VFS code questions

Kevin M. Kilbride kmk at ssl.org
Fri Feb 25 01:05:35 PST 2005

I'm new to the FreeBSD/DragonflyBSD system code, so I don't know what 
some of the underlying motivations for its design have been, but what 
I'm seeing while making a WARNS=6 pass at sbin/fsck is raising a few 
uncomfortable questions for me. I'm hoping someone will entertain my 
ignorance for a moment and save me much time by providing insight. If 
there is a better forum for such discussion, my apologies in advance, 
but I couldn't see anyplace else that looked more inviting.

As I'm attempting to thread my way back into the deep hierarchy of 
preprocessor macros in the UFS headers, I'm finding macros two or more 
layers deep with variable return types that perform comparisons between 
signed and unsigned objects and (more disturbingly) use int32 masks on 
int64 values. The invoking code then coerces the results of these macros 
to fix their type and explicitly discard high-order bits that may have 
been left set by mask sign-extension and which (presently) aren't needed 

Not only does all this strike me as cavalier, it is also extremely 
confusing and difficult to unravel from a reverse perspective---to say 
nothing of what it does for debugging.

My questions, then, are two-fold:

1. Is there a coding policy issue at Dragonfly that precludes the use of 
static inline functions, as opposed to preprocessor macros? Is somebody 
going to shoot me if I attempt to thread my way back through the 
type-spaghetti of these header files and write them out as inlines? If 
this is done with the __inline__ keyword, it shouldn't break user code 
compiled with GCC even if the pedantic ANSI mode is used. It may, 
however, force separate instances to be emitted in each compilation unit 
if optimization is turned off (actually, in GCC 3.4 you can explicitly 
tag functions for inlining even when optimization is turned off, but 
that's of no use at this point, since GCC 2.95 is the default system 
compiler). Anyone compiling without optimization is not likely to be 
concerned about performance penalties incurred from the function call 
overhead, and it seems a small price to pay for making these logic atoms 
penetrable to type analysis by both humans and compilers. It's a lot 
easier to find a fault when the compiler points you to a line of code in 
your inline definition than it is if it complains about fifty different 
instances of macros that have been expanded into single lines of source 

2. Being forced to unravel the underlying type of objects in these 
macros almost begs a careful reconsideration of existing type 
relationships. For example, why are so many obviously-non-negative 
parameters (like masks) in the FFS superblock struct declared as signed 
integers? Actually, almost everything is signed. Will it break something 
horribly if these are flipped to unsigned integers of comparable width?

Since I know very little about the code I'm looking at, per se, I have 
visions of re-installing Dragonfly after wiping out my hard drive if 
something I do compiles, but fails later as a result of non-explicit 
logical dependencies (being passed through void pointers, for example). 
How many monsters are there in these waters, and where might they be?

More information about the Submit mailing list