Efficient coding considered harmful?

Guy Harris guy at auspex.UUCP
Sat Nov 12 07:23:12 AEST 1988


>Either way we lose.

"Either way" is generally used when there are two ways.  There is a
third way, which, although it may not catch type mismatches *all* the
time, will probably catch a hell of a lot of them:

	put the prototypes into an #include file, and make sure the
	module that defines a function includes the #include file that
	declares that function.

The compiler will complain if the prototype in the #include file does
not match the prototype on the function itself.

BTW, putting function declarations in an #include file is a Very Good
Idea even if you don't have prototypes; without prototypes, you may not
be able to declare the types of the arguments to the function, but you
can declare the type of the function's return value, which is
important.... 

>But don't cram your views that make a machine run slower down my
>throat!

I have yet to see any indication that adoption of stronger-type-checking
notions in the dpANS will "make a machine run slower" by any significant
amount.

In fact, if prototypes had been there since Day 1 and had been the
*only* way of declaring functions (this may perhaps have made the
language too big for the PDP-11 to compile, I don't know - I'm not
saying that this would necessarily have been the best thing), and if
"varargs" or "stdarg" had been the only permissible way of doing
variable-length-argument-list functions, calling sequences where the
callee, rather than the caller, could safely have been used (since the
compiler could feel reasonably confident that if a function expects N
arguments of particular types, it will be passed just such a list of
arguments), and it has at least been asserted that on some machines,
such calling sequences are faster (e.g., the debates over the "C" and
"Pascal" calling sequences on 80*86 machines).

"Strong typing is for weak minds" is for weak minds.



More information about the Comp.lang.c mailing list