Func Protos with K&R Func Defs

Steve Summit scs at adam.mit.edu
Fri Mar 1 12:57:49 AEST 1991


In article <11634 at jpl-devvax.JPL.NASA.GOV> david at jpl-devvax.JPL.NASA.GOV (David E. Smyth) writes:
>scs at adam.mit.edu (Steve Summit) writes that ANSI C may well stop
>supporting old-style function definitions at some time in the future.
>Therefore he recommends that writing K&R style definitions is probably
>a bad idea in the long term.
>Well, the problem is this: ANSI is not portable, K&R is.
>I actually wanted to use ANSI for the Widget Creation Library, and
>initially implemented it using ANSI.  Problem:  Many, many people could
>not compile it!  So, I went back to K&R.  Now its portable.

I was so busy covering all the bases it was probably hard to tell
what I was recommending.  It sort of depends on who's asking.
Remember, programming is an essentially pragmatic activity; we
don't have to be idealists about these things.  The best
long-term strategy is not necessarily the best short-term
strategy.  (I *hate* that attitude, and I'd rather it didn't
apply, but I suppose sometimes it does.)

In my own work, I use the "disrecommended" old-style function
definitions almost exclusively (with the exception of varargs
functions).  Most of the time, I don't even bother with a bunch
of externally-visible prototype declarations: they're not
required, and all they do for me is cost me effort to create and
figure out where to put them, and cause extra recompilations
because the right place to put them is in header files which
there then have to be more of, and more widely #included.
(Many people claim that prototypes are also good for cross-file
argument type checking; I use lint for that.)

Another popular technique is to write code using new-style,
prototype function definitions, using a simple converter when the
source must be passed to an old-style tool.  If you always write
your function definitions in the same, stylized way, the
converter can be quite simple indeed, practically an awk or sed
script.

However, if you're shipping code all over the place, it's
probably easier if you ship old-style code, rather than having to
ship the conversion tool (and teach people how to use it).

My previous two articles on this topic did *not* come out and say
"I use old-style function definitions for portability; doing so
is clearly superior and everybody else should, too."  There
aren't always easy, simplistic answers; that's why programming
can be hard work and why debates here (and elsewhere) can rage
for so long.  Whether I like it or not, the use of modern,
prototyped declarations and function definitions is the currently
recommended practice, and I would not be doing anybody a favor if
I went around actively discouraging their use.

Certainly, there are situations in which interim avoidance of
full-scale function prototypes is advisable; it sounds like the
X toolkit is a good example.

There are several internally-consistent strategies I can think
of, with a number of tradeoffs between them:

Strategy 1: "Reactionary."  Use "Old C" exclusively.

Of course, "Old C" is not as precisely defined as "ANSI C;" there
are a number of variants.  Someone using this strategy codes only
for a small number of machines, none of which has an ANSI
compiler.  The code is not terribly portable outside of that
group of machines.  The coding style is tailored to the "Old C"
dialects present on the old machines in question.

Strategy 2: "Curmudgeon."  Use the intersection of "Old C" and
ANSI C.

This is probably the most demanding strategy, because you can't
use any ANSI features, and you can't use a number of "Old C"
features which have been disallowed, or had their definitions
changed, by X3J11.  Like all good, simplistic tradeoffs, however,
this most demanding strategy is also the most portable, at least
for now.

You can modify this strategy, to encompass certain kinds of ANSI
features, if necessary.  For example, any functions which accept
a variable number of arguments must be defined for an ANSI
compiler using a new style, prototype definition, inside #ifdef
__STDC__.  You can use newer standard library routines (strtoi,
vfprintf, etc.), even if your old systems don't support them, if
you can provide your own implementations.

Strategy 3: "Schizophrenic."  Use old-style function definitions
and new-style, prototype declarations, keeping the declarations
inside #ifdef __STDC__ or the equivalent.

This is like 2, except that you start adding prototypes.  (It's
therefore more work, so maybe 2 isn't the most demanding after
all.  Never mind.)  There are a couple of reasons for adding the
prototypes, while leaving the definitions old style: to placate
people who have "grown up" with prototypes, and are used to
seeing them, and to placate new compilers, which often issue
warnings for function calls without prototypes in scope.
If the code is compiled on systems with new compilers but without
lint, the prototypes can help keep the calls correct there.

There is a drawback to this "fence sitting" strategy, though: if
several people are developing, maintaining, and modifying the
code, some with new compilers and some with old ones, it's easy
for the function prototypes to get screwed up, because only half
the programmers care about them, and only half the compilers see
(and check) them.  (You can use some Stupid Preprocessor Tricks
to arrange for there to be one set of declarations and/or
definitions, rather than two, but they typically require
counterintuitive double parentheses.  Several people claim this
is a reasonable tradeoff; I'll not argue with them.)

Strategy 4: "Liberal."  Use a subset of ANSI C, such that simple
tools can reliably convert the code back to old-style, when
necessary.

This is a nice, forward-thinking strategy.  (Until you run it
through the converter, the code is both 100% ANSI compliant and
doesn't run afoul of the "future directions," namely that
old-style support may go away some day.)  It's a bit of work to
steer clear of new features (especially those in the
preprocessor, such as # and ##), which can't be converted back to
old-style trivially, and to stay within whatever formatting
conventions the converters use, if simplistic.

Strategy 5: "Don't look back."  Use ANSI C exclusively and
comprehensively.

Obviously, you can use this only if you aren't likely to move the
code to a system with an old compiler, and you aren't interested
in making it easy for anyone else who might have to.  It's also
the easiest strategy, since all the new "Programming in C" books
describe ANSI C.  (Soon enough, converters will exist which will
do a good job of translating function prototypes back to
old-style, but you'd need a full compiler, with a "K&R" back-end
"code generator," to translate code which makes full use of any
and all available ANSI features, as whole-hog "Don't Look Back"
code inevitably would.)


We could sit around and discuss the interlocking tradeoffs
between these and other strategies for a long, long time.  If
widespread portability is important, choose 2 or 3.  If your main
development compiler accepts prototypes, strategy 4 is
attractive; if most of your work is under older compilers,
strategies 2 and 3 look better.  If you'd rather not worry about
the caveats associated with mixing prototypes and old-style code
(these have been discussed in this thread already), lean towards
5, or maybe 4.  If you'd like to avoid a wholesale rewrite on the
day when the "obsolescent" old-style forms become truly obsolete,
definitely use 4 or 5.  If you believe that decent automated
converters will certainly be available by then, don't worry about
using 2 and 3.

Already, large numbers of people appear to be jumping in to
strategy 5 with both feet.  This strategy appears to hold the
fewest surprises (you don't have to worry about f(x) float x;
stuff), but it is a little surprising how fast the cry of "I just
picked up this nice program but it's written in ANSI C and my
old compiler chokes on it; what should I do?" has become
widespread.


I don't know whether a future revision of the Standard will
actually delete support for old-style functions or not.  Karl
Heuer and I discussed it once, and agreed that the situation was
comparable to the old =op operators (=+, =-, etc.).  These were
already obsolescent in 1978, and the compilers were starting to
issue warnings for them.  Ten years later, most (but not all!) of
the =op's in old code had been weeded out, and X3J11 could delete
support for them with impunity.

On the other hand, there were fewer compilers and fewer lines of
code around in the late 70's, so there probably wasn't the kind
of backwards pressure then (to keep using =op in code, for
portability's sake) as there is now (to keep using old-style
functions).  If the reactionaries, curmudgeons, and
schizophreniacs keep it up, there may be so much "old" code
around when the standard comes up for review that the old
functions will be left in (or supported by most compilers as an
extension.)

On the other other hand, there's an awful lot of extra baggage in
an ANSI compiler to handle both types of function declarations,
and it would be best to pare that down eventually.  Since
deleting prototypes isn't an option, old-style functions probably
ought to go some day.  (Anyway, I'm sure there will be good
converters available by then, to automate the conversion of any
remaining old-style code.)

Personally, I'll probably switch over to prototype-style
definitions within the next few years, although I'll never be big
on prototyped external declarations.  (The implicit declaration
rule remains in force, and I don't think it's slated for removal,
so it will remain legal to define int-return, fixed-number-of-
arguments functions using new-style, prototype syntax, and to
call them without any external declarations in scope at all,
prototypes or no.  Once old-style syntax has been fully retired,
this could be classed as a "New Reactionary" or "Wishful Thinker"
strategy.)

                                            Steve Summit
                                            scs at adam.mit.edu



More information about the Comp.lang.c mailing list