C common practice. (was: low level optimization)

Richard A. O'Keefe ok at goanna.cs.rmit.oz.au
Tue Apr 30 11:53:13 AEST 1991


In article <22649 at lanl.gov>, jlg at cochiti.lanl.gov (Jim Giles) writes:
> Even closely related codes tend to be fragmented this way.
> For example: the X11R4 xterm program is maintained as 16 separate
> source files even though it is _one_ utility and I know of no other
> program which shares any code with these 16 files.  There is not
> really any reason for this code to be maintained in more than _one_
> file.

There are two separate good reasons that I can think of.
Not having read any of the X sources, I have no idea which if either
is relevent to xterm.

1.  Modules.  Functions and variables declared 'static' are private to
    a file, and can't be confused with functions or variables of the
    same name in other files.  This is a Good Thing.

    It would, of course, be a Better Thing to have an explicit module
    construct in the language.  See Modula-2, Modula-3, Ada, Fortran-90.
    comp.std.c++ were talking about a module construct for C++.
    
2.  Recompilation speed.  If you change one file, you recompile one file.
    That can be a lot faster than recompiling the whole program.

    It would, of course, be a Better Thing to have compilers which just
    recompiled the parts of the program which depended on the change.
    In the 70s I had the privilege of using a Burroughs B6700; now that
    was a batch, card-oriented, system, but if you gave their compilers
    a "patch file" they were smart enough to recompile just the routines
    which had been affected by the changes.  It worked wonderfully well;
    the operating system was *one* giant Algol (well, Espol) source file,
    and a patch could be compiled in a minute.  One thing that helped was
    that macros (DEFINE) in their languages were lexically scoped.  All
    this on a machine which was small and slow by comparison with today's
    UNIX boxes.

I like C, but if I had a PC, I'd buy an Ada compiler for it.

> I don't maintain that keeping code in separate file is necessarily
> bad or good.  But to pretend that it is not common practice is to 
> ignore reality.  This common practice may in (the near) future result 
> in less efficient code because of missed optimization.

It is difficult to see why.  Is there any reason why a C compiler should
not have an option "-c.a" (make a ".a" file) so that
	cc -o lib.a -c.a *.c
would make a library file in one step, with cross-file optimisation
possible?  The IBM mainframes and VAX/VMS support "concatenated files",
where 
	compile foo.c+bar.c+ugh.c
acts _as_though_ the files had been concatenated; if you just imagine
a directive "remove static symbols from symbol table" interposed between
the files, compiling a collection of files into one object with cross-
file optimisation ought to be easy.
-- 
Bad things happen periodically, and they're going to happen to somebody.
Why not you?					-- John Allen Paulos.



More information about the Comp.lang.c mailing list