C common practice. (what it really is)

Thomas M. Breuel tmb at ai.mit.edu
Sat Apr 27 04:57:19 AEST 1991


In article <22354 at lanl.gov> jlg at cochiti.lanl.gov (Jim Giles) writes:
   On the contrary.  Putting each C procedure into a separate file _is_
   common practice.  It is promoted as "good" style by C gurus.  Skilled
   C programmers recommend it - they don't avoid it or condemn it.

Lest some novice C programmer get confused out there, jlg's statement
is false. For example, for X11R4 dix server code, xterm, and twm, the
average is about 20-30 functions per source file, and that is probably
pretty typical of C code in general.

Common C practice is to put related functions into a single source
file, since C implements a simple form of modularization at source
file (read, compilation unit) level, via the "static" declaration.

Back in the days of the PDP-11 with its 64k code space, people would
split up source code to libraries such that each source file had only
one function in it. This was because the UNIX link editor pulls in all
functions and global data defined inside the same file from a library
even if they are not all used.

There was some thought given to modifying the UNIX link editor, but
with the advent of virtual memory and shared libraries, there is very
little incentive to do so on UNIX systems. Some vendors actually have
more sophisticated link editors that manage to pull apart object
files, for example for cross-compilers for embedded controllers, where
every byte counts.

I know of no C programmer who considered this sort of butchery
(splitting up your sources to get around a limitation of the link
editor) "good style", though.

In terms of compilation speed and code optimization, having only one
function per source file is about the worst you can possibly do: you
pay the cost for parsing lots of include files for each function you
compile, and, with most compilers, you inhibit all global
optimization.

						Thomas.



More information about the Comp.lang.c mailing list