'nmake'

Steve Parker sparker at unisoft.UUCP
Thu May 25 05:52:38 AEST 1989


In article <11581 at ulysses.homer.nj.att.com>, ekrell at hector.UUCP (Eduardo Krell) writes:
> The need for these is that the make model is too simplistic: As I said
> before, time stamps are not good enough to determine when a file should
> be recompiled. When projects get too big, the makefiles get too complicated
> and one is never 100% sure that make is recompiling all the files it REALLY
> needs to and that it's not compiling too much.

However, writing a program that knows everything there is to know about
software regeneration is bound to be folly.  For the same reason there is
no one programming language that fits all, no one method for software
regeneration fits all.  Instead, I prefer to choose an understandable,
predictable, and easily adaptable tool.  I especially prefer that to
a complex, difficult to understand tool, with a difficult to read syntax.
Thus, I view nmake as an example of the de-evolution of make.

> On a small system, this might not make a big difference, but on a big
> project where building the system takes a day of CPU time or more,
> it's critical that this process be as efficient and as reliable as
> possible.  Many of the projects using nmake now have cut their building
> times because back when they used make, they had to rebuild everything
> from scratch as they didn't trust make to do the right thing.

So nmake maximizes both efficiency and reliablility?  Let me see:  The
most reliable way to regenerate software is to recompile the whole world
every time.  (No errors or variability are possible.)  Efficient is 
recompiling only the absolute minimum number of files.  Sounds like
these are trade-offs to me.  And again, I assert nmake is based on a
complicated and confused paradigm of software regeneration, that falls
somewhere in the middle.

> >	"As a testimony to the strength of this metalanguage, most new make
> >	features and ideas have resulted in changes to the standard builtin rules
> >	(written as a makefile) rather than in changes to the [nmake source]."
> 
> But what bothers you: that he did succeed in doing that? Are you annoyed
> that someone can write a general purpose make engine which can be tailored
> with higher level rules?

No.  I have always been able to get make to do pretty damn close to anything
I want.  (Granted sometimes it has not been as pretty as I would have
liked.)  And in fact, I would probably be much happier if he had succeeded
at making a general purpose make engine.  Based on my experiences, it was
not a success.

> >This does the wrong thing for some #ifdef'd source.  It still makes nmake
> >slower than it need be, and wrong to boot!?!?
> 
> First of all, you can turn off source scanning if you want. And, anyway,
> source files are rescanned only if they've changed since the last time
> they were compiled, so you're clearly exaggerating the overhead.

Sorry, it makes it slightly slower.  And wrong.

> And it doesn't do the wrong thing for #include's within #ifdef's:
> it knows about them and it doesn't require the included file to
> be there.

Maybe _your_ version doesn't.  Mine does.  It can be turned off, or
cpp can be used.  (The latter is clearly the right way.)  But the default
behavior I see is to generate false dependencies.

> I've rewritten the Unix kernel makefiles to a single nmake makefile.
> It went down from about 40 pages of makefile to 1.5 and the system
> can be recompiled much faster now.

Interesting.  After I'ld reached the point of having burned three weeks
of my time on nmake bugs with a kernel nmake file, I converted it to
a make file.  It was less than 10% larger than the nmake file.  (I work
in a fairly complicated cross-development environment.)  While I haven't
made measurements, my feel is that make is about 30% faster at figuring
out if anything needs to be made.  And it recompiles the right things
every time now!  Wow!  Make sure is a great tool....

> How about the use of a co-shell so that individual actions don't need
> to spawn new shells all the time?

Fine idea.  Right now I go to build a library and either a bug in ksh
or nmake causes process to be left hanging around spinning forever.
(It does build the library okay though.  I'm thankful for that :-()

> How about having compiled makefiles so that they don't need to be reparsed
> every time you run make?

Bad idea.  Besides, you were just telling me about how short nmake files
are.  Why, the amount of time spent parsing them must be so insignificant
now.  I don't like things I can't see!  I guess I'm a moss-covered old-timer.
I remember when UNIX meant all files are streams of bytes, and the only
ones I couldn't 'cat' were machine executables.  Something about
flexibility and combining tools at will comes to mind....

> How about using a state variable so that if you change its
> value in the Makefile, all the files which use that symbol will be
> recompiled (with the new -D flag)?

How about a shell script that takes the name of a variable I've changed,
uses grep to go through source files looking for actual dependencies on
that variable, and touching only _those_ files?  That eliminates the
need for that other nasty binary file that has all saved state.

> How about using .SOURCE rules
> to specify lists of directories where different kinds of source and
> header files are and let nmake generate the right -I flags and compile
> source from those directories?

Path searching for source files is available in numerous makes.  Many
of them are predictable and reliable.

> The list could go on, but I've already made the point.
>     
> Eduardo Krell                   AT&T Bell Laboratories, Murray Hill, NJ
> UUCP: {att,decvax,ucbvax}!ulysses!ekrell  Internet: ekrell at ulysses.att.com

Pleeez.

In my opinion nmake is a bad idea, poorly done.  Whether it is fixed in
mumble-fatz is a secondary point at best.  My experiences with it lead
me to believe the quality level is lower than I require of myself before
I release software.

My main feelings, however, are that it is a bad idea.  It violates the
rules of good software design.  The following items are quotes from
Rob Pike's paper on designing windowing interfaces in USENIX's Computing
Systems, V1#4:

	o Simple systems are more expressive.
	o Balance the benefit of a feature against its overall cost.
	o Design with a specific customer (or task) in mind.

Nmake isn't simple.  Putting features like scanning C source files for
dependencies is a poor cost/benefit decision.  Nmake's design reflects
an attempt to design with _all_ possible customers in mind.

In my opinion, nmake is a poorer tool even than make.  And other new
makes have a less confusing, more sensible view of the world.  (e.g.,
mk and GNUmake)

Steve Parker
sparker at unisoft.com
{sun,uunet,ucbvax}!unisoft!sparker



More information about the Comp.unix.wizards mailing list