HELP, WE'RE DROWNING!!

der Mouse mouse at thunder.mcrcim.mcgill.edu
Sat Jun 29 00:18:07 AEST 1991


In article <1991Jun24.163819.3125 at email.tuwien.ac.at>, hp at vmars.tuwien.ac.at (Peter Holzer) writes:
> mouse at thunder.mcrcim.mcgill.edu (der Mouse) writes:
>> I did (Turbo C, at least).  The only advantages of it I can see over
>> the Sun cc is the fine-grained control over warning generation and a
>> certain degree of ANSIness.  Comparing it to gcc, I see no
>> advantages.
> When I have have to work on an ASCII-Terminal, I see many advantages.
> E.g. source code and compiler error messages are simultaneously
> visible.

This is true anyway.  If you have a decent editor (eg, (almost?) any
emacs variant) you just split the screen into two windows; if not, use
error(1) to stuff the errors into comments in the source....

> If I have an X-terminal, still one advantage remains.

Now wait a minute, Turbo-C is a MS-DOS program.  What's this about
ASCII terminals and X terminals?  Since when do DOS machines run
terminal lines or X clients?

> The debugger.  Turbo-Debugger is the best debugger I have ever seen.

Have you tried ups?

It's been a while since I did anything with Turbo C, so I don't
remember too many details, but as I recall the debugger was roughly
comparable with ups, as a debugger.  And given all the other problems
with Turbo C, even if it were a substantially better debugger I would
still hold the environment inferior.

>> A partial list of disadvantages I find in 2.0 (these are just the
>> ones I can remember or find in a quick skim of the manuals):

>> - Memory models
> That's not the fault of the compiler but of the hardware

Mostly so, I suppose.  But even if you have a '386 or '486, it doesn't
know how to use them, or at least 2.0 showed no awareness of anything
past the '286.

> (of course the compiler could just use 32bit pointers everywhere, but
> sometimes you want to save space or execution time.  If you don't
> care, just always use the large memory model, and don't care about
> the rest).

Not even.  You cannot ignore memory models unless you are willing to
stick with 64Kb.  Not even the huge model works - the default pointer
type is still far rather than huge.

>> - Not free
> True, but not expensive, either.

Why spend $170 for the privilege of using a binary-only compiler on a
brain-dead excuse for an operating system on a badly mangled
architecture when you can spend $0 and use a source-available compiler
on a real operating system on nearly any architecture you please
(including that same mangled one)?

>> - Source not available

(This alone is enough to ensure I won't use it for anything serious.
And let me add that even what source *is* available isn't
redistributable.)

> That's a pity.  But most people don't know enough of compilers to fix
> bugs anyway.

That hardly excuses it.

>> - "Integrated" environment's editor is almost unconfigurable
> Yes, I miss vi.  But using the integrated environment is faster than
> using vi, make and a debugger.

Perhaps for someone equally used to both.  But in my case, the
slowdowns necessitated by fighting against the stupidities hampering me
at every turn more than outweigh the (very impressive, btw) compilation
speed.

>> - Make is pretty stupid; in particular, it has no default rules, as
>>    far as I could tell.
> I don't know about the default rule (I never used it with UNIX),

That's the thing that means you don't have to write

foo.o: foo.c
	$(CC) -c foo.c

for every .c file.  (Or .obj instead of .o for DOS.)  I don't think
even writing an explicit wildcard rule

.c.obj: ; $(CC) -c $*.c

worked....

So, most of the problems I see with Turbo C are not actually with the C
compiler at all but with the rest of the environment.

					der Mouse

			old: mcgill-vision!mouse
			new: mouse at larry.mcrcim.mcgill.edu



More information about the Comp.lang.c mailing list