Atomic #defines (was Re: Password checking program)

der Mouse mouse at mcgill-vision.UUCP
Mon Aug 28 18:32:11 AEST 1989


In article <10765 at smoke.BRL.MIL>, gwyn at smoke.BRL.MIL (Doug Gwyn) writes:
> In article <13569 at bloom-beacon.MIT.EDU> scs at adam.pika.mit.edu (Steve Summit) writes:
>>	#define ERROR (EOF-1)
> Don't do this.  You don't know what EOF might be defined as, so this
> might not work right.

Then as far as I can see there is *no* way to (portably) choose an int
value which is not EOF and not a valid char.  Anything I can do, as far
as I can see, will (for some choice of EOF, int range, and char range)
either overflow, re-use the value of EOF, or re-use the value of some
char.

> EOF belongs to the C implementation.  Invent your own symbols for
> your own uses.

Fine.  But how?

By the way, it seems to me that the required existence of EOF implies
that it is not a legal implementation choice to make char and int
identical.  Is this true?  (Flame retardant: I didn't say "sane", I
said "legal".)

					der Mouse

			old: mcgill-vision!mouse
			new: mouse at larry.mcrcim.mcgill.edu



More information about the Comp.lang.c mailing list