Atomic #defines (was Re: Password checking program)
Steve Summit
scs at hstbme.mit.edu
Tue Aug 29 15:50:38 AEST 1989
In article <1625 at mcgill-vision.UUCP> mouse at mcgill-vision.UUCP (der Mouse) writes:
>In article <10765 at smoke.BRL.MIL>, gwyn at smoke.BRL.MIL (Doug Gwyn) writes:
>> In article <13569 at bloom-beacon.MIT.EDU> scs at adam.pika.mit.edu (Steve Summit) writes:
>>> #define ERROR (EOF-1)
>> Don't do this. You don't know what EOF might be defined as, so this
>> might not work right.
>Then as far as I can see there is *no* way to (portably) choose an int
>value which is not EOF and not a valid char. Anything I can do, as far
>as I can see, will... either overflow, re-use the value of EOF,
>or re-use the value of some char.
Sure there is:
#if EOF != -2
#define ERROR (-2)
#else
#define ERROR (-3)
#endif
The only reason I didn't do this in the program that prompted the
original complaint was that I didn't want to require that
<stdio.h> be #included before the header file that was trying to
#define ERROR. (I can't yet depend on ANSI's guarantee that
multiple #inclusions of standard header files are safe, so I
didn't have the header file in question #include <stdio.h>
itself. It occurs to me that
#ifndef EOF
#include <stdio.h>
#endif
is probably a safe way to protect against a non-idempotent
<stdio.h>. I used to occasionally cheat and use #ifndef FILE for
this purpose, but FILE isn't necessarily a preprocesser macro and
should probably be a typedef instead. But EOF is required to be
a macro, right?)
Steve Summit
More information about the Comp.lang.c
mailing list