Time to standardize "true" and "false"

Rahul Dhesi dhesi at sun505.UUCP
Mon Sep 25 12:02:04 AEST 1989


(The referenced article had a follow-up header of "poster", which I
think is a nasty thing to have done.)

In article <9464 at attctc.Dallas.TX.US> wjf at attctc.Dallas.TX.US (Jesse Furqueron)
writes:
>#define FALSE	0
>#define TRUE	!0 

I suggest that defensive programmers eschew these constants because the
temptation to say

     if (x == TRUE) ...

may overcome you some day and you will suffer, unless you can
universally guarantee that you didn't absent-mindedly do something like

     x = isdigit(c);


If, on the other hand, you are willing to either be careful to always
say
     x = (isdigit(c) != 0);

or if you alternatively define

     #define ISTRUE(x)	(x)
     #define ISFALSE(x)	(!(x))

and say

     if (ISTRUE(x)) ...
     if (ISFALSE(y)) ...

instead then the use of TRUE and FALSE is not so dangerous.

Best is just think binary and say:

     x = 0;		/* x is false */
     y = 1;		/* y is true */

and for testing use

     if (x)  ...	/* if x is true */
     if (!y) ...	/* if y is false */

If you really must define a macro, try:

     #define ONE   1
     #define ZERO  0

Now if you see

     if (x == ONE) ...

you immediately realize that this could fail to work.

The problem is that in C any nonzero value is considered to be
true when tested in a boolean context, so

     #define TRUE  1

is misleading.  In a richer language you could perhaps say:

     #define TRUE  [-inf..-1, 1..+inf]

Rahul Dhesi <dhesi%cirrusl at oliveb.ATC.olivetti.com>
UUCP:  oliveb!cirrusl!dhesi



More information about the Comp.lang.c mailing list