Style guides and portability

Adrian McCarthy adrian at mti.mti.com
Wed Jan 16 06:18:56 AEST 1991


In article <1991Jan13.182655.17672 at athena.mit.edu> scs at adam.mit.edu writes:
>Doug Gwyn (I think) wrote:
>> No, any C compiler worth using (and certainly any that conforms to the
>> standard) will provide at least 16 bits for an int, at least 32 bits
>> for a long, and at least 8 bits for a char.

Who cares how many bits are used in the representation?  What really matters
is the range of legal values.  Just because an int is 32 bits doesn't
guarantee that its range is -(2^32) -- +(2^32 - 1).  That assumes a
twos-complement machine.  While this may be the only representation you
ever run in to, if you're trying to remain portable I'd watch out for this.
Someday you might meet a sign-magnitude or even a BCD (Binary Coded
Decimal) machine.

Granted, if you're trying to put a bitmask into an int, its the number of
bits that counts.

>In any case, the "minimum maxima" for <limits.h> in section
>2.2.4.2.1, combined with the requirement of a "pure binary
>numeration system" and other language in section 3.1.2.5,
>effectively imply the 16 and 32 bit sizes.)

Yes, use limits.h.  But does "pure binary numeration system" imply that
you can't make an ANSI-compliant C compiler for a BCD machine?

Aid.  (adrian at gonzo.mti.com)



More information about the Comp.lang.c mailing list