bits in an int vs. long?

Frotz frotz at drivax.UUCP
Fri Oct 6 05:49:13 AEST 1989


logan at inpnms.UUCP (James Logan) writes:

>There is not a definition for int, so I have to use LONG.  The only
>time I can see this falling apart is if we port to a UNIX system
>with an odd-sized int or long.  (I realize that it is wrong to make
>assumtions about the number of bits in an int or a long, BTW.  I
>just can't convince anyone else.)

>Unless there is a clear real-world argument against the
>assumption that int's and long's are the same size, I will have
>to treat the two as interchangeable.  Comments?

The Intel 186, 286, 386 processors all use 16-bit ints.  i80386 allows
the use of 32-bit ints, but you need a 386 code generator to get this.
If you are using a compiler that DOES NOT GENERATE 386 code, you will
most likely NOT get 32-bit ints... 

It is my understanding that:

	sizeof(char) < sizeof(short) < sizeof(long)
	sizeof(short) == 2
	sizeof(long)  == 4

	'int' may be defined as a 'short' or a 'long' depending on the
hardware...  I have heard that there are processors in the world that
use 20 bit integers???

"My two bits...  clink... clink..."
--
Frotz



More information about the Comp.lang.c mailing list