bits in an int vs. long?

James Logan logan at inpnms.UUCP
Thu Oct 5 02:45:52 AEST 1989


I would like to take a poll of any modern compilers, on a 680x0,
80386, or RISC architecture, use anything besides 32 bits for
their int's and long's.  Please email any comments on this.  

My current project has the following definitions that I must
choose from when using UNIX system calls:  

#define LONG	long
#define BYTE	unsigned char
#define CHAR	char
(others, such as UWORD, etc.)

For a while I was using the variable types that the section 2 & 3
man pages declare to interface with the system calls and library
routines; and using the #define'ed types when sending and
receiving data to and from foriegn microprocessors.  Now I have
been directed to used these #define'ed types for EVERYTHING.  :-(

There is not a definition for int, so I have to use LONG.  The only
time I can see this falling apart is if we port to a UNIX system
with an odd-sized int or long.  (I realize that it is wrong to make
assumtions about the number of bits in an int or a long, BTW.  I
just can't convince anyone else.)

Unless there is a clear real-world argument against the
assumption that int's and long's are the same size, I will have
to treat the two as interchangeable.  Comments?

-- 
James Logan                       UUCP: uunet!inpnms!logan
Data General Telecommunications   Inet: logan%inpnms at uunet.uu.net
(301) 590-3069



More information about the Comp.lang.c mailing list