What breaks? (was Re: 64 bit longs?)

Anthony DeBoer adeboer at gjetor.geac.COM
Wed Jan 16 07:21:23 AEST 1991


In article <1991Jan15.053356.2631 at zoo.toronto.edu> henry at zoo.toronto.edu (Henry Spencer) writes:
>In article <54379 at eerie.acsu.Buffalo.EDU> chu at acsu.buffalo.edu (john c chu) writes:
>>>It is intuitively appealing, but I would be surprised to see anyone
>>>implementing it:  it would break far too much badly-written software.
>>
>>Can someone please tell me what would break under that model and why?
>
>There is an awful lot of crufty, amateurish code -- notably the Berkeley
>kernel networking stuff, but it's not alone -- which has truly pervasive
>assumptions that int, long, and pointers are all the same size:  32 bits.
>
>At least one manufacturer of 64-bit machines has 32-bit longs and 64-bit
>long longs for exactly this reason.
>
>The problem can largely be avoided if you define symbolic names for your
>important types (say, for example, net32_t for a 32-bit number in a TCP/IP
>header) and consistently use those types, with care taken when converting
>between them, moving them in and out from external storage, and passing
>them as parameters.  This is a nuisance.  It's a lot easier to just treat
>all your major types as interchangeable, but God will get you for it.

It seems to me that there really isn't any _portable_ way to declare a 32-bit
long, for example.  Not that I would want to advocate changing the syntax of C
[again], but for most software the key thing is that the integer has at least
enough bits, rather than a precise number of them, so perhaps if there was
some declaration sequence like "int[n] variable", where n was the minimum
number of bits needed, and the compiler substituted the appropriate integer
size that met the requirement (so an int[10] declaration would get 16-bit
integers, for example), then the language might take a step toward
portability.  A bitsizeof() operator that told you how many bits you actually
had to play with might help too, but even then you'd have to allow for
machines that didn't use two's complement representation.

I suppose it's only a minor pain now: use a type called net32_t or int32_t and
define all your types in a short header file you rewrite on each new machine
on which you're going to compile.  There are enough funny situations possible
that it's probably best for a human programmer to look at the application and
the architecture and call the shots.  We've got to earn our salaries once in a
while :-)
-- 
Anthony DeBoer - NAUI #Z8800                           adeboer at gjetor.geac.com
Programmer, Geac J&E Systems Ltd.             uunet!jtsv16!geac!gjetor!adeboer
Toronto, Ontario, Canada             #include <std.random.opinions.disclaimer>



More information about the Comp.lang.c mailing list