Uses of "short" ?

Guy Harris guy at sun.uucp
Thu Sep 12 14:33:18 AEST 1985


> > Thinking of data objects not as lumps of machine words
> > but as abstractions will, I suspect, improve the quality of your code in
> > general, and specifically its portability.
> 
> This sounds great.  I agree with it to a point.  But doesn't it depend upon
> what one is trying to accomplish?  Certainly those implementing
> communications protocols DO care.  Kernel hackers probably care in lots
> of places, too, especially those writing device drivers.

"Do care" about what?  Even device drivers, protocol modules, etc. are
probably much more likely to be correct if you think of the objects they
manipulate abstractly.  Some bits of code in device drivers on machines with
memory-mapped I/O might have to treat structures which represent device
registers, say, as low-level objects, but 99% of the data the OS manipulates
- and probably a high percentage of the data the device driver manipulates -
are not such low-level objects.

> The net is that sometimes one wants to get close to the machine, and
> sometimes one wants to use C as a high(er)-level language.  That's the
> beauty of C; it CAN be used either way.  But it is up to the programmer.

Even when writing grubby device driver code, you should use C as a
higher-level language.  There's no benefit to be gained from thinking of,
say, the I/O operation/buffer header queue of a disk driver as a bunch of
words, some of which contain addresses, some of which contain counts, etc..
The ability of C to get "close to the machine" is vastly overemphasized; 99%
of the code people write, even in OSes and the like, doesn't need to get
"close to the machine" in the same sense as an assembler language gets
"close to the machine" and, in most cases, *doesn't* get "close to the
machine" in that sense.

> If you really believe what you say, would you support the abolition of the
> short and long data types?  Double as well?  I'd be interested.

No, I don't support the abolition of those data types.  "int" means "most
convenient integral type, guaranteed to hold numbers in the range -32767 to
32767 but not guaranteed to hold anything outside".  Needless to say, this
is an extremely inappropriate type to represent "number in the range -1
million to 1 million".  The type "long" is needed for this (in ANSI C,
"long"s are guaranteed to hold values between -(2^31-1) to (2^31-1)).  "int"
is "closer to the machine" than "long", so if anything "int" should go if
you're trying to increase the distance of code from the machine, not "long"
or "short".

"int" should not be abolished, though, because in a lot of cases "short" and
"long" overspecify the type and don't allow the language implementation
enough freedom to choose the most appropriate type.  If you have some
variable which can use any reasonable amount of space without any serious
effect on the space requirements of the program, and which is *never* going
to be outside the range -32767 to 32767, "int" is the appropriate choice.

	Guy Harris



More information about the Comp.lang.c mailing list