int divided by unsigned.

Jim Giles jlg at lanl.gov
Wed Jul 12 09:49:18 AEST 1989


>From article <1989Jul11.215930.9042 at jarvis.csri.toronto.edu>, by flaps at jarvis.csri.toronto.edu (Alan J Rosenthal):
> jlg at lanl.gov (Jim Giles) writes:
>>C foolishly doesn't require 'short', 'int', and 'long' to be different data
>>types.  Oh well.
> 
> Sure it does.  It just doesn't require them to be different sizes.  Requiring
> them to be different sizes would be foolish.

Requiring them to be different sizes would make sense.  Allowing them to
be the same size is done for backward compatibility (_both_ meanings of
the word backward intended).  This failing wouldn't have existed if the
language had made proper requirements on the data types from the start.

The only case I can think of where it would be useful to have two distinct
data types allowed to be identically implemented would be 'char' vs. 'ascii'.
Here, 'char' could be the machine specific character set and 'ascii' would
be the ASCII character set.  The two would be identical on machines in
which the usual character set _is_ ASCII.  Other than that, if a language
has two distinct data types, they should have different properties.  Even
better, distinct data types should differ from each other in predictable
ways.  What's wrong with short must be twice as precise as char?  Or int
should be at least twice as precise as short?  Or long must be at least
twice as precise as int?  Etc..



More information about the Comp.lang.c mailing list