signed/unsigned char/short/int/long [was: #defines with parameters]

T. William Wells bill at twwells.uucp
Thu Dec 8 15:49:48 AEST 1988


In article <330 at aber-cs.UUCP> pcg at cs.aber.ac.uk (Piercarlo Grandi) writes:
:                                                          I think there
: are two issues here, one the introduction of signed as a keyword, and
: the other neat ways of defining semantics. As to the latter, the whole
: type system would be greatly simplified if one were to say that

`Signed' is there so that we can have `signed char'. As to why one
would want that, the original C specified that, while a char was an
integer, whether it could contain negative values is up to the
implementer. `Signed char' has the same size as a char, but it is
always signed. (And yes, there are good reasons not to change the
current definition of `char').

`Signed' was then allowed as a modifier for the other types, for
reasons of symmetry.

(This is all in the Rationale, in section 3.1.2.5.)

As for simplifying the type system; the current one is as simple as
is possible for it to be, given that it *must* be compatible with the
old one, and given the addition of a `signed char'.

: [1] there are two distinct types, int and unsigned; they are distinct
: types because different rules of arithmetic apply to them. This would
: make it clear, and I have found very few people that actually
: understand that unsigned is not just "positive int", that the same
: operators applied to int and unsigned have very different semantics.

Then you have simply been hanging around inexperienced C programmers.

Not only that, but you are wrong about there being two distinct
types; from the view you are adopting, there are three.  See below.

: [2] Each of the two distinct types may come in three different extra
: lengths, char, short and long, that are exactly equivalent among them
: for the same type except for the different size.

This would be nice if it were compatible with history; however...

: As to the last point, char has been so far just a short short; a char
: value can be operated upon exactly as an integer. Historically char
: constants have been really the size of integer constants...

This is false. Char has not been and ought not be made just a short
short.  Char is a funny type. It is neither signed nor unsigned though
it is always implemented as one or the other.

Let me repeat this: there are three signednesses in C:

    1) integers - these have positive and negative values.
    2) unsigned - these have positive values only.
    3) char - these have positive values. Sometimes they have
       negative values as well but it depends on the implementation.

: I would have liked, instead of the unnecessary and confusing signed
: modifier, a nice range(abs_max) modifier for types integer and
: unsigned, and char/short/long defined in terms of it. This would have
: added tremendously to the portability of programs.

And here again you are wrong. While it is sometimes nice to be able
to specify numeric ranges, it is hardly necessary to do so for
portability. Understanding the C paradigm, one can use the existing
types to create portable code. I do it all the time; my code is
regularly ported to everything from micros to mainframes.

:                       to sanction the existing practice of some
: compilers (PCC had it, more recent BSD versions fixed this "bug") to
: say "int char" or better "char int"?

`Char int' is, and always has been, illegal. I don't know of a single
compiler that accepts it. I just checked: the one on my system
(Microport, a system V) and the one on my work machine (SunOS,
more-or-less BSD) reject it. I believe that both are based on the
PCC.  I know that the C compiler (from ISC) we used on our PDP-11 and
our VAX, between four and six years ago, both rejected `char int' and
similar constructs. I'm almost certain that both compilers were PCC
based.

---

Let me suggest that you should carefully read the various books on C,
starting with K&R (the original), the Harbison & Steele book, and
some other good C textbooks. *Then* read the latest draft of the C
standard.  Do all of this without reference to what you would wish
the language to be. Understand what it *is*; develop familiarity with
the paradigms and facts of the language. Then, and only then, will be
you be equipped to complain about C's deficiencies.

In the mean time, I'm going to stop responding to your statements
about what C ought to be.  You have formed these opinions in the
absence of knowledge; I have come to believe that I am wasting my
time trying to correct them.

If you have questions about the language, go ahead and ask them, and
you'll even get a civil reply, but please stop telling us what the
language should be until you know what it *is*.

---
Bill
{uunet|novavax}!proxftl!twwells!bill



More information about the Comp.lang.c mailing list