characters

Paul John Falstad pfalstad at phoenix.Princeton.EDU
Thu Sep 20 13:36:25 AEST 1990


In article <1990Sep18.162407.15525 at zoo.toronto.edu> henry at zoo.toronto.edu (Henry Spencer) writes:
>In article <2517 at idunno.Princeton.EDU> pfalstad at phoenix.Princeton.EDU (Paul John Falstad) writes:
>>I, for one, loathe the concept of signed chars.  I've wasted countless
>>hours of programming time searching for bugs caused because I forgot that
>>chars are signed by default.  I think chars (in fact, all integer types)
>>should be unsigned by default.  Comments?
>Extending this to the rest of the integer types is silly, unless you
>stop calling them (e.g.) "int" and make it, say, "nat" instead.  People

Ok, THAT was silly.  I got a bit carried away.

In some programs, I just got so annoyed that I did a typedef unsigned
char uchar, and then used uchar instead.  The problem with that is all
the library functions use char *.  I think ANSI C is actually supposed
to give an ERROR -- not a warning -- if you mix the two.  This leaves me
with three options: (1) edit the include files (bad idea), or (2) do a cast
each time.  But having to do strlen((char *) str) each time is very
annoying.

The third option is to just remember that chars are signed and watch out
for problems caused by the sign extension.  Hmmm.  But complaining is so much
easier...

Incidentally, someone sent me mail saying that neither K&R nor ANSI
specify that chars must be signed.  True?  All the implementations I've come
across have chars signed, however, which is the only fact I'm interested
in.

Paul Falstad, pfalstad at phoenix.princeton.edu PLink:HYPNOS GEnie:P.FALSTAD
For viewers at home, the answer is coming up on your screen.  For those of
you who wish to play it the hard way, stand upside down with your head in a
bucket of piranha fish.



More information about the Comp.lang.c mailing list