binary to ascii

Dan Bernstein brnstnd at kramden.acf.nyu.edu
Sat Sep 15 20:09:41 AEST 1990


In article <13680 at hydra.gatech.EDU> cc100aa at prism.gatech.EDU (Ray Spalding) writes:
> In article <574 at demott.COM> kdq at demott.COM (Kevin D. Quitt) writes:
> >In article <371 at bally.Bally.COM> siva at bally.Bally.COM (Siva Chelliah) writes:
    [ i = ((int) c) & 0x00FF ]
> >    Try "i = (unsigned int) c;" and you'll see it isn't necessary.  
> This is incorrect (where c is a signed char).

We discussed this a few months ago (``How to convert a char into an int
from 0 to 255?''). The conclusion was that (int) (unsigned char) c takes
a character into an integer from 0 through UCHAR_MAX. Other series of
casts do not do the job; using & is overkill.

---Dan



More information about the Comp.lang.c mailing list