binary to ascii

Ray Spalding cc100aa at prism.gatech.EDU
Sat Sep 15 04:00:34 AEST 1990


In article <574 at demott.COM> kdq at demott.COM (Kevin D. Quitt) writes:
>In article <371 at bally.Bally.COM> siva at bally.Bally.COM (Siva Chelliah) writes:
>>    i=(int ) c;
>>    i=i &  0x00FF;   /* this is necessary because when you read, sign is 
>>                        extended in c   */
>    Try "i = (unsigned int) c;" and you'll see it isn't necessary.  

This is incorrect (where c is a signed char).  When converting from a
signed integral type to a wider, unsigned one, sign extention IS
performed (in two's complement representations).  See K&R II section
A6.2, "Integral Conversions".
-- 
Ray Spalding, Technical Services, Office of Information Technology
Georgia Institute of Technology, Atlanta Georgia, 30332-0275
uucp:     ...!{allegra,amd,hplabs,ut-ngp}!gatech!prism!cc100aa
Internet: cc100aa at prism.gatech.edu



More information about the Comp.lang.c mailing list