C compiler implements wrong semantics

Richard Kuhns rjk at mrstve.UUCP
Wed Feb 5 04:03:17 AEST 1986


In article <2147 at brl-tgr.ARPA> NEVILLE%umass-cs.csnet at csnet-relay.arpa writes:
(I shortened the note...)
>This is posted to unix-wizards instead of to net.lang because i believe
>that it shows faulty semantics in the Unix C compiler.  i don't know
>the proper to get to arbitrary newsgroups, being an Internet person, so
>if the moderator would kindly forward it there i would appreciate it.
>
>According to my books, if a is 5, then (a++ + a) ought to evaluate to 11.
>On 4.2bsd (or any system using the pcc, i imagine) it evaluates to 10.
>On VMS with VAX-C v2.1, it evaluates to 11.
>
>So the questions for the day are:  Is pcc "right" because it is sort of the
>defacto standard?  (i have a friend who claims that BNF and such are useless,
>the compiler is the only definition of a language that counts)  Is this
>discrepancy between Unix C's behaviour and description already widely known
>and carefully worked around?  Should i attempt to fix it and possibly break
>some code or leave it alone for old time's sake?
>

According to everything quote official unquote I've read on the subject,
there is no discrepancy between Unix C's behaviour and description.
Quoting from "A C PROGRAM CHECKER - lint" (dist. with AT&T SYSV.2),

"In order that the efficiency of C language on a particular machine not be
unduly compromised, the C language leaves the order of evaluation of
complicated expressions up to the local compiler. ...
*In particular, if any variable is changed by a side effect and also used
elsewhere in the same expression, the result is explicitly undefined.* "
-- 
Rich Kuhns		{ihnp4, decvax, etc...}!pur-ee!pur-phy!mrstve!rjk



More information about the Comp.unix.wizards mailing list