What machines core dump on deref of NULL?

Jan Brittenson bson at wheaties.ai.mit.edu
Sat Jul 7 09:01:02 AEST 1990


In article <1990Jul6.152722.5320 at eng.umd.edu>
russotto at eng.umd.edu (Matthew T. Russotto) writes:

>In article <418 at minya.UUCP> jc at minya.UUCP (John Chambers) writes:

>>And on page 192 of my C bible I find the paragraph:
>>	[...] it is guaranteed that assignment of
>>	the constant 0 to a pointer will produce a null pointer distinguish-
>>	able from a pointer to any object.
>>

>Distinguishable in this sense just means you can use tests like
>
>p=0;
>if (p == &someobject)
>	code();
>else
>	othercode()
>
>which will always fail

   Not if someobject resides at address 0, in which case p does point
to an object. In addition, the address calculation &someobject might
yield a pointer to address 0. E.g.,

		char foo[1];
		... &foo[ -(int) &foo ] ...

   Even when limiting ourselves to unix and the more common
implementations can we come up with a reason for treating (char *) 0
as a specific object; to probe the access rights of ones u area, for
instance. But for compatibility's sake NULL should be treated as "a
pointer not pointing at any object" - esp. when the intention is to
later be able to port software from the unix environment to a non-unix
environment where variables may very well be located at 0.


> (it probably also means that malloc, etc, can never
> allocate an object at 0)

   If malloc() returns NULL, that should be regarded as "no object
allocated." In future implementations 0 may be the first location of
an upwards-growing heap for all we know... Of course the problem could
be easily avoided by never using address 0, but that's - as far as I
can tell - exactly what John Chambers argued against in the first
place. To summarize, I think John's argument is quite valid, although
I think the problem can be easily dodged at the cost of grace.

(Disflamer: all above is IMHO.)



More information about the Comp.unix.wizards mailing list