gcc and NULL function pointers.

Daniel Jimenez djimenez at ringer.cs.utsa.edu
Thu Jun 27 11:19:59 AEST 1991


In article <1991Jun26.134355.29334 at cs.odu.edu> kremer at cs.odu.edu (Lloyd Kremer) writes:
>In article <1991Jun26.053508.3634 at ringer.cs.utsa.edu> djimenez at ringer.cs.utsa.edu (Daniel Jimenez) writes:
>>I thought 0 in a context where a pointer is expected (e.g., int *p; p = 0;)
>>wasn't the integer 0, rather whatever that machine's representation of
>>a null pointer is.
>
>Yes.  But problems most often occur when the compiler does not recognize that
>a pointer interpretation is appropriate such as when passing the pointer 0 as
>a function argument with no cast and no prototype in scope.

I know.  When I said that, I was responding to someone who was considering
0 in a pointer context.  I know about execl and other functions where 0
will not do.

I found my mailbox stuffed this morning with responses to my article.
Ok, ok, there exist machines where char*'s are larger than int*'s.

That being the case, I have to revise this:
>>For what it's worth, here's my opinion on NULL:
>>We should all contribute to a fund to help build a time machine so
>>someone can go back in time and tell K&R to include something like
>>Pascal's nil in C. :-)
>
>OK, I'll hold the money until funding is complete.  :-)

(I'll e-mail you my cash donation :-)
It won't do just to have a Pascal 'nil' in all pointer contexts,
because pointers can be of different sizes.  So we should tell
K&R just to have a macro

#define NULL(type) (type*)0 /* no, I didn't make it up.  I stole it. */

instead of just

#define NULL (your_favorite_type*) 0

I'm sure everyone can agree on that one.  The problem is, time travel
aside, what to do with all those programs where NULL is defined
as 0 or (some_type*)0 when they are ported to wierd machines with big
char*'s.

On those architectures, you could add a compiler switch "-stupid_NULL"
that would compile the program in such a way that all pointers passed
to or received by functions would be guaranteed to be the largest size
pointer the machine has.  Then, when those parameters are dealt with in
the function, they would be cast to the right size automagically.
The only case when they wouldn't be converted would be cases like

char *p;
...
if (p == 0) blah_blah ();

because to check for the null pointer requires only checking for all bits zero.
(before you flame me, read on)
The null pointer would have to be represented by "all bits zero" in this
case, but could be something different during normal compilation.  Any ints
passed to or received by functions would also have to be converted into things
the size of the largest pointer (unless integers are larger, in which
case everything would be converted to integers), lest an integer 0
should make its way into a place where null was expected.

This switch would be used whenever compiling one of those programs.
The decreased efficiency of those programs would be an incentive to the
programmers to change their ways.

The fact that I am proposing such a silly solution characterizes the
futility of the whole #define NULL question.  
-- 
*    Daniel A. Jimenez			*  Please excuse my longwindedness.
*    djimenez at ringer.cs.utsa.edu	*  This Sun terminal makes everything
*    dajim at lonestar.utsa.edu		*  I write seem important.
* Opinions expressed here are mine only, and not those of UTSA.



More information about the Comp.std.c mailing list