I'm finally putting aside my lurker roots with a predicament:
Here's the story: It was a quiz in one of my programming classes to add an element to the end of a linked list. Sounds simple enough. I get the quiz back, and I've lost two points because of a small error in traversing the list. It still sounds straightforward. However, the reason I lost points wasn't because the traversal was inherently flawed or anything. It was because of what I can only describe as semantics:
Code:
I was reasonably irked by the statement. After all, I'd written code like that and it has worked, so it might have been a fluke that only a compiler like gcc would forgive. Or maybe it was limited to a certain subset of compilers, because I've seen a similar line in the decompiled source code to Super Mario 64 (which was compiled on IDO 5.3).
I decided to look into it to see how exactly NULL and 0 are related, or whether they're related at all.
After consulting The C Programming Language (K&R C), I came across this passage:
"Pointers and integers are not interchangeable. Zero is the sole exception: the constant zero may be assigned to a pointer, and a pointer may be compared with the constant zero. The symbolic constant NULL is often used in place of zero, as a mnemonic to indicate more clearly that this is a special value for a pointer. NULL is defined in <stdio.h>. We will use NULL henceforth" (Kernighan & Ritchie, 1978)
What this passage seems to convey is that NULL is a constant put in place to not compare pointers to a magic number (0 in this case). So it would look something like:
Code:
To get another source to defend my claim, I consulted the ANSI standard for C89, just to make sure that the definition for NULL hadn't changed from the transition to standardization (after all, K&R C and ANSI C have quite a few differences).
What I found was that the ANSI C89 standard defines the NULL macro as one that "expands to an implementation-defined null pointer constant", where the null pointer constant is defined as "An integral constant expression with the value 0, or such an expression cast to type void *". This further proves that NULL is used in place of 0 or at least (void*) 0.
With these sources gathered, I emailed the professor to get a second opinion on the issue. I got this in response:
"You are correct in that NULL is most always 0. However, the standard does not REQUIRE that it be 0."
What could they have meant by that? Do such evil compilers exist that don't require NULL to be 0? Will it become an issue in the future? What are your thoughts on the matter?
Here's the story: It was a quiz in one of my programming classes to add an element to the end of a linked list. Sounds simple enough. I get the quiz back, and I've lost two points because of a small error in traversing the list. It still sounds straightforward. However, the reason I lost points wasn't because the traversal was inherently flawed or anything. It was because of what I can only describe as semantics:
Code:
for(;temp->next;temp=temp->next); //only works if NULL==0
I was reasonably irked by the statement. After all, I'd written code like that and it has worked, so it might have been a fluke that only a compiler like gcc would forgive. Or maybe it was limited to a certain subset of compilers, because I've seen a similar line in the decompiled source code to Super Mario 64 (which was compiled on IDO 5.3).
I decided to look into it to see how exactly NULL and 0 are related, or whether they're related at all.
After consulting The C Programming Language (K&R C), I came across this passage:
"Pointers and integers are not interchangeable. Zero is the sole exception: the constant zero may be assigned to a pointer, and a pointer may be compared with the constant zero. The symbolic constant NULL is often used in place of zero, as a mnemonic to indicate more clearly that this is a special value for a pointer. NULL is defined in <stdio.h>. We will use NULL henceforth" (Kernighan & Ritchie, 1978)
What this passage seems to convey is that NULL is a constant put in place to not compare pointers to a magic number (0 in this case). So it would look something like:
Code:
#define NULL 0
To get another source to defend my claim, I consulted the ANSI standard for C89, just to make sure that the definition for NULL hadn't changed from the transition to standardization (after all, K&R C and ANSI C have quite a few differences).
What I found was that the ANSI C89 standard defines the NULL macro as one that "expands to an implementation-defined null pointer constant", where the null pointer constant is defined as "An integral constant expression with the value 0, or such an expression cast to type void *". This further proves that NULL is used in place of 0 or at least (void*) 0.
With these sources gathered, I emailed the professor to get a second opinion on the issue. I got this in response:
"You are correct in that NULL is most always 0. However, the standard does not REQUIRE that it be 0."
What could they have meant by that? Do such evil compilers exist that don't require NULL to be 0? Will it become an issue in the future? What are your thoughts on the matter?