|
From: | Vesa Jääskeläinen |
Subject: | Re: [avr-gcc-list] Re: WinAVR 20050214 (gcc 3.4.3) and optimizer bug. |
Date: | Mon, 09 May 2005 19:47:40 +0300 |
User-agent: | Mozilla Thunderbird (Windows/20050412) |
Jeff Epler wrote:
On Mon, May 09, 2005 at 08:21:12AM -0700, Larry Barello wrote:I didn't understand everything that was said, but it seemed clear to me that having the construct char *foo; if (foo && (*foo !=0)) randomly break seemed pretty harsh.As I recall, your original code was more like 1 char *foo; 2 char bar = *foo; 3 if(foo && (*foo != 0)) { ... } Dereferencing a NULL pointer causes undefined behavior. GCC assumes unixish semantics, where the dereference would cause the program to exit. In that sense, the result of the test of foo on line 3 is performed on line 2 when foo is dereferenced.
Since AVR doesn't have unixish semantics for "NULL"-pointer dereferences, maybe -fno-delete-null-pointer-checks should be the default no matter the -O level. Maybe the bit pattern for NULL should not be all zeros, but that's an even bigger change to contemplate.
Now that you said it, I noticed that there is actually a "hidden bug" :), but in reality it just reads from address zero every time this function is entered with NULL pointer and in this architecture it would not crash, may it be good or bad thing.
Now I begin to understand why this "optimization" happened. For now I think it is safer to use this flag to disable this optimization and then browse every function in code base for similar issues and fix them when found.
Thanks for giving this explanation and providing a workaround. Thanks, Vesa Jääskeläinen
[Prev in Thread] | Current Thread | [Next in Thread] |