Testing on NULL an unitialized values
This discussion is connected to the gimp-developer-list.gnome.org mailing list which is provided by the GIMP developers and not related to gimpusers.com.
This is a read-only list on gimpusers.com so this discussion thread is read-only, too.
Testing on NULL an unitialized values
Hello,
since some days I'm browsing through the Gimp-Code.
What I have seen so far looks very tidy.
But I also found some things that I would do differently, throughout the whole code, and maybe also in the libs (I didn't looked at them in detail).
I would do EVERY pointer set to NULL, when defining it. And normally I also would set ANY other value to a certain value, when defining it.
This has helped me to track errors as early as possible.
Example:
==============================================
/*****************************************************************************/
/* public functions
********************************************************/
GimpContext *
gimp_context_new (Gimp *gimp,
const gchar *name,
GimpContext *template)
{
GimpContext *context;
g_return_val_if_fail (GIMP_IS_GIMP (gimp), NULL); g_return_val_if_fail (name != NULL, NULL); g_return_val_if_fail (! template || GIMP_IS_CONTEXT (template), NULL);
context = g_object_new (GIMP_TYPE_CONTEXT, "name", name, "gimp", gimp, NULL);
if (template)
{
context->defined_props = template->defined_props;
gimp_context_copy_properties (template, context, GIMP_CONTEXT_ALL_PROPS_MASK); }
return context;
}
==============================================
The test
if( template )
makes only sense, if you can be sure that uninitialzed values
will definitelky be NULL.
If you are not sure that uninitialized values will be NULL,
then the test
if( template )
makes no sense.
So:
- either consequently setting all pointers to NULL,
even you are intended to just right after this will set it
to another value; if you forget this, then you at least has
your NULL in the pointer.
- or: you can completely avoid such tests as the above mentioned one,
because you are sure that you alkready have initialized values.
The former decision is the one that leads to easy maintainable code. The later decision is, what I would call going to become crap.
It's a lot of work, looking for such stuff in all files.
But I would say, it will definitely help in tracking down errors early.
I can say this from many years of C programming.
Any comments welcome.
Ciao, Oliver
Testing on NULL an unitialized values
The test
if( template )
makes only sense, if you can be sure that uninitialzed values will definitelky be NULL.
You must have missed the g_return_val_if_fail (! template || GIMP_IS_CONTEXT (template), NULL) .
It checks if template is NULL or a pointer to a valid GimpContext. If template is some random non-NULL value, the test will fail and a warning message will be printed. Such warning messages indicate a programmer error and should be dealt with during development.
--tml
Testing on NULL an unitialized values
Zitat von "Tor Lillqvist" :
The test
if( template )
makes only sense, if you can be sure that uninitialzed values will definitelky be NULL.You must have missed the g_return_val_if_fail (! template || GIMP_IS_CONTEXT (template), NULL) .
It checks if template is NULL or a pointer to a valid GimpContext. If template is some random non-NULL value, the test will fail and a warning message will be printed. Such warning messages indicate a programmer error and should be dealt with during development.
[...]
Nice to know, but I was talking on things like the *context in that funcion.
Even only temporarily valies, if set to a certain value, like 0 or NULL, will help in finding problems.
The mentioned function just was an example.
Uninitialzed values I see nearly everywhere in the code.
Dereferencing NULL is easy to find, because it crashes early.
Ciao, Oliver
Testing on NULL an unitialized values
On 04/21/2010 11:58 AM, Oliver Bandel wrote:
Zitat von "Tor Lillqvist":
The test
if( template )
makes only sense, if you can be sure that uninitialzed values will definitelky be NULL.You must have missed the g_return_val_if_fail (! template || GIMP_IS_CONTEXT (template), NULL) .
It checks if template is NULL or a pointer to a valid GimpContext. If template is some random non-NULL value, the test will fail and a warning message will be printed. Such warning messages indicate a programmer error and should be dealt with during development.
[...]
Nice to know, but I was talking on things like the *context in that funcion.
Even only temporarily valies, if set to a certain value, like 0 or NULL, will help in finding problems.
The mentioned function just was an example.
Uninitialzed values I see nearly everywhere in the code.
Dereferencing NULL is easy to find, because it crashes early.
Hi, Oliver
Have you programmed with glib before? A lot of defensive programming techniques differ between straight C and C-with-glib. For instance, the guards at the top are common, and (I imagine) gimp_context_copy_properties has similar guards. As such, it's the job of the called function, not the caller, to check if a pointer they want to dereference is NULL.
This has the advantage that you don't check a pointer for NULL 10 times across 10 different function calls when you only use it once, all the way at the bottom. Of course, if you actually dereference a value (like the template pointer in the snippet you posted), you should test it before you dereference it.
In short, you might want to see what sort of defensive techniques are customary or appropriate for a given context before concluding that we're programming blind.
--xsdg
Testing on NULL an unitialized values
On 04/21/2010 01:58 PM, Oliver Bandel wrote:
Even only temporarily valies, if set to a certain value, like 0 or NULL, will help in finding problems.
I agree, and I try to initialize all local variables that I either add or modify the declaration of. I don't think it would be worth to commit a patch that initializes all variables though because it would break git blame.
/ Martin
Testing on NULL an unitialized values
On Wed, 2010-04-21 at 13:58 +0200, Oliver Bandel wrote:
Even only temporarily valies, if set to a certain value, like 0 or NULL, will help in finding problems.
Should be totally un-necessary as the compiler will warn you if your code uses uninitialized variables. We are compiling with -Wall and we try hard to eliminate all compiler warnings. What you are suggesting will not improve the code at all, it would most likely even degrade its readability.
Sven
Testing on NULL an unitialized values
On 04/21/2010 07:53 PM, Sven Neumann wrote:
On Wed, 2010-04-21 at 13:58 +0200, Oliver Bandel wrote:
Even only temporarily valies, if set to a certain value, like 0 or NULL, will help in finding problems.
Should be totally un-necessary as the compiler will warn you if your code uses uninitialized variables. We are compiling with -Wall and we try hard to eliminate all compiler warnings. What you are suggesting will not improve the code at all, it would most likely even degrade its readability.
The compiler doesn't catch all cases, like this one:
#include
int main(int argc, char **argv)
{
int var;
if (argc == 2)
var = 42;
printf ("var = %d", var);
return 0;
}
Since use of uninitlized variables very well can cause severe and hard-to-reproduce crashes, and since unpredictability never is a good thing when it comes to computers, I think it is pretty clear what the recommendation should be with regards to initialization of variables.
/ Martin
Testing on NULL an unitialized values
On Wed, 2010-04-21 at 12:33 +0200, Oliver Bandel wrote:
Example:
============================================== /*****************************************************************************/ /* public functions
********************************************************/GimpContext * gimp_context_new (Gimp *gimp, const gchar *name, GimpContext *template) {
GimpContext *context;g_return_val_if_fail (GIMP_IS_GIMP (gimp), NULL); g_return_val_if_fail (name != NULL, NULL); g_return_val_if_fail (! template || GIMP_IS_CONTEXT (template), NULL);
context = g_object_new (GIMP_TYPE_CONTEXT, "name", name, "gimp", gimp, NULL);
if (template) {
context->defined_props = template->defined_props;gimp_context_copy_properties (template, context, GIMP_CONTEXT_ALL_PROPS_MASK); }
return context;
}
==============================================The test if( template )
makes only sense, if you can be sure that uninitialzed values will definitely be NULL.
"template" isn't uninitialized here. It is a parameter passed to gimp_context_new() and it may either be NULL or a pointer to a valid GimpContext object. This is even checked right at the beginning of the function.
Sven
Testing on NULL an unitialized values
Zitat von "Sven Neumann" :
On Wed, 2010-04-21 at 12:33 +0200, Oliver Bandel wrote:
Example:
============================================== /*****************************************************************************/ /* public functions
********************************************************/GimpContext * gimp_context_new (Gimp *gimp, const gchar *name, GimpContext *template) {
GimpContext *context;g_return_val_if_fail (GIMP_IS_GIMP (gimp), NULL); g_return_val_if_fail (name != NULL, NULL); g_return_val_if_fail (! template || GIMP_IS_CONTEXT (template), NULL);
context = g_object_new (GIMP_TYPE_CONTEXT, "name", name, "gimp", gimp, NULL);
if (template) {
context->defined_props = template->defined_props;gimp_context_copy_properties (template, context, GIMP_CONTEXT_ALL_PROPS_MASK); }
return context;
}
==============================================The test if( template )
makes only sense, if you can be sure that uninitialzed values will definitely be NULL."template" isn't uninitialized here. It is a parameter passed to gimp_context_new() and it may either be NULL or a pointer to a valid GimpContext object. This is even checked right at the beginning of the function.
Yes, you are right.
But "context" is not initialized at definition.
It get's it's value later on.
When changing code, forgetting to set a value later might bring problems.
GimpContext *context = NULL;
Right after beginning of the function is, what I mean.
In this small function one can oversee what's going on. In larger functions it's not always obviously, and such semmeingly non-necessities can help in shrinking down debugging time from weeks to minutes, especially in big projects.
I prefer programming in paranoid mode ;-) It helps, if the coffee is empty with early core dumps... ;-)
When I see the huge database and complexity of gimp, I prefer such a way even more. :)
When I look at scheme.c, it has some thousands lines and some functions are many screens long... DEK would say: a function should not be larger than one page or screen size.... I agree in that point.
Ciao, Oliver
Testing on NULL an unitialized values
Hi,
Zitat von "Omari Stephens" :
On 04/21/2010 11:58 AM, Oliver Bandel wrote:
Zitat von "Tor Lillqvist":
[...]
Even only temporarily valies, if set to a certain value, like 0 or NULL, will help in finding problems.
The mentioned function just was an example.
Uninitialzed values I see nearly everywhere in the code.
Dereferencing NULL is easy to find, because it crashes early.
Hi, Oliver
Have you programmed with glib before?
[...]
No, I'm new to glib.
I had it in mind since a while, because it also provides different
kind's of trees.
But only now I have real contact to it.
A lot of defensive programming
techniques differ between straight C and C-with-glib. For instance, the guards at the top are common, and (I imagine) gimp_context_copy_properties has similar guards. As such, it's the job of the called function, not the caller, to check if a pointer they want to dereference is NULL.
Of course the called function has to test it on NULL/non-NULL.
But the function that creates a pointer does it's job best, if it starts with a NULL right at the time of the definition.
My rule of thumb, which has helped me a lot is: ALWAYS INITIALIZE, even if some lines later you assign a value.
But if you forget this part, or change code and the assignment is lost by accident, then there is a pointer that is NOT NULL.
Result: your tests on NULL fails!
So: all your gurading is disabled, if there is an unitialized pointer.
This has the advantage that you don't check a pointer for NULL 10 times across 10 different function calls when you only use it once, all the way at the bottom.
I prefer checking it ten times to checking it 0 times ;-)
Of course, if you actually dereference a value (like the template pointer in the snippet you posted), you should test it before you dereference it.
The test against NULL will fail, if you forgot to assign a value.
If the value is assigned at definition (NULL for a pointer), this makes the checks working always.
Maybe I had it not very good explained in my first mail, what I mean.
So, I hope I have clarified it.
In short, you might want to see what sort of defensive techniques are customary or appropriate for a given context before concluding that we're programming blind.
I didn't say blind programming.
But maybe also switching the light on is a god idea. ;-)
Ciao, Oliver
Testing on NULL an unitialized values
Zitat von "Martin Nordholts" :
On 04/21/2010 01:58 PM, Oliver Bandel wrote:
Even only temporarily valies, if set to a certain value, like 0 or NULL, will help in finding problems.
I agree, and I try to initialize all local variables that I either add or modify the declaration of. I don't think it would be worth to commit a patch that initializes all variables though
Hmhhh...
...but the next time when you work on a function, you could just do that part too?
I really have thought about a patch that i sintended just to do only that: adding the initializations.
because it would break git
blame.
git blame?
Can you explain me that?
Ciao, Oliver
Testing on NULL an unitialized values
Zitat von "Sven Neumann" :
On Wed, 2010-04-21 at 12:33 +0200, Oliver Bandel wrote:
Example:
============================================== /*****************************************************************************/ /* public functions
********************************************************/GimpContext * gimp_context_new (Gimp *gimp, const gchar *name, GimpContext *template) {
GimpContext *context;g_return_val_if_fail (GIMP_IS_GIMP (gimp), NULL); g_return_val_if_fail (name != NULL, NULL); g_return_val_if_fail (! template || GIMP_IS_CONTEXT (template), NULL);
context = g_object_new (GIMP_TYPE_CONTEXT, "name", name, "gimp", gimp, NULL);
if (template) {
context->defined_props = template->defined_props;gimp_context_copy_properties (template, context, GIMP_CONTEXT_ALL_PROPS_MASK); }
return context;
}
==============================================The test if( template )
makes only sense, if you can be sure that uninitialzed values will definitely be NULL."template" isn't uninitialized here.
[...]
To explain, what I meant:
The function, that calls gimp_context_new() gives template to gimp_context_new(). If THE CALLING function - for some reason - gives a non-NULL but non valid pointer to gimp_context_new(), shit happens.
Ciao, Oliver
Testing on NULL an unitialized values
On 04/21/2010 11:45 PM, Oliver Bandel wrote:
Zitat von "Martin Nordholts":
On 04/21/2010 01:58 PM, Oliver Bandel wrote:
Even only temporarily valies, if set to a certain value, like 0 or NULL, will help in finding problems.
I agree, and I try to initialize all local variables that I either add or modify the declaration of. I don't think it would be worth to commit a patch that initializes all variables though
Hmhhh...
...but the next time when you work on a function, you could just do that part too?
Only if I touch the variable declaration itself, not only the function. Since GIMP aligns local variables, it isn't that uncommon that you touch variable declarations you didn't add yourself.
git blame?
Run it on a file in GIMP and you'll see what commit that last touched a line. That is very helpful if you are uncertain why the code looks like it does in a particular place. If the commiter did a good, job all you have to do is to read the commit message for the commit that last touched that place, and the intentions of the original programmer should be clear(er).
/ Martin
Testing on NULL an unitialized values
On 04/22/2010 03:54 AM, Marc Lehmann wrote:
On Wed, Apr 21, 2010 at 08:14:33PM +0200, Martin Nordholts wrote:
The compiler doesn't catch all cases, like this one:
#include int main(int argc, char **argv)
{
int var;
if (argc == 2)
var = 42;
printf ("var = %d", var);
return 0;
}1. initialising var will not fix the bug, if there is any.
It won't, but it will make the bug consistently occur, which is a big plus.
2. initialising var will prevent other static analysers to diagnose a possible problem.
The problem to diagnose would be that of using an initialized variable, no? The fix would then be to initialize the variable.
3. initialising var will prevent "weird effects" and just *might* decrease chances of finding the bug further.
Why would you want "weird effects" in software? That's exactly what you don't want. At worst, a bug should manifest itself by making a program not do what it was intended to do, not doing something unpredictable.
Since use of uninitlized variables very well can cause severe and hard-to-reproduce crashes, and since unpredictability never is a good
Actually, it's easy to diagnose those bugs though, just look at the coredump.
The coredump gives you the state of the program when it crashed, not the cause leading up to the crash, which could have been an uninitlized local variable that's no longer in any stack frame.
Yes, don't do it unnecessarily, it tends to hide bugs.
Rather "As a rule of thumb, initialize local variables.". As always there are cases where it's better not to initialize local variables.
/ Martin
Testing on NULL an unitialized values
A couple of very small coins.
On Thu, Apr 22, 2010 at 06:55, Martin Nordholts wrote:
On 04/22/2010 03:54 AM, Marc Lehmann wrote:
On Wed, Apr 21, 2010 at 08:14:33PM +0200, Martin Nordholts wrote:
The compiler doesn't catch all cases, like this one:
#include int main(int argc, char **argv)
{
int var;
if (argc == 2)
var = 42;
printf ("var = %d", var);
return 0;
}1. initialising var will not fix the bug, if there is any.
It won't, but it will make the bug consistently occur, which is a big plus.
2. initialising var will prevent other static analysers to diagnose a possible problem.
The problem to diagnose would be that of using an initialized variable, no? The fix would then be to initialize the variable.
I think what he's trying to say here is that initializing it to 0 is still uninitialized. Just deterministicly so. And no valgrind, or static analyzers will notice that you're reading an uninitialized zero. The fix would be to initialize the variable in all possible execution paths, but not necessarily to 0.
3. initialising var will prevent "weird effects" and just *might* decrease chances of finding the bug further.
Why would you want "weird effects" in software? That's exactly what you don't want. At worst, a bug should manifest itself by making a program not do what it was intended to do, not doing something unpredictable.
Undeterministic behavior will expose a bug, deterministic but slightly wrong will probably hide it.
Since use of uninitlized variables very well can cause severe and hard-to-reproduce crashes, and since unpredictability never is a good
Actually, it's easy to diagnose those bugs though, just look at the coredump.
The coredump gives you the state of the program when it crashed, not the cause leading up to the crash, which could have been an uninitlized local variable that's no longer in any stack frame.
Yes, don't do it unnecessarily, it tends to hide bugs.
Rather "As a rule of thumb, initialize local variables.". As always there are cases where it's better not to initialize local variables.
The compiler is actually smart enough to give you a warning "might be used uninitialized", always initializing to something will hide that warning. And you'll use your uninitialized value (which will always be zero, or whatever) unaware of that it's not sensibly initialized.
Fredrik.
Testing on NULL an unitialized values
Zitat von "Fredrik Alströmer" :
A couple of very small coins.
On Thu, Apr 22, 2010 at 06:55, Martin Nordholts wrote:
On 04/22/2010 03:54 AM, Marc Lehmann wrote:
On Wed, Apr 21, 2010 at 08:14:33PM +0200, Martin Nordholts wrote:
The compiler doesn't catch all cases, like this one:
#include int main(int argc, char **argv)
{
int var;
if (argc == 2)
var = 42;
printf ("var = %d", var);
return 0;
}1. initialising var will not fix the bug, if there is any.
It won't, but it will make the bug consistently occur, which is a big plus.
2. initialising var will prevent other static analysers to diagnose a possible problem.
The problem to diagnose would be that of using an initialized variable, no? The fix would then be to initialize the variable.
I think what he's trying to say here is that initializing it to 0 is still uninitialized. Just deterministicly so.
That's rhetoric saying.
And no valgrind, or
static analyzers will notice that you're reading an uninitialized zero.
No problem.
You have that defined value, and with each run it gives you the same value. That mean: the bug is not fixed, but can be recreated with every run. This is: you can track it down, because it always gives you the same bevaiour. In this case: the value seems not to be changes... a good start: you know what to look for.
And if the bug is, that the value will be changed intermeidately, then it's also easier to track this, becaue something that is not 0, as it definitely has to be is easier to detect as something that can range the whole range of possibilities that the variable can hold, to compare with any other value.
For example: you set it to zero, know that no function should work on it, an it changes nevertheless. If it starts with have fun with debuggin ;-)
It's always god to know, from where to start.
Languages like OCaml for example have never undefined values. If you create something, it has a value. That's fun.
The fix would be to initialize the variable in all possible execution paths, but not necessarily to 0.
Can you explain that?
Why should every initialization would make sense?
You first set it to 0 (or NULL when it's a pointer), and rightly afterwards you set the right value. So in the case of a correct code, you get your initialisation, which yoh want to have it.
If it's a value that is definitley always !0, but fixed (constant start value) than setting to THAT value is OK too. But then it's best doing it at definition time, not one or many lines later. And it's also good, not to hard code that fixed starting point there, but use #define.
If you have a fixed starting point, that's good in debugging.
If you later remove your init, or if the function, that makes the init, makes nonsense, you at least can detect the difference.
difference = A - B
If one of A and B is fixed, you can find the difference. If both are undetermeined, happy debugging. ;-)
Bugs that are untrackable, are untrackable because of those problems.
If for example the x-mouse-value is always 0, even when you move it,
you know what's wrong at the beginning of debugging. And you know what
to look for. And a constant 0 is easier to see than a constnat
.
It's distinct and clear. No need to look up manpages or science books.
You know there is always 0 or NULL. Fine, if that's wrong. :)
3. initialising var will prevent "weird effects" and just *might* decrease chances of finding the bug further.
Why would you want "weird effects" in software? That's exactly what you don't want. At worst, a bug should manifest itself by making a program not do what it was intended to do, not doing something unpredictable.
Undeterministic behavior will expose a bug, deterministic but slightly wrong will probably hide it.
Heheh. funny.
Deterministic behaviour will also expose a bug: it will show you
always the same wrong result.
Always the same wrong result is easier to track down than always a
behaviour that is different. And it's even more complicated, if it has
to be different every time. You have to compare all possible values
that should occure with all possible values that do occur.
See the difference-example above
In other words:
error = actual_value - wanted_value
If you know math, you know how to find the problem.
Since use of uninitlized variables very well can cause severe and hard-to-reproduce crashes, and since unpredictability never is a good
Actually, it's easy to diagnose those bugs though, just look at the coredump.
The coredump gives you the state of the program when it crashed, not the cause leading up to the crash, which could have been an uninitlized local variable that's no longer in any stack frame.
Yes, don't do it unnecessarily, it tends to hide bugs.
Rather "As a rule of thumb, initialize local variables.". As always there are cases where it's better not to initialize local variables.
The compiler is actually smart enough to give you a warning "might be used uninitialized", always initializing to something will hide that warning. And you'll use your uninitialized value (which will always be zero, or whatever) unaware of that it's not sensibly initialized.
You don't need that warning anymore. They were invented for languages that allow undefined values, and prohgrammers that leave them undefined (mostly by accident).
You definitley know that there is one certain start value. But which value is it have? Is it always the same? And the same on all computers.
Ciao,
Oliver
P.S.:
For example it's also good after freeing memory, to set the pointer to NULL.
Some people say: not necessary.
But it has helped me finding bugs.
Testing on NULL an unitialized values
On Thu, Apr 22, 2010 at 14:00, Oliver Bandel wrote:
Zitat von "Fredrik Alströmer" :
And no valgrind, or
static analyzers will notice that you're reading an uninitialized zero.No problem.
You have that defined value, and with each run it gives you the same value. That mean: the bug is not fixed, but can be recreated with every run. This is: you can track it down, because it always gives you the same bevaiour. In this case: the value seems not to be changes... a good start: you know what to look for.
Let's say you pass that value down the stack, 0 IS a valid value for this particular algorithm, and will produce results which are similar, but not identical, to a non-zero value. If there are many factors which might distort the result, you'd have no idea where to start looking. If you DIDN'T initialize your variable, valgrind would tell you that the algorithm is reading uninitialized memory, and it'd also tell you which one.
-snip-
Languages like OCaml for example have never undefined values. If you create something, it has a value. That's fun.
Right, so does Java, yet still they're uninitialized and produce a warning (or error actually, in some cases), if you use it.
The fix would be to initialize the variable in all possible execution paths, but not necessarily to 0.
Can you explain that?
Well, if you end up with an uninitialized variable, there's an execution path which CAN reach that state, and chances are, you didn't think of it. And IN that case the variable needs to be initialized, but to what depends on the case (the one you forgot about).
Why should every initialization would make sense?
You first set it to 0 (or NULL when it's a pointer), and rightly afterwards you set the right value. So in the case of a correct code, you get your initialisation, which yoh want to have it.
This is just... weird, if you initialize it directly, why bloat the code?
If it's a value that is definitley always !0, but fixed (constant start value) than setting to THAT value is OK too. But then it's best doing it at definition time, not one or many lines later. And it's also good, not to hard code that fixed starting point there, but use #define.
If you have a fixed starting point, that's good in debugging.
If you later remove your init, or if the function, that makes the init, makes nonsense, you at least can detect the difference.
difference = A - B
If one of A and B is fixed, you can find the difference. If both are undetermeined, happy debugging. ;-)
Bugs that are untrackable, are untrackable because of those problems.
You've never used tools like static analyzers and valgrind, right? Bugs that are untrackable are usually not of this kind, rather pertaining to race conditions or intricate semantic problems.
-- snip --
3. initialising var will prevent "weird effects" and just *might* decrease chances of finding the bug further.
Why would you want "weird effects" in software? That's exactly what you don't want. At worst, a bug should manifest itself by making a program not do what it was intended to do, not doing something unpredictable.
Undeterministic behavior will expose a bug, deterministic but slightly wrong will probably hide it.
Heheh. funny.
Careful.
Deterministic behaviour will also expose a bug: it will show you always the same wrong result.
Always the same wrong result is easier to track down than always a behaviour that is different. And it's even more complicated, if it has to be different every time. You have to compare all possible values that should occure with all possible values that do occur.See the difference-example above
In other words:
error = actual_value - wanted_value
If you know math, you know how to find the problem.
This is true for obvious bugs. For not so obvious ones, where 0 IS a valid value, and SHOULD be 0 most of the time, you will NOT spot the error until someone hits that corner case where it shouldn't be.
-- snip --
The compiler is actually smart enough to give you a warning "might be used uninitialized", always initializing to something will hide that warning. And you'll use your uninitialized value (which will always be zero, or whatever) unaware of that it's not sensibly initialized.
You don't need that warning anymore. They were invented for languages that allow undefined values, and prohgrammers that leave them undefined (mostly by accident).
This is simply wrong. It's a rather naive to set uninitialized = undefined, as I've tried to explain above. As I also said, Java will also give you an error in this case even though it's still well defined. Using something uninitialized (defined or not) suggests you've forgotten an execution path, and your code is semantically wrong.
You definitley know that there is one certain start value. But which value is it have? Is it always the same? And the same on all computers.
If there was a way to define a value without initializing it, I'd be all for it. But AFAIK, there isn't. Unfortunately.
Ciao,
OliverP.S.:
For example it's also good after freeing memory, to set the pointer to NULL. Some people say: not necessary.
But it has helped me finding bugs.
Thats a completely different case, and makes a lot more sense, as after freeing it's initialized but undefined (yes I know, it keeps pointing where it used to point, but that's not the point, no pun intended).
For the record, I'm not necessarily against setting a predefined value to variables sometimes. I'm just against doing it for the wrong reasons, and I'd much rather have the compiler say "Warning: might be used uninitialized in this context" as a part of the static analysis, rather than chase down the bug where a value is 0 at run time (remember, I'm primarily talking corner cases here).
Fredrik.
Testing on NULL an unitialized values
On Wed, 2010-04-21 at 18:30 +0200, Martin Nordholts wrote:
On 04/21/2010 01:58 PM, Oliver Bandel wrote:
Even only temporarily valies, if set to a certain value, like 0 or NULL, will help in finding problems.
I agree, and I try to initialize all local variables that I either add or modify the declaration of. I don't think it would be worth to commit a patch that initializes all variables though because it would break git blame.
I don't think the git history of a file is a reason that should keep anyone from committing code cleanups. The question is rather if this particular cleanup is worth the hassle and if it would really result in an improvement when it comes to readability and maintainability of the code-base.
Sven
Testing on NULL an unitialized values
Hi,
On Thu, 2010-04-22 at 14:38 +0200, Fredrik Alströmer wrote:
For the record, I'm not necessarily against setting a predefined value to variables sometimes. I'm just against doing it for the wrong reasons, and I'd much rather have the compiler say "Warning: might be used uninitialized in this context" as a part of the static analysis, rather than chase down the bug where a value is 0 at run time (remember, I'm primarily talking corner cases here).
Totally agreed. I also prefer the compiler telling me that a refactoring I've just done is not correct because I forgot to initialize a variable in a code path. This has happened to me and the compiler warning caught some potential bugs. If we would always initialize all variables this mistake would have gone unnoticed.
Sven
Testing on NULL an unitialized values
Zitat von "Sven Neumann" :
On Wed, 2010-04-21 at 18:30 +0200, Martin Nordholts wrote:
On 04/21/2010 01:58 PM, Oliver Bandel wrote:
Even only temporarily valies, if set to a certain value, like 0 or NULL, will help in finding problems.
I agree, and I try to initialize all local variables that I either add or modify the declaration of. I don't think it would be worth to commit a patch that initializes all variables though because it would break git blame.
I don't think the git history of a file is a reason that should keep anyone from committing code cleanups.
Oh, yes. I see it the same way.
The question is rather if this
particular cleanup is worth the hassle and if it would really result in an improvement when it comes to readability and maintainability of the code-base.
The answer from my experience is: it absolutely is worth it.
And I doubt that the code will ne that much more unreadable. In situations where it bloats up the code, I would guess it reasons that should better end in refactoring the code.
Ciao, Oliver
Testing on NULL an unitialized values
Hi Frederik,
my main attend was to mention the problem of pointers, regarding uninitialized values. That's why I insisted on Null, and it makes sense often to use 0 or 0.0 for other values. As "strings" are char*, NULL should be used, not "".
You are right, that in some seldom situations it might make sense to initialize values to other start values. But they should always be predictable.
If for example 0 is part of the used range of values and you have -1 not in that range, -1 might make sense. For example filedescriptor-stuff in Unix-system-near applications might start with fd = -1 and if something goes wrong or you acidently removed your initializing funtion, the -1 will show you a problem.
Other values for example might be: INT_MAX or INT_MIN.
What is the special on all those values?
The special thing is, that they are EXCEPTIONS to what is normal.
Other languages have exceptions built in, and you can catch them. The same is here: a NULL-pointer is an exception. And it's the only exception that you have for your pointers, that is working all the time.If you may have a set of function pointers, which only can be valid, you also can test against them. But even in that case, NULL would be an exception too.
For int values enums are often a good idea, or language / system constants.
So, all in all: the most interesting values for initilazisation at definition
are those values, that should never occur.
Only against them you can test. glib provide those tests for incoming
parameters; but if the caller has forgotten initialization, your test
gives
you wrong feeling of security. And if you in your function miss the
init to real values, at least the EXCEPTIONAL init, right at the
beginning of your function will the called functions - if thex test
their arguments - fail.
So, to set the tests into a working mode, you must provide it values, that it can detect.
So much for now.
Ciao, Oliver
Testing on NULL an unitialized values
Zitat von "Sven Neumann" :
Hi,
On Thu, 2010-04-22 at 14:38 +0200, Fredrik Alströmer wrote:
For the record, I'm not necessarily against setting a predefined value to variables sometimes. I'm just against doing it for the wrong reasons, and I'd much rather have the compiler say "Warning: might be used uninitialized in this context" as a part of the static analysis, rather than chase down the bug where a value is 0 at run time (remember, I'm primarily talking corner cases here).
Totally agreed. I also prefer the compiler telling me that a refactoring I've just done is not correct because I forgot to initialize a variable in a code path. This has happened to me and the compiler warning caught some potential bugs. If we would always initialize all variables this mistake would have gone unnoticed.
[...]
If this case would be undetected by the compiler, but you have
initialized it to your EXCEPTIONAL value at the same line, at which
the variable is defined,
then - when running the code, it would have shown you your problem.
Will the compiler stop execution on any warning? It should, and not compile any code that gives warnings, otherwise your attempt will not work. People will ignore it "just for testing". And that's the beginning of the problems ;-)
The other case is: values change during the meantime. If you reset them to the exceptional values after usage, especially I mean pointers here, that's a good idea. And in those cases the compiler would not show you a warning on non-initialized values, but the tests on validity will work.
BTW: just some days ago I saw a bug fix in glib. In this fix, after freeing some memory, the pointer was set to null. I was happy to see this. :)
Ciao, Oliver
Testing on NULL an unitialized values
You are right, that in some seldom situations it might make sense to initialize values to other start values. But they should always be predictable.
You didn't get the reasoning about letting the compiler, or valgrind, catch use of uninitialized variables, did you?
The same is here: a NULL-pointer is an exception.
Only if you try to dereference it. There are some standard C library functions, and many GTK+ stack functions, where passing a NULL pointer for a parameter is a documented fully normal way to specify some semantic variation of its API.
And it's the only
exception that you have for your pointers,
Well, as such some API could very well define some "well-known" special ("exceptional") pointer values and give them semantic meaning by themselves (i.e. the contents of what they point to would be irrelevant). That doesn't happen often, but it would be perfectly normal C.
I mean something like:
typedef struct FooBarStruct *FooBar;
extern const FooBar FooBarMUMBLE; extern const FooBar FooBarPLUGH;
where the actual contents of the structs pointed to by FooBarMUMBLE and FooBarPLUGH would have no meaning, the only meaning would be that if for some function a FooBar argument equals FooBarMUMBLE it would have a special meaning (and the pointer would not be dereferenced), and ditto for FooBarPLUGH.
--tml
Testing on NULL an unitialized values
Will the compiler stop execution on any warning? It should, and not compile any code that gives warnings, otherwise your attempt will not work. People will ignore it "just for testing".
That depends on the project. Many projects do use flags like -Werror, although that is not always possible. And most good programmers in the FLOSS world use flags like -Wall and take compiler warnings quite seriously.
--tml
Testing on NULL an unitialized values
Zitat von "Tor Lillqvist" :
You are right, that in some seldom situations it might make sense to initialize values to other start values. But they should always be predictable.
You didn't get the reasoning about letting the compiler, or valgrind, catch use of uninitialized variables, did you?
I got it.
But the compiler throws a warning and not an error on it.
So, it's possible to get running code.
The same is here: a NULL-pointer is an exception.
Only if you try to dereference it.
No, I mean exception in a different way.
It's an exception even if you don't dereference it, because it will be one, if you dereference it.
Dereferencing a pointer that is not NULL, but contains nonsense, not necessarily pops up as a problem.
But it will be a problem, when the code is in usage, that is: at the customer or, neing more general: at the user.
Murphy's Law. :)
There are some standard C library functions, and many GTK+ stack functions, where passing a NULL pointer for a parameter is a documented fully normal way to specify some semantic variation of its API.
And the way it is handled is, to check against NULL, because NULL is special.
And it's the only
exception that you have for your pointers,Well, as such some API could very well define some "well-known" special ("exceptional") pointer values and give them semantic meaning by themselves (i.e. the contents of what they point to would be irrelevant). That doesn't happen often, but it would be perfectly normal C.
I mean something like:
typedef struct FooBarStruct *FooBar;
extern const FooBar FooBarMUMBLE; extern const FooBar FooBarPLUGH;
Yes, I like this idea.
it's not used often.
But it just adds more exceptional values to NULL. And you only can detect them as exceptional, if you know that definitions.
If you cast them to something else, you have problems to detect them.
But NULL is given from the language as really special value. (And normally should be (void*) 0.).
The above idea is nice, and adds more of exceptional values, but not as distinctionable as NULL.
Ciao, Oliver
Testing on NULL an unitialized values
Zitat von "Torsten Neuer" :
Am Freitag, 23. April 2010 08:39:52 schrieb Oliver Bandel:
Zitat von "Sven Neumann" :
Hi,
On Thu, 2010-04-22 at 14:38 +0200, Fredrik Alströmer wrote:
For the record, I'm not necessarily against setting a predefined value to variables sometimes. I'm just against doing it for the wrong reasons, and I'd much rather have the compiler say "Warning: might be used uninitialized in this context" as a part of the static analysis, rather than chase down the bug where a value is 0 at run time (remember, I'm primarily talking corner cases here).
Totally agreed. I also prefer the compiler telling me that a refactoring I've just done is not correct because I forgot to initialize a variable in a code path. This has happened to me and the compiler warning caught some potential bugs. If we would always initialize all variables this mistake would have gone unnoticed.
[...]
If this case would be undetected by the compiler, but you have initialized it to your EXCEPTIONAL value at the same line, at which the variable is defined,
then - when running the code, it would have shown you your problem.You are comparing a bug turning up when actually running the code vs. turning up when compiling it. I prefer to find and fix it *before* the code gets executed.
I do prefer this too.
Whenever a compiler is able to tell about a possible bug development is quickened. Which is one of the reasons some languages have explicitly been designed to allow for good static code evaluation.
Yes.
[...]
Ignoring warnings "just for testing" is bad style and contra-productive. Any serious programmer doing that should be slapped with a wet trout.
I have seen this all to often at work. "It's just a warning..."
The other case is: values change during the meantime. If you reset them to the exceptional values after usage, especially I mean pointers here, that's a good idea.
Agreed with that. But then again, optimally these pointers themselves should not exist any more (which means you can't dereference them).
What you seem to be talking about here is the use of global variables that get
re-used over and over. Resetting those to sane values is absolutely a must, but one should avoid the use of global variables whereever possible in the first case.
Let it be a field in a structure that will be used even after the free. And if normally after the free that pointer will not be used... be sure, one day it will be re-used, and the NULL-check then fails. :) That's going to be fun. :)
In case of library functions dealing with pointers, on the other hand, one cannot be sure whether the pointer itself is destroyed after the memory is freed. So setting the pointer to a sane value is a must and cannot be avoided.
OK, so I see you agree.
Many people are maybe not aware of that problem, I would think.
Ciao, Oliver
Testing on NULL an unitialized values
Zitat von "Tor Lillqvist" :
You are right, that in some seldom situations it might make sense to initialize values to other start values. But they should always be predictable.
You didn't get the reasoning about letting the compiler, or valgrind, catch use of uninitialized variables, did you?
[...]
Let's just take an example that makes it more clear.
See attachement ( I hope the list allows attachements ).
I get two results: with char* mytext = NULL;
=========================
oliver@siouxsie:~$ a.out
selected number: 0
Text: ATTENTION! No text selected. Fix Bug, please!
selected number: 1
Text: two
selected number: 2
Aborted
oliver@siouxsie:~$
=========================
Instead ov abort() I could select whatever I want. I, as programmer, have control about it.
With char* mytext;
I get
=========================
oliver@siouxsie:~$ a.out
selected number: 0
Text: ATTENTION! No text selected. Fix Bug, please!
selected number: 1
Text: two
selected number: 2
Text: two
selected number: 3
Segmentation fault
oliver@siouxsie:~$
=========================
In this case I have no control.
Compiling with -Wall:
=========================
oliver@siouxsie:~$ vim checks.c
oliver@siouxsie:~$ gcc -g -Wall checks.c
oliver@siouxsie:~$ a.out
(...)
=========================
Using valgrind: ----------------
In case of char* mytext = NULL;
=========================
oliver@siouxsie:~$ vim checks.c
oliver@siouxsie:~$ gcc -Wall checks.c
oliver@siouxsie:~$ valgrind a.out
==29758== Memcheck, a memory error detector
==29758== Copyright (C) 2002-2009, and GNU GPL'd, by Julian Seward et al.
==29758== Using Valgrind-3.5.0-Debian and LibVEX; rerun with -h for
copyright info
==29758== Command: a.out
==29758==
selected number: 0
Text: ATTENTION! No text selected. Fix Bug, please!
selected number: 1
Text: two
selected number: 2
==29758==
==29758== HEAP SUMMARY:
==29758== in use at exit: 4 bytes in 1 blocks
==29758== total heap usage: 1 allocs, 0 frees, 4 bytes allocated
==29758==
==29758== LEAK SUMMARY:
==29758== definitely lost: 4 bytes in 1 blocks
==29758== indirectly lost: 0 bytes in 0 blocks
==29758== possibly lost: 0 bytes in 0 blocks
==29758== still reachable: 0 bytes in 0 blocks
==29758== suppressed: 0 bytes in 0 blocks
==29758== Rerun with --leak-check=full to see details of leaked memory
==29758==
==29758== For counts of detected and suppressed errors, rerun with: -v
==29758== ERROR SUMMARY: 0 errors from 0 contexts (suppressed: 4 from 4)
Aborted
oliver@siouxsie:~$
=========================
In Case of char* mytext;
=========================
oliver@siouxsie:~$ vim checks.c
oliver@siouxsie:~$ gcc -Wall checks.c
oliver@siouxsie:~$ valgrind a.out
==29795== Memcheck, a memory error detector
==29795== Copyright (C) 2002-2009, and GNU GPL'd, by Julian Seward et al.
==29795== Using Valgrind-3.5.0-Debian and LibVEX; rerun with -h for
copyright info
==29795== Command: a.out
==29795==
selected number: 0
Text: ATTENTION! No text selected. Fix Bug, please!
selected number: 1
Text: two
selected number: 2
==29795== Conditional jump or move depends on uninitialised value(s)
==29795== at 0x4005C5: print_text_checks_null (in /home/oliver/a.out)
==29795== by 0x400692: print_all_messages (in /home/oliver/a.out)
==29795== by 0x4006CD: main (in /home/oliver/a.out)
==29795==
==29795== Conditional jump or move depends on uninitialised value(s)
==29795== at 0x4E6FA50: vfprintf (vfprintf.c:1601)
==29795== by 0x4E79BE9: printf (printf.c:35)
==29795== by 0x4005DF: print_text_checks_null (in /home/oliver/a.out)
==29795== by 0x400692: print_all_messages (in /home/oliver/a.out)
==29795== by 0x4006CD: main (in /home/oliver/a.out)
==29795==
==29795== Use of uninitialised value of size 8
==29795== at 0x4E72A87: vfprintf (vfprintf.c:1601)
==29795== by 0x4E79BE9: printf (printf.c:35)
==29795== by 0x4005DF: print_text_checks_null (in /home/oliver/a.out)
==29795== by 0x400692: print_all_messages (in /home/oliver/a.out)
==29795== by 0x4006CD: main (in /home/oliver/a.out)
==29795==
==29795== Conditional jump or move depends on uninitialised value(s)
==29795== at 0x4E9B6ED: _IO_file_xsputn@@GLIBC_2.2.5 (fileops.c:1314)
==29795== by 0x4E72711: vfprintf (vfprintf.c:1601)
==29795== by 0x4E79BE9: printf (printf.c:35)
==29795== by 0x4005DF: print_text_checks_null (in /home/oliver/a.out)
==29795== by 0x400692: print_all_messages (in /home/oliver/a.out)
==29795== by 0x4006CD: main (in /home/oliver/a.out)
==29795==
==29795== Use of uninitialised value of size 8
==29795== at 0x4E9B6F3: _IO_file_xsputn@@GLIBC_2.2.5 (fileops.c:1316)
==29795== by 0x4E72711: vfprintf (vfprintf.c:1601)
==29795== by 0x4E79BE9: printf (printf.c:35)
==29795== by 0x4005DF: print_text_checks_null (in /home/oliver/a.out)
==29795== by 0x400692: print_all_messages (in /home/oliver/a.out)
==29795== by 0x4006CD: main (in /home/oliver/a.out)
==29795==
==29795== Conditional jump or move depends on uninitialised value(s)
==29795== at 0x4E9B703: _IO_file_xsputn@@GLIBC_2.2.5 (fileops.c:1314)
==29795== by 0x4E72711: vfprintf (vfprintf.c:1601)
==29795== by 0x4E79BE9: printf (printf.c:35)
==29795== by 0x4005DF: print_text_checks_null (in /home/oliver/a.out)
==29795== by 0x400692: print_all_messages (in /home/oliver/a.out)
==29795== by 0x4006CD: main (in /home/oliver/a.out)
==29795==
==29795== Use of uninitialised value of size 8
==29795== at 0x4E9B70D: _IO_file_xsputn@@GLIBC_2.2.5 (fileops.c:1316)
==29795== by 0x4E72711: vfprintf (vfprintf.c:1601)
==29795== by 0x4E79BE9: printf (printf.c:35)
==29795== by 0x4005DF: print_text_checks_null (in /home/oliver/a.out)
==29795== by 0x400692: print_all_messages (in /home/oliver/a.out)
==29795== by 0x4006CD: main (in /home/oliver/a.out)
==29795==
==29795== Use of uninitialised value of size 8
==29795== at 0x4E9B6B0: _IO_file_xsputn@@GLIBC_2.2.5 (fileops.c:1348)
==29795== by 0x4E72711: vfprintf (vfprintf.c:1601)
==29795== by 0x4E79BE9: printf (printf.c:35)
==29795== by 0x4005DF: print_text_checks_null (in /home/oliver/a.out)
==29795== by 0x400692: print_all_messages (in /home/oliver/a.out)
==29795== by 0x4006CD: main (in /home/oliver/a.out)
==29795==
==29795== Conditional jump or move depends on uninitialised value(s)
==29795== at 0x4E9B6C0: _IO_file_xsputn@@GLIBC_2.2.5 (fileops.c:1347)
==29795== by 0x4E72711: vfprintf (vfprintf.c:1601)
==29795== by 0x4E79BE9: printf (printf.c:35)
==29795== by 0x4005DF: print_text_checks_null (in /home/oliver/a.out)
==29795== by 0x400692: print_all_messages (in /home/oliver/a.out)
==29795== by 0x4006CD: main (in /home/oliver/a.out)
==29795==
Text: two
selected number: 3
==29795== Jump to the invalid address stated on the next line
==29795== at 0x0: ???
==29795== by 0x4006CD: main (in /home/oliver/a.out)
==29795== Address 0x0 is not stack'd, malloc'd or (recently) free'd
==29795==
==29795==
==29795== Process terminating with default action of signal 11 (SIGSEGV)
==29795== Bad permissions for mapped region at address 0x0
==29795== at 0x0: ???
==29795== by 0x4006CD: main (in /home/oliver/a.out)
==29795==
==29795== HEAP SUMMARY:
==29795== in use at exit: 4 bytes in 1 blocks
==29795== total heap usage: 1 allocs, 0 frees, 4 bytes allocated
==29795==
==29795== LEAK SUMMARY:
==29795== definitely lost: 4 bytes in 1 blocks
==29795== indirectly lost: 0 bytes in 0 blocks
==29795== possibly lost: 0 bytes in 0 blocks
==29795== still reachable: 0 bytes in 0 blocks
==29795== suppressed: 0 bytes in 0 blocks
==29795== Rerun with --leak-check=full to see details of leaked memory
==29795==
==29795== For counts of detected and suppressed errors, rerun with: -v
==29795== Use --track-origins=yes to see where uninitialised values come from
==29795== ERROR SUMMARY: 14 errors from 10 contexts (suppressed: 4 from 4)
Segmentation fault
oliver@siouxsie:~$
=========================
Now you can meditate about the many messages of valgrind on undefined values and how they relate to the uninitialized values in the Gimp-code.
Didn't knew that result, because I use valgrind since two days, on recommendation of the gimp-irc-people.
But it looks like it affirms what I'm talking about.
-Wall has no effect here, and valgrind says the same as me. :)
So, have fun with uninitialized values, or enter the realm of clean programming ;-)
Ciao, Oliver
P.S.: In case the attachement will be suppressed, I paste the code here again.You may ignore the rest of the mail here, if you can get the file from the attachement.
***********************************************
#include
#include
#include
/* just prints us the text */
/* if text is undefined, we abort. */
/* ------------------------------- */
void print_text_checks_null ( char* print_me )
{
//fprintf( stderr, "ptr: %p\n", print_me );
if( print_me != NULL )
{
printf("Text: %s\n", print_me);
}
else
{
abort(); /* abort() or handle otherwise */
}
return;
}
/* some functions that generate us a text, */ /* giving us fresh allocated strings. */
char* put_string_1()
{
return strdup("one");
}
char* put_string_2()
{
return strdup("two");
}
char* put_string_3()
{
return strdup("three");
}
/* just a vector of string generating functions */
/* -------------------------------------------- */
typedef char* (*STRINGFUNC)();
STRINGFUNC func_vec[] = { put_string_1, put_string_2, put_string_3 };
void print_all_messages ( unsigned int selection )
{
int num_of_functions = sizeof(func_vec)/sizeof(STRINGFUNC);
//char* mytext = NULL; char* mytext;
/* check maximum index */
if( selection > num_of_functions )
{
abort(); /* abort() or handle otherwise */
}
switch( selection )
{
case 1:
mytext = (*func_vec[ selection ])();
break;
case 2: /* we need a different text here! */
/* decide later, which to choose
mytext = "I want a different text here!";
mytext = "I may want another text here...";
*/
break;
case 3: mytext = (*func_vec[ selection ])(); break;
default:
mytext = "ATTENTION! No text selected. Fix Bug, please!";
break;
}
print_text_checks_null( mytext );
return; }
int main()
{
int idx = 0;
for( idx = 0; idx < 10; idx++ )
{
printf(" selected number: %d\n", idx );
print_all_messages( idx );
}
return 0;
}
***********************************************
Testing on NULL an unitialized values
Am Freitag, 23. April 2010 schrub Oliver Bandel:
Zitat von "Tor Lillqvist" :
You are right, that in some seldom situations it might make sense to initialize values to other start values. But they should always be predictable.
You didn't get the reasoning about letting the compiler, or valgrind, catch use of uninitialized variables, did you?
I got it.
But the compiler throws a warning and not an error on it. So, it's possible to get running code.
So what? If the programmer decides to ignore warnings it's his deliberate choice. At least others have the chance to see those warnings and fix them/slap the one who caused them.
Tobias
Testing on NULL an unitialized values
hehe,
the segfault did not came from the char* mytext, but from wrong indexing in the vector. :( my fault :(
Heheh... nevertheless valgrind is on my side ;-)
Somehow I got no crash from the uninitialized char*, but that might only happen after release at the user's computer: It's unpredictable. maybe there was a \0 at that address.
The main thing I wanted to show: how to track such variables? The compilers and help tools do not always help.
If this were more than just showing that problem, I now should clean up my code... to one day also enter the realm of clean programming ;-)
Ciao, Oliver
Testing on NULL an unitialized values
Hello,
just because I found a nice jargon entry, which supports my view, I relate to that old topic again.
On Wed, Apr 21, 2010 at 12:33:15PM +0200, Oliver Bandel wrote: [...]
I would do EVERY pointer set to NULL, when defining it. And normally I also would set ANY other value to a certain value, when defining it.
This has helped me to track errors as early as possible.
[...]
To "Heisenbug" you find:
"In C, nine out of ten heisenbugs result from uninitialized auto variables, fandango on core phenomena (esp. lossage related to corruption of the malloc arena) or errors that smash the stack." http://www.jargon.net/jargonfile/h/heisenbug.html
So, unininitialized auto variables is explicitly mentioned.
Ciao, Oliver