Comment by nullc

Comment by nullc 19 hours ago

3 replies

meh, the compiler can almost always eliminate the spurious default initialization because it can prove that first use is the variable being set by the real initialization. The only time the redundant initialization will be emitted by an optimizing compiler is when it can't prove its redundant.

I think the better reason to not default initialize as a part of the language syntax is that it hides bugs.

If the developers intent is that the correct initial state is 0 they should just explicitly initialize to zero. If they haven't, then they must intend that the correct initial state is the dynamic one in their code and the compiler silently slipping in a 0 in cases the programmer overlooked is a missed opportunity to detect a bug due to the programmer under-specifying the program.

RustyRussell 18 hours ago

In recent years I've come to rely on this non-initialization idiom. Both because as code paths change the compiler can warn for simple cases, and because running tests under Valgrind catches it.

bluecalm 13 hours ago

It only works for simple variables where initialisation to 0 is counter productive because you lose a useful compiler warning (about using initialised variable).

The main case is about arrays. Here it's often impossible to prove some part of it is used before initialisation. There is no warning. It becomes a tradeoff: potentially costly initialisation (arrays can be very big) or potentially using random values other than 0.

  • nullc 2 hours ago

    Fair point though compilers could presumably do much better warning there on arrays-- at least treating the whole array like a single variable and warning when it knows you've read it without ever reading for it.