The implications of making uninitialized memory act like poison seem pretty scary to me, and I’d expect such a change to break a lot more code than just bitfield manipulation generated internally to the compiler. I think there’s a lot of other code out there which does similar sorts of things at source level – with the expectation that math on the result of an uninitialized memory load is going to at least give some non-deterministic result.
E.g. look at something like https://github.com/bminor/musl/blob/master/src/string/strlen.c – it reads the string word-by-word until it sees one with a zero byte. If there’s uninitialized data after the end-of-string zero, no problem under the undef-bits semantics. But if the entire load turns to poison, that’ll be totally busted.
With undef semantics, multiple loads of the same address results in the same nondeterministic value.
Are you sure about that? It doesn’t sound right.
For example the following
xor_a
function can be optimized into a single return statement of zero.
It can be optimized to zero without that guarantee, because xor undef, <anything>
is undef
, and undef
can be correctly replaced by any arbitrary value – such as zero.