That's just broken, but not too surprising, since it's not uncommon for compilers to have miscompile bugs in tough and unusual corner cases. But the context of this conversation is: do you need a new compiler to fix UB?
Conversation
The fact that llvm has bugs doesn't really change the answer. Even a compiler that was designed to avoid UB could have miscompile bugs. Also, the context of the conversation allows for making changes to llvm - so this seems like the kind of thing that could be fixed.
1
1
I guess I don't understand the context. It seems to be about C, and I don't see how you can resolve that problem for C without coming up with a model to enforce a form of memory safety. What is the scope of UB that should be avoided? You mean, for a language like Rust or Swift?
2
The question of whether memory unsafety implies UB is sort of at the heart of the disconnect between the C spec and C practitioners. As a practitioner (and compiler guy) I view memory unsafety as a separate thing - after all a “bad” store still stores to a well defined place.
2
1
2
There is nothing well defined about what an out-of-bounds access or use-after-free will access. The compiler, linker and even runtime environment are assuming that is never going to happen and there's nothing defined about what the consequences are going to be from the C code.
3
1
In the machine code output by the compiler, it's perfectly well defined what an out-of-bounds access or use-after-free will do, and to what, although of course it depends on runtime state. It's just undefined in the input C code.
1
2
That's not a relevant response related to the thread. He states that he wants an optimizing compiler with a comparable amount of optimization, where the programmer is writing code for an abstract machine and the compiler is making transforms that preserve abstract semantics.
1
That wasn't my interpretation of his Tweet at the time, but on looking at further context, I think you are correct
1
I would definitely say that the standard should not say things are 'undefined' but rather come up with sensible constraints on how it should be implemented. Guaranteeing that signed overflow wraps would be a regression for safe implementations by forbidding them from trapping.
2
1
Yes, let's genocide nasal demons. I'm not totally convinced about signed overflow: overflow semantics have certainly caused many security holes, but a potential crash in every (signed) arithmetic operation also seems unsuitable for software designed not to fail
1
1
Software designed to be robust has to be designed to handle failure. It has to know how to recover from ending up in an unexpected state that the design wasn't written to handle. It's important to do things with transactions and to be able to recover from unknown states.
There are different kinds of robust software. Some of it has to be designed to not fail, rather than to handle failure. There's the possibility that a hardware failure will put you in an unknown state, but that's why we have TMR and duplex lockstep processors
1

