Conversation

Hardware doesn't implement C, so there isn't a standard behavior defined by hardware. It's up to the compiler to map C onto the hardware or the virtual machine. They get to choose how to handle each kind of undefined or implementation defined behavior, and everything else.
2
1
No, that's not what I've been saying. I think it would be a serious regression to break compatibility with safe implementations by making it correct to be incompatible with them. You want to massively roll back safety and security, especially if you want to remove it by default.
2
1
It is a big deal, because it would start on the path towards making signed integer overflow as hard to enable as unsigned integer overflow. The current definition of the standard makes it far easier, and therefore makes C into a safer language when the implementation wants that.
1
Trapping on signed integer overflow isn't a new behavior. It has been around as a compiler feature for ages and is broadly deployed. Trapping on unsigned integer overflow is a newer feature, and it's far harder to deploy largely due to it not being standards compliant.
2
Signed integer overflow is always a bug (unless -fwrapv is passed, which isn't portable), so it can reported and fixed as a bug in software. It's also detected by default by tools like UBSan designed to catch UB. Since unsigned overflow is well-defined, that makes it harder.
1
Yeah, definitely. It's quite rare for -fwrapv to actually make a difference anyway. It's unfortunate if projects depend on it and decide not to avoid signed overflow or at least mark intended overflows with the intrinsics. It makes it way harder to find the bugs or harden it.
1
1