Conversation

Okay, and that also means that using MADV_FREE in malloc and elsewhere is not possible either, which is a massive performance cost. Uninitialized memory can and does change value at runtime beyond just compiler optimizations avoiding saving uninitialized data via spill / restore.
2
3
That's likely glibc what is going to be doing for their stack cache since MADV_DONTNEED is a significant performance cost for their implementation, and it doesn't become a non-issue if restricted to malloc since it still means that uninitialized memory can change between reads.
1
4
Reading uninit data being undefined instead of locking it to an unspecified value permits massive optimizations like MADV_FREE and more efficient register allocation/spilling. Similarly, other memory safety issues being undefined permits optimization / freedom of implementation.
1
5
Many programs have bugs where they read data that has just been freed, but handle it being an arbitrary value. The issue is often benign with common allocators. However, with other implementations the access will fault and they crash. It's good it's not required to let it work.
2
3
Also, signed overflow being undefined rather than defined as wrapping means that more secure implementations where it traps are permitted. Passing -fsanitize=signed-integer-overflow -fsanitize-trap=signed-integer-overflow is standards compliant and used for hardening in AOSP.
3
5
Similarly, lots of other UB that's easy to catch with simple branches at runtime (not most memory and type safety issues, but lots of other bugs) can be made to trap while remaining standards compliant, including enforcing not dereferencing outside many objects with object-size.
1
Implementations of memory safety for C via fat pointers, etc. also depend on these things being undefined. By making it acceptable to index from one object into another and dereference the pointer, you would be forbidding memory safe implementations of C which are very important.
1
If the fat pointers define new behavior then it’s ok for that definition to contradict “classic C”. C making it defined or not only enables you to “say” that a secure version of C is “compliant”. It’s not a bad thing if you had to say that it was non compliant to classic C.
1
I think it's you that doesn't understand C. The whole point behind the C standard is that it's extremely portable and permits a broad range of implementation choices and aggressive optimization. You want a new variation of the language, which is fine, but I don't think it helps.
2
Show replies
These things were being done before the C89 standard, if that's what you mean by classic C (i.e. C code written before 1989). People cared about being able to pay a performance cost to compile code as memory safe even back then, or to use tracing GC even just to identify leaks.