If the fat pointers define new behavior then it’s ok for that definition to contradict “classic C”. C making it defined or not only enables you to “say” that a secure version of C is “compliant”. It’s not a bad thing if you had to say that it was non compliant to classic C.
Conversation
They don't define any new behavior. They catch undefined behavior like out-of-bounds accesses and use-after-free. These things are already broken and non-portable. C is deliberately very portable and permits implementations that are memory safe. It even permits using a GC.
2
You keep saying what C *is* which isn't relevant to me, since I'm proposing what it *should be*. (More specifically, I think that like many folks, I want an -fno-bullshit, which turns off optimizations introduced by people who don't understand C.)
1
1
I think it's you that doesn't understand C. The whole point behind the C standard is that it's extremely portable and permits a broad range of implementation choices and aggressive optimization. You want a new variation of the language, which is fine, but I don't think it helps.
2
These issues are not the practical issues with it in the real world. Making all these little things well defined doesn't fix that C code is plagued by type and memory safety errors, which lead to software being extremely unsafe, unreliable and vulnerable. It's mostly academic.
1
This Tweet was deleted by the Tweet author. Learn more
I'm not the one with a fundamental misunderstanding of how C is supposed to work. You're making an extreme misinterpretation of what I've been stating. Making C much more permissive forbids optimizations, safety improvements, portability, alternate implementation approaches, etc.
1
1
The fact that C makes type and memory safety issues along with other classes of bugs undefined is what makes it permitted to make a much more secure implementation where temporal and inter-object spatial memory safety issues are partially or even fully detected at runtime, etc.
1
1
Permitting trapping on signed integer overflow is another good example of that. If it defined signed integer overflow as guaranteed to wrap, it would disallow more secure implementations where it traps. Even saying it either wraps or strictly traps would be a safety regression.
1
Since it should permit implementations that do more efficient lazy trapping, as that's more likely to be widely adopted due to significantly better performance, especially with hardware support. It's how some next generation hardware approaches handling overflow by default.
1
i.e. hardware support for poison values, where you can still use them to perform more arithmetic (resulting in more poison values) but as soon as you actually read them in order to depend on them in some other way, it will trap.
Redefining C to forbid lots of important real world safety features even including extreme basics like type-based CFI, _FORTIFY_SOURCE and -fsanitize=object-size would be horrible. I think you need to think over what making it more permissive would do to safety / portability.
1
Replacing 'undefined' with stricter definitions could often be a good thing, but simply making the language more permissive and forbidding memory / type safe implementations is not a good way to do that. I'm glad the people involved in making the standard are smarter than that.
2
Show replies

