And that memory safe language can certainly be a subset of C with annotations / rules that make it memory safe. No problem with that, I just don't think it's a very useful/practical thing to do personally since I'd rather use a nicer language if it has to be from scratch anyway.
Conversation
We know how to write software with decent security and these kinds of capabilities. It's not a mystery. We choose to use software architectures and languages making it unrealistic to provide decent security. Even if you claim that it's due to programmer incompetence, not tools...
1
2
3
... then clearly there are near 0% competent C programmers. The whole point of safer tooling is that humans aren't being trusted to never make a mistake or miss something. It's extremely hard to right completely correct software and those bugs should not be remotely exploitable.
2
2
5
The tools can't allow assorted little mistakes which are certainly going to happen to cause code execution vulnerabilities. It just isn't a viable way to build a secure computing ecosystem still capable of doing the same kinds of things, like loading up a GIMP / Krita project.
1
1
Should definitely work to avoid bugs as much as possible. A huge part of is eliminating bug classes with better tools. It's also important to prevent the remaining bug classes from being exploitable in ways like gaining code execution and we know how to get 99.9% there.
1
There's a big distinction between weak warnings / static analysis and doing it properly. For example, GCC / Clang will warn for things like possibly uninitialized variables. However, they don't implement a system for guaranteeing that C code has no uninitialized variable use.
1
1
It's easy to implement that. It requires the code to actually be written with it in mind. The compiler simply enforces that every code path must initialize the variable before it's used (also applies to parts of structures / arrays). They can define how clever the analysis is.
1
2
It actually has to be clearly defined. If the standard set of rules isn't able to demonstrate that something is initialized before use, you have to initialize it, or if it's perf critical and it can be demonstrated as correct to humans, you could mark it unsafe + use intrinsic.
1
Now, you do this for all the other safety issues, which requires coming up with more systems for enforcing safety. You end up with C code that is still just as portable, but with the rules and annotations is actually verifiable as memory safe. It essentially reinvents Rust.
1
You can't assume that verifying it with a compiler implementing these standards would mean it is also safe with other compilers due to the C preprocessor and non-portable extensions. Still, it would make things far better with other compilers too.
1
As I said earlier though, if you are going to define a new language requiring extensive rewrites / porting, I think it might as well just be a modern language with higher productivity and a better type system. After memory safety is addressed, much more can be done for safety.
