There is no risk of creating a language dialect that already exists. Real world C software is already written in a dialect of C where tons of undefined behavior maps to things that manage to keep the program working. Cases that break are noticed/fixed, what's left is this stuff.
Conversation
Of course there's a risk if you add an option that makes it guaranteed in even more scenarios and not as a side effect or something else!
2
I do want to refine (or perhaps change) my position though. I don't think you can look at a language feature and say it create a dialect or not. Yes, you can have some litmus tests as I suggested above, e.g., checking if the feature allows programs to function correctly where...
1
... where they otherwise wouldn't (so -ftrapv would fail this test, but -fwrapv would pass), but ultimately whether a dialect *is* created is an emerged property of future behavior that we can't completely predict. It is based on how much code ends up relying on it,
1
how useful the feature is (how hard to stop using it), how many compilers support it, the marketshare and attribute of the involved developers, performance and security concerns and so it.
That is, it's just a big sea of grey. So the details matter: yes it possible that ...
1
... the way some compilers insert a 0 byte in their stack protector check could cause a large body of programs to emerge that rely on this, and don't treat it as a bug, but so far that hasn't happened. It is possible that this zero-by-default behavior will also be heavily ...
1
... relied on to the extent to creates a dialect, but I would *guess not*. It is certainly more likely than the stack protector thing. So I'm not a fan of binary thinking in the sense of "well, technically any program could rely on any of a million implementation details and ..."
1
"... and we don't call those dialects", because this isn't a precise black and white thing. It's all probabilities and developer behavior and egos and so on. Just like libc can't change their memory allocator due to emacs: they have created a dialect that guarantees ...
1
... the emacs stuff keeps working, but you couldn't have determined that 20 years ago by looking at the design of the allocator. You couldn't even predict it by knowing that emacs came to depend on it: you'll need to know the relationship between emails and glibc, the people ...
1
... involved, emacs market and mindshare and all the little factors that went into glibc deciding "shit, we can't change this now".
1
In that case though, glibc literally exported a pair of functions to store and load the heap. Clang already added the pattern initialization switch and if they care about backwards compatibility they already committed to continuing to support it. Zero or no zero changes little.
They allow 255/256 single byte patterns of automatic initialization already and I don't see any way that removing the switch blocking zero could ever end up creating the need to maintain more code. If they didn't special case it, they'd be guaranteed to have less code for good.
1
I'm not talking about clang needing to maintain more code, I could care less about that?
I though it was about that software might come to rely on the zero init behavior, and this could be a bad thing. That's how I understand the "no dialects, please" side of the debate.
1
Show replies

