It's not like the C specification is stable... and it only gained a memory model with C11, which is a big deal.
-
-
Previously POSIX specified a much stronger (more constr) mem model to which all usable compilers had to comply
1 reply 0 retweets 0 likes -
Replying to @RichFelker @CopperheadOS and
And the pre-C11 [lack of] memory model in C was sufficient for specifying non-concurrent behavior fully.
1 reply 0 retweets 0 likes -
Replying to @RichFelker @CopperheadOS and
So IMO from a practical standpoint C11 significantly weakened the memory model, rather than adding it.
1 reply 0 retweets 0 likes -
From a practical standpoint, they've had major issues with their own informal model not matching how GCC works.
2 replies 0 retweets 0 likes -
They = Linux kernel devs? I don't doubt it one bit. Their informal model is "C is high level assembler". :-(
1 reply 0 retweets 0 likes -
Yeah, they make a lot of assumptions about the memory model; often incorrect even at hardware level outside x86.
1 reply 0 retweets 0 likes -
Replying to @CopperheadOS @RichFelker and
Linus puts the blame on the compilers and weak memory models. They used their own loose version of C11 rules.
1 reply 0 retweets 0 likes -
Replying to @CopperheadOS @RichFelker and
But... without carefully considering how it lined up with the reality in compilers and other architectures.
1 reply 0 retweets 0 likes -
Replying to @CopperheadOS @RichFelker and
And having many concurrent data structures / algorithms not understood even by core developers seems a bit bad.
1 reply 0 retweets 0 likes
I consider the whole "concurrent data structures" game a mistake. Locks are cheap and safe. RCU is awful.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.