Even if it did not hurt performance, it hurts the very idea of programming: express our ideas and algorithms, not those of the compiler authors or random emergent behaviours.
-
-
-
It's also a very shaky foundation to base your optimization on. Someone recompiling x years from now could shoot themselves in the foot.
End of conversation
New conversation -
-
-
BUT I DON'T WANNAAAAAAAAAAAA
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Even if it DID produce better optimizations, I really don't want to be treated like a child because someone over at clang HQ noticed a slight opening that they could attack my code with.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Is there a known-good compiler (tinycc?) for using what this paper calls C* ? Or are there real security implications when you go too far back?
@nothings uses VS 6 right, are these concerns part of his reasoning or is he worried more about the IDE? -
It's just the IDE for me. In general VS hasn't gone down the "C" path so much. Yet.
End of conversation
New conversation -
-
-
With my former compiler writer hat on, I have to agree. Compiler writers should spend less time on getting another 0.05% on SPEC* benchmarks and start providing some optimisation GUARANTEES. Where's the C++ compiler that promises that copy elision always happens when it applies?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.