At some point, someone is going to have to explain why - despite nobody ever wanting this in production code ever, for any reason - the default CSR state for divide by zero on most platforms is fault instead of flush.
-
Show this thread
-
This is such a tremendous pain in the ass. Most code that has to do an "if this isn't zero, then divide" could be written to "just work" if it could generally assume that the CSRs were always set to flush. But they can't, because it usually isn't.
1 reply 0 retweets 16 likesShow this thread -
Since most numerical code today is in libraries, and libraries can't go setting the CSR because different libraries might conflict, etc., it is really a huge issue. Please someone just change this globally, by fiat. Microsoft? Linux? WASM? PLEASE???
1 reply 0 retweets 13 likesShow this thread -
Maybe somebody has already, and if so, I commend you. But as far as I can tell so far, everybody still defaults to _crashing the program_ when it divides by zero.
2 replies 0 retweets 12 likesShow this thread -
Replying to @cmuratori
Funnily enough I'm reading this tweet while waiting for an app to compile & start in Debug because I just got a divide-by-zero crash in it. I swear. Uncanny.
1 reply 0 retweets 3 likes
It is just so silly at this point. Basically all SSE code that does a divide could be trivially extended to make flush-to-zero produce a usable default value with no branching. How is this still a thing??
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.