Files Are Hard - via @sadisticsystems https://danluu.com/file-consistency/ …
-
-
From my understanding, that works as long as you assume the writing of the hash always works correctly, else an incorrect hash could invalidate correct data
-
hmmm, yeah true. I think the gist is that the data needs to be hashed twice. Once before write, and once after write to check if they match. If not, you trace back to what went wrong - and never ACK the request.
End of conversation
New conversation -
-
-
Have you heard about cosmic rays and ram :trollface:https://twitter.com/whitequark/status/980522328151834624 …
-
haha, yeah I read that a while ago. I mean... yeah :|
End of conversation
New conversation -
-
-
When I worked at a HPC lab they were all about ECC at all layers. They would exclusively buy ECC RAM and encouraged ECC on important data paths in software. If things are really really important they would even have redundant computation in some cases
-
A Ph.D student did a spinout called eDNA for exactly this purpose in environments with high risk, such as nuclear reactor control systems and flight controllers for rockets and satellites
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.