The fact that << and >> are big-endian is the "Benjamin Franklin mixing up the direction of current" of programming.
Most binary processing is little-endian these days, because most processors are little-endian. Some network protocols use big, but I think that's shifting (Sia uses little-endian, as do protocol buffers, Cap'n Proto, etc.). I would prefer it to be displayed little-endian too.
-
-
Working with binary is sufficiently different from everyday base-10 math that I don't see familiarity as an advantage -- better to start from a clean slate.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.