The annoying thing about computing is that every damn thing has to be live in an environment of scale. It's like if there were only 2 scales of cooking: cooking a meal for your family at home, and cooking for the entire planet. No intermediate scales of intermediate difficulty.
You still have to do most of the work of getting from 1 to 7 billion in the 1 to say 1000 phase... the higher orders of scaling are mostly handled by infrastructure that's mostly agnostic to code specifics, right?
-
-
This Tweet is unavailable.
-
I dunno, feels like an empirical research question
End of conversation
-
-
-
I’ve written systems (still running) for a single department of a large company. Tons of that software exists. It was not designed to, and could not, scale much beyond its current size, but has been running for 9 years.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Nope. Consider that Twitter was originally written in Ruby and had to be rewritten in Scala to scale https://www.artima.com/scalazine/articles/twitter_on_scala.html …
-
I agree with Lawrence here. Things keep changing as progressively higher order black swans become the norm. Weirder things happen at larger and larger scales. At 1000x you still expect networking to "just work", security is just locking the door, and earthquakes don't exist
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.