Only a tiny minority of professional programmers have a clear picture in their minds of how fast modern computers are. 99.9% have next to no idea. How does this affect software that is even conceived? (Ignoring, for a moment, what is actually built, which we know is very slow).
-
-
To make the speed point again, for an attempt at clarity: Programmers have a picture of their computer, in their minds, that they use to figure out what to do. For 99.9%+, that picture is inaccurate: the imagined computer is 100x-1000x slower than the real computer.
Show this thread -
This will result in software that's too slow, obviously. But it also affects what one thinks is possible, what one dares to imagine to do. That is the more important part. Humans are very example-based, and if our examples are wrong, where they lead us will be wrong too.
Show this thread
End of conversation
New conversation -
-
-
I think you’re over estimating our ability to truly grasp the « time warp » a modern processor creates.
-
I completely agree this is a big problem. I've been trying to educate people in my company about the insane speeds of modern computers compared with what we used to use 10 or 20 years ago which the older people in our group do still remember using to do cool things.
-
2/2 But even these older people have simply forgotten due to the passage of time and the drip, drip of slow software becoming the norm. Perhaps Jon can help come up with real example running on older vs newer hardware and then compare with comparable modern, badly written code.
-
Original Opera browser! And back on Android 2.something, before it had JIT and used just an interpreter, I used to open so much tabs on the phone with resources to spare, that it still is unbeaten by the likes of Chrome on the desktop.
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.