Only a tiny minority of professional programmers have a clear picture in their minds of how fast modern computers are. 99.9% have next to no idea. How does this affect software that is even conceived? (Ignoring, for a moment, what is actually built, which we know is very slow).
-
Show this thread
-
How big of a problem is it that we have this crucial craft, on which we are knowingly staking the future, and almost none of its practitioners understand the fundamental tool they are using?
7 replies 6 retweets 99 likesShow this thread -
(For the record, I don’t place myself in the top tier re understanding of speed or anything else. I am somewhere in the middle of that gradient between the 99.9% and the People Who Really Know.)
4 replies 1 retweet 45 likesShow this thread -
We see all this bad rhetoric claiming “system X is only 2x slower than native code therefore it’s fast”... but one must ignore rationalizations and look at the actual output, which is several orders of magnitude inefficient. Few people are willing to put 2 and 2 together here.
9 replies 3 retweets 66 likesShow this thread -
The most common objection to these points is "we write slow software because it lets us make things faster and more easily". I agree this is the common belief, but it's wrong. If development is so much easier, why is productivity approaching 0 over time?
20 replies 11 retweets 118 likesShow this thread -
Replies seem to be rat-holing on the old well-understood concept that software is slow. Yeah, we know, I have said that many times (and said to ignore that this time). What I am highlighting here is a deeper issue: programmers don't really know what computers are any more.
7 replies 4 retweets 65 likesShow this thread -
Speed is one dimension of understanding that's lacking; the picture of speed in programmers' heads is 2-4 orders of magnitude too slow. It's easy to see and understand this, which is why I brought it up. But it's not the only dimension of missing understanding.
2 replies 1 retweet 29 likesShow this thread -
To make the speed point again, for an attempt at clarity: Programmers have a picture of their computer, in their minds, that they use to figure out what to do. For 99.9%+, that picture is inaccurate: the imagined computer is 100x-1000x slower than the real computer.
9 replies 15 retweets 90 likesShow this thread -
This will result in software that's too slow, obviously. But it also affects what one thinks is possible, what one dares to imagine to do. That is the more important part. Humans are very example-based, and if our examples are wrong, where they lead us will be wrong too.
14 replies 13 retweets 154 likesShow this thread -
Replying to @Jonathan_Blow
I suppose underestimating CPU speed relative to, say, network speed or RAM speed also must lead to some design patterns that make software slow and bad, e.g. using a client/server model to communicate between different parts of the same program.
1 reply 1 retweet 8 likes
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.