Only a tiny minority of professional programmers have a clear picture in their minds of how fast modern computers are. 99.9% have next to no idea. How does this affect software that is even conceived? (Ignoring, for a moment, what is actually built, which we know is very slow).
-
-
(For the record, I don’t place myself in the top tier re understanding of speed or anything else. I am somewhere in the middle of that gradient between the 99.9% and the People Who Really Know.)
Show this thread -
We see all this bad rhetoric claiming “system X is only 2x slower than native code therefore it’s fast”... but one must ignore rationalizations and look at the actual output, which is several orders of magnitude inefficient. Few people are willing to put 2 and 2 together here.
Show this thread -
The most common objection to these points is "we write slow software because it lets us make things faster and more easily". I agree this is the common belief, but it's wrong. If development is so much easier, why is productivity approaching 0 over time?
Show this thread -
Replies seem to be rat-holing on the old well-understood concept that software is slow. Yeah, we know, I have said that many times (and said to ignore that this time). What I am highlighting here is a deeper issue: programmers don't really know what computers are any more.
Show this thread -
Speed is one dimension of understanding that's lacking; the picture of speed in programmers' heads is 2-4 orders of magnitude too slow. It's easy to see and understand this, which is why I brought it up. But it's not the only dimension of missing understanding.
Show this thread -
To make the speed point again, for an attempt at clarity: Programmers have a picture of their computer, in their minds, that they use to figure out what to do. For 99.9%+, that picture is inaccurate: the imagined computer is 100x-1000x slower than the real computer.
Show this thread -
This will result in software that's too slow, obviously. But it also affects what one thinks is possible, what one dares to imagine to do. That is the more important part. Humans are very example-based, and if our examples are wrong, where they lead us will be wrong too.
Show this thread
End of conversation
New conversation -
-
-
IMO uou have to (a) inspire why it is important (b) teach without shame and (c) maybe not be nostalgic about the future that never was
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
It's amazingly common, too. People are taught to write code, not to program.... or whatever terms you want to use to define the difference there.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
It's a big problem; consider the amount of energy wasted by PC's running inefficient code every time someone uploads a picture to instagram, extracts a quarterly report etc. Add to that the amount of life wasted waiting for things to boot/complete some trivial task.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
I think it is a really big problem. A lot of inefficiencies and waste of resources.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Problem is only getting worse. Computer hardware is even more heterogeneous now than in the past. CPU, GPU, and now neural processing units. So the things to know is much more now.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.