Only a tiny minority of professional programmers have a clear picture in their minds of how fast modern computers are. 99.9% have next to no idea. How does this affect software that is even conceived? (Ignoring, for a moment, what is actually built, which we know is very slow).
-
-
This will result in software that's too slow, obviously. But it also affects what one thinks is possible, what one dares to imagine to do. That is the more important part. Humans are very example-based, and if our examples are wrong, where they lead us will be wrong too.
Show this threadThanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Definitely. I've met graduates and veterans who don't get how powerful machines are these days. Bit show them example code, they claim its a toy problem. It's like boids is a joke to them because they won't believe their code isn't really that complex. It's what I tried to expl..
-
to explain in the data-oriented design book. 'your problems aren't really that complex, you've just made them complex', and I think I got through to a few. I still hold its that so few are taught to really think about the data model.
End of conversation
New conversation -
-
-
Seems backwards. Programmers are supremely optimistic about: a) the necessity and quality of many abstraction layers b) the brute force capacity of modern hardware to handle heavy abstractions Whatever they think hardware capable of, they can't imagine another way to code.
-
Let's not forget: c) the principle of premature optimization being the root of all evil. Programmers are discouraged from optimization in schools today, not encouraged; and the motivations to do so no longer exist because of a) and b). Complacency is the root of all evil.
End of conversation
New conversation -
-
-
Just to clarify, I fully agree with you about the speed of computers and software being slow. My disagreement is over engineers at companies like Twitter/FB (or myself at Thread) being unproductive or trending less productive.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
I did not see whether you have a reference for the specific numbers you're using?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
I can't imagine the 99.9% number being accurate. Even ignoring all else (like - a lot of the developers having CS degrees, where they got taught how computers work), the embedded, drivers, low-level etc. devs surely have to constitute more that 0.1% of the overall field?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
What facts is your opinion based on? You may have an inaccurate mental model of what other people are thinking. Admit it, you are just making these numbers up.
- 1 more reply
New conversation -
-
-
It’s not like ppl “program computers”, they just create software. And I’m not saying in a bad way. Tho, I program most of my day. Whenever I, or someone, say that I “program computers” it just doesn’t feel true. (But at least I acknowledge that)
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
That's not the only way in which said picture is usually wrong. Programmers will eagerly optimize making assumptions based on said picture without benchmarking, resulting in uglier *and* slower code. Hence the famous "Eager optimization is the root of all evil"
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.