My experience has been that processors have gotten better but programmers have failed to keep up.
-
-
-
My experience is that they've gotten faster but Javascript and .NET.
- 1 more reply
New conversation -
-
-
Transistor counts are still increasing, but single-core performance has flattened out. And that turns out to be pretty important.pic.twitter.com/PXi0QaAPmI
-
I love this GIF
End of conversation
New conversation -
-
-
Trust me we support people who run open source ruby code on random cloud servers and we are *keenly* aware that per-thread perf has advanced a lot in the last 7 years
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
I think your perception is roughly right -- a 10 year old CPU (Sandy Bridge) is still totally usable today for the vast majority of users. But if you look at the 10 years from, say, 1983 to 1993, you're looking at an 8 MHz 286 w/128kB of RAM vs. a 66 MHz P5 w/8MB or 16MB of RAM.
-
We’ve become much more GPU reliant in the past 10 years, and it shows in the transistor budget allocation
End of conversation
New conversation -
-
-
As
@kchoudhu alluded to - compute has a flexible demand curve. Some websites are now larger than a 1980's hard drive, etc.Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
The last 5sec of the video explains perception vs. reality — by 2018 none of the top CPUs are in consumer devices, except the Apple chip.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
We've probably increased the amount of work we're making them do at a faster rate than hardware performance improvements, a bit like induced demand on highway congestion
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.