Given that neural networks run well with sloppy <= 8 bit precision, I wonder what kind of analog or optical computational arrangements could outperform digital hardware.
-
-
Replying to @ID_AA_Carmack
Carver Mead's book "Analog VLSI and Neural Systems" came out three decades ago, and there have been people working on it ever since, with reasonable success (though obviously not taking over the world).
1 reply 0 retweets 7 likes -
Replying to @NYarvin
Maybe time for a breakout, executing GPU trained deep learning models.
1 reply 0 retweets 3 likes -
Replying to @ID_AA_Carmack
The thing is, transistors that can do digital well have been getting smaller and smaller, but transistors that can do analog well are still pretty large. (Something like 14 nm versus 130 nm, though don't take that too seriously.) So the trend isn't favorable.
2 replies 0 retweets 6 likes -
Replying to @NYarvin @ID_AA_Carmack
is that a case of "digital makes money so we invest in digital " or is analog just harder to make small?
1 reply 0 retweets 0 likes -
Replying to @zakedodead @ID_AA_Carmack
It's mainly that transistors that just need to do on/off can be cruder than transistors that deliver accurate analog performance. But yes, while analog makes plenty of money, applications for millions of analog transistors are scarce, so there's less point in miniaturizing.
1 reply 0 retweets 1 like -
Replying to @NYarvin @zakedodead
The interesting point is questioning how much accuracy and precision is necessary; it seems to be less than most guesses, which may open up unconventional avenues.
2 replies 0 retweets 2 likes
Even "good" VLSI transistors are lousy compared to discrete transistors. So there's already considerable art that goes into using them for neural networks in such a way as to make the overall error reasonable.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.