Given that neural networks run well with sloppy <= 8 bit precision, I wonder what kind of analog or optical computational arrangements could outperform digital hardware.
-
-
Maybe time for a breakout, executing GPU trained deep learning models.
-
The thing is, transistors that can do digital well have been getting smaller and smaller, but transistors that can do analog well are still pretty large. (Something like 14 nm versus 130 nm, though don't take that too seriously.) So the trend isn't favorable.
- 4 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.