Evergreen tweethttps://twitter.com/fchollet/status/1040400696187580416 …
-
-
Btw GPUs have 10-15B transistors these days, growing by 10x in ~5 years. So GPUs should be brain-scale in less than 5 years
Show this thread -
New conversation -
-
-
This Tweet is unavailable.
- End of conversation
-
-
-
Probably it doesn't even make sense to compare two NNs by the number of their neurons/layers. I mean, there are simple MLPs where optimizers can find countless permutations of hidden neurons/layers that may perfectly work just fine for a given task. They're just nodes/parameters.
-
That is correct, in particular it doesn't make sense across different architectures
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.