Also, @xaqlab would be very sad to hear that you think no one is still trying to figure out how the brain does belief propagation.
And PGMs have been heavily explored, while backprop was ignored for a long time... so I think things are OK
-
-
But yes... basing as directly as possible in ground truth neuro observations would be very nice...
- 1 more reply
New conversation -
-
-
Gradient descent wasn't ignored. It was just an unremarkable part of many learning algorithms. What always struck me as weird was how neural network people deified it as, like, the One True Learning Algorithm.
- End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.