Great read, neural nets seem inherently limited to the quality of the datasets they are trained on and can't be expected to extrapolate novel features not originally present in the training set. Much like people and their unique experiences still.
-
-
Kiitos. Käytämme tätä aikajanasi parantamiseen. KumoaKumoa
-
-
-
Yeah, but that's like saying: "every function is linear when the input is represented in a space of high-enough dimension", or: "every dynamical system can be approximated by a Markov chain." It's both true and considerably less relevant in practice than one might think.
-
It's relevant when it leads to better algorithms, as both of those have (kernel machines and sequential Monte Carlo methods, respectively).
- Näytä vastaukset
Uusi keskustelu -
-
-
I don't know if it's the intention of your tweet but I feel like recently there has been **a lot** "who's the original inventor of xyz" going on in ML.
Kiitos. Käytämme tätä aikajanasi parantamiseen. KumoaKumoa
-
Lataaminen näyttää kestävän hetken.
Twitter saattaa olla ruuhkautunut tai ongelma on muuten hetkellinen. Yritä uudelleen tai käy Twitterin tilasivulla saadaksesi lisätietoja.