I just love the cognitive fluency of this animation. Great job!
-
-
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
-
-
Sideways FTW! Think of system w 64 GPU + model w 64 layers. A GPU handles a layer and passes activations to next GPU AND receives gradient <--back, and calculates backprop based on CURRENT activations, which can be up to 64 frames newer (but like frames in other stage of pipeline
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
-
-
I think we can levege this in the GAN’s generator if during the training we use slow interpolation between noise instead of random sampling.
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
-
-
Okay..., seeing what I'm seeing, ultrafine "tubro" on backprop... Just that the quantum application "spoke" for some reason; as I 'saw' "divisional bit stride" in your "multipass combine". May not be current but perhaps a presettle on advantageon in algorythmic for that processor
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
-
-
Do you people realize that the adult brain can perceive any object instantly without previously learning a model of it? IOW, the brain doesn't even learn patterns. It can see them instantly & invariantly. DL & backprop are not it. Yes, AGI is coming but not from DeepMind.


-
Oh, I forgot to say something important. Deep learning is to AGI what climbing a tree is to landing on the moon. hahaha...HAHAHA...hahaha...


Kraj razgovora
Novi razgovor -
-
-
You tell us. Can you?
- Kraj razgovora
Novi razgovor -
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.
