This is fantastic! What has changed since the May version of the paper?
-
-
-
Multiple Unsupervised and RL models are performing better than Supervised learning. Maybe a stretch, but curious if these results could take us to the next stage i.e. form the building blocks of AGI? In the near term, it creates many distillation opportunities.
Kraj razgovora
Novi razgovor -
-
-
We have seen the similar stuff for 3D point cloud databases.https://youtu.be/fkiOyPSSYfs
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
-
-
This can be applied to a variety of problems: 1. Derive some context using implicit labels 2. Train/create embedding from similar/dissimilar items 3. Use embedding for end task So these results seem expected. We've seen this in NLP. We'll start seeing this everywhere.
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
-
-
Do you think VQAEs/BiGans could ever catch up, as far as representation learning goes, or contrastive-predictive is gonna be it from now on?
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.