Always wanted to know how to abstract relational knowledge from sensory input and transfer it to new worlds? This paper is for you :)
-
-
Prikaži ovu nit
-
-
Are you fed up with data papers that present a theory in figure 6? Always wishing there was a theory paper with some data in figure 6? This one’s for you!
Prikaži ovu nit - Kraj razgovora
Novi razgovor -
-
-
Very nice work. Would it be fair to say the non-spatial relationships accounted for in this model are still specifically between stimuli that are visually experienced, or is the scope intended to include e.g. Aronov et al. (2017) auditory data & even non-sensory relationships?
-
It can do non spatial and even non-euclidean. Doesn't matter the modality of input

- Još 3 druga odgovora
Novi razgovor -
-
-
Very interesting paper! Not sure you saw our paper/poster at CCN this year, but it looks closely related.
@rvrikhyehttps://twitter.com/dileeplearning/status/1166517919099977728?s=20 … -
Thanks v much! Yes definitely thinking about similar problems!
Kraj razgovora
Novi razgovor -
-
-
Impressive study! similar approach in papers from vicariousai. I've worked in the past years on similar representations for knowledge /analytical projects, and I'm awe-inspired by your work
Thanks for sharing!
Is your code /datas released? -
Thanks v much - code needs some tidying up but
@jcrwhittington is planning to release it. Probably when the paper is accepted into a journal - Još 1 odgovor
Novi razgovor -
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.