thread on universally-quantified one to one mappings, which I still see as key bottleneck to future progress.https://twitter.com/GaryMarcus/status/1200637240214347776 …
-
-
Replying to @GaryMarcus
This looks pretty good though, no?pic.twitter.com/v92RFxrgRz
2 replies 0 retweets 1 like -
Replying to @AdamMarblestone
I am not sure it is a general solution but building in operations over variables as NTM's do is the crux of what I have been arguing for. IIRC i mentioned that specific work approvingly in my Deep Learning Critical Appraisal and follow up on Medium (In Defense of Skepticism).
1 reply 0 retweets 1 like -
Replying to @GaryMarcus
Yes, and FYI I think transformers with “relative position encoding” also arguably build in a syntactic/content independent variable binding mechanism https://arxiv.org/abs/1803.02155
1 reply 0 retweets 1 like
Meanwhile, possibly, if the brain needs a dedicated variable binding mechanism, possibilities could include https://www.biorxiv.org/content/10.1101/304980v1.full … or better https://arxiv.org/abs/1611.03698v3 …
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.