But it doesn't capture others (e.g., between "dog" and the smell of a wet dog, the feeling of touching a dog's fur, the problem of cleaning up dog hair, etc.). So "dog" is partly grounded for Google translate and it could be more grounded for a robot with touch and smell. 2/
-
-
Which they learned while they were an infant.https://medium.com/intuitionmachine/the-stepping-stones-of-agi-inspired-by-infant-cognitive-development-35ea0af61d23 …
-
We can give robots that 'intuition' by embedding simulation-based internal models. The simulator models physics, collisions etc, so the robot can figure out it can't walk through a wall, etc. See https://www.frontiersin.org/articles/10.3389/frobt.2017.00074/full … & related papers - and our simulator models cause & effect.
End of conversation
New conversation -
-
-
This Tweet is unavailable.
-
Yes, and how do we develop that 'symbolic grounding' that gives us that intuitive understanding of forces and material strength? We learn it from experience. Other simpler animals learn it from evolution (which is also learning from experience).
End of conversation
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.