Imitating the human brain is not more likely to result in AGI than mathematical approaches, because human brains are not generally intelligent after all — Ben Goertzel, #AGI2019
-
-
Replying to @Plinz
Surely if brains are not generally intelligent, that is quite strong evidence that general intelligence (at least what's meant by that definition) is not possible at all?
2 replies 0 retweets 0 likes -
I would at least be curious to hear how the human brain can expect to recognize "general intelligence" if it is not itself g.i.
1 reply 0 retweets 0 likes -
Replying to @TheVeryInternet @jprwg
We can represent many more functions than we can discover. For us, discovering new ideas takes so long that understanding the universe takes many generations. But we can accumulate the ideas of many generations in a single mind.
1 reply 0 retweets 2 likes -
I think you misunderstand the question. It's not "how will we make an AGI". It's "how will we even know we have one". No matter how much effort blind people put in over the centuries, a blind person will not be able to tell green things apart from red.
2 replies 0 retweets 0 likes
There nay not be full generality, because all minds are finite, unless it turns out that the number of relevant models to describe possible universes is tightly bounded. But there can probably be “good enough” generality, including how to figure out physics and build minds.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.