This belief seems to follow the (highly attractive) cognitive fallacy that there must be one weird secret trick to solve any given hard multidimensional problem that would otherwise require hard work. You can sell a lot of books based on this idea
-
-
Show this threadThanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Is it like saying you will cure cancer?
-
More like you will cure medicine.
- Show replies
New conversation -
-
-
I am curious: what do you think about
@singularity_net ‘s approach using OpenCog to run specific processes? It seems to me more like an orchestration of subservices but it seems like a smart way to solve complex problems that would look like AGIThanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Exactly right. General intelligence is local.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
You can model general intelligence in the brain. Solving AI is akin to perfectly modelling that. A simpler model might not be a perfect representation but you can claim you have started to solve AI. The hard part is working out the models underlying principles anyway.
-
The Human Brain Project has failed spectacularly in this respect
- Show replies
New conversation -
-
-
I disagree with this -- but I more strongly disagree with the idea that DL has this cracked, and AGI is only a matter of time. I think the real problem is that there is not a single person in the world past or present that has been able to define what intelligence actually is.
-
Intelligence, at the most basic level, equals to self awareness, and everything it implies.
- Show replies
New conversation -
-
-
What leads you to the conclusion that AGI isn’t a single problem?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.