@Plinz Do many people in AI work on the problem of getting and AI to share it's understanding with humans?
-
-
Replying to @MagpieMcGraw
In relative numbers no, in absolute numbers, yes.
1 reply 0 retweets 0 likes -
Replying to @Plinz
I read an article about Deep Mind, and they seem to think that if you make a really smart AI, it can use it's smarts to solve big human problems. I'm uncomfortable with that, if the knowledge how to solve the problems doesn't get stored(and understood) in human brains.
1 reply 0 retweets 0 likes -
Replying to @MagpieMcGraw
It seems inevitable that we have to make better models than humans. But we will also outmodel humans, the same way as humans outmodel dogs, and that makes it possible to generate translations that humans can comprehend to the best of their ability.
1 reply 1 retweet 1 like -
Replying to @Plinz
Not sure what you mean by outmodel. Humans have to do a lot of thinking, and contain a lot of knowledge to take care of ourselves. If we outsource the thinking and the knowledge to a machine we don't understand, it breaking would be a disaster on a species wide level.
1 reply 0 retweets 0 likes -
Replying to @MagpieMcGraw
We already did, and it already is. (And to outmodel means that you understand an agent so well that you can explain and predict its activity better than the agent itself. Everything you do, you do for a reason, which you often don't understand yourself)
1 reply 0 retweets 2 likes -
Replying to @Plinz
I guess I fear it could get unimaginably worse. Have you seen that show Dollhouse?
1 reply 0 retweets 0 likes
Things will get unimaginably worse at some point for sure. If we don't use AI.
-
-
-
Replying to @MagpieMcGraw
Governance, finance, ecology, econ, infrastructure, foundational sciences
1 reply 0 retweets 0 likes - 1 more reply
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.