With the potential for AI to swoop in and take our jobs, the masses fear not exploitation, but something far worse: irrelevance. [3/11]
-
Show this thread
-
Referendums and elections that liberal democracies are categorised by are about human feelings. “If democracy were a matter of rational decision-making, there would be absolutely no reason to give people equal voting rights…some people are far more knowledgeable than others."
1 reply 0 retweets 0 likesShow this thread -
But this equality of human authority was justified by the liberal story. Now, our feelings can be hacked, and the tech revolution may bring about the authority of algorithms. What then becomes of human freedom? [5/11]
1 reply 0 retweets 1 likeShow this thread -
“If we invest too much in developing AI and too little in developing human consciousness, the very sophisticated AI of computers might only serve to empower the natural stupidity of humans.” Instead, we must consider our long-term needs as conscious beings. [6/11]
2 replies 0 retweets 1 likeShow this thread -
Property is a precursor to inequality, and data is property too. If controlling [] data controls our feelings and therefore our behaviour, then whoever controls the data will control the future. The key challenge of this century is regulating ownership of this data. [7/11]
1 reply 1 retweet 2 likesShow this thread -
“As more and more data flows from your body and brain to the smart machines via the biometric sensors, it will become easy for corporations and government agencies to know you, manipulate you, and make decisions on your behalf.” [8/11]
1 reply 0 retweets 1 likeShow this thread -
The richest 100 people own more wealth than the poorest 4 billion. In the future, the rich will be able to take advantage of bioengineering, and we might see the species divide itself into different biological casts.[9/11]
1 reply 0 retweets 1 likeShow this thread -
“Globalisation will unite the world horizontally by erasing national borders, but it will simultaneously divide humanity vertically.” If AI leaves those at the bottom w/o political power or economic value, the state might lose incentives to invest in their health and education.
2 replies 0 retweets 1 likeShow this thread -
Replying to @cosimia_
If an unbiased AI, which won’t be prone to parasitic tendencies like greed, has political and economic control, then I feel like they would treat all life as resourceful as long as life is not an existential threat to the AI. Unless the AI turns out to adopt a radical nihilism
1 reply 0 retweets 0 likes -
Replying to @PhilosophyMeds
If we leave AI to study the world around it and adopt values according to that, it’s easy to see how it could favour greed and nihilism. Our economic system favours greed as a means of getting ahead, and it doesn’t take into account the future welfare of the planet
1 reply 0 retweets 1 like
So better to program good values than to hope it’s nice to us :P
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.
