Amazon built an AI to rate job applications. It analyzed 10 years of (male dominated) hires. Then it started penalizing resumes that included the word “women’s,” downgrading graduates from all women's colleges, and highly rating aggressive language.https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G …
-
-
Replying to @broderick @ClaraJeffery
The AI learns from observed behavior same as a child would. If the behavior it confronts is so absolutely biased, then we have to address that prior to allowing AI to learn through observation.
6 replies 33 retweets 477 likes -
Also, who programmed the AI? men? Only men? There is a huge difference in linguistics between men & women and this inherent difference translates to how we communicate and maybe even how we program.
@CarnegieMellon@MIT12 replies 18 retweets 253 likes -
-
Replying to @TigzyRK @broderick and
Seriously? Heuristic programming which results in the training which leads to deep learning.
2 replies 0 retweets 7 likes -
Replying to @souzou_no @broderick and
You are wrong, there's no difference between men and women in programming. The bias seen in this context is because of the dataset used to train the AI, not the code.
1 reply 0 retweets 3 likes -
Replying to @TigzyRK @broderick and
Debiasing algorithms is an actual issue.https://medium.com/coinmonks/ai-doesnt-have-to-be-conscious-to-be-harmful-385d143bd311 …
4 replies 2 retweets 13 likes -
Replying to @souzou_no @broderick and
In this case, algorithm doesn't mean code. It means behavior (runtime), and if you read carefully you'll understand it.
2 replies 0 retweets 2 likes
No she won't
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.