Super interesting. This wasn't meant to necessarily place blame on Google - just pointing out how bias in society is reflected in AI (1/3)
-
-
Show this thread
-
However, I do think it is our responsibility as designers/developers/users to be aware of this bias and do our best to correct it. (2/3)
Show this thread -
Btw, this was a slide from a great talk by
@neelima_jadhav, who gave some other interesting examples of bias in AI (race, class, etc) (3/3)Show this thread
End of conversation
New conversation -
-
-
Makes me wish I could package up the effort I made to train my brain to deal with singular 'they' and publish it to github.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Just remember that this isn't Google's fault.
-
They wrote the code, they trained the algorithms, they failed to recognize and correct for the bias present in their training data.
- Show replies
New conversation -
-
-
He was a babysitter She was a doctor Can I make it anymore obvious? O bir bebek bakicisi O bir doktor What more can I say?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Can't pull up the article. Can you summarise why it happens, if possible? If not then that's okay too.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
That's the point tho - we hand over decisions to algorithm, then when algorithm gives biased output, we act like that bias is "objective."
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Well basically the information is lost once translated into Turkish. No way you can get it back.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.
Turkish, a gender neutral language, then that same Turkish phrase back to English