ML systems are biased when data is biased. This face upsampling system makes everyone look white because the network was pretrained on FlickFaceHQ, which mainly contains white people pics. Train the *exact* same system on a dataset from Senegal, and everyone will look African.https://twitter.com/bradpwyble/status/1274380641644294150 …
-
-
Thank you. Question on extremes: doesn’t this imply if I have only one Inuit woman in my dataset, I can’t exactly equalize. Further thought: aren’t humans continuous? 1/2
-
To do this properly; wouldn’t we need to sequence everyone’s DNA? Plot by maternal and paternal DNA markers?
- Show replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.