Biased data in -> biased predictions out
-
-
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Maybe not racism, but it does illustrate the lack of diversity in tech and the consequences that can result from it.
- 1 more reply
New conversation -
-
-
Another thing that will get blacks wrongfully killed.
- 1 more reply
New conversation -
-
-
Common issue with face recog. systems. MS had similar issues w/the Kinect & tanned folk. I've actually trained ANNs to fix this in my MSEE & did similar for VR in my PhD. Strange that AMZN didn't get it right out the door - it's not hard. Must have used a biased training set.
-
yes, it is the case, data set with few example in it. but may be a CNN issue, like using max polling in a photo where all pixels in gray scale are are almost saturated, u are a phD in just starting using machine learn but a think that maybe cause this kind of issue or not?
-
If you're aware of the quality of the image, all you do is set the weights the compensate. Remember, the AI will use pixel shade diff. It's a numerical value. So the programmer has to be capable enough to know where to set the edge detection from pixel to pixel.
- 1 more reply
New conversation -
-
-
Not really surprising. But work out your software issues before you try to market it anywhere. We’ve already had way too many innocent black men convicted. The last thing we need is some unnecessary invention aiding in more wrongful convictions.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
This isn’t news.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
I saw that even in those cell phone apps that transform you into a zombie; where they asked me to indicate where the eyes and mouth were. But believe me, there is no bad intention there, but technical issues... but, of course, that can not be understood by racists
-
yes it is a technical issue , in white guys the pixels colors change more so it is easier to gett lines and shapes, it is not just biased data set
End of conversation
New conversation -
-
-
Nothing is done right for blacks
-
o the case , it is due the bias in data set, few photo of black people, but due the neural network used, there are a tecnicality the make high satured photo to be hard to recognise , the pixels in white guy have more different in light areas in comparation of dark areas,
-
I know
End of conversation
New conversation -
-
-
@nytimes@ACLU@GrabYourWallet Cosidering the current plight of minorities, could this trigger public backlash that ultimately results in a black eye for$AMZN brand?pic.twitter.com/AqQJW9dwGfThanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Hand dryers and sinks do not see black people. Racist bathrooms!
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.