-
-
In part I fully agree, but I think there's a very important point to be made that statistical asymmetries that are conducive to racism and discrimination are inevitable for any AI trained by public masses of data. But they don't have cog counterweights, opposed to (nice) humans
-
Inevitable sounds like a weasel word used to shift blame from developers to 'the world'. Data scientist should've thought about how dataset might be biased & accounted for it (with better curated dataset) before going to scale. In this case, should have had more PoC in training.
-
Yeah, literally the hiring policies of the company could have solved this by hiring a more representative/diverse set of humans.
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.