Redlining laws are outdated. If an ML algorithm selects zip code as a predictor of credit risk, it's because it is, not because the algorithm is using it as a cover for racism. Duh!
-
-
Of course. Imaging having rich parents was correlated w being good at statistics and we have no real stats features. But imagine you are a great statistician w poor parents. Algorithm chooses someone worse than you because they got rich parents. Of course that is unfair no?
-
Something like that happened in UK after cancellation of exams for university. An algorithm was used that considered past success of students at a given school, and as a result, exceptional students from poor schools were not being selected.
- Näytä vastaukset
Uusi keskustelu -
Lataaminen näyttää kestävän hetken.
Twitter saattaa olla ruuhkautunut tai ongelma on muuten hetkellinen. Yritä uudelleen tai käy Twitterin tilasivulla saadaksesi lisätietoja.