@schock says: algorithms should not be “color blind” (equality model that works best for ppl in power). Algorithms should be just (equity model that takes history & intersectionality into account). #datajustice18pic.twitter.com/JweD79JhNi
• goth gremlin • computational cognitive/neuroscience modeling • geek & techish Cypriot • plant aficionada • came up with #bropenscience • http://neuroplausible.com •
You can add location information to your Tweets, such as your city or precise location, from the web and via third-party applications. You always have the option to delete your Tweet location history. Learn more
Add this Tweet to your website by copying the code below. Learn more
Add this video to your website by copying the code below. Learn more
By embedding Twitter content in your website or app, you are agreeing to the Twitter Developer Agreement and Developer Policy.
| Country | Code | For customers of |
|---|---|---|
| United States | 40404 | (any) |
| Canada | 21212 | (any) |
| United Kingdom | 86444 | Vodafone, Orange, 3, O2 |
| Brazil | 40404 | Nextel, TIM |
| Haiti | 40404 | Digicel, Voila |
| Ireland | 51210 | Vodafone, O2 |
| India | 53000 | Bharti Airtel, Videocon, Reliance |
| Indonesia | 89887 | AXIS, 3, Telkomsel, Indosat, XL Axiata |
| Italy | 4880804 | Wind |
| 3424486444 | Vodafone | |
| » See SMS short codes for other countries | ||
This timeline is where you’ll spend most of your time, getting instant updates about what matters to you.
Hover over the profile pic and click the Following button to unfollow any account.
When you see a Tweet you love, tap the heart — it lets the person who wrote it know you shared the love.
The fastest way to share someone else’s Tweet with your followers is with a Retweet. Tap the icon to send it instantly.
Add your thoughts about any Tweet with a Reply. Find a topic you’re passionate about, and jump right in.
Get instant insight into what people are talking about now.
Follow more accounts to get instant updates about topics you care about.
See the latest conversations about any topic instantly.
Catch up instantly on the best stories happening as they unfold.
@schock says: algorithms should not be “color blind” (equality model that works best for ppl in power). Algorithms should be just (equity model that takes history & intersectionality into account). #datajustice18pic.twitter.com/JweD79JhNi
This will be harder to do, as just blindly training algorithms on various data sets won't be enough. But I think it is necessary and worth doing. Otherwise stereotypes could become entrenched in decision making and analytics algorithms and create negative feedback loops.
Good example is: Let's say a machine learning algorithm is fed data and predicts possibility of defaulting on a loan. If this algorithm is just blindly fed data, it could inadvertently develop a stereotype against African Americans as a larger percentage are -
likely to be lower socioeconomic status than the general population. Such a stereotype buried in a machine learning algorithm would then be self perpetuating if it were used in banks decision making processes. We need to be vigilant against such biases showing up in our software
You don't even need an example that brings SES into it (not in any direct way at least), as demonstrated by this huge fail: https://gizmodo.com/why-cant-this-soap-dispenser-identify-dark-skin-1797931773 …
I think we can define two different failure modes based on how complex a working solution would need to be. Stuff like a soap dispenser not recognizing dark skin, or facial recognition not working for non-majority ethnicities can be fixed by using more diverse training data.
Problems like a loan default prediction algo having an implicit bias are harder to solve, bcuz the biases the algo would develop accurately reflect the real world (i.e low SES minorities are more likely to default) But the algo ethically needs to have these biases corrected for.
Making the algo "raceblind" isn't enough, bcuz it would end up tracking race as a hidden variable (i.e. through home address or ethnic sounding names.) Interestingly enough, it seems like these problems would end up mirroring systematic racism in the real world.
Yes, they would reflect the real world descriptively but not with a real understanding of the world like a human has (or can have! Because of course some racists are real descriptive bigots).
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.