So, let's talk about the ethics of data science in our new world.
-
-
I talk sometimes about an algorithm I built once. It's potential for harm is so high that I don't even talk publicly about what it does.
-
This is the reality we have to face *every day* doing our mundane work. I _never_ expected this algorithm to do what it does.
-
When we're pulling together data on our users, we need to ask ourselves "what could this be used for? What harm can come?"
-
In reliability engineering, there is a process called FMEA -- Failure Mode Effect Analysis. It's a harm evaluation process.
-
You think of a failure that can happen, and score from 1-10 the likelihood of failure, the severity of harm given failure, and detectability
-
Data Scientists probably need to start doing this. "What is the harm that happens from classifying this object incorrectly?"
-
It's well-known among data scientists that if you work hard enough, you can find a pattern in the data that matches whatever you want it to.
-
And we now have an administration that will use its reach to classify people as they see fit. What is the harm in what we build?
-
I read yesterday about librarians destroying patron records after the Patriot Act was passed. Are we willing to destroy our algorithms?
-
Ask yourselves these questions: Are you willing to refuse an order? Is your manager sympathetic to user privacy?
-
These are moral touchstones that you need to be a data scientist now.
-
You might think, "nah, regulations prevent that sort of access." Watch those regulations evaporate overnight.
-
It's not about the math and coding anymore. Truthfully, it never was. But ask yourself if you want to be in charge of assembling the list.
-
Sometimes we might have no choice. The courts can compel us to act. Warrants, subpoenas.
-
There's only so much we can do to resist and to build trust, but there's one action I'm going to take.
-
Today I'm putting a canary in my profile (previous tweet I accidentally typed 'warrant').
-
That I have never been asked to deliver to a government a list of people to target based on behaviors.
-
As a trans woman, the incoming government has gone through extreme lengths to classify me as a sexual predator.
-
I'm sure if try hard enough you can build an algorithm to come to that conclusion for you. I'm not, but algorithms don't care.
-
I hope the owners of the data on me care enough to refuse that order. If you're a data scientist, I hope you refuse that order, too.
-
Fascism ideologically relies on technology to give it strength. As a data scientist, YOU are the technology. Act your conscience. Fin.
- 2 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.