Let's hear them then, the only problem I've heard you talk about is that people don't want to publish their ImageDNA-like algorithm. There are solutions to that, use SGX or send tamperproof machines to trusted victim advocates to generate the hashes.
There is no additional delay, the existing human-review system also requires moderation before removal. The point is the hash solution is objectively better than the "preemptively send us all your nudes" system, not that it's perfect.
-
-
Alex is complaining the press didn't understand his solution, but I don't understand it either. He says there are good reasons, but hand-waves away anyone asking what they are... is it not fair to expect a good explanation when you're asking people to take that seriously?
-
i don't necessarily like the solution, but i think his point is clear: - if you moderate when the victim uploads, there is a short delay but thereafter you can remove the first occurrence immediately - if you moderate on first occurrence, the photo remains up for a while
- 4 more replies
New conversation -
-
-
no, the existing system places the delay at the start. the point at which you can say "all future occurrences of this photo will be removed immediately" is shortly after you upload the photo - in your system, it is shortly after first detection which is a very different promise
-
No it doesn't. If the photo is already being shared when you submit it, it doesn't stop until moderation. That is the same in both systems. If it hasn't been shared yet, it can be blocked first in both systems. The difference is no new people see your nudes until abuse with hash.
- 14 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.