You already have the image at that point, so no additional sharing has happened. Using your solution, I can send you a picture that isn't a nude, and someone looks at it and sees that it's not a nude. Using this system, you wait for a match and then see it's not a nude. Right?
No it doesn't. If the photo is already being shared when you submit it, it doesn't stop until moderation. That is the same in both systems. If it hasn't been shared yet, it can be blocked first in both systems. The difference is no new people see your nudes until abuse with hash.
-
-
yes, i am obviously assuming that the image hasn't been shared yet. how does your system allow you to block on first seen? "never seen before" would seem to be a fickle criteria for blocking, but i might be missing something in your system.
-
Right, never seen before, but you can prove that you've seen it. I think that's good enough. The software has to be distributed in some protected form, so you can also sign the hashes and verify a classifier said they're nudes before automatic blocking if you like.
- 3 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.