if you wait until first detection to whitelist the photo, there's a delay where you can't block the photo (because it could be an innocent photo used as a denial-of-service). this is bad for the victim (photo isn't blocked immediately).
-
-
by whitelisting the photos upfront, you can tell the victim "a moderator verified it and now we will immediately remove the photo, when we detect it" in your scenario, you're telling the victim "when we first detect the photo, a moderator will decide whether or not to remove it"
1 reply 0 retweets 1 like -
Replying to @saleemrash1d @alexstamos
In order to generate the hash you need the original photo. Facebook can immediately tell if they have a matching image already, and can moderate it then and there if so. If not, that's good evidence you're the original creator, so block and moderate on first seen.
1 reply 0 retweets 1 like -
Replying to @taviso @alexstamos
by moderating there and then, you have delayed the removal process. what if the victim wants to ensure the photo is immediately removed on the first occurrence?
1 reply 0 retweets 1 like -
Replying to @saleemrash1d @alexstamos
There is no additional delay, the existing human-review system also requires moderation before removal. The point is the hash solution is objectively better than the "preemptively send us all your nudes" system, not that it's perfect.
2 replies 0 retweets 0 likes -
Alex is complaining the press didn't understand his solution, but I don't understand it either. He says there are good reasons, but hand-waves away anyone asking what they are... is it not fair to expect a good explanation when you're asking people to take that seriously?
1 reply 0 retweets 0 likes -
Replying to @taviso @alexstamos
i don't necessarily like the solution, but i think his point is clear: - if you moderate when the victim uploads, there is a short delay but thereafter you can remove the first occurrence immediately - if you moderate on first occurrence, the photo remains up for a while
1 reply 0 retweets 0 likes -
Replying to @saleemrash1d @alexstamos
You're missing the key point, you have the hash and facebook has never seen the image before. That's good evidence you own that image, so temporarily block sharing while it's being moderated. Facebook employees do not have to view you nudes until they already had them.
1 reply 0 retweets 0 likes -
Replying to @taviso @alexstamos
how is this not vulnerable to DoS? spray the system with images that Facebook hasn't seen before (either images you find online, or ones specially designed to cause false positives in the image hashing algorithm) and wreak havoc on the ability to send photos privately?
1 reply 0 retweets 0 likes -
Replying to @saleemrash1d @alexstamos
Firstly, please understand that the human moderation queue is vulnerable to DoS, but requires people to see your nudes. You can send 1M pictures of puppies, and a team of 25 humans has to look at each one delaying abuse victims from being helped.
2 replies 0 retweets 1 like
Secondly, you rate limit accounts to say, 1k hashes - I don't know how many selfies most people have, whatever a fair ceiling is. Then, three strikes of submitting bad hashes and you're out. You can also require signing and PoW from the software.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.