if you wait until first detection to whitelist the photo, there's a delay where you can't block the photo (because it could be an innocent photo used as a denial-of-service). this is bad for the victim (photo isn't blocked immediately).
-
-
by whitelisting the photos upfront, you can tell the victim "a moderator verified it and now we will immediately remove the photo, when we detect it" in your scenario, you're telling the victim "when we first detect the photo, a moderator will decide whether or not to remove it"
1 reply 0 retweets 1 like -
Replying to @saleemrash1d @alexstamos
In order to generate the hash you need the original photo. Facebook can immediately tell if they have a matching image already, and can moderate it then and there if so. If not, that's good evidence you're the original creator, so block and moderate on first seen.
1 reply 0 retweets 1 like -
Replying to @taviso @alexstamos
by moderating there and then, you have delayed the removal process. what if the victim wants to ensure the photo is immediately removed on the first occurrence?
1 reply 0 retweets 1 like -
Replying to @saleemrash1d @alexstamos
There is no additional delay, the existing human-review system also requires moderation before removal. The point is the hash solution is objectively better than the "preemptively send us all your nudes" system, not that it's perfect.
2 replies 0 retweets 0 likes -
Replying to @taviso @alexstamos
no, the existing system places the delay at the start. the point at which you can say "all future occurrences of this photo will be removed immediately" is shortly after you upload the photo - in your system, it is shortly after first detection which is a very different promise
1 reply 0 retweets 0 likes -
Replying to @saleemrash1d @alexstamos
No it doesn't. If the photo is already being shared when you submit it, it doesn't stop until moderation. That is the same in both systems. If it hasn't been shared yet, it can be blocked first in both systems. The difference is no new people see your nudes until abuse with hash.
1 reply 0 retweets 0 likes -
Replying to @taviso @alexstamos
yes, i am obviously assuming that the image hasn't been shared yet. how does your system allow you to block on first seen? "never seen before" would seem to be a fickle criteria for blocking, but i might be missing something in your system.
1 reply 0 retweets 0 likes -
Replying to @saleemrash1d @alexstamos
Right, never seen before, but you can prove that you've seen it. I think that's good enough. The software has to be distributed in some protected form, so you can also sign the hashes and verify a classifier said they're nudes before automatic blocking if you like.
1 reply 0 retweets 0 likes -
Replying to @taviso @alexstamos
this is a compelling idea (hash it locally after running a classifier in a trusted execution environment) and i think i missed this point in your original argument. is this practical yet? what needs to happen to make it so?
1 reply 0 retweets 1 like
I believe it's practical and that I could build it, Alex says it isn't but handwaves away anyone asking why not. So, who knows...
-
-
Replying to @taviso @alexstamos
i think there are rough edges: - if your classifier mistakenly decides that a photo shouldn't be automatically blocked, what happens? - i guess you have to rate limit the classifier (depends on the limitations of your TEE); solved by making PoW an input to the classifier?
0 replies 0 retweets 0 likesThanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.