There has been some movement on the perpetual hashing front, as FB recently published new algorithms based upon more modern techniques that should be a bit more robust. The biggest problem is adversarial reporting to trigger image censorship.https://www.google.com/amp/s/about.fb.com/news/2019/08/open-source-photo-video-matching/amp/ …
-
-
Replying to @alexstamos
You already have the image at that point, so no additional sharing has happened. Using your solution, I can send you a picture that isn't a nude, and someone looks at it and sees that it's not a nude. Using this system, you wait for a match and then see it's not a nude. Right?
2 replies 0 retweets 2 likes -
Replying to @taviso
Except you have now blocked that image in every private chat on the platform during the content moderation latency. There is effectively an infinite space of perceptual hashes that will probabilistically match a single photo; how do you think this holds up against 8ch*n
1 reply 1 retweet 8 likes -
Replying to @alexstamos
That doesn't make any sense. They can already submit infinite photos to your human team, do you block them as soon as the images arrive at nudes@fb.com, or do you wait for a moderator to confirm them first? If it's the second, then this is a nonsense excuse.
3 replies 1 retweet 3 likes -
Replying to @taviso @alexstamos
Think about it Alex. If you have to block before moderation, then I can send any image I want, right? It seems like either you don't really care about that latency, or you're already vulnerable to that attack anyway - either way, my system means don't have to share nudes, right?
1 reply 0 retweets 5 likes -
Replying to @taviso
FB isn’t going to block until an image has been confirmed. I expect that those images will be used to train classifiers to make proactive detection more likely next time.
2 replies 0 retweets 0 likes -
Replying to @alexstamos @taviso
I think it would be great if most of this could be pushed client side. I’m actually putting together a whole workshop on client-side abuse detection after RWC.
2 replies 0 retweets 4 likes -
Replying to @alexstamos @taviso
I’m not interested in going around and around with somebody with no desire to tackle the problem. Google is the absolute largest purveyor of NCII on the planet. If you really care about this issue, then ask why client-side hashes can’t be used to censor Google Image Search.
2 replies 0 retweets 3 likes -
Replying to @alexstamos @taviso
It will turn out to be a really thorny nest of issues, which you would understand if you spent any time in this area. Project Zero can’t build trustworthy software just with bug finding, and you can’t reply-guy your way into mitigating really difficult society-wise issues.
2 replies 0 retweets 4 likes -
Replying to @alexstamos
Hah, you're being disingenuous. You're saying "why won't the media just trust me that people need to send me their nudes", and you can't even explain to me why it's necessary.
1 reply 0 retweets 0 likes
If you're not being disingenuous, and you genuinely believed the censorship and latency excuses, then I'm already helping you by explaining the problem, no?
-
-
Replying to @taviso @alexstamos
I haven't worked on this problem, so certainly don't appreciate the thorny issues, but I do find this thread informative.
0 replies 0 retweets 0 likesThanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.