Have you or your employer made any headway on this issue in the last couple of years? Since you so helpfully shared your concerns on Twitter I would have hoped you could have tried one of those options in the meantime.
-
-
Replying to @alexstamos
This is also the answer we got at the time, "Why don't you do it this way, so that people don't have to send your team their nudes?" "Why don't you do it for us?"...I think that's not a very satisfying answer
1 reply 0 retweets 9 likes -
Replying to @taviso
There are a bunch of complications that arise when you actually try to implement this kind of mass image blocking at scale. It is easy to throw stones from the sidelines; if you spent time actually working on the problem you would realize the compromises weren’t arbitrary.
1 reply 0 retweets 5 likes -
Replying to @alexstamos
Let's hear them then, the only problem I've heard you talk about is that people don't want to publish their ImageDNA-like algorithm. There are solutions to that, use SGX or send tamperproof machines to trusted victim advocates to generate the hashes.
1 reply 0 retweets 5 likes -
Replying to @taviso
There has been some movement on the perpetual hashing front, as FB recently published new algorithms based upon more modern techniques that should be a bit more robust. The biggest problem is adversarial reporting to trigger image censorship.https://www.google.com/amp/s/about.fb.com/news/2019/08/open-source-photo-video-matching/amp/ …
1 reply 1 retweet 7 likes -
Replying to @alexstamos
You already have the image at that point, so no additional sharing has happened. Using your solution, I can send you a picture that isn't a nude, and someone looks at it and sees that it's not a nude. Using this system, you wait for a match and then see it's not a nude. Right?
2 replies 0 retweets 2 likes -
Replying to @taviso
Except you have now blocked that image in every private chat on the platform during the content moderation latency. There is effectively an infinite space of perceptual hashes that will probabilistically match a single photo; how do you think this holds up against 8ch*n
1 reply 1 retweet 8 likes -
Replying to @alexstamos
That doesn't make any sense. They can already submit infinite photos to your human team, do you block them as soon as the images arrive at nudes@fb.com, or do you wait for a moderator to confirm them first? If it's the second, then this is a nonsense excuse.
3 replies 1 retweet 3 likes -
Replying to @taviso @alexstamos
Think about it Alex. If you have to block before moderation, then I can send any image I want, right? It seems like either you don't really care about that latency, or you're already vulnerable to that attack anyway - either way, my system means don't have to share nudes, right?
1 reply 0 retweets 5 likes -
Replying to @taviso
FB isn’t going to block until an image has been confirmed. I expect that those images will be used to train classifiers to make proactive detection more likely next time.
2 replies 0 retweets 0 likes
That's possible if you block by hash as well. You wait for a hash match, send matching image to moderation queue, train classifier on result to prioritize future moderation. If you can't explain to me why this isn't solvable, isn't it fair for media to be snarky about the idea?
-
-
Replying to @taviso @alexstamos
I mean, you are asking people to send you their nudes.... you better have a good story and technical experts ready to confirm this is the best possible solution?
0 replies 1 retweet 3 likesThanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.