That doesn't make any sense. They can already submit infinite photos to your human team, do you block them as soon as the images arrive at nudes@fb.com, or do you wait for a moderator to confirm them first? If it's the second, then this is a nonsense excuse.
Right, never seen before, but you can prove that you've seen it. I think that's good enough. The software has to be distributed in some protected form, so you can also sign the hashes and verify a classifier said they're nudes before automatic blocking if you like.
-
-
this is a compelling idea (hash it locally after running a classifier in a trusted execution environment) and i think i missed this point in your original argument. is this practical yet? what needs to happen to make it so?
-
I believe it's practical and that I could build it, Alex says it isn't but handwaves away anyone asking why not. So, who knows...
- 1 more reply
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.