WhatsApp: “we also use artificial intelligence (AI) and machine learning to proactively detect child nudity and previously unknown child expoitative content when they’re uploaded.” In the testimony.
-
-
Replying to @charlesarthur @pwnallthethings
I don't understand the point you're making here. Can you clarify?
1 reply 0 retweets 0 likes -
Replying to @Pinboard @pwnallthethings
I was echoing
@pwnallthethings point. That FB and WA do all the same things at any point they can. They’d like to do it on-device, I’d bet, because it’s a huge challenge and the delta between FB and WA reports probably signals that they’re missing a lot.1 reply 0 retweets 1 like -
Replying to @charlesarthur @pwnallthethings
My point is that it is very significant whether you do this server side or on the device. A lot of people right now seem to learning that major sites all scan for CSEM (since otherwise they'd be flooded with the worst stuff imaginable) and it's kind of derailing the conversation
2 replies 0 retweets 4 likes -
Replying to @Pinboard @pwnallthethings
except that Apple’s point is it scans *only if* the photo is about to be uploaded to iCloud Photo Library. So it’s like a gate to the server.
1 reply 0 retweets 0 likes -
Replying to @charlesarthur @pwnallthethings
Yes, but it scans on the device. So there is now code on the phone that checks and reports content. I don't know enough about Apple's design and implementation to say anything meaningful beyond that, but the server/device divide is a very bright line, and Apple chose to cross it.
2 replies 0 retweets 2 likes -
Once you put code on the device that flags illegal content, three questions follow: 1. What's the designed behavior and its impact 2. What will Apple do in jurisdictions that demand different behavior 3. What new avenues does it make available to people who hack the phone
1 reply 0 retweets 0 likes -
Replying to @Pinboard @pwnallthethings
Sure. And I think that the security people at Apple would have thought about those things. It’s been a long time in development.
1 reply 0 retweets 0 likes -
Replying to @charlesarthur @pwnallthethings
These are the same security people who got us here. I have dear friends in Apple security, but even they can't do the impossible (build unhackable software). Your faith in their ability and thoughtfulness surprises me.https://www.washingtonpost.com/technology/2021/07/19/apple-iphone-nso/ …
1 reply 0 retweets 0 likes -
Replying to @Pinboard @pwnallthethings
sure, but to hack the CSAM system would lead to the iCloud account being investigated (as it says in the explanation) which would show if it’s legit or not. Being sent CSAM is already the threat. Case in UK:https://www.theguardian.com/uk-news/2019/nov/19/police-chief-convicted-for-having-child-sex-abuse-video-on-phone-robyn-williams …
1 reply 0 retweets 0 likes
I think you misunderstand my point. The issue is adding lots of new moving parts to on-device crypto and image parsing, which necessarily means adding lots of new exploitable bugs to a part of Apple's OS that is already notorious for them.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.