my personal conclusion is that we're well past that point without deepfakes
Conversation
Replying to
For many people, definitely. I don't look forward to increasingly not being able to figure out the objective reality myself though. It's also a bit scary to think about how things like this are going to impact policing, trials, etc. when video increasingly can't be trusted.
2
1
Replying to
former: i'm not immune to propaganda already. it gets quantitatively worse, but probably not qualitatively
latter: that's what bothers me most about them, yeah
1
2
i expect we'll get video sensors with built-in attestation. i was going for a while to ask you whether you think that can be done in remotely reliable way?
1
2
Replying to
It's definitely possible to use attestation for this, even with existing technology today like Android key attestation. However, as I mentioned in another thread we had about this, attestation based on chaining to a known intermediate or root is a weak form vs. strong pairing.
1
2
I also don't feel that it should be treated as something that's nearly impossible to overcome by someone with physical access. It would be expensive, and there's probably the value in substantially raising the bar for this, but it could still be bypassed given enough money.
1
1
I'm not so sure that powerful and rich organizations still being able to do it while taking it away from the masses is a positive thing. That can already be today today with a camera app using Android key attestation and relying on the weak chaining to the known Google root.
1
1
The API supports chaining trust through the OS to the app. If you can exploit the OS, you can bypass OS enforced checks, but the signed attestation data includes the patch level which is a mitigating factor. It can definitely already be used for this today despite the weaknesses.
2
1
A stronger way of doing it would be doing a strong pairing based on a unique key for the device in advance, such as a camera given to a journalist. You'd also want the device to start taking this more serious by pairing the camera sensor with the HSM / TEE permanently, etc.
1
1
Replying to
yeah, so what i think will happen is we'll get attestation whether we like it or not, because of inertia and desire to be able to keep relying on video evidence. not just journalism but surveillance cameras will need it
1
Replying to
Provisioning a unique key per device and maintaining a database of them effectively puts a few corporations / governments into the position of being arbiters of the truth. I think it's the kind of thing that could happen and it's a bit troubling to think about the implications.
Replying to
i think what's worse is that this scheme can't possibly work against an adversarial silicon vendor because they always have the ability to a) produce another chip with the same secret and b) steganographically leak the key using e.g. a weakened PRNG
1
1
the only way i can see this working is if the HSM is a white box macro that can be independently audited, synthesized using an equivalent of reproducible builds, and designed in a way that doesn't permit a netlist-level backdoor
this is a very tall order
1
Show replies

