Conversation

you can use deepfakes to lie with audio or video that looks real, which is worse than what we have currently, that is lying with audio or video that's actually real by manipulating context and maliciously cutting & pasting it what i wonder is how much worse would it be
5
37
Replying to
People are going to increasingly lose trust in video as a form of evidence. If it doesn't conform to your biases, it must be fake. If it does fit what you think, it's real. It's another step towards people living in completely different realities based on how they see the world.
2
4
Replying to
For many people, definitely. I don't look forward to increasingly not being able to figure out the objective reality myself though. It's also a bit scary to think about how things like this are going to impact policing, trials, etc. when video increasingly can't be trusted.
2
1
Replying to
former: i'm not immune to propaganda already. it gets quantitatively worse, but probably not qualitatively latter: that's what bothers me most about them, yeah
1
2
Replying to
It's definitely possible to use attestation for this, even with existing technology today like Android key attestation. However, as I mentioned in another thread we had about this, attestation based on chaining to a known intermediate or root is a weak form vs. strong pairing.
1
2
Replying to and
I also don't feel that it should be treated as something that's nearly impossible to overcome by someone with physical access. It would be expensive, and there's probably the value in substantially raising the bar for this, but it could still be bypassed given enough money.
1
1
Replying to and
I'm not so sure that powerful and rich organizations still being able to do it while taking it away from the masses is a positive thing. That can already be today today with a camera app using Android key attestation and relying on the weak chaining to the known Google root.
1
1
Replying to
i'm not sure if i would trust any of the existing SoCs here. attestation built into sensor silicon is a different question, but that raises more questions, like what do we do with compression
2
1
Replying to
The existing functionality is definitely only able to provide a very weak implementation, but it's enough to wipe out being able to create fakes for the masses. I don't think attestation can stop this if someone is willing to invest the money in bypassing the physical security.
1
1
Show replies
Replying to and
A stronger way of doing it would be doing a strong pairing based on a unique key for the device in advance, such as a camera given to a journalist. You'd also want the device to start taking this more serious by pairing the camera sensor with the HSM / TEE permanently, etc.
1
1
Replying to
yeah, so what i think will happen is we'll get attestation whether we like it or not, because of inertia and desire to be able to keep relying on video evidence. not just journalism but surveillance cameras will need it
1
Show replies