you can use deepfakes to lie with audio or video that looks real, which is worse than what we have currently, that is lying with audio or video that's actually real by manipulating context and maliciously cutting & pasting it
what i wonder is how much worse would it be
Conversation
Replying to
People are going to increasingly lose trust in video as a form of evidence. If it doesn't conform to your biases, it must be fake. If it does fit what you think, it's real. It's another step towards people living in completely different realities based on how they see the world.
2
2
4
Replying to
my personal conclusion is that we're well past that point without deepfakes
1
5
Replying to
For many people, definitely. I don't look forward to increasingly not being able to figure out the objective reality myself though. It's also a bit scary to think about how things like this are going to impact policing, trials, etc. when video increasingly can't be trusted.
my question is:
1. lying by pulling true recordings out of context
2. lying by making fake recordings that seem true
3. ???
Replying to
former: i'm not immune to propaganda already. it gets quantitatively worse, but probably not qualitatively
latter: that's what bothers me most about them, yeah
1
2
i expect we'll get video sensors with built-in attestation. i was going for a while to ask you whether you think that can be done in remotely reliable way?
1
2
Show replies


