Wow - seems intractable. Even technological solutions, like a protocol forcing the creation of a digital “tell” when something has been manipulated or the like, would require international cooperation among manufacturers, developers and governments.
-
-
-
And even if that were possible, also seems slightly dystopian. Any solutions suggested by the media industry?
-
On diff note, there must already be some interesting legal issues arising re the use of digital actors in movies, and the degree to which an actor’s estate can give permission/license to use the likeness, even if the actor might never have taken that role to begin with!
-
I don't think anyone has an idea of a good solution yet. That's part of what makes it such a serious problem Using someone's likeness without permission opens one up to a lawsuit. That checks movie studios and other companies, but not intel agencies or individuals on the internet
End of conversation
New conversation -
-
-
I think that's the appropriate reaction.
- 1 more reply
New conversation -
-
"They tested it with Obama, because there’s a lot of video of him." That's the key. Machine learning relies on analysis of extremely vast amounts of repetitive data. It's not quite as easy as this article makes it appear
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
I disagree with that, Fke Videos are easily detected, let’s not contribute to the alt-right.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.