New: an app that uses neural networks to remove clothing from the images of women making them look realistically nude. The $50 app, called DeepNude, "dispenses with the idea that deepfakes were about anything besides claiming ownership over women’s bodies"https://www.vice.com/en_us/article/kzm59x/deepnude-app-creates-fake-nudes-of-any-woman …
-
-
Update: developer shuts down the app after Motherboard coverage, backlashhttps://www.vice.com/en_us/article/qv7agw/deepnude-app-that-undresses-photos-of-women-takes-it-offline …
Показать эту веткуСпасибо. Твиттер использует эту информацию, чтобы сделать вашу ленту лучше. ОтменитьОтменить
-
-
-
He didn't say it wouldn't cause harm, just that it gives plausible deniability
-
He literally said it's good. As opposed to, you know, bad.
-
Still nothing about no harm nor bad. Next.
Конец переписки
Новая переписка -
-
-
In the past. When this sort of threat happened (the: “do X or I will release image/video/etc) - the person under attack had almost no recourse to deny. Finally a woman (and hopefully soon men too) can at LEAST say “listen, that’s not me” to their community, family, etc.
- еще 1 ответ
Новая переписка -
-
-
Wonder what his position would be if every morning his mentions were full of deepfaked porn of his mom. Also I hope that never happens.
Спасибо. Твиттер использует эту информацию, чтобы сделать вашу ленту лучше. ОтменитьОтменить
-
Загрузка может занять некоторое время.
Вероятно, серверы Твиттера перегружены или в их работе произошел кратковременный сбой. Повторите попытку или посетите страницу Статус Твиттера, чтобы узнать более подробную информацию.
+44 20 8133 5190. Wickr
josephcox. XMPP
jfcox@jabber.ccc.de
joseph.cox@vice.com