Conversation

For example — while 34% of US judges are women, only 3% of the images generated for the keyword “judge” were perceived women. For fast-food workers, the model generated people with darker skin 70% of the time, even though 70% of fast-food workers in the US are White. 🧵 4/13
Embedded video
0:10
10K views
2
289
Stable Diffusion is working on an initiative to develop open-source models that will be trained on datasets specific to different countries and cultures in order to mitigate the problem. But given the pace of AI adoption, will these improved models come out soon enough? 🧵 9/13
3
127
The popularity of generative AI like Stable Diffusion also means that AI-generated images potentially depicting stereotypes about race and gender are posted online every day. And those images are getting increasingly difficult to distinguish from real photographs. 🧵 11/13
5
116