Not that this matters at all but 50% is higher than chance. It's not 50/50 if somebody is gay. Gay people aren't 50% of the population.
-
-
The data was balanced, so I think 57% is not much better than chance. That was using their "facial femininity" feature only.
1 reply 0 retweets 2 likes -
Replying to @sir_deenicus @o_guest and
If I were to pick out a flaw, it would be from doing 20-fold cv on a dataset of faces where total images outnumbers number of people.
2 replies 0 retweets 0 likes -
Ah, I missed that they had equal numbers of "gay" and "not gay" people. Nice catch!
2 replies 0 retweets 2 likes -
Replying to @o_guest @sir_deenicus and
The flaw isn't in the dataset really either way, of course.
1 reply 0 retweets 0 likes -
Hmm what do you mean? I don't think it will hold if they took a larger sample, across subcultures and races.
2 replies 0 retweets 1 like -
I think it's inherently unethical researcher that doesn't reflect reality but either way is flawed.
2 replies 0 retweets 1 like -
I def agree on it being unethical. But I was also surprised you could get gender from looking at 2D faces alone. That seems unlikely.
2 replies 0 retweets 0 likes -
Replying to @sir_deenicus @o_guest and
Like, avg human might not be able to label all of imagenet but we can at least tell apart. Radiologists can tell apart malignancies.
1 reply 0 retweets 2 likes -
Replying to @sir_deenicus @o_guest and
Surprised that humans can't tell sexual orientation but machines can. Harmful both from ethics view and in ML is magical misuse mentality
1 reply 0 retweets 1 like
To be fair it's possible there are expert queerdar people. They weren't used here.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.