Suppose a machine learning project intended to predict startup success explicitly factors in the gender of the startup's founders as a feature. Is there any non-problematic reason why they would do this? It would just mean that it would reinforce gender biases, right?
Conversation
Replying to
Without knowing specifics, it seems rather worrying if it's not explicitly used as a way to counter existing biases. I know is pretty interested in these kinds of questions, but unsure if she has time to respond directly.

