This women determines what you via something called "algorithmic fairness." No you were not imagining it.
> For example, imagine that a Google Image query for "CEOs" shows predominantly men... even if it were a factually accurate representation of the world, it would be algorithmic unfairness
> My definition of fairness and bias specifically talks about historically marginalized communities. And that's who I care about. Communities who are in power and have traditionally been in power are not who who I'm solving 'fairness' for.