Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Interesting. But that would still mean you need to determine the "correct" ratio and adjust the corrective measures to make sure they don't accidentally hypercorrect.

Additionally there's still the problem that we're talking about spherical cows. Hiring policies don't exist in a vacuum. I'll talk about squares and triangles to keep it abstract.

Hiring only "the best of the best" is a quite popular strategy. Let's say the "shared average" model is correct and squares are overrepresented near the average of the applicant pool while underrepresented at the top and bottom. The "correct" ratio might be 60/40. The actual ratio for the top (and bottom) percentile might end up being closer to 90/10.

Now, depending on the size of the pool those 10% squares of the top percentile might be enough for one company to maintain its ratio while only hiring "the best of the best", or even several companies. But at some point companies will have to either sacrifice its ratio and hire more triangles or sacrifice its standards and hire squares who are weaker than some of the triangle candidates.

Note that so far I haven't even been talking about discrimination or perceived biases. This is what happens if we have perfectly rational actors with perfect knowledge of the market simply enacting the policy "only hire the best of the best" with the restriction of "try to maintain a ratio of 60/40".

You could argue sometimes going for the weaker candidate is worth it to combat the chilling effects of perceived biases. But it should be obvious why it's naive to expect any company to choose so voluntarily when it means the competition that doesn't follow the rule will get more better candidates.

And so far we're talking about a single property that is split pretty evenly across the greater population (even if the hiring pool might be unbalanced). What if 60% of the population is yellow, 30% are blue, 9% are green and 1% are red? Diversity programmes often aim for equal representation of minorities, not just for proportional representation. So that means you want 25% red. But you also want this for both sets of polygons, so at a 50/50 ratio you need to try to hire 12.5% red squares, 12.5% red triangles and so on for all colors. Next imagine 20% of squares and triangles also have rounded corners. And 1% changed their number of corners at some point in their lives.

This is clearly a field that needs unexcited empirical studies. Yet gender studies are seething with ideological bias, taking the conclusions a priori as self-evident. And critics are lumped in with those who are ideologically opposed.

I have no idea what the actual distributions look like. I don't know what the correct ratio would be. I know sexism exists. I also know plenty of women are put off by far more benign aspects of the field. I also know plenty of men are put off as well.

For all its flaws the Google Memo got one thing right: appealing to emotion (what he mistakenly called "empathy") is not the way to further our understanding of the situation. Personal anecdotes are heartwarming or gut-wrenching but anecdotes are not data. When scientific results don't match up with anecdotes that shouldn't mean the science is wrong. It just means "this warrants further study". Maybe the science is wrong, then we can find out how that happened and do more science while preventing the same mistakes. But maybe the anecdotes as important as they may feel are outliers. Or maybe both are true and there are problems we need to address but they distract from the actual cause.

It's not like climate change. With climate change if climate change is wrong, by addressing it we just end up wasting a lot of resource to make the world better nevertheless. With identity politics (assuming companies actually "lower the bar" for minorities to "fix" their ratios), if we're wrong, we've ended up treating a lot of people unfairly just to end up with numbers that look fairer.

I think we should continue encouraging women and minorities to get into tech. I also think we should combat sexism and bigotry in the industry. But I also think we should not forestall the conclusion when trying to understand the root cause of these disparities.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: