Right. If it’s true that women statistically outperform men (with same application documents), it’d be logical to prefer them just on gender alone. Because they likely turn out to be better.
For most jobs it’s hard to do a hiring process without in-person interviews, or at the very least video calls. So I’m not really sure how one could realistically get rid of biases. But I completely agree that whenever there are too many applications to interview everyone individually, the initial screening of applicants should be completely anonymized and rely only only technologies where biases can at least be understood.
For the final step I’m afraid we’ll have to try to train people to be less prone to biased decision-making. Which I agree is not a very promising path.
You’re welcome. I mean it’s kind of a factual question. Is gender an indicator on its own? If yes, then the rest is just how statistics and probability work… And that’s not really a controversy. Maths in itself works 🥹
I’d also welcome if we were to cut down on unrelated stuff, stereotypes and biases. Just pick what you like to optimize for and then do that. At least if you believe in the free market in that way. Of course it also has an impact on society, people etc and all of that is just complex. And then women and men aren’t really different, but at the same time they are. And statistics is more or less a tool. Highly depends on what you do with it and how you apply it. It’s like that with most tools. (And LLMs in the current form are kind of a shit tool for this if you ask me.)
Right. If it’s true that women statistically outperform men (with same application documents), it’d be logical to prefer them just on gender alone. Because they likely turn out to be better.
Thanks for the voice of reason in this sea of hate.
From my pov it would be best to have completely anonymised applications and no involvement of AI in the hiring process.
For most jobs it’s hard to do a hiring process without in-person interviews, or at the very least video calls. So I’m not really sure how one could realistically get rid of biases. But I completely agree that whenever there are too many applications to interview everyone individually, the initial screening of applicants should be completely anonymized and rely only only technologies where biases can at least be understood.
For the final step I’m afraid we’ll have to try to train people to be less prone to biased decision-making. Which I agree is not a very promising path.
You’re welcome. I mean it’s kind of a factual question. Is gender an indicator on its own? If yes, then the rest is just how statistics and probability work… And that’s not really a controversy. Maths in itself works 🥹
I’d also welcome if we were to cut down on unrelated stuff, stereotypes and biases. Just pick what you like to optimize for and then do that. At least if you believe in the free market in that way. Of course it also has an impact on society, people etc and all of that is just complex. And then women and men aren’t really different, but at the same time they are. And statistics is more or less a tool. Highly depends on what you do with it and how you apply it. It’s like that with most tools. (And LLMs in the current form are kind of a shit tool for this if you ask me.)