NYU professor says using AI to hire, fire employees ‘troublesome’

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Journalist Hilke Schellmann discusses the impact of artificial intelligence (AI) on hiring, supervision, promotion and firing in her book “The Algorithm”.

In his book, Shellman examines the accountability of AI and investigates its growing presence in the workplace.

Schellmann, an Emmy Award-winning investigative reporter and assistant professor of journalism at New York University, examines the impact of AI on workplace decisions and warns hiring managers to be more skeptical when using the technology.

She sat down with ABC News Live to discuss her new book.

ABC News Live: It should come as no surprise that companies large and small are using artificial intelligence platforms and software to help them make decisions that were once the domain of HR departments. Joining us in a moment is Emmy-winning investigative reporter and professor of journalism at NYU Halke Shelman.

His new book, “Algorithms: How AI Decides Who is Hired, Supervised, Promoted, and Fired. [And Why We Need to Fight Back Now],” investigates seemingly automated tools that can potentially eliminate current and potential employees based on some questionable criteria. explores the ways in which AI is impacting the employee lifecycle. There are no doubt many eyebrows raised here.

So if you could explain what tools employers are using and why they warrant investigation?

Halke Shelman: Yes. So I think we see that, you know, most Fortune 500 companies use AI somewhere in the hiring pipeline. So we see it a lot with resumes. So, if you, you know, upload your application to any of the major job platforms or apply directly to the websites of employers you know, there’s often a resume parser that says yes to the resume. And will sort in a pile of no. .

And then we see a lot of, you know, and that was obviously going through the pandemic. We see a lot of one-way video interviews, where people get pre-recorded questions and there’s no one on the other side. And then we also see video games, those, job seekers are asked to play to understand what their abilities are, what their personalities are, you know, are they agile? Are they, you know, quick, quick to learn, all the things that companies want to know.

And, you know, it’s all coming from a place where we’re now seeing, you know, companies get inundated with millions and millions of applications. So they feel we need a technical solution. I’ve heard from a lot of them, employment lawyers who said, ‘Oh, yeah, we found gender discrimination in that tool. And we told the company not to use it, but the startup is still there.’ So, yes, there are a lot of tools that unfortunately do more harm than good.

ABC News Live: And we also want to ask you about some eye-opening tests you did on a certain tool that claims to predict personality and job suitability based on voice patterns. So walk us through this experiment and the amazing results.

Shellman: Yes. So I often think about, in this case, it was a one-way video interview where, you know, you get a bunch of questions and you’re asked to answer them and record yourself. . So I always think about like, what happens to people who might have an accent, who have a speech impediment, and you know, what about their audio?

When their audio is transcribed into text, will the AI ​​tool check them? So I thought, you know, let’s put it to the test a little bit. So I talked to a tool in German, and I read an entry on Wikipedia about psychometrics. And I was surprised, you know, I sent the email and I got an email back that said, ‘Oh, you’re 73 percent, you know, qualified for this job.’ And I was like, I didn’t say a word in English. And when I looked at the transcript, it was just nonsense. I think it really bothers me. Like, you know, these are high-stakes decisions, like, it matters who gets the job and why we get the job.

ABC News Live: Right? Very clearly not a perfect science yet. Now you spent five years on this book. You spoke to job applicants, employers and whistleblowers. So what steps will need to be taken to ensure the responsible and ethical use of AI in the field of human resources?

Shellman: Yes. So I think, you know, there should be a lot of skeptical questions first like for developers, like, you know, we need clarification. Why would anyone reject it? Why was someone put in the next round? And often the developers of the tools are not even aware of it because these are unsupervised AI models. So I find it very difficult. We must have clarity, transparency. So if someone is in front of a judge, they have to explain why someone made it to the next round.

ABC News Live: Halke Shellman. This is an interesting topic that we will likely be grappling with for years to come. Thank you very much for your time today.

Shellman: Yeah, thank you for having me.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment