For this episode of What the Hell Just Happened?! Paul Edwards discusses how artificial intelligence (AI) is working its way into the world of business ownership and management with CEDR Senior Solution Center Advisor Halisi Tambuzi. Can AI be unintentionally biased? If so, how would that affect your practice? How can you prepare for a world of AI integration? Listen as Paul and Halisi analyze the risk that AI can create if used without caution, as well as how you still may be able to find ways to leverage the value of this emerging technology to streamline HR processes at your practice.
Paul: Hello. My name’s Paul Edwards and welcome to the WTHJH podcast. You’re about to listen to an episode of “What the Hell Just Happened in HR?”.
I’m an HR nerd who loves to talk about HR with just about anyone who will listen. So, during each podcast, we’re going to delve into the solutions for dealing with a real-life HR issue. Plus, occasionally, we’re going to share some big-company HR strategy ideas.
Keep in mind that for every HR problem you solve, there are state, federal, and local laws that govern what we can and cannot do. And now, let’s get started.
Halisi: Hey, Paul.
Paul: Hey, Halisi!
Halisi: I wanted to talk to you about artificial intelligence, the use of artificial intelligence.
Paul: In HR?
Halisi: In HR.
Paul: Oooh, okay.
Halisi: And I wanted to get your thoughts on that.
Halisi: On the one hand, using AI to help improve systems, hiring, just processes within the workplace can be helpful.
Halisi: But then, if you’re not… if your systems, if your artificial intelligence systems aren’t picking up on various biases, protected classes, where someone might be applying from, (the) distance-wise, could exclude.
Paul: Oh! Got it. Yeah.
Halisi: Could exclude certain groups of people.
Paul: So, hang on a second. So, the AI could, for everybody’s benefit, artificial intelligence, which I think was a really bad way to name what this is. But the AI could pick up that of the last 16 terminations, that when it looked at all of the data sets of all of the people who got terminated or quit or just left, they were more than 15 miles away and bring that into a singular piece of data that says people who live 15, more than 15 miles away should be excluded. We shouldn’t even interview them, right?
Paul: Okay, it could bring something like that in. Okay. Right. Keep going.
Halisi: And then so the issues there start to kind of span out. So, that’s like the discriminatory, protective class, national origin, what have you.
Halisi: But what should employers be kind of thinking of in order to kind of maybe check their systems?
Paul: Well, we already have a body of knowledge around this which was testing employees. So, if you wanted to run sort of an intelligence test for employees, you had to get it. You couldn’t just create it yourself. It had to be certified, it had to be certified by a group of professionals who I think some of them had to be like psychiatrists and then the test itself, and the questions themselves, had to be examined and kind of cleared by a third party to say that these are not creating bias within your workplace. So, we already have the body of knowledge for saying that when you bring a piece of data in, a piece of knowledge in, a piece of understanding in, that you need to make sure that it’s not applied in a way which would, like you said, adversely impact a specific group of people.
Halisi: Specific group of people, yeah. Like because employers might do hiring via video.
Paul: Right. We used to say don’t do that.
Paul: Don’t ask for pictures. Okay? That was always the question. So, here’s a piece of AI, okay. It is a piece of intelligence that you were asking for which was, “What does this person look like?” And we have to be realistic. There are all kinds of biases like, “What’s attractive to me? What is the color of the person’s skin?” All these things that we could gain from looking at the photograph. So, we would therefore say at the beginning, in the beginning phase, you cannot be accused of using those factors to weed out people because you don’t know them. So, if, when a resume comes in, if you don’t know the nationality of the person, you don’t know their race, you don’t know how far away they live. Well, actually have your address, I guess you could check, but anyway. You can’t use what you don’t have. AI is using everything that it can gather in order to try to present you with the best choices.
Halisi: So, on the one hand, we’re saying, “Hey I want to be more efficient.” So, I might use a particular… some type of AI system. But, on the other hand, it’s like, there’s a particular caution. You want all the information?
Paul: You do.
Halisi: If you want all the information, there’s risk that comes with handling all that information.
Paul: And if you want to use all of the information.
Halisi: If you want to use all the information.
Paul: Yeah. Because once you have it, you have it and people have gotten in trouble for giving the test that we talked about. And in the end, you could say the test is designed to not weed people out but, in the end, you’re collecting that information and you can look back and you can say, okay, now show me everybody who did not get the job. Oh, look!
Paul: They’re all a certain race! If you were in this race or you were clearly in this religion or whatever those things look like, you could not get a job here and look at all of the same looking people who work here, which is biased.
Halisi: And some of the things with the AI is that it will weed it out for you so that it doesn’t even come to your table.
Paul: And that’s not good. Yeah, that’s not good. Gosh, we could go all over the place on bias because this is not a podcast, this session is not about bias but we will have a couple of these. It is about AI. It can handicap you, too. Look, I think the future, AI and HR, I think is a big deal. It’s already a big deal. People are using it and misusing it in some ways. But, you know, an intelligent… and intelligent chat that answers your employee’s simple HR questions? I love that idea. Like, “I’m getting ready to go on maternity leave. You know, what do I need to do?” You don’t even have to interrupt a manager.
The bot says, “These are basically what we do.” The bots are based on what your policies are, and then the bot shows the policy to the employee. I just think that’s a wonderful use of AI. You know, predictive AI… using predictive measures internally. Maybe even good AI that is polling your employees and trying to figure out, you know, who’s happy and who’s sad. Why? I think, we talk about exit interviews. I think, if you could get an ex-employee to participate in an AI exit interview process, you could probably get more, you could glean more from the answers they would give to say 20 questions that took them four minutes to answer then you might otherwise including, they won’t tell you at all why they left.
Halisi: I mean that’s good to kind of just keep in mind that like at the exit interview stage it’s more reasonable. Maybe on that tail end.
Halisi: You’re using it while during the time period where that employee is employed to kind of just check to see if they have concerns, what they may have. On the front end, it sounds like it could be a little bit more riskier on that end.
Paul: On the front end I think it can be, especially when you’re using it in interviewing and using it to filter for who you’re going to interview. I don’t believe that AI has it figured out at all. That’s, my, you know, based off my experience.
Halisi: So, making sure you kind of just, one, paying attention to the latest.
Paul: Disparate impact! I’ve been searching for that since we started. I’ve been trying to get that terminology to come up. So, when you have a test or an AI that knocks people out and you look back and you say, “Oh look, this was the impact it had on everyone,” even though you didn’t think it was, that’s identifying a disparate impact that it’s having on…
Halisi: Right yes, no intent but…
Paul: …a subset
Halisi: …the overall, kind of, inclusion of it. The impact.
Paul: Yeah. Look, I think AI is the way to go. I know larger businesses are using it. I think that, you know, we’re big advocates for smaller businesses, small-to-medium businesses. I believe that AI is going to play an important role. In the next, you know, 2 to 5 years, I think it’s going to show up in a way that really helps us out.
Halisi: Where would you fall as far as helping and, I think I actually might know, if it’s helping kind of just improve processes within the workplace that could ultimately reduce your employee count. That might say, “You know what? We no longer need this position anymore. Because technology is able to replace that particular position.”
Paul: Yeah, I like the idea. I love the idea of AI being used as an efficiency… I mean, it’s well known that, per employee, compared to say 20 or especially 30 years ago the productivity level of businesses. It takes probably 10 fewer employees to do a single job than it used to. Maybe I’m exaggerating but maybe not by much. I mean, I think about what we do here with 35 employees across the country and everything that we accomplish. And don’t get me wrong, folks, we have our own challenges here. But what we are able to do with technology is amazing. And, you know, how we can organize. And it wasn’t that long ago we were mailing y’all your employee handbooks.
Paul: I mean, we were printing them out, we were binding them, we were putting notes in there and all sorts of things. And, while I miss that to a certain degree, we don’t do it that way. AI is something that is going to transform some of the processes. Yeah, for sure.
Halisi: You just made me have flashbacks with the…
Paul: You were here when we did that, right?
Halisi: With the handbooks.
Paul: The leather-bound handbooks.
Halisi: Leather-bound handbooks.
Paul:They were awesome.
Paul: They were awesome.
Halisi: So, that’s great.
Paul: Alright, anything else? Anything else on AI?
Halisi: Nope. That was it.
Paul: Okay. Alright, everybody. That was What The Hell Just Happened in HR?! Not yet, but AI is coming.
Closing: Thanks for joining us for this week’s episode of What The Hell Just Happened?! If you have an HR issue or a question, you’d like us to discuss on this podcast, send it to podcast@WTHjusthappened.com. For more HR advice and insights from Paul and his team of experts, you can also join our private Facebook group, HR Base Camp, or visit HRbasecamp.com. Make sure you tune in next week. And remember, when you improve your workplace, you improve your life.
This podcast is sponsored by CEDR HR Solutions. Time is money. Is your timekeeping system costing you more than it’s saving you? If the time-keeping system you use for your office isn’t customized for your business, it isn’t working hard enough. CEDR’s PTO and time tracking saves you time and money by doing things that other timekeeping systems just can’t do. Track PTO as it accrues for your employees, run easy, detailed payroll reports, build your office schedule, limit overtime, prevent payroll mistakes, and more, all with the same user-friendly interface. Visit cedrsolutions.com/software to learn more.