How AI is adding more hurdles to DEI efforts

Adobe Stock

Diversity, equity and inclusion efforts are under fire in the workplace, and without a clear strategy, widely used AI tools could exacerbate discrimination even more. 

In 2025, 78% of organizations worldwide — roughly 280 million companies — have reported using AI in at least one business function, according to a recent report from Mckinsey, a 55% increase compared to the previous year. While the integration of new technologies has had a positive impact on the workplace, the current political climate around DEI makes bias a bigger threat than ever before. 

"DEI and AI are not mutually exclusive concepts," says Sara Gutierrez, chief science officer at talent management solution SHL. "We have to integrate the two concepts and the key to that is intentionality." 

Read more: Why benefit managers can't ignore tech and AI any longer

The recruiting process is where most organizations are concentrating their AI strategies. In fact, the Harvard Business Review estimates that as many as 86% of employers are now limiting or eliminating human involvement in the initial stages of the interview process and are replacing the interviewer with artificial intelligence. Without the proper oversight, however, AI can do much more harm than good, according to Gutierrez. 

This is especially apparent for DEI-specific roles: Since January 2025, more than 270 DEI jobs have been eliminated across several industries, according to an analysis by Revelio Labs. An absence of this position in the workplace will have trickle down effects: Normally, diversity executives are in charge of implementing strategies like DEI training, awareness campaigns, blind resume reviews and standardized interview questions. They also promote open communication throughout organizations to ensure accountability by reviewing company data. With those roles at risk, so are those strategies. 

"If we're implementing AI tools without a DEI lens, you can undermine any hard-won progress you've had," Gutierrez says. "But if you're thinking about those goals and allowing them to help guide your selection and development of what tools you'll be using, it can potentially accelerate inclusion." 

How to make AI strategies diverse

Gutierrez urges organizations shopping around for AI tools to look past marketing language and start asking vendors important questions about how the tool was built, what data it relies on and how it's being tested. Specifically, leaders should be inquiring about where they source their training data, because tools that are trained exclusively on historical hiring outcomes will often replicate those previous biases. Instead, benefit managers should be focusing on tools trained on objective performance data and assessments. 

Read more: Demonized by Trump, DEI professionals go 'discreet' to find jobs

"Transparency is critically important when an organization is implementing any type of AI," Gutierrez says. "The best thing organizations can do is set up some form of official governance framework that enables them to show employees how the AI models work and explain why they're confident it has the least amount of bias." 

Overall, AI is making progress towards improving the employee experience by automating tasks like email management, data entry, presentations and summarizing complex topics. These same time-saving tools can be applied to hiring and diversity efforts, as long as it's done so responsibly. 

"There is no current solution that exists that doesn't have some sort of bias included into it," Gutierrez says. "If we're building strategies by making conscious choices along the way, we have the opportunity to actually drive more fairness into these tools."

For reprint and licensing requests for this article, click here.
Technology Artificial intelligence Diversity and equality
MORE FROM EMPLOYEE BENEFIT NEWS