Benefits Think

6 strategies for clients looking to source new recruitment technology

The time-consuming nature of searching for qualified candidates — sometimes with mixed results — has caused many organizations to invest in new tech. Employers often expect these technologies will be more efficient and predictive than traditional hiring methods, but there are a few issues they may need to be aware of.

Technology using artificial intelligence, machine learning and predictive analytics often promises to streamline operations and eliminate thorny challenges such as personal bias and subjective metrics in recruitment decisions. However, substituting human decision-making for optimized AI selection tools can raise serious legal concerns and must be carefully assessed before implementation.

JobFair2.Bloomberg.jpg
A "Now Hiring" sign is displayed during a Job News USA career fair at Papa John's Cardinal Stadium in Louisville, Kentucky, U.S., on Wednesday, May 18, 2016. The U.S. Department of Labor is scheduled to release initial jobless claims figures on May 19. Photographer: Luke Sharrett/Bloomberg
Luke Sharrett/Bloomberg

Employers can take precautions to mitigate some of these risks. If you’re thinking about implementing an AI selection tool or have already begun the journey, here are some initial steps you should take.

Scrutinize the data to avoid and address hidden biases. Hidden biases can plague AI selection tools despite their capacity to rely on neutral and objective criteria. From the outset, it is critical to identify the data sources, which will be used to train the algorithm. Data sets may be flawed as the result of limited sample sizes or the disproportionate representation of a single group. The over reliance of an algorithm on prohibited or inherently discriminatory distinctions may also create a bias. Employers should have supervisors scrutinize data sets and monitor for bias.

Conduct a proper validation study. Federal and state laws prohibit employers from using a selection tool that has a disparate impact unless it is job related and consistent with business necessity. The uniform guidelines of employee selection procedures offer three methods for demonstrating job relatedness, called validation studies, which involve a statistical analysis of the tool. Conducting a validation study not only helps insulate the tech from legal challenge, but also assists in determining whether the tool is accurate, effective and performs as expected.

Continuously monitor the tool and conduct follow-up analyses. Employers adopting AI selection tools often are tempted to set it and forget it. But AI tools are constantly evolving. Employers must make sure to create processes for constant monitoring. Diligent employers will regularly conduct analyses to minimize the risk that tech has on protected groups.

Don’t forget disability accommodation. AI selection tools typically reside online and employers should craft a plan to make their offerings accessible to applicants with disabilities. Website accessibility claims are on the rise, and such claims may spread to online or computer-based selection tools if employers do not take the necessary steps to accommodate applicants with disabilities.

Consider data security and applicable data privacy laws. The use of big data in employee selection procedures require the compilation of often personal, sensitive and private data relating to workers or applicants. Employers using AI tools should evaluate the security of their data, as well as anything collected and compiled by a third-party vendor. Additionally, limiting access to private data internally will minimize the risk that a current employee will misuse it.

Know applicable state laws. State and local governments are beginning to respond to the automation of recruitment and onboarding processes by enacting their own regulations. Employers must closely monitor developments in their respective jurisdictions and comply with sometimes competing demands of federal, state and local laws.

For example, the Illinois legislature recently passed the Artificial Intelligence Video Interview Act, which, if signed into law by the governor, would create disclosure requirements for companies that use video interview technology dependent on AI. The act would require an employer to notify each applicant before the interview that AI may be used, provide applicants with information explaining how the AI will be used to assess their candidacy and obtain their written consent.

This article originally appeared in Employee Benefit News.
For reprint and licensing requests for this article, click here.
Recruiting tools Recruiting Workforce management Workplace management Diversity and equality HR Technology Benefit Technology Resource Center
MORE FROM EMPLOYEE BENEFIT NEWS