AI is reaching into nearly every aspect of business, and the HR department is no exception. Because the ability of AI to impact hiring practices is evolving quickly, it’s important to stay as up to speed as possible on how AI can and does lead to inadvertent discrimination against job seekers.
In this two-part series, we’ve examined algorithmic discrimination. Now, let’s look at biometric discrimination.
What is biometric discrimination and how does it impact the talent acquisition process?
Artificial Intelligence (AI) is increasingly a part of the human resources toolkit. In some instances, using AI can result in unintended discrimination. One of those instances is when AI is used in video or audio conversations – typically interviews – to help recruiters determine whether a candidate is serious about wanting the job, truthful in their responses, and excited about the company and its mission.
When devices scan a person’s face, that is a biometric identity system. In 2023, the Federal Trade Commission (FTC) referred to biometric information as data that depicts or describes physical, biological, or behavioral traits, characteristics, or measurements of or relating to an identified or identifiable person’s body. Most commonly, we are familiar with biometric systems being used in our smartphone to open, unlock passwords, and access other secure information. Biometric scanning is widely used in airport security. It can also be used in workplace surveillance.
How does biometric discrimination affect women in the workforce?
Biometric systems can be used during a video interview or conversation, or during any in-person interview or conversation that is being monitored by cameras. AI programming is an important – perhaps the most important – factor in keeping biometric systems nondiscriminatory. The AI for biometrics was created from datasets based on biomarkers from middle-aged white men.
As a result, biometric discrimination happens when the AI system cannot read different people’s faces in the same way. This has proven to be a problem for people of color, and the darker a person’s skin, the more trouble that the AI has interpreting their biomarkers. AI also struggles to accurately read the biomarkers for people who have a disability or difference that affects their facial muscles and appearance. Studies have also shown that biometric systems can be biased against women and older people.
How can printing companies avoid biometric discrimination in the HR practices and hiring processes?
States are starting to respond to biometric discrimination by enacting laws and regulations that require disclosure of the use of biometric systems in hiring and human resources applications. There are also an increasing number of states with laws about using consumers’ biometric data.
As an employer, carefully monitor the conclusions made by biometric systems against the feedback from the interviewer. Also monitor whether the biometric system has a pattern of favorable or unfavorable conclusions of particular groups of people. And above all, check local and state laws to make sure that the company is compliant with all laws and regulations about AI and the collection of biometric information.
How can job seekers in the printing industry protect themselves from biometric discrimination?
Avoid being subject of discrimination by opting out of interviews or conversations where biometric systems are being used. If you do not opt out, request that the company using the biometric system share the results of the data that was collected during your interview or conversation.
If you’d like to dive deeper in HR policy topics like AI, pay transparency, and DEI, or to view my webinars, articles, and more, please visit PRINTING United Alliance’s Center for Human Resources Support or follow me on LinkedIn.
This is part 2 of a 2-part series on how AI is impacting employment policy. In the next Women in Print Alliance newsletter, we’ll focus on the issue of biometric discrimination.