Behavioral Design Algorithms Show Promise and Peril in Hiring

A new technology has the potential to both reduce and exacerbate illegal bias in hiring.

The New York Times has reported that two start-up hiring platforms, Applied and Pymetrics, have created algorithms using  artificial intelligence and neuroscience games that can level the playing field for gender, ethnic and socioeconomic representation.

Age discrimination also is illegal but it was not mentioned. This despite considerable evidence  showing that employers currently are systematically discriminating against older workers by using computer software to screen out their resumes and divert them to a digital trash can.  Research shows that older women are the most severely affected by hiring discrimination.

Spokespersons for Applied and Pymetrics said behavioral design algorithms  are capable of analyzing hiring factors that are more predictive of performance and less biased than traditional resume screening tools. The algorithms are tweaked until men and women and people of different ethnic backgrounds get similar scores to qualify for hire. A spokesperson for Applied cited a large test in which over half of the people that were hired would not have been were it not for the platform. A  Pymetrics spokesperson said the company has been highly successful in improving gender, ethnic and socioeconomic representation for clients like Accenture and Unilever.

The behavioral design companies say the technology is equally capable, in the wrong hands, of magnifying hiring bias.

Age discrimination in hiring is arguably one of the most important issues of our time. Yet, as the Times story indicates, the problem is  invisible. Most major corporations do not consider age to be a diversity issue. While they compile gender and minority hiring statistics, they do not keep records of age in hiring.  The EEOC has ignored age discrimination in hiring for more than a decade and even lags behind the business community in comprehending the significance of implicit age bias in hiring for “cultural fit.”  

In August, the director of the EEOC Office of Federal Operations, Carlton M. Hadden, Jr., issued at least two decisions finding no discrimination in cases where highly-qualified applicants aged 60 and 48 were passed over for much younger applicants (including recent graduates) with far few objective qualifications.  Hadden completely discounted the significance of superior objective qualifications in the hiring process.  Then, in September, the EEOC filed a lawsuit claiming a group of CBS television stations in Dallas, TX, engaged  in age discrimination in hiring  by failing  to consider the superior qualifications of a 42-year-old female weather caster and, instead, hiring a 24 year old woman for the job. Given these confused outcomes, it’s anyone’s guess how the EEOC will decide an issue involving implicit age bias in hring.

One thought on “Behavioral Design Algorithms Show Promise and Peril in Hiring”

Leave a Reply

Your email address will not be published. Required fields are marked *