AI's Lack of Humanity and the Threat of Bias: A Legit Concern

-- Read time:

Hi Recruiters,

Human-facing AI tools are front-and-center these days, like the new shiny gadget everyone's clamoring to have. And, while AI holds real potential for saving time and optimizing costs, we should be mindful of not losing our way in the hype. In the light of day, there's both risk and reward, and our discernment will make all the difference.

Just a few weeks ago, I made a prediction: Within the next 6-18 months, AI will likely be able to handle a lot of what recruiters do today. But, that doesn't necessarily mean that it will do the job well.

Nonetheless, my prediction is tracking. LinkedIn entered the agentic AI race over the last week with its Hiring Assistant promising to offload much of the pre-hire workflow. I haven't seen the tool in action first-hand, but as TA tech continues down this road, my question will always be: “But, does it do the job well?”

The true value of any AI implementation is going to be whether or not it's actually helpful. Here's what some Redditors had to say about previous AI iterations in LinkedIn Recruiter:

And, many remember that just a few years ago LinkedIn's job-matching AI was found to be biased against women—referring more men to jobs for which women were equally qualified.

The point is that new tech will always have flaws. Sometimes, those flaws will outweigh the potential value that a tool offers, and other times, the value is so great that the market can withstand a buffer period for the kinks to be worked out. When those kinks can produce particularly harmful results, it's not always best to be first to market.

Nuance in TA

While AI excels at processing structured data, it struggles with evaluating soft skills like communication and teamwork, which are critical for many roles. Nuance is essential to understand a candidate's experience, especially if their career path is unconventional.

AI alone will always operate with limited contextual understanding. Mitigating how often AI makes errors, such as overlooking qualified candidates, will require the human element.

Remember the story, The Princess and the Pea? A lonely prince insists on only marrying a real princess. To test the royal claim of a young woman, the Queen places a tiny pea beneath a stack of twenty mattresses where she will sleep—because only a real princess is delicate enough to be disturbed by the pea under so many layers. The next morning, the young woman complains of a terrible night, proving her royalty.

Was she a real princess or just someone with sensitive skin? An algorithm trained to judge how qualified she is based on this test would not ask that question.

Employers adopting new hiring tech must not neglect human intervention. If they do, they'll only create an even more flawed, homogenized process for attracting talent, which will damage their employer brand, their talent prospects, and their business.

Arm-in-Arm with AI

At the end of the day, AI can be an incredible partner—but it's only as effective as the people guiding it. Just like any tool, it works best when we bring our unique perspectives, ethics, and insight to the table. Think of AI as a helpful apprentice in the hiring process, rather than the one calling the shots. It might streamline some aspects of the job, but we're the ones who bring depth and discernment.

First, human oversight isn't just a box to check. It's the backbone of any effective AI implementation. AI can help with tasks like sorting through applications or screening for skills, but it's up to us to review and adjust those decisions, spot biases, and ask questions only a human would think to ask. The technology might generate a shortlist, but our experience and judgment are what keep things on track.

And, if there's one thing that keeps AI grounded, it's transparency. When people don't understand how a system works, trust flies out the window. For AI to truly earn a place in the hiring toolkit, recruiters—and candidates—need insight into how it makes its decisions. When we're open about why certain resumes bubble to the top or how a recommendation was made, it not only builds trust but also gives us the chance to catch any inconsistencies.

Finally, data is the essential ingredient that feeds these tools. “Garbage in, garbage out” may be an old saying, but it's truer than ever with AI. If AI systems are trained on biased data, the results will carry that bias forward. By being diligent about the quality of our training data—checking for balanced representation and fairness—we can make sure that AI's recommendations reflect the diversity and quality that we're aiming for in our hiring.

AI has tremendous potential, but it still needs us to lead the way. With a steady hand and clear eyes, we can guide this new technology toward making hiring better, faster, and—most importantly—more human.

What else is happening in hiring?
Dan's Corner

AI can speed things up, but when it comes to truly understanding people—their adaptability, resilience, and teamwork—it still needs us to step in. AI handles the basics, but we bring the depth that makes hiring more meaningful.

If you're interested in this balance, join us for An Unexpected Journey: How Durable Skills Shape Careers on November 21st. Jason Putnam, CRO at Plum, and the hireEZ team will share how these human skills have shaped their paths. Their stories might just inspire a new way to blend tech with the human touch.

Sign up now—let's make hiring smarter and more human!

Recommended articles
hireEZ Partners With Leading Job Boards, Introduces 'Sourcing Hub' Feature
3 min read
hireEZ and IQTalent partner to include on-demand recruiting staff with sourcing and CRM platform
3 min read
hireEZ Launches CRM Platform to Empower TA Leaders and Recruiters
3 min read
hireEZ Announces GPT3-Powered Platform Feature for Candidate Outreach
3 min read
Keep up with
everything learning
Get the latest resources in your inbox.

By filling out this form you agree to hireEZ's Privacy Policy and consent to receive communications from hireEZ.