MANAGEMENT & PERSONNEL SYSTEMS
  • Home
  • Solutions
  • About Us
  • Insights
  • Contact us
  • FRANK TALK
  • Online Services


The MPS Insight

​AI in Hiring: The Good, the Bad, and the “Oh No, Not That Again”

11/23/2025

0 Comments

 
Artificial intelligence has officially joined the hiring party—and like any unpredictable party guest, it can either make things run smoothly or accidentally set something on fire. At MPS, we’ve spent decades helping organizations choose the right people in the right way, so we’ve seen the good, the bad, and the “please stop doing that immediately” side of AI in hiring.
Let’s start with the good. AI is great at doing the tasks nobody’s life calling is centered around. Has anyone ever said, “My dream is to manually schedule 27 phone screens and rewrite job postings all day”? No. AI can automate these admin tasks beautifully. It can also do things like help draft consistent job-related questions, but only as long as a qualified human double-checks it. Think of AI as a very fast, very literal intern. Smart, yes. Ready to run the show? Absolutely not.
Now for the bad—and this is where it gets serious.
Picture

The Black Box Problem
People who build AI systems admit they don’t always fully understand why an algorithm makes particular decisions. That’s not a comforting sentence when those same algorithms are sometimes tasked with deciding who moves forward in a hiring process.
This “black box” issue has already shown up in the real world:
  • The EEOC has publicly warned that employers are still legally responsible for discrimination even if an algorithm caused it.
  • In a current lawsuit, job applicants allege that automated screening tools disproportionately rejected applicants who were Black, older, or had disabilities.
  • Earlier cases found automated resume screeners learned to downgrade all female candidates or to prefer applicants with resumes that looked like the company’s historically male-dominated workforce.
  • According to the ACLU, a deaf candidate interviewed by a video platform received computer generated feedback that she need to “practice active listening.”
  • Some systems were caught rejecting people above a certain age because the model learned that “youthful resume” historically got hired.
None of these companies set out to discriminate and break the law. The machine learned it from historical patterns—and no one noticed until the damage was done.
Where We Draw the Line
AI should never replace structured interviews, score human traits from facial expressions, or decide who gets hired. If AI is acting as judge and jury in your hiring process, congratulations—you’ve invented a lawsuit generator.
The bottom line? AI can make hiring more consistent, more efficient, and a lot less chaotic. But it is not a substitute for human judgment, job analysis, or basic legal compliance.
At MPS, we’re excited about what AI can do—as long as humans stay firmly in the driver’s seat, and AI stays in the passenger seat, holding the map… not grabbing the wheel.
0 Comments

    Author

    Cheryl Frankeny

    Get updates on what’s new at MPS.

Subscribe
CONTACT US

(713) 667-9251

12946 Dairy Ashford, Suite 301
Sugar Land, TX 77478

[email protected]


  • Home
  • Solutions
  • About Us
  • Insights
  • Contact us
  • FRANK TALK
  • Online Services