Back to Blog
AI & Bias 2 min read

85% AI Bias: Hidden Discrimination in Resume Screening

Amazon's AI recruiting tool showed 85% bias against women. Discover how bias creeps into AI systems and how ARIAS eliminates it with skills-based evaluation.

The Amazon AI Disaster

In 2018, Amazon quietly scrapped an AI recruiting tool that showed 85% bias against women. The system, designed to automate resume screening, had learned to penalize resumes that included the word "women's" (as in "women's chess club captain") or mentioned women's colleges.

How did this happen? Amazon trained their AI on a decade of resumes—mostly from men. The AI concluded that male candidates were preferable and systematically downranked female applicants.

"Everyone wanted this holy grail. They literally wanted it to be an engine where I'm going to give you 100 resumes, it will spit out the top five, and we'll hire those."

— Former Amazon Engineer

This wasn't a bug. It was a feature of how AI learns. And it's a cautionary tale for every company building or buying AI hiring tools.

How AI Bias Happens

1. Historical Bias in Training Data

AI models learn from historical data. If that data reflects past discrimination, the AI perpetuates it. Amazon's AI was trained on resumes from a male-dominated tech industry, so it learned that men = good candidates.

2. Proxy Variables

Even when you remove protected attributes like gender and race, AI can infer them from proxies:

  • University attended (historically Black colleges)
  • Neighborhood (zip codes correlate with race)
  • Hobbies (gendered activities)
  • Name (ethnic patterns)

3. Feedback Loops

If biased AI recommends certain candidates, and humans hire them, that reinforces the bias in future training data. The system becomes more biased over time.

The Research is Clear

Studies show AI bias is pervasive:

  • 50% fewer callbacks for Black-sounding names vs. white-sounding names
  • 32% bias against candidates over age 40
  • Gender bias in keyword weighting (leadership = male, collaborative = female)

How ARIAS Solves This

Skills-Based Evaluation

ARIAS doesn't look at resumes. It evaluates candidates through live interviews focused purely on skills and competencies. No names, no photos, no universities—just performance.

Blind Hiring by Default

Demographic information is never fed into our evaluation algorithms. The AI assesses communication, problem-solving, and technical skills without knowing gender, race, or age.

Standardized Rubrics

Every candidate is evaluated on the same criteria. Adaptive questioning maintains depth while ensuring fairness.

Continuous Bias Audits

We regularly audit our AI for disparate impact across demographic groups and adjust algorithms to ensure equity.

Eliminate Bias from Your Hiring

See how ARIAS creates fair, skills-based evaluations

Start Free Trial