News & Insights

Hired by an Algorithm, Fired by Bias: When AI Screening Tools Discriminate

You applied for a job. You never got an interview. You never talked to a human. You were just…rejected.

Welcome to hiring in 2026. By some estimates, over 80% of employers now use automated systems to screen applicants before a human ever looks at a resume. These tools promise efficiency and objectivity. What they often deliver is discrimination at scale.

Here’s what nobody tells you: the same civil rights laws that protect you from a biased manager also protect you from a biased algorithm. The computer doesn’t get a pass just because it’s a computer.

How Algorithmic Discrimination Actually Works

These systems screen resumes for keywords, score personality tests, analyze video interviews for facial expressions and tone of voice, and rank candidates based on predictive models. They automatically reject anyone who falls below a threshold. No human involved.

The sales pitch is that algorithms remove human bias. The reality is the opposite. Algorithms are trained on historical data – and historical data reflects historical discrimination. If past hiring favored younger employees, or men, or people without disabilities, or people who weren’t pregnant, the algorithm learns to do the same thing.

And here’s the part that should concern you: an algorithm doesn’t need to “know” your age, disability, pregnancy status, or race to discriminate against you. It can infer protected characteristics through proxies. Employment gaps? Could be pregnancy, disability, or caregiving. Graduation dates? That’s age. Zip codes? That’s often race. Speech patterns or facial movements in video interviews? That can screen out neurodivergent candidates.

The discrimination is laundered through “objective” data. But the outcomes are the same.

Pregnancy, Maternity Leave, and the Algorithm Problem

This is one I see all the time, and it’s getting worse.

Performance reviews that look at trailing 12-month metrics. Stack ranking systems that compare employees against each other. Sales quotas that don’t pause during leave. AI-driven layoff selections based on productivity scores.

What do these have in common? They all punish women who took maternity leave. If you were out for three months, your numbers are lower than someone who worked all year. The algorithm doesn’t know you were on protected leave. It just sees lower output and targets you.

A journalist from Bloomberg reached out to me recently asking about exactly this. It’s a live issue. Companies are using “objective” metrics to push out new mothers, and the paperwork looks clean because the computer made the decision.

Age and Disability Discrimination by Algorithm

Some of the worst harms fall on older employees and people with disabilities. Video interview software can penalize neurodivergent candidates. Timed online assessments assume you can see the screen, use a mouse, and think in a particular way. Personality tests often resemble clinical diagnostic tools – which means they can screen out people with anxiety, depression, ADHD, or other conditions that have nothing to do with job performance.

In 2023, the EEOC reached a settlement against iTutorGroup after its hiring algorithm automatically rejected applicants over 40. No human was involved. The discrimination was still illegal.

That case put employers on notice: algorithmic bias is now a federal enforcement priority.

What the Law Says

New York City has one of the first laws specifically regulating AI hiring tools. Employers using automated decision tools must conduct annual bias audits, publish the results, and notify applicants that automated systems are being used.

But that law has limits. The audits only cover race, ethnicity, and sex, not disability or age. And compliance with the audit requirement doesn’t mean the tool is legal. If it’s producing discriminatory outcomes, you still have claims.

More importantly, the traditional civil rights laws still apply – and they cover everything. Title VII. The ADA. The Age Discrimination in Employment Act. The New York State and City Human Rights Laws. The Pregnant Workers Fairness Act. If an automated tool screens you out because of a protected characteristic, or has a disparate impact on a protected group, that’s discrimination. Period.

And here’s the kicker: New York State just codified disparate impact protections in December 2025. So if an algorithm is producing discriminatory outcomes, even without intent, that’s now explicitly covered under state law.

Warning Signs

You might be dealing with algorithmic discrimination if you’re repeatedly rejected without ever getting an interview despite strong qualifications, if video interviews seem scored without human interaction, if requests for accommodations are ignored or the system can’t handle them, or if you’re told decisions are “system-generated” and can’t be reviewed.

Lack of transparency is not a defense. Neither is outsourcing – if an employer uses a vendor’s discriminatory tool, the employer is still liable.

What You Should Do

If you think an algorithm screened you out, document everything. Save the job posting. Screenshot the application portal. Note the dates. If you got rejection emails, keep them. If you took a video interview or assessment, write down what you remember.

If you’re still employed but you think algorithmic performance reviews or layoff criteria are being used against you, the same thing. Document the metrics, document the timing, document who else is affected.

And as always: don’t quit. #DONTQUIT. If you’re being pushed out, get advice first.

The Bottom Line

Algorithmic hiring tools don’t create discrimination. They accelerate it. When biased systems operate at scale, entire categories of employees can be quietly excluded from economic opportunity without explanation.

But the law still applies. Technology changes. Civil rights protections don’t.

If you were denied an opportunity by a system you never got to question, or if you’re being pushed out by “objective” metrics that seem designed to hurt people like you—it might be time to talk to someone.

We offer free, confidential evaluations. You don’t have to navigate this alone.

Contact Tuckner Sipser Weinstock & Sipser, LLP at 212-766-9100 or email us at info@womensrightsny.com.

Tell Us What Happened

CALL 212.766.9100, TEXT, OR FILL OUT THE FORM

Schedule A Free Evaluation