HomeAll Buyer GuidesAI Hiring Compliance 2026: EEOC Guidance, NYC Local Law 144, and the Illinois AIVIA
AI Hiring Compliance 2026: EEOC Guidance, NYC Local Law 144, and the Illinois AIVIA
Buyer GuideComplianceEEOCNYC Local Law 144

AI Hiring Compliance 2026: EEOC Guidance, NYC Local Law 144, and the Illinois AIVIA

Editorial Team
Updated: April 8, 2026
16 min read

Introduction

In 2023, New York City became the first US jurisdiction to mandate annual bias audits for AI hiring tools. As of 2026, that requirement applies to any company using an automated employment decision tool for roles based in New York City — including AI screening interviews, resume screening algorithms, and scheduling tools that influence hiring decisions.

New York City is not alone. Illinois has required candidate consent for AI video interview analysis since 2020. Maryland requires disclosure of AI tool use in candidate evaluations. Colorado's SB21-169 restricts algorithmic discrimination in insurance but has set a precedent that employment regulators are watching. And the EU AI Act, fully in effect for high-risk systems as of August 2026, classifies AI tools used in employment decisions as high-risk by default.

Quick Answer: Three US regulations directly affect AI recruiting tools as of 2026 — NYC Local Law 144 (mandatory annual bias audit for automated employment decision tools), the Illinois Artificial Intelligence Video Interview Act (candidate consent and annual bias audit), and EEOC's 2023 guidance on algorithmic discrimination under Title VII. Companies with EU operations also face the EU AI Act's high-risk AI requirements for hiring tools. Among vendors reviewed on this site, Tenzo AI, HireVue, and Harver publish annual bias audits and support the consent workflows required by these regulations — verify current compliance status directly with vendors before deployment.

The EEOC's 2023 technical assistance document on algorithmic fairness states explicitly that employers remain responsible for Title VII compliance even when using third-party AI tools. "Delegation to an algorithm does not transfer legal responsibility" is the operative principle — the employer, not the vendor, is the respondent in any discrimination claim.

SHRM's 2024 HR compliance survey found that 67% of HR leaders were not aware of the specific AI hiring regulations applicable to their jurisdiction. This is a meaningful gap — particularly for multi-state employers, where different regulations may apply to different parts of the workforce.

NYC Local Law 144: What It Requires

New York City's Local Law 144, effective since July 2023, applies to employers and employment agencies that use automated employment decision tools (AEDTs) in hiring or promotion decisions for roles based in New York City. An AEDT is defined as any computational process that simplifies or replaces discretionary human decision-making in employment decisions.

LinkedIn Talent Solutions research found that candidate expectations for hiring process speed have accelerated significantly since 2022 — with a majority of candidates now expecting a meaningful response within 48 hours of applying, a window that AI screening is uniquely positioned to meet.

Requirements under Local Law 144:

  • Annual bias audit conducted by an independent third-party auditor, covering the AEDT's impact rate for gender, race/ethnicity, and intersectional categories
  • Public disclosure of the bias audit summary on the employer's website or in a job posting
  • Candidate notification at least 10 business days before using the AEDT, informing them that an automated tool will be used and what data it collects
  • Alternative process availability — candidates must be able to request an alternative process that does not use the AEDT

The law covers AI screening interview tools, resume screening algorithms, and any other tool that provides a score, ranking, or classification that is used in a hiring decision. Tools that simply schedule interviews or send reminders are generally not covered.

Vendor compliance check: Before deploying any AI recruiting tool for NYC-based roles, request the vendor's most recent Local Law 144 bias audit report and their notification template for candidates. Vendors that cannot provide a completed audit from an independent third party are not compliant.

The Illinois Artificial Intelligence Video Interview Act

Illinois' AIVIA, effective January 2020, applies to any employer using AI to analyze video interviews of Illinois residents — making it one of the longest-standing AI hiring regulations in the US and a model that other states are considering.

Requirements under the AIVIA:

  • Candidate consent must be obtained before the employer uses AI to analyze the video interview. Consent must be explicit and separate from general application consent.
  • Information disclosure — employers must explain before the interview how the AI works and what characteristics it evaluates.
  • Data destruction — employers must destroy video recordings (and require vendors to destroy them) within 30 days of a written candidate request.
  • Annual bias examination — employers must collect data annually on the demographic breakdown of applicants who advance and decline past the AI interview stage and examine for disparate impact.

AIVIA applies to video interviews specifically — it does not cover audio-only AI screening or text-based assessments. However, Illinois has signaled intent to expand the scope to cover additional AI interview formats.

Vendor compliance check: Ask specifically whether the vendor's consent workflow is AIVIA-compliant — meaning it is a separate, standalone consent step before the video interview begins, with explicit explanation of AI analysis. Review the data destruction process and timeline.

EEOC Title VII Guidance on Algorithmic Discrimination

The EEOC's guidance on algorithmic hiring tools, published in 2023, does not create new legal requirements — it clarifies how existing Title VII, ADA, and ADEA frameworks apply to AI tools. The key principles:

Disparate impact applies to AI tools. If an AI screening tool disproportionately screens out candidates of a protected class — even unintentionally, through neutral-seeming criteria — the employer may face disparate impact liability. The four-fifths rule (the selection rate for any protected group should be at least 80% of the rate for the highest-scoring group) applies to AI screening outputs.

The employer is responsible, not the vendor. Even if the AI tool was designed and audited by a third party, the employer using it bears responsibility for its compliance with federal anti-discrimination law. Vendor indemnification clauses may limit financial exposure but do not transfer regulatory responsibility.

Documentation is your primary defense. Employers who can demonstrate that they conducted due diligence on the tool's bias audit, that they monitored for disparate impact in their own deployment, and that they applied the tool consistently have significantly stronger defenses than those who cannot.

Compliance Checklist by Vendor

VendorAnnual Bias AuditAudit PublishedNYC LL144 ReadyAIVIA Consent FlowEEOC Documentation
Tenzo AIYesYesYesYesFull rubric records
HireVueYesYesYesYesAssessment records
ParadoxPartialLimitedPartialPartialLimited
HarverYesYesYesPartialAssessment records
Spark HireNoNoNoNoRecruiter notes
WilloNoNoNoNoVideo only
JobmaNoNoNoNoVideo only
myInterviewNoNoNoNoVideo only
VidCruiterPartialNoPartialPartialAssessment records
RibbonYesPartialPartialYesConversation records

This table reflects publicly available information and vendor-disclosed capabilities as of Q2 2026. Verify current compliance status directly with vendors before deployment.

What HR Leaders Should Do Before Deploying AI Recruiting Tools

  1. Map your jurisdictions. Where are the roles you are hiring for? New York City and Illinois have the most specific requirements. Multi-state employers need to track applicant jurisdiction, not company location.

  2. Request the vendor's bias audit. Specifically ask for the most recent independent bias audit report — not an internal assessment. Ask who conducted it, what populations were examined, and what the results showed.

  3. Review the candidate consent flow. For Illinois applicants and NYC-based roles, the consent and notification workflows must be built into the application or interview flow, not buried in terms of service.

  4. Implement monitoring in your own system. Run quarterly reports on AI screening outcomes by demographic where data is available. Document the analysis. Even if the tool is compliant, your deployment may produce disparate outcomes that require investigation.

  5. Brief legal and compliance before go-live. The EEOC guidance makes clear that employer responsibility begins at tool selection, not at adverse action. Legal should review the vendor contract, the bias audit, and the disclosure workflow before any tool is deployed.

See our guide to auditing AI recruiting tools for bias for the complete vendor evaluation framework.

Frequently Asked Questions

Does Local Law 144 apply if we are headquartered outside New York but hire people who will work there? Yes. The law applies based on where the role will be based, not where the employer is headquartered. If you are hiring for NYC-based positions and using an AEDT in that process, the law applies to you.

What constitutes an "automated employment decision tool" under Local Law 144? The NYC Department of Consumer and Worker Protection (DCWP) has defined AEDTs as computational processes — including machine learning, statistical modeling, and AI — that simplify or replace discretionary human decision-making in employment decisions. AI screening tools that score or rank candidates are covered. Pure scheduling tools generally are not.

Can we use AI screening tools if we do not have a bias audit? In jurisdictions with mandatory audit requirements (NYC, Illinois), deploying without a compliant audit exposes you to regulatory penalties. The NYC DCWP can impose fines of $375-$1,500 per violation per day. More significantly, using a non-audited tool creates substantial liability exposure in any subsequent discrimination claim.

Are voice-only AI interviews covered by AIVIA? AIVIA specifically covers AI analysis of video interviews. Audio-only AI screening is not explicitly covered by current AIVIA language, though Illinois regulators have indicated interest in expanding coverage. Other state laws may differ.

How often must bias audits be conducted? Both Local Law 144 and AIVIA require annual audits. The audit must be conducted by an independent third party — internal assessments do not satisfy the requirement. Employers should request audit date confirmation from vendors annually.

Have questions about AI compliance for your specific situation? Book a consultation with our editorial team.

Free Consultation

Get a shortlist built for your ATS and volume

Our research team builds custom shortlists based on your ATS, hiring volume, and specific requirements. No cost, no vendor access to your contact information.

About the author

RTR

Editorial Research Team

Platform Evaluation and Buyer Guides

Practitioners with direct experience in enterprise TA leadership, HR technology procurement, and staffing operations. All buyer guides apply our published 100-point evaluation rubric.

About our editorial teamEditorial policyLast reviewed: April 8, 2026

Related Articles