A novel lawsuit against AI recruiting platform Eightfold AI could reshape how companies use hiring technology—and expose employers to new liability.
Why it matters: If plaintiffs succeed in arguing AI screening tools are "consumer reports" under the Fair Credit Reporting Act, companies may need to rethink how AI is used in hiring.
Driving the news: Job applicants filed suit in California claiming Eightfold's candidate scoring system (which ranks applicants 1-5 using data from 1+ billion profiles) must comply with credit reporting disclosure requirements.
- The 1970 Fair Credit Reporting Act covers data used for "employment purposes," requiring transparency and dispute mechanisms.
- Plaintiffs want companies to disclose what data is collected and allow candidates to correct errors.
- Critically, the plaintiffs are alleging that Eightfold scrapes the internet for information to compare candidates and predict future performance without their consent (Eightfold denies this).
The vendor perspective: AI screening tools are fundamentally different from credit reporting, so the FCRA doesn’t apply.
- AI tools automate what human recruiters do: ranking candidates into tiers of desirability.
- The Consumer Financial Protection Bureau under Trump rescinded 2024 guidance supporting the plaintiff's interpretation (that hiring scores are consumer reporting).
What to do now:
- Audit your vendors: Understand exactly how your AI screening tools work and what data they collect (try this Fisher Phillips checklist of questions to ask).
- Increase transparency: Consider providing more feedback to candidates, even if not legally required yet.
- Watch developments: A final decision in the case could take multiple years, but it may result in employers being required to comply with FCRA procedures on all AI screening tools.
Bottom line: The technology is evolving faster than the legal landscape, so proactive compliance may be preferred to reactive litigation.