
More than the rejections, Erin Kistler recalls the quiet. She applied to hundreds of jobs, many of which were handled by Eightfold’s software, but she hardly ever received a response. She didn’t even get a foot in the door despite having a flawless resume and twenty years of tech experience.
It was more than just annoying. Something behind the screen, working silently and without explanation, seemed to be working against her.
| Fact | Detail |
|---|---|
| Defendant | Eightfold AI, Inc. |
| Plaintiffs | Erin Kistler and Sruti Bhaumik |
| Filed In | Contra Costa County Superior Court, California |
| Legal Grounds | Fair Credit Reporting Act (FCRA) and California’s Investigative Consumer Reporting Agencies Act |
| Alleged Practice | Secretive candidate scoring system called “Match Score” |
| Plaintiffs’ Argument | Lack of transparency, no way to dispute scores, potential career impact |
| Companies Using Eightfold | Microsoft, PayPal, Salesforce, Bayer |
| Legal Representation | Outten & Golden LLP, Towards Justice |
That suspicion became a legal battle in recent months. A proposed class-action lawsuit was filed by Kistler and Sruti Bhaumik against Eightfold AI, a hiring tech company with strong venture capital backing and a client list that includes Bayer, Microsoft, PayPal, and Salesforce.
The plaintiffs are indicating that current protections may still be applicable, even to tools that seem entirely novel, by concentrating on the Fair Credit Reporting Act, a law that was drafted long before AI became widely used.
Fundamentally, the problem is centered on a number. In particular, the “Match Score” that candidates are given by Eightfold. The score, which ranges from zero to five, is determined by a sophisticated algorithm that filters through online profiles, resumes, and other data signals to forecast a person’s chances of succeeding in a position.
The issue for candidates is not the score per se, but rather the fact that they are never informed of its existence.
AI systems like Eightfold’s have been used to optimize hiring processes as job hunting has grown more automated over the last ten years. These systems promise to improve the accuracy of matching job candidates with positions and drastically cut down on hiring time.
However, the experience has been particularly opaque for job seekers like Kistler and Bhaumik. They were not given the chance to contest the evaluation process or told that their applications would be scored algorithmically.
The legal complaint, which claims that Eightfold acts as a consumer reporting agency, is based on this lack of transparency. If so, it ought to be subject to the same regulations that apply to credit reporting companies.
Through the use of federal law intended to prevent unjust or incorrect decisions, the lawsuit aims to unlock what is commonly referred to as the “black box” of AI hiring.
Eightfold, a Santa Clara, California-based company, says it follows the law and ethics. Data is either shared by corporate clients or supplied directly by candidates, according to a company spokesperson; it is not scraped from public profiles on platforms like GitHub or social media.
Nevertheless, the lawsuit claims that the system projects future job titles, assesses the “quality of education,” and generates predictive personality tags like “team player” or “introvert” without the applicant’s knowledge or consent.
For seasoned project manager Bhaumik, the outcome was equally depressing. She remembers that within two days of applying for a position at Microsoft, she was rejected. Eightfold’s platform was used to process the application.
The plaintiffs contend in their strategic filings that automation covertly took away their job-seeking rights.
I stopped reading about the lawsuit at one point when I noticed that Kistler’s applications had a mere 0.3% success rate. That particular detail caught my attention—not just as a statistic, but also as an indication of how remote the human element has become in the hiring process.
Leading the legal charge is Towards Justice executive director David Seligman. He points out that there is “no AI exemption to our laws,” and his group wants the courts to uphold the requirement that career-related technologies be transparent, accountable, and open to challenge.
Numerous businesses have praised these tools for their exceptional ability to manage high-volume applications since their release. However, there is a significant trade-off associated with that efficiency, frequently at the price of clarity and justice.
A Match Score can subtly predict an individual’s career path when it comes to hiring. Without knowledge or access, a candidate may be immediately disqualified without ever having the opportunity to update, modify, or clarify out-of-date information.
Additionally, the case draws from a larger movement. Legal and policy voices are increasingly opposing the unbridled proliferation of AI decision-making across industries. Additionally, hiring has turned into an early flashpoint due to its combination of high stakes and little oversight.
Interestingly, AI hiring tools have been the subject of other lawsuits. Similar claims that Workday’s system was inadvertently excluding Black, older, and disabled applicants were made in 2023. Although that case is still pending, its course indicates that these difficulties are becoming more widespread.
The Eightfold lawsuit has the potential to attract thousands, if not millions, of job seekers who were impacted by similar practices by pursuing a class-action designation.
If successful, the result could change how AI companies communicate their practices and engage with the individuals whose lives they evaluate. For the first time, applicants may be granted the legal right to view and contest their algorithmic profile.
This would be especially helpful for job seekers in their early stages, providing a route to visibility in procedures that are frequently automated from beginning to end.
Employers’ software selection and implementation may also be impacted by the knock-on effects. Many might need to reconsider their accountability frameworks in addition to their tools.
The case serves as a reminder that technology’s reach need not be blind due to its legal momentum and increasing public attention.
Long-standing concepts of justice, consent, and due process can serve as a gentle yet firm guide.
Kistler and Bhaumik are suing more than just a tech company by taking this to court. They are posing a more general query: Don’t people have a right to know how machines are evaluating them?
We’ll probably hear this question more frequently in the years to come.
Additionally, job seekers who browse job postings and click “apply” with hope may find that their stories aren’t erased before they start and that their scores aren’t kept a secret.
