Judge Gives the OK to Proceed With a Class Action Against Workday For Alleged Old Fart AI Bias in Hiring

old guy being told he's too old to be hired

As the profession’s only news rag written by and for millennials (all are welcome, of course), we feel it’s relevant to bring you this news from the courts as it affects people over 40 and, ugh, that includes us now.

HR Dive reported earlier this week that a federal judge in the U.S. District Court for the Northern District of California has given the green light to pursue a class action suit against Workday brought by five people over the age of 40 who say they were unfairly filtered out of Workday application systems in violation of the Age Discrimination in Employment Act. Plaintiff Derek Mobley and the four people he included in his request as opt-in plaintiffs say they applied for more than 100 positions each and, in some cases, received swift automated rejections.

From Mobley v. Workday, which you can read in full at HR Dive:

Derek Mobley brings this action for employment discrimination against Workday, Inc., alleging that Workday’s artificial intelligence (“AI”)-based applicant recommendation system discriminated against job applicants on the basis of race, age, and disability. Mobley is joined by four other plaintiffs over the age of forty, who allege that they too have applied for hundreds of
jobs via Workday and have been rejected almost every time without an interview, allegedly because of age discrimination in Workday’s AI recommendation system.

Interestingly, Workday said a big hurdle to the case will be identifying affected people as they could number in the hundreds of millions. “If the collective is in the ‘hundreds of millions’ of people, as Workday speculates, that is because Workday has been plausibly accused of discriminating against a broad swath of applicants. Allegedly widespread discrimination is not a basis for denying notice,” ruled Judge Rita Lin.

With the OK from the court, plaintiff Mobley will now notify “similarly situated individuals” to give them the opportunity to opt-in to the suit. After additional discovery, Workday would then have an opportunity to present evidence that the collective is
not, in fact, similarly situated, said court documents. These individuals are “all individuals aged 40 and over who, from September 24, 2020, through the present, applied for job opportunities using Workday, Inc.’s job application platform and were denied employment recommendations.”

Mobley et al. have a tough job ahead of them, as court documents explain:

The proposed collective is similarly situated because Mobley has substantially alleged the existence of a unified policy: the use of Workday’s AI recommendation system to score, sort, rank, or screen applicants. The critical issue at the heart of Mobley’s claim is whether that system has a disparate impact on applicants over forty. That issue is susceptible to common proof—it cuts across the proposed collective, regardless of the degree to which particular employers place weight on those recommendations, the extent to which the system makes discriminatory recommendations across different employers, or the natural variations in the qualifications or rejection rate for particular members of the proposed collective. At this stage, that is sufficient. The proposed collective need not be identical in all ways, because its members are alike in the central way that matters: they were allegedly required to compete on unequal footing due to Workday’s discriminatory AI recommendations.

As Duck Phillips would say: “That’s a sticky wicket.”

Workday naturally denies the allegations. “This is a preliminary ruling at an early stage of this case, and before the facts have been established. We’re confident that once those facts are presented to the court, the plaintiff’s claims will be dismissed,” said a spokesperson to HR Dive.

Workday had this to say:

Workday advances several arguments for why it is impossible for any collective to be similarly situated under Mobley’s theory of the case. First, Workday claims that it “does not offer ‘employment recommendations,’ so no one belongs in this collective, including [Mobley].” Second, Workday argues that the policy Mobley has identified is not uniform, defeating preliminary collective certification. Finally, Workday suggests that the natural variation in the proposed collective members’ qualifications for the jobs to which they applied, number of jobs applied to, and rejection rate mean that no collective could ever be similarly situated.

The court had this to say, citing Workday’s own website:

Citing its interrogatory responses, Workday argues that “Workday does not recommend, screen out, or otherwise assess or predict applicants’ likelihood of success in a role.” Therefore, Workday claims that no one has ever been “denied employment recommendations” by Workday and the proposed collective has no members. Id. But this position is contrary to the well-pled allegations in the complaint (incorporating Workday’s statements on its own website), as well as Workday’s interrogatory responses. For example:

  • Workday’s website states that “[o]ur skills intelligence foundation helps you build diverse teams by expanding candidate pools with equitable, AI- and ML-driven job recommendations.”
  • In discovery, Workday stated that: “Workday customers who purchase Workday Recruiting have access to Candidate Skills Match (‘CSM’), which they can choose to turn on or off. If a customer chooses to turn on CSM, CSM utilizes artificial intelligence to parse an employer’s job posting and an applicant’s application and/or resume; extract skills in the employer’s job posting, on the one hand, and skills from the application and/or resume on the other hand; and determine the extent to which the applicant’s skills match the role to which they applied. The results of CSM are reported as ‘strong,’ ‘good,’ ‘fair,’ ‘low,’ ‘pending,’ and ‘unable to score.’”
  • Plaintiff alleges that Workday’s “algorithms are only trained on incumbent employees at a company, allowing the pymetrics Workday Assessment Connector to build a homogenous workforce not representative of the applicant pool.”

Workday fails to explain why the conduct described above does not constitute “recommend[ing], screen[ing] out, or otherwise assess[ing] or predict[ing] applicants’ likelihood of success in a role.” Workday appears to take the position that because its AI recommendation system supposedly cannot auto-reject applicants without some degree of participation by the employer, Workday does not “recommend.” But Mobley’s disparate impact claim is based on the theory that Workday’s AI “participat[es] in the decision-making process,” a concept that is broader than auto-rejections. Therefore, even if Workday is taken at its word that its AI recommendation system cannot auto-reject an applicant, Workday is incorrect that the proposed collective is memberless. An individual who received a “low” CSM score, for example, could qualify as having been denied an employment recommendation.

This case could prove interesting for anyone who’s been auto-rejected by automated application systems, not only people over 40 who think the AI filtered them out due to age. That likely includes just about everyone who’s applied for a job using one of these systems in the last several years.

5 thoughts on “Judge Gives the OK to Proceed With a Class Action Against Workday For Alleged Old Fart AI Bias in Hiring

  1. I have been negatively affected by the Workday AI programming due to being screened out due to age and disability

    1. Genuine question here, not being snarky…How do you know you were screened out due to age and disability? I have no doubt it’s happening, I’ve experienced bias of my own in hiring, or at least I think I have, but it’s a difficult thing to prove unless you have a direct line to the hiring side.

  2. A direct line or hundreds of applicants permitting a “differential analysis”.
    For example: suppose Workday is hired by ABC company to help select candidates for a position.
    Workday has 1,000 candidates in its database that appear to meet the criteria for the job before any screening as to the strength of the “CSM”.
    Now 200 of the candidates are over 40, and for the arithmetically challenged 800 are 40 or under.
    The CSM says 80 of the 800 are a “good match” for the position and 1 of the 200. A simple statistical test can “shift the burden” to Workday to explain the result.

  3. I share the same question as another reader—how can someone be certain they were filtered out specifically due to age? With the volume of applications employers receive today—often hundreds or even thousands per job—it’s difficult to pinpoint a single factor behind an applicant not moving forward.

    There are many legitimate reasons why someone might not be selected: a lack of directly relevant experience, location misalignment, or simply that other applicants more closely matched the role’s criteria. Employers often assess a combination of qualifications, including skills, tenure, and overall fit.

    Without access to the algorithms or selection data, attributing rejection solely to age—especially in systems influenced by numerous variables—is extremely hard to verify. That said, the allegations in this case raise important questions about fairness and transparency in AI-driven hiring systems, and I’ll be watching how it unfolds.

Comments are closed.