Googling for images of occupations amplifies age-based gender inequality in people’s beliefs. Credit: Nature (2025). DOI: 10.1038/s41586-025-09581-z

U.S. Census data shows no systematic age differences between men and women in the workforce over the past decade. And globally, live about five years longer than men. But that's not what you'll see if you search Google or YouTube or query an AI like ChatGPT.

A in the journal Nature analyzed 1.4 million online images and videos plus nine large language models trained on billions of words and found that women are systematically presented as younger than men. The researchers looked at content from Google, Wikipedia, IMDb, Flickr, and YouTube, and major large language models including GPT2, and concluded that women consistently appeared younger than men across 3,495 occupational and social categories.

"This kind of age-related gender bias has been seen in other studies of specific industries, and anecdotally, such as in reports of women who are referred to as girls," says Berkeley Haas Assistant Professor Solène Delecourt, who co-authored the study with Douglas Guilbeault of Stanford's Graduate School of Business and Bhargav Srinivasa Desikan from the University of Oxford/Autonomy Institute. "But no one has previously been able to examine this at such scale."

The distortion was most stark for high-status, high-earning occupations. What's more, the researchers found mainstream algorithms further amplify age-related gender bias: When generating and evaluating nearly 40,000 resumes, ChatGPT assumed women were younger and less experienced while rating older male applicants as more qualified.

"Online images show the opposite of reality. And even though the internet is wrong, when it tells us this 'fact' about the world, and we start believing it to be true," Guilbeault says. "It brings us deeper into bias and error."

A 'culture-wide, statistical distortion of reality'

The research team used several approaches to assess gender and age in images and videos gathered from a variety of platforms (for video analysis, they captured still images). In one case, they hired thousands of online workers to classify gender (male, female, nonbinary) and estimate age within a set of ranges. In other cases, the datasets allowed them to cross-reference the image timestamp with the subject's birthdate to calculate an objectively precise age.

Across all methods and datasets, women were strongly associated with youth and men with older ages, either based on how old they appeared to be or what their true age was. This relationship held whether the researchers measured by:

  • Human judgment
  • Machine learning
  • Objective information

This distortion not only grew stronger as the prestige of the job increased—CEO or astronaut, for instance—but also for jobs with larger pay gaps between men and women.

The researchers found the same relationship when shifting their analysis from images to text. They studied the relationship between gender and age using billions of words from across the internet, including Reddit, Google News, Wikipedia, and Twitter. Words related to youth were much more closely tied to women.

"One concern people might have is that images and videos are kind of unique in that people can wear makeup or apply filters, using image-specific strategies to make themselves look younger," Delecourt says. "That's why we also looked at text, and we found exactly the same pattern."

Real-world effects of distorted perceptions

Following on those findings, the researchers conducted two experiments to understand how online algorithms amplify this bias. In the first, roughly 500 participants were split into two groups. Half searched Google Images for specific occupations, labeled the gender of the people in the images, then estimated the average ages and hiring preferences for those roles. The searched for unrelated images, such as an apple or guitar, and then estimated ages and gender associations for those same occupations, but without exposure to images of them.

Participants who viewed women in occupation-related images estimated the average age for that job to be significantly lower than those in the control group, while those who saw a man performing the same job assumed the average age was significantly higher. For occupations perceived as female-dominated, participants recommended younger ideal hiring ages; for male-dominated occupations, they recommended older hiring ages.

In the second experiment, the researchers prompted ChatGPT (gpt-4o-mini) to generate nearly 40,000 resumes across 54 occupations, using distinctive male and matched for popularity, ethnicity, and other factors. When generating resumes for women, ChatGPT assumed they were younger (by 1.6 years), had more recent graduation dates, and had less compared to resumes with male names.

When evaluating resumes, ChatGPT rated older men more highly than women for the same positions. This result appeared whether the researchers provided names or ChatGPT generated its own applicants, showing that the bias is deeply embedded in the system.

A problematic feedback loop

The research follows a study published in Nature last year by Delecourt and Guilbeault—then a professor at UC Berkeley Haas—finding female and male gender associations are more extreme among Google Images than within text from Google News. While the text is slightly more focused on men than women, this bias is over four times stronger in images. They also found that biases are more psychologically potent in visual form.

One of the major takeaways from the new study, Guilbeault notes, is that this evaluation of online information at unprecedented scale reveals a deeply inaccurate picture of the world in which we live. "This is of particular concern given the internet is increasingly how we learn about the social world," he says.

"People are spending more time online, and we rely on algorithms that curate information. And so, what if these biased beliefs are spreading and becoming a self-fulfilling prophecy? Our study shows that they are reinforcing stereotypical expectations about how the world should be."

These questions are all the more urgent given the tremendous amount of investment in AI tools, which are trained on ever-larger online datasets of image and text. When these tools are applied in real-world settings, they are likely to reshape the world even more in line with the stereotypes inherent in their training. In the case of resume screening—in which AI is already widely used—the biases of AI are directly skewing its perceptions of who is and is not qualified for a given job.

Delecourt also pointed to the amount of information young people absorb, actively and subconsciously, through online experience. Given what images present for the average male or female doctor, for example, children may be imprinted with biased ideas about the occupation.

"What was most striking to me, ultimately, was how this online presentation has a much broader effect than I imagined when going into this," she says. "These misrepresentations feed directly into the real world in ways that could be widening gaps in the labor market and skewing the ways we associate gender with authority and power."

Takeaways

"Overall, our study shows that age-related gender bias is a culture-wide, statistical distortion of reality, pervading online media through images, search engines, videos, text, search engines, and generative AI," Delecourt says.

  • Women are systematically portrayed as younger than men across online platforms. Analysis of 1.4 million images and videos plus nine found women consistently appear younger than men across 3,495 occupational and social categories—with the distortion strongest in high-status, high-earning occupations.
  • Algorithms amplify this age-gender bias. When ChatGPT generated nearly 40,000 resumes, it assumed women were younger (by 1.6 years) with less work experience, and rated older male applicants as more qualified—even though real-world data shows no systematic age differences between men and women in the workforce.
  • This creates a problematic feedback loop that distorts reality. The researchers found that people who viewed occupation-related images online adopted the biased age assumptions they saw, potentially creating a self-fulfilling prophecy that reinforces stereotypical expectations and widens real-world gaps in the labor market.

"To fight pervasive cultural inequalities," Delecourt says, "the first step is to recognize how stereotypes are coded into our culture, our algorithms, and our own minds."

More information: Douglas Guilbeault, Age and gender distortion in online media and large language models, Nature (2025). .

Journal information: Nature