When prompted to create images of female and male bodies, Artificial Intelligence (AI) platforms overwhelmingly reproduce and amplify narrow Western body ideals, according to a new study from the Faculty of Kinesiology and Physical Education at the University of Toronto, recently published in the Psychology of Popular Media journal.
The researchers, including post-doctoral fellow Delaney Thibodeau, research associate Sasha Gollish, MSc graduate Edina Bijvoet and Professor Catherine Sabiston from KPE and MSc student Jessica E. Boyes from Northumbria University in the UK, prompted three AI platforms, MidJourney, DALL-E and Stable Diffusion, to create images of female and male bodies, female and male athlete bodies and athlete bodies.
“In a systematic coding of 300 AI-generated images, we found that AI reinforces the fit ideal, with athlete images far more likely to show very low body fat and highly defined muscularity than non-athlete images,” says Thibodeau, lead author of the study.
If that wasn’t entirely unexpected, the researchers also found that gendered sexualization persists, with female images more likely to be facially attractive, younger, blonde and shown in revealing clothing, for example in bathing suits, while male images were more often shirtless, hairier and hyper-muscular.
Consistent with these findings, objectification was common, with the fit of the clothing and exposure patterns emphasizing appearance over function, mirroring what the researchers describe as detrimental trends in social media imagery.
Other findings include a lack of diversity, with most images depicting young, White bodies and no images showing visible disabilities – or baldness.
“Racial and age diversity were minimal,” says Thibodeau, adding that AI defaults to male athletes when unspecified. “When prompted simply for an athlete (no sex specified), 90 per cent of images depicted a male body—revealing an embedded bias toward male representation.”
“Overall, our findings underscore the need to investigate how emerging technologies replicate and amplify existing body ideals and exclusionary norms,” says Sabiston, who is Tier 1 Canada Research Chair in Physical Activity and Psychosocial Well-Being and director of the Mental Health and Physical Activity Research Centre (MPARC) at KPE.
“A human-centred approach – one that is informed by considerations of factors such as gender, race, disability and age – would be advisable when designing AI algorithms.
“Otherwise, we continue to perpetuate harmful, inflexible and rigid imagery of what athletes should look like.”
Users of AI generated images also have a role to play, according to Sabiston, to thoughtfully craft prompts and consider how these images will be presented publicly. Additionally, viewers of AI-generated images should be cautious of interpreting them as authentic and be critical of biases and potential stereotypes depicted in them.
While more research efforts are needed to track the impact of AI-generated images on psychosocial outcomes such as self-esteem, motivation and body image, the researchers say they are hopeful that as more diverse and inclusive images continue to be posted and shared globally, this evolution will help to create more acceptance of body and weight diversity.
This research was funded by the Canada Research Chair Program.