Charles Sturt University

09/18/2024 | Press release | Distributed by Public on 09/17/2024 17:55

Generative AI creates gender bias – new study reveals

Generative AI creates gender bias - new study reveals

18 SEPTEMBER 2024

Charles Sturt University researchers have revealed a potential pitfall in the use of Generative Artificial Intelligence (GenAI) when creating professional images in health and medicine.

  • Charles Sturt researchers reveal gender bias present in Generative AI image technology
  • Two research papers found gender and ethnicity bias particularly related to undergraduate medical students and pharmacists
  • The research raises concerns the technology may deter women and minorities from choosing to study in these fields

Charles Sturt University researchers have revealed a potential pitfall in the use of Generative Artificial Intelligence (GenAI) when creating professional images in health and medicine.

Professor in Nuclear Medicine Geoff Currie and Senior Lecturer in Medical Imagining Mr Johnathan Hewis, both in the Charles Sturt School of Dentistry and Medical Sciences, co-authored two research papers revealing gender bias in text-to-image depictions of both undergraduate medical students and Australian pharmacists.

Both papers analysed the use of DALL-E 3, the text-to-image GenAI tool supported through ChatGPT, to generate a range of single and group photos of undergraduate medical students and pharmacists.

"Despite 54.3 per cent of undergraduate medical students in Australia being women, only 39.9 per cent of the artificially generated figures were female," Professor Currie said.

"Not only this, but there was a lack of ethnic diversity, with only 7.7 per cent depicting mid skin tones and zero per cent with dark skin tones."

Similar results were generated for their second research paper looking specifically at Australian pharmacists.

Mr Hewis said the artificially generated images failed to reflect the fact that 64 per cent of pharmacists in Australia are female.

"Only 29.7 percent of generated images, both of individuals and group shots, represented women," he said.

"The ethnicity was also biased in this research, again depicting zero per cent of people with dark skin tones, and only 6.5 per cent with mid skin tones."

The evident gender bias of the technology risks falsely depicting the medical professions as one lacking in diversity.

Professor Currie said while the technology has its place, it must be treated with caution.

"GenAI has emerged rapidly and been widely adopted without considering limitations," he said.

"Inherent biases in training data amplify gender and ethnicity biases by representing medical students and pharmacists as predominantly white and male.

"If images carrying such bias are circulated, it erodes the hard work done over decades to create diversity in medical and pharmacy workforces and risks deterring minority groups and women from pursuing a career in these fields."

Mr Hewis added that the convenience of the technology played a role in its rapid uptake.

"Society has been quick to adopt AI because it can create bespoke images whilst negating challenges like copyright and confidentiality," he said.

"However, accuracy of representation cannot be presumed, especially when creating images for professional or clinical use.

"These studies highlight that generative AI can significantly amplify inherent biases leading to misrepresentation in gender and diversity."

The first study, 'Gender bias in generative artificial intelligence text-to-image depiction of medical students', was co-authored by Mr Sam Anderson, a medical radiation science student in the Charles Sturt School of Dentistry and Medical Sciences, and Miss Josie Currie, a medical student at the University of NSW Rural Medical School.

The second study, 'Gender and ethnicity bias in generative artificial intelligence text-to-image depiction of pharmacists', was co-authored by Senior Lecturer in Pharmacy Practice Mr George John in the Charles Sturt School of Dentistry and Medical Sciences.

Media Note:

For more information or to arrange an interview, contact Jessica Mclaughlin at Charles Sturt Media on 0430 510 538 or via [email protected].

Share this article

share

Share on FacebookShare
Share on TwitterTweet
Share by EmailEmail
Share on LinkedInShare
Print

local_offerAll Local NewsCharles Sturt UniversityResearchTechnology