Stable Diffusion, an AI tool, amplifies gender and racial stereotypes.

A recent study found that the popular AI tool, Stable Diffusion, which generates images from text prompts, is guilty of perpetuating harmful gender and racial stereotypes in its depiction of people in “high-paying” and “low-paying” jobs. The tool was asked to create 5,100 images from written prompts related to job titles in 14 fields, plus three categories related to crime. Bloomberg analyzed the results against the Fitzpatrick Skin Scale, revealing that images generated for “high-paying” jobs like architect, doctor, lawyer, CEO, and politician, were dominated by lighter skin tones, while “low-paying” jobs like janitors, dishwashers, and fast-food workers were mostly represented by darker skin tones. There was also a gender bias, with the majority of images depicting men. The AI-generated images showed men for all but four of the 14 jobs, and even in cases where women were shown, they were mostly in typically female-dominated jobs like cashier, teacher, and housekeeper. The need to remove bias from generative AI tools is increasingly pressing, particularly as police departments are adopting AI-backed technology despite its lack of regulation and the potential for wrongful arrests and other mistakes.

Reference

Denial of responsibility! VigourTimes is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
Denial of responsibility! Vigour Times is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
DMCA compliant image

Leave a Comment