N-Gram House

Tag: AI gender bias

Benchmarking Bias in Image Generators: How Diffusion Models Reinforce Gender and Race Stereotypes

Benchmarking Bias in Image Generators: How Diffusion Models Reinforce Gender and Race Stereotypes

Diffusion models like Stable Diffusion amplify racial and gender stereotypes in generated images, underrepresenting women in leadership and overrepresenting Black individuals in low-status roles. Real-world harm is already happening in hiring and education.

Categories

  • Machine Learning (67)
  • History (50)
  • Software Development (7)
  • Business AI Strategy (6)
  • AI Security (5)

Recent Posts

Benchmarking Bias in Image Generators: How Diffusion Models Reinforce Gender and Race Stereotypes Aug, 2 2025
Benchmarking Bias in Image Generators: How Diffusion Models Reinforce Gender and Race Stereotypes
Data Privacy for Large Language Models: Principles and Practical Controls Mar, 11 2026
Data Privacy for Large Language Models: Principles and Practical Controls
How to Build and Run AI Ethics Boards for Development Decisions Apr, 28 2026
How to Build and Run AI Ethics Boards for Development Decisions
Synthetic Data Generation with Multimodal Generative AI: Augmenting Datasets Jan, 11 2026
Synthetic Data Generation with Multimodal Generative AI: Augmenting Datasets
Building a Community of Practice for Vibe Coding: Peer Reviews and Office Hours Apr, 13 2026
Building a Community of Practice for Vibe Coding: Peer Reviews and Office Hours

Menu

  • About
  • Terms of Service
  • Privacy Policy
  • CCPA
  • Contact

© 2026. All rights reserved.