As conversations around AI bias grow, researchers and technologists have pointed out that generative AI tools can unintentionally mirror long-standing stereotypes. Image generation tools often reveal this quietly. Prompts such as ‘nurse’ frequently produce images of women, while ‘CEO’ tends to return images of men.
Agencies are not only storytellers but also contributors to the cultural imagery that shapes public perception. Recognising this responsibility, 21N78E Creative Labs has introduced UnbAIsed, an initiative aimed at challenging these patterns.
Launched around International Women’s Day, the agency has also rolled out a film to spread awareness about the issue and encourage participation in the project.
Artificial Intelligence is often seen as neutral, simply processing information and generating responses. In reality, AI learns from the internet, which reflects human history, culture and, inevitably, our biases.
What seems subtle highlights how easily historical patterns carry into new technologies, a concern particularly relevant for advertising and creative industries. At the centre of the initiative is a gender-neutral AI image library designed to move beyond these stereotypes.
The repository aims to feature diverse representations of people across professions, roles and everyday life. It is also designed as an open, collaborative effort, inviting students, creators and brands to contribute gender-neutral AI-generated images with clear tagging.
By keeping the archive open and free to use, the initiative hopes to reshape how AI interprets prompts. Since most AI image models learn from large internet datasets, the patterns within those datasets directly shape how AI visualises the world.
When datasets are skewed, outputs tend to be skewed as well.
Through UnbAIsed, the agency hopes to introduce alternative visual references that challenge stereotypes and present people beyond traditional roles. Over time, these references could influence how future AI models represent the world.
Neeraj Rajeev, senior copywriter, 21N78E Creative Labs, “Even though we were aware of the bias, our understanding of it took a sharp turn when we saw its extent in image generation. That's when we realised the impact of what we had thought of was far wider. From that moment onwards, it became our sole mission to kickstart this initiative."
Sudhir Nair, founder and CEO, 21N78E Creative Labs, added, “At 21N78E, we’ve always believed that technology should be a mirror of our progress, not our prejudices. AI is an incredible tool, but it lacks the lived experience to know when it’s repeating an old mistake. With UnbAIsed, we aren't just building a library; we’re attempting to give AI a better set of memories to learn from. It’s our way of ensuring that the digital future remains as diverse and nuanced as the real world we live in."
Viren Mahendra, national creative director, 21N78E Creative Labs, said, “As creators, we use images to build worlds, but if our AI tools only show us a world of the past, we’re limited in what we can imagine for the future. UnbAIsed is our way of adding more inclusive, honest colors to that digital palette. It’s about ensuring that when we look into the AI mirror, we see a reflection that is as diverse and nuanced as the reality we live in every day."
Nikhil Shahane, COO, 21N78E Creative Labs, added, “The challenge with generative AI isn't just the output; it's the data loops that reinforce it. By building UnbAIsed, we’re moving from passive users to active contributors in the model-training ecosystem. We’ve utilised Gemini for the architecture and OSS generation models to seed the library, but the goal is to create a cleaner, more diverse dataset that ‘un-teaches’ the systemic biases found in older, unrefined crawls. It’s about leveraging the right tech stack to ensure the visual intelligence of tomorrow is built on a more accurate representation of today."


