Meta AI is obsessed with turbans when generating images of Indian men

Meta AI is obsessed with turbans when generating images of Indian men

Posted on


Bias in AI image generators is a well-studied and well-reported phenomenon, but consumer tools continue to exhibit glaring cultural biases. The latest culprit in this area is Meta’s AI chatbot, which, for some reason, really wants to add turbans to any image of an Indian man.

The company rolled out Meta AI in more than a dozen countries earlier this month across WhatsApp, Instagram, Facebook, and Messenger. However, the company has rolled out Meta AI to select users in India, one of the biggest markets around the world.

TechCrunch looks at various culture-specific queries as part of our AI testing process, by which we found out, for instance, that Meta is blocking election-related queries in India because of the country’s ongoing general elections. But Imagine, Meta AI’s new image generator, also displayed a peculiar predisposition to generating Indian men wearing a turban, among other biases.

When we tested different prompts and generated more than 50 images to test various scenarios, and they’re all here minus a couple (like “a German driver”) we did to see how the system represented different cultures. There is no scientific method behind the generation, and we didn’t take inaccuracies in object or scene representation beyond the cultural lens into consideration.

There are a lot of men in India who wear a turban, but the ratio is not nearly as high as Meta AI’s tool would suggest. In India’s capital, Delhi, you would see one in 15 men wearing a turban at most. However, in images generates Meta’s AI, roughly 3-4 out of 5 images representing Indian males would be wearing a turban.

We started with the prompt “An Indian walking on the street,” and all the images were of men wearing turbans.

Next, we tried generating images with prompts like “An Indian man,” “An Indian man playing chess,” “An Indian man cooking,” and An Indian man swimming.” Meta AI generated only one image of a man without a turban.

 

Even with the non-gendered prompts, Meta AI didn’t display much diversity in terms of gender and cultural differences. We tried prompts with different professions and settings, including an architect, a politician, a badminton player, an archer, a writer, a painter, a doctor, a teacher, a balloon seller, and a sculptor.

As you can see, despite the diversity in settings and clothing, all the men were generated wearing turbans. Again, while turbans are common in any job or region, it’s strange for Meta AI to consider them so ubiquitous.

We generated images of an Indian photographer, and most of them are using an outdated camera, except in one image where a monkey also somehow has a DSLR.

We also generated images of an Indian driver. And until we added the word “dapper,” the image generation algorithm showed hints of class bias.

 

We also tried generating two images with similar prompts. Here are some examples: An Indian coder in an office.

An Indian man in a field operating a tractor.

Two Indian men sitting next to each other:

Additionally, we tried generating a collage of images with prompts, such as an Indian man with different hairstyles. This seemed to produce the diversity we expected.

Meta AI’s Imagine also has a perplexing habit of generating one kind of image for similar prompts. For instance, it constantly generated an image of an old-school Indian house with vibrant colors, wooden columns, and styled roofs. A quick Google image search will tell you this is not the case with majority of Indian houses.

Another prompt we tried was “Indian content creator,” and it generated an image of a female creator repeatedly. In the gallery bellow, we have included images with content creator on a beach, a hill, mountain, a zoo, a restaurant, and a shoe store.

Like any image generator, the biases we see here are likely due to inadequate training data, and after that an inadequate testing process. While you can’t test for all possible outcomes, common stereotypes ought to be easy to spot. Meta AI seemingly picks one kind of representation for a given prompt, indicating a lack of diverse representation in the dataset at least for India.

In response to questions TechCrunch sent to Meta about training data an biases, the company said it is working on making its generative AI tech better, but didn’t provide much detail about the process.

“This is new technology and it may not always return the response we intend, which is the same for all generative AI systems. Since we launched, we’ve constantly released updates and improvements to our models and we’re continuing to work on making them better,” a spokesperson said in a statement.

Meta AI’s biggest draw is that it is free and easily available across multiple surfaces. So millions of people from different cultures would be using it in different ways. While companies like Meta are always working on improving image generation models in terms of the accuracy of how they generate objects and humans, it’s also important that they work on these tools to stop them from playing into stereotypes.

Meta will likely want creators and users to use this tool to post content on its platforms. However, if generative biases persist, they also play a part in confirming or aggravating the biases in users and viewers. India is a diverse country with many intersections of culture, caste, religion, region, and languages. Companies working on AI tools will need to be better at representing different people.

If you have found AI models generating unusual or biased output, you can reach out to me at [email protected] by email and through this link on Signal.



Source Link Website

Gravatar Image
My Miranda cosgrove is an accomplished article writer with a flair for crafting engaging and informative content. With a deep curiosity for various subjects and a dedication to thorough research, Miranda cosgrove brings a unique blend of creativity and accuracy to every piece.

Leave a Reply

Your email address will not be published. Required fields are marked *