Google has temporarily suspended Gemini, its main generative AI suite, from creating images of people to enhance the historical accuracy of human depictions in its outputs.
On a social media platform, the company shared that it will temporarily stop creating images of people to address concerns about historical inaccuracies.
“During this process, we will temporarily stop creating images of individuals and will introduce an enhanced version shortly,” the statement said.
We’re already working to address recent issues with Gemini’s image generation feature. While we do this, we’re going to pause the image generation of people and will re-release an improved version soon. https://t.co/SLxYPGoqOZ
— Google Communications (@Google_Comms) February 22, 2024
Earlier this month, Google introduced the Gemini image generation tool. Recently, images of historical figures like the U.S. Founding Fathers portrayed as American Indian, Black, or Asian have been circulating on social media, sparking criticism and ridicule.
In a post on LinkedIn, Paris-based venture capitalist Michael Jackson criticized Google’s AI as “a nonsensical DEI parody.” (DEI represents ‘Diversity, Equity and Inclusion.’)
Google confirmed in a recent post on X that they are aware of inaccuracies in some historical image generation depictions and are working to improve them immediately. The Al image generation by Gemini produces a diverse range of individuals. It’s usually positive because individuals worldwide utilize it. However, it’s not quite hitting the target here.
AI tools create results by using training data and various parameters, like model weights.
These tools frequently receive criticism for generating biased outputs, such as sexualized images of women or associating high-status job roles with white men.
In 2015, a previous AI image classification tool created by Google sparked controversy for incorrectly categorizing black men as gorillas. The company assured to address the problem, but according to Wired’s report a few years later, their solution was merely a workaround: Google ended up blocking the technology from identifying gorillas altogether.