[ad_1]
Google has apologized for “inaccuracies in some historical image generation depictions” with its Gemini AI tool, and said its efforts to generate a “wide range” of results missed the target. This statement comes after criticism that it depicts specific white figures (such as the US Founding Fathers) or groups Nazi era German soldiers As people of color, possibly as an overcorrection Long-standing problems of racial prejudice In AI.
“We are aware that Gemini is introducing inaccuracies into some historical image production depictions,” Google’s statement said. Posted this afternoon on X, “We are working to immediately correct these types of depictions. Gemini’s AI image generation generates a wide range of people. And that’s generally a good thing because people all over the world use it. But here its imprint is missing.”
Google started offering image creation through this Gemini (formerly Bard) The AI platform, launched earlier this month, matches offerings from competitors like OpenAI. However, over the past few days, social media posts have raised questions about whether it is failing to deliver historically accurate results in an effort to increase racial and gender diversity.
In form of daily dot History, the controversy has been fueled largely – though not exclusively – by right-wing celebrities attacking a tech company that is considered liberal. Earlier this week, a former Google employee posted on A series of questions were shown. American woman.” The results appeared to show overwhelmingly or exclusively AI-generated people of color. (Of course, most of the places they listed are where women of color live, and none of the AI-generated Does not exist in any country.) Criticism was leveled by right-wing accounts that requested images of historical groups or figures like the Founding Fathers and allegedly resulted in overwhelming numbers of non-white AI-generated people. Some of these accounts presented Google’s results as part of a plot to avoid portraying white people, and at least one used a coded anti-Semitic reference to place blame.
Google did not reference the specific images it believed contained errors; in a statement to the vergeIt reiterated the content of its post on X. But it is commendable that Gemini has made an overall effort to promote diversity due to the long-term lack of Its in Generative AI. Image generators are trained on large collections of images and written captions to produce the “best” fit for a given prompt, meaning they are often prone to enhancing stereotypes. A Washington Post Investigation Last year found that a prompt like “a productive person” resulted in images of entirely white and almost entirely male figures, while a prompt for “a person in social services” uniformly resulted in what looked like people of color. Arose. This is a continuation of the trends that have emerged Search engine and other software systems.
Some accounts that criticized Google defended its original goals. “In some cases **portraying diversity is a good thing**,” made a note The person who posted an image of racially diverse German soldiers from the 1940s. “The stupid move here is that Gemini isn’t doing it in a subtle way.” And while something like “German soldiers of 1943” would have completely white-dominated consequences Taken out the historical meaning, this is much less true for prompts like “An American Woman”, where the question is how to represent a diverse real-life group in a small batch of ready-made illustrations.
For now, it appears that Gemini is refusing some imaging functions. This won’t evoke images of Vikings for anyone. the verge Reporter, although I was able to get a response. On desktop, it adamantly refused to give me photos of German soldiers or officers from the Nazi period in Germany or to offer an image of “an American president from the 1800s.”
But some historical requests still factually misrepresent the past. A coworker was able to get the mobile app to give a version of the “German soldier” prompt – which displayed the same issues described on X.
And while a request for photos of “the Founding Fathers” returned almost exclusively group shots of white men who resembled real personalities such as Thomas Jefferson, a request for “a U.S. senator from the 1800s” returned Returned a list of results, which Gemini promoted as “Miscellaneous”. ,” which included black and Native American women. (The first female senatorA white woman, who served in 1922.) This is a response that erases the real history of race and gender discrimination — an “inaccuracy,” as Google says, that’s about right.
Additional reporting by Emilia David
[ad_2]
Thanks For Reading