top of page
  • tech360.tv

Google to Address Concerns Over AI Picture Bot's Bias

Google is working to fix its AI picture bot after criticism of over-correcting against the risk of being racist. The Gemini bot supplied images depicting a variety of genders and ethnicities, even when historically inaccurate. Google has temporarily suspended the tool's ability to generate images of people while it addresses the issue.

Google is taking swift action to address concerns surrounding its new AI-powered tool for creating pictures. Users have criticised the tool, called Gemini, for over-correcting against the risk of being racist and generating historically inaccurate images. For instance, when prompted for images of America's founding fathers, the tool displayed pictures of women and people of color. Google acknowledges that Gemini's AI image generation is "missing the mark" and is committed to improving its depictions.


Jack Krawczyk, senior director for Gemini Experiences, stated that while the tool does generate a wide range of people, it needs to be more accurate in specific contexts. Google has temporarily suspended the tool's ability to generate images of people while it works on resolving the issue. This incident is not the first time AI has faced criticism regarding diversity. Google faced backlash almost a decade ago when its photos app mislabeled a photo of a black couple as "gorillas."


OpenAI, a rival AI firm, has also faced accusations of perpetuating harmful stereotypes. Users found that its Dall-E image generator predominantly displayed pictures of white men when queried for chief executives. These incidents highlight the challenges AI technology faces in accurately representing diverse populations.


Google, under pressure to demonstrate its commitment to AI advancements, recently released the latest version of Gemini. The tool creates images based on written queries. However, it quickly drew criticism from users who accused Google of training the bot to be "laughably woke." Critics argue that the tool fails to acknowledge the existence of white people and lacks accuracy in its depictions.


The claims gained traction in right-wing circles in the US, where there is already a growing backlash against big tech platforms for alleged liberal bias. Google is aware of the concerns and emphasises its dedication to representation and bias mitigation. The company aims to ensure that its results reflect its global user base and is committed to fine-tuning the tool to accommodate historical contexts and nuances.


Google's response to the criticism has been positive, with Krawczyk stating that this is part of the alignment process and an opportunity for iteration based on user feedback. The company encourages users to continue providing feedback to help improve the tool's accuracy and inclusivity.


In conclusion, Google is taking steps to address concerns over its AI picture bot's bias. The company acknowledges the need for improvement and is committed to refining the tool to accurately represent diverse populations and historical contexts.

 
  • Google is working to fix its AI picture bot after criticism of over-correcting against the risk of being racist.

  • The Gemini bot supplied images depicting a variety of genders and ethnicities, even when historically inaccurate.

  • Google has temporarily suspended the tool's ability to generate images of people while it addresses the issue.


Source: BBC

As technology advances and has a greater impact on our lives than ever before, being informed is the only way to keep up.  Through our product reviews and news articles, we want to be able to aid our readers in doing so. All of our reviews are carefully written, offer unique insights and critiques, and provide trustworthy recommendations. Our news stories are sourced from trustworthy sources, fact-checked by our team, and presented with the help of AI to make them easier to comprehend for our readers. If you notice any errors in our product reviews or news stories, please email us at editorial@tech360.tv.  Your input will be important in ensuring that our articles are accurate for all of our readers.

bottom of page