Washington Examiner

Google’s AI bot halts image creation tool following race controversy

Google has⁢ temporarily​ halted its AI image generator’s ability to create pictures‍ of people ⁤in order to address concerns​ about‌ its‌ representation ⁢of race.‍ The company faced backlash when its Gemini chatbot inaccurately portrayed historical figures as different races, such as presenting white figures as ​black, Native American, or Asian. ‌Conservative ⁣users criticized Google, claiming ‍it was evidence of a biased AI model. Google acknowledged the issues and stated that it is working to improve the image generation feature before re-releasing it. The company did not provide a specific timeline for‍ when the feature will be available again. Jack Krawczyk, the head of product for Google’s AI division, apologized for the inaccuracies and emphasized the company’s commitment to addressing representation and bias.‍ This is not the first time⁤ Google has faced criticism ⁤regarding diversity, as it previously​ had to apologize for labeling an ‌image of a black couple as “gorillas” in⁢ its photo app. Google Gemini, formerly known‍ as Google Bard, is⁤ a chatbot powered by Google’s language model.‌ It was launched in limited release in March ⁣2023 and has undergone multiple upgrades, leading to its renaming as “Gemini” to reflect its⁤ advanced technology.

What steps is Google ⁢taking⁢ to improve the image generation feature and address biases within AI technology

Google has recently announced that it will ‍temporarily suspend the AI image generator’s ability to create pictures of people in order to address concerns about its representation of race. This decision comes after the company faced significant backlash when ⁤its Gemini chatbot inaccurately portrayed historical figures as​ different races.

The Gemini chatbot, powered by Google’s ⁣language model, came under fire when it presented white figures‌ as black, Native American,‌ or Asian. Conservative ⁢users accused Google of ‍promoting a‍ biased AI model, ⁣leading to widespread criticism and calls ⁤for action.

Acknowledging⁤ the issues,⁢ Google has stated that it is actively working to improve⁣ the image generation feature before re-releasing it. However, the company did not provide a specific timeline for when the feature ⁤will become available again to users.

Jack Krawczyk, ⁢the head of product for Google’s AI division, expressed his apologies for the inaccuracies and emphasized the company’s commitment to addressing representation ⁤and bias. ⁤Google’s acknowledgment of the problem and its efforts to⁢ rectify the situation demonstrate its dedication to ensuring fair and accurate ‌representation.

This is ⁢not the‍ first time that Google has faced criticism concerning diversity. In the ‌past, the company had‌ to issue ⁢an apology for labeling an image of a black couple as “gorillas” in its photo app. These incidents highlight the importance of ongoing efforts to address biases within AI technology and emphasize the necessity of diversity and sensitivity training for AI models.

Gemini, formerly known as Google Bard, was launched in limited release in March 2023. ‍It has since⁢ undergone multiple‌ upgrades to enhance its capabilities, leading to its renaming ‌to ⁢”Gemini” to reflect its advanced technology⁤ and language processing capabilities.

In conclusion, ‌Google’s decision to temporarily halt the AI image generator’s ability to create pictures ⁢of people is a commendable step towards addressing concerns about racial representation. By acknowledging the inaccuracies and committing to improving its image generation feature, Google ​demonstrates its dedication to fostering fair and unbiased AI models. It is crucial for tech companies to actively⁣ address ​and rectify such⁤ issues to ensure that AI technology represents and serves diverse communities accurately and equitably.



" Conservative News Daily does not always share or support the views and opinions expressed here; they are just those of the writer."
*As an Amazon Associate I earn from qualifying purchases

Related Articles

Sponsored Content
Back to top button
Available for Amazon Prime
Close

Adblock Detected

Please consider supporting us by disabling your ad blocker