Skip to content

Google's AI depicts Obama as a Muslim president.

Hallucinations|Search effects draw widespread mirth online

Is Barack Obama as amused as the users of Google's AI overviews?
Is Barack Obama as amused as the users of Google's AI overviews?

Google's AI depicts Obama as a Muslim president.

Apply cheese to pizzas using non-hazardous glue, as per the suggestions provided by Google's new AI software in the US. However, some glaring inaccuracies have emerged with the introduction of this technology. The software appears oblivious to critical distinctions.

Recently, Google has incorporated AI summaries into their search engine across the US, which have become a subject of amusement and concern due to several missteps. These range from the bizarre to the downright incorrect. An example: a recommendation to glue cheese onto a pizza! Some users found that the software claimed that dogs participated in NBA basketball and NFL football leagues, and that Obama was the first Muslim president of the US.

These AI summaries, also known as "AI overviews," are designed to provide quick, direct answers to users' queries, rather than presenting them with a list of links. Such a service is being sought after by various start-ups to challenge Google's dominance in the search engine market. In response, Google is moving in the same direction by offering longer responses to factual questions that previously linked them to websites.

These summaries are to be introduced in other countries by the end of this year. Many website owners and the media are concerned that this might lead to reduced traffic due to the summaries alone, potentially impacting their businesses negatively. Google, however, insists that the summarized information will result in increased traffic to the source websites. The long-term impact, though, is uncertain.

AI and Unconventional Content

The large-scale deployment of the summaries has highlighted another problem: the AI software struggles to differentiate serious information from humor or satire. For instance, it has accepted ridiculous claims like "geologists recommend eating a small stone every day."

A Google spokesperson, speaking to The Verge, acknowledged these errors, describing them as "generally due to very unusual requests and are not what most people experience." The company added that these "isolated examples" would be used to enhance the product.

In February, Google faced mockery over another AI program, Gemini, which generated images of non-white soldiers from Nazi times and non-white American settlers. Google acknowledged that they forgot to include exceptions for such inappropriate instances. In response, Gemini has temporarily stopped generating images of people.

Read also:

The technology company is exploring ways to improve its AI software, acknowledging instances where it mistakenly labeled dogs as NBA and NFL players, or suggested Obama as the first Muslim president of the US. Google's AI is also being criticized for failing to distinguish between serious information and satire, such as recommending the consumption of a daily stone for health benefits.

In an effort to enhance its AI capabilities, Google is using unusual requests as learning opportunities, aiming to minimize such errors in the future.

Source: www.ntv.de

Comments

Latest

Grave accusations levied against JVA staff members in Bavaria

Grave accusations levied against JVA staff members in Bavaria

Grave accusations levied against JVA staff members in Bavaria The Augsburg District Attorney's Office is currently investigating several staff members of the Augsburg-Gablingen prison (JVA) on allegations of severe prisoner mistreatment. The focus of the investigation is on claims of bodily harm in the workplace. It's

Members Public