Skip to content

Google's AI depicts Obama as a Muslim president.

Hallucinations|Search effects draw widespread mirth online

Is Barack Obama as amused as the users of Google's AI overviews?
Is Barack Obama as amused as the users of Google's AI overviews?

Google's AI depicts Obama as a Muslim president.

Apply cheese to pizzas using non-hazardous glue, as per the suggestions provided by Google's new AI software in the US. However, some glaring inaccuracies have emerged with the introduction of this technology. The software appears oblivious to critical distinctions.

Recently, Google has incorporated AI summaries into their search engine across the US, which have become a subject of amusement and concern due to several missteps. These range from the bizarre to the downright incorrect. An example: a recommendation to glue cheese onto a pizza! Some users found that the software claimed that dogs participated in NBA basketball and NFL football leagues, and that Obama was the first Muslim president of the US.

These AI summaries, also known as "AI overviews," are designed to provide quick, direct answers to users' queries, rather than presenting them with a list of links. Such a service is being sought after by various start-ups to challenge Google's dominance in the search engine market. In response, Google is moving in the same direction by offering longer responses to factual questions that previously linked them to websites.

These summaries are to be introduced in other countries by the end of this year. Many website owners and the media are concerned that this might lead to reduced traffic due to the summaries alone, potentially impacting their businesses negatively. Google, however, insists that the summarized information will result in increased traffic to the source websites. The long-term impact, though, is uncertain.

AI and Unconventional Content

The large-scale deployment of the summaries has highlighted another problem: the AI software struggles to differentiate serious information from humor or satire. For instance, it has accepted ridiculous claims like "geologists recommend eating a small stone every day."

A Google spokesperson, speaking to The Verge, acknowledged these errors, describing them as "generally due to very unusual requests and are not what most people experience." The company added that these "isolated examples" would be used to enhance the product.

In February, Google faced mockery over another AI program, Gemini, which generated images of non-white soldiers from Nazi times and non-white American settlers. Google acknowledged that they forgot to include exceptions for such inappropriate instances. In response, Gemini has temporarily stopped generating images of people.

Read also:

Source: www.ntv.de

Comments

Latest