Google AI search, which the company calls AI Overviews, is now available to millions of users. While it aims to help you find what you're looking for fast, it isn't fully guaranteed to offer accurate and meaningful responses. For example, when people started to tinker with it, they found it telling them to eat rocks and use non-toxic glue to make cheese stick better on pizza.
The good part is that Google was swift in addressing the issue and has now removed some of these weird and inaccurate AI results. The company claims that these were due to "data voids" combined with people asking odd questions to AI Overviews.
Liz Reid, Google's head of search, further argues that AI Overviews don't generally "hallucinate." However, Reid admits that AI searches can misinterpret information that is available on the internet. Reid's blog post later compared the "accuracy rate" of the generated results with featured snippets and said it to be "on par."
The blog post about the matter also explains that Google is putting more restrictions in place to ensure fewer of these weird responses appear in the future. With these in place, AI Overviews are less likely to pop up when the queries are "nonsensical" or satirical.
Get a Google Pixel 8 with generative AI features from Amazon