Google Warns about ‘Hallucinating’ AI Chatbots
Listen to Podcast:
In an interview with a newspaper that was published on Saturday, the head of Google’s search engine warned against the potential perils of artificial intelligence in chatbots.
This comes at a time when Google’s parent company, Alphabet, is fighting to compete with the popular software ChatGPT.
Read More: Google Warns About Health and Finance AI Content
According to Prabhakar Raghavan, senior vice president at Google and head of Google Search, “this kind of artificial intelligence we’re talking about right now can sometimes lead to something we call hallucination.” Raghavan made this statement in an interview with the German newspaper Welt am Sonntag.
In remarks that were published in German, Raghavan made the observation that “this then expresses itself in such a way that a computer offers a persuasive but utterly made-up answer.” He went on to say that one of the primary responsibilities was to keep this to a bare minimum.
After OpenAI, a startup that Microsoft is financing with roughly $10 billion, released ChatGPT in November, which has since astonished consumers with its startlingly human-like responses to user queries, Google has been on the defensive. ChatGPT has put Google in a difficult position.
Read Also: Chatbot: Why You Need It for Your Business
This past week, Alphabet Inc. debuted its very own chatbot known as Bard. Unfortunately, the software made a mistake and provided misleading information in a promotional video, which resulted in the loss of $100 billion in market value for the firm on Wednesday.
Alphabet is still in the process of doing user testing on the Bard app, and the company has not yet provided any indication of when the app may become available to the general public.
Raghavan stated that while there is an obvious sense of urgency, there is also a sense of the enormous responsibility that comes along with the situation. “It is in no way our intention to misinform the general population.”
To Read Our Exclusive Content, Sign up Now.