Skip to main content
Daily Star

Facebook's new AI can tell lies and insult you - so don't trust it, warns company

Meta has released the BlenderBot3 chatbot to the public - it can talk about virtually any topic and learns from conversation but the company has warned it might not always tell the truth

Facebook's new artificial intelligence chatbot could write insults and lie to you, the company has warned.

Meta's BlenderBot3 chatbot has been released to the public for users in the US to try out.


The company claims that its AI can search the internet and 'chat about nearly any topic'. It's even been given the ability to self-improve and learn how to communicate better through conversation with humans.

However, while the bot is designed to help Meta make better AI-based 'conversational' systems, the tech giant has also admitted that it has a few behavioural problems.

PARIS, FRANCE - FEBRUARY 03: In this photo illustration, the Facebook logo is displayed on the screen of an iPhone in front of a Meta logo on February 03, 2022 in Paris, France. Share prices for Facebook's parent company, Meta, slumped in after-hours trading after the company reported that social network's daily active users declined to 1.929 billion in Q4 of 2021 from 1.930 billion in the previous quarter. Facebook is losing users for the first time in its history, Mark Zuckerberg's company has seen its profits decline, and the transition to the metaverse promises to be chaotic. (Photo illustration by Chesnot/Getty Images)
Meta is encouraging people to complain if their AI is rude(Image: Getty Images)

READ NEXT: Check your phone bills for these surprise charges this month, warns Martin Lewis

READ MORE: Elon Musk starts Twitter poll and tells CEO 'prove to the public' over 'bots and fakes'READ MORE: Inside Steve Jobs and Bill Gates' brutal bromance - from betrayal to deathbed letter

In an FAQ about the bot, Meta said: "Users should be aware that despite our best attempts to reduce such issues, the bot may be inappropriate, rude, or make untrue or contradictory statements.

"The bot's comments are not representative of Meta's views as a company, and should not be relied on for factual information, including but not limited to medical, legal, or financial advice."


The company added that it has worked to 'minimise' how much the bot uses swear words, insulting languages or culturally insensitive phrases. Users have the ability to report and 'dislike' these comments.

(FILES) In this file photo taken on May 01, 2018 Facebook CEO Mark Zuckerberg speaks during the annual F8 summit at the San Jose McEnery Convention Center in San Jose, California. - Shares of Facebook parent Meta plunged 24 percent in opening trading February 3, 2022, weighing on the Nasdaq and threatening the stock market's four-day winning streak. (Photo by JOSH EDELSON / AFP) (Photo by JOSH EDELSON/AFP via Getty Images)
Mark Zuckerberg's firm is working hard on AI(Image: AFP via Getty Images)
READ MORE: Scientist's deep space image of 'distant star' turns out to be photo of a spicy sausage

Part of the reason so many chatbots end up with issues like this is because of AI bias.


When chatbots are trained on publicly available data from the Internet, their behaviours can reflect the content on the Internet.

As you probably already know, there is a lot of inappropriate, insulting, and adult content online, so training an AI how to interact using this content can lead to issues such as those identified by Meta.

This is one of the reasons that Google also allegedly has problematic AI. A software engineer sacked by the company for claiming its AI was sentient said the bot was also biased and 'racist'.

Article continues below

READ MORE:

READ MORE: Silicon Valley billionaire shares 'unrecognisable' photo of himself from 30 years agoREAD MORE: Man gets revenge on spammers for annoying texts and wins £1,000 in court
Follow Daily Star:

Artificial IntelligenceFacebook
reach logo

At Reach and across our entities we and our partners use information collected through cookies and other identifiers from your device to improve experience on our site, analyse how it is used and to show personalised advertising. You can opt out of the sale or sharing of your data, at any time clicking the "Do Not Sell or Share my Data" button at the bottom of the webpage. Please note that your preferences are browser specific. Use of our website and any of our services represents your acceptance of the use of cookies and consent to the practices described in our Privacy Notice and Cookie Notice.