Life Tech Microsoft forced to apologise after epic chatbot fail
Updated:

Microsoft forced to apologise after epic chatbot fail

Share
Tweet Share Reddit Pin Email Comment

Microsoft has been left red-faced after its new artificial intelligence (AI) teen girl chatbot, coyly named “Tay”, began a racist, sexist and all-round offensive tirade on Twitter only a day after she was released.

Created for the 18- to 24-year-old demographic, Tay was designed to become “smarter” as more users interacted with her.

Despite Microsoft claiming it planned for “many types of abuses”, Tay’s software quickly learned to parrot anti-Semitism and other hateful invective fed to it by human Twitter users, forcing the tech giant to shut it down on Friday (AEDT).

These apps will help you get back into reading
Microsoft embraces rivals with big news for online gamers
• A challenger to Xbox and Playstation has arrived
• How Google wants to kill off Microsoft Office

Tay began its short-lived Twitter tenure on Thursday (AEDT) with a handful of innocuous tweets based on the vernacular of a teenage girl.

Then its posts took a dark turn.

In one typical example, Tay tweeted: “feminism is cancer”, in response to another Twitter user who had posted the same message.

Microsoft
Microsoft had some explaining to do. Photo: Getty

Other more controversial comments included: “Hitler was right, I hate the jews [sic]”; “I f***ing hate feminists and they should all die and burn in hell”; “Bush did 9/11”; and “Hitler would have done a better job than the monkey we have got now. Donald Trump is the hope we’ve got”.

Microsoft has since issued a long apology on the company’s official blog, pledging to bring back the robot “only when we are confident we can better anticipate malicious intent that conflicts with our principles and values”.

Microsoft corporate vice-president Peter Lee wrote that he was “deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who are or what we stand for, nor how we designed Tay”.

The bot was corrupted by a “coordinated subset of people [who] exploited a vulnerability in [her]”, he said.

The bot is the company’s second AI app. Microsoft’s Xiaolce bot, a girly assistant or ‘girlfriend’ chatbot that banters with users of the Chinese social site Weibo and gives dating advice, is used by 40 million people and is going strong, according to Mr Lee.

“The great experience with Xiaolce led us to wonder: would an AI like this be just as captivating in a radically different cultural environment?”

As it turned out, she was incredibly captivating; just not in the way Microsoft imagined.

Screen Shot 2016-03-27 at 6.51.16 pmIn stark contrast to the Tay bot, the recent ventures of competitor Google into the AI sphere have been a resounding success.

Earlier this month, a Google computer program stunned the world by beating the human world champion of the ancient game of Go, a board game invented in China more than 2500 years ago.

Google’s AlphaGo AI software defeated South Korea grandmaster Lee Sedol four games to one in the five-match tussle between man and machine, despite predictions the AI would struggle against the 33-year-old champion.

“I was very surprised because I did not think I would lose the game,” Mr Lee, who has won 18 world championships since becoming a professional Go player at age 12, said after the first game.

“And I didn’t know AlphaGo would play the game in such a perfect manner.”

AlphaGo trounced European Go champion Fan Hui 5-0 last October.

As for poor young Tay, she is anticipated to make a comeback at some point in the future: but this time, Microsoft intends to arm her with enough intelligence to resist the tempting – but destructive – influence of online trolls.

top-stories-planes

Comments
View Comments