Social Media

Tay, Microsoft’s AI program, is back online

Microsoft's artificial intelligence (AI) program, Tay, reappeared on Twitter on Wednesday after being deactivated last week for posting offensive messages.

However, the program once again went wrong and Tay's account was set to private after it began repeating the same message over and over to other Twitter users.

According to a Microsoft, the account was reactivated by accident during testing.

"Tay remains offline while we make adjustments," a spokesperson for the company told CNBC via email. "As part of testing, she was inadvertently activated on Twitter for a brief period of time."


Jim Young | Reuters

Twitter users speculated the program was caught in a feedback loop where it was constantly replying to its own messages.

@psychicpebble Tay is back, and she's hungry...

Tay was first launched last Wednesday, but had to be deactivated a few days later after it began writing messages using racist and sexual language.

Peter Lee, corporate vice president of Microsoft's research division, apologized for the program's behaviour.

"We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for," Lee wrote on the company's blog.

Man vs machine: A.I. could put you out of a job

According to Lee, the program was created as a "chatbot" to entertain 18-to-24 year olds and learn from interacting with humans.

However, some Twitter users were able to manipulate the program to send out the offensive messages.

"Unfortunately, in the first 24 hours of coming online, a coordinated attack by a subset of people exploited a vulnerability in Tay," Lee explained. "As a result, Tay tweeted wildly inappropriate and reprehensible words and images."

Alastair Bathgate, CEO of Blue Prism, a software company that develops robotic process automation systems, said the incident proves that Microsoft has not learnt to control its AI program.

"You can be devious with these things because, essentially, they are not that intelligent," he told CNBC over the phone.

"They are relatively dumb compared to a human with 20 or 40 years of life experience. Maybe it's going to take that much life experience for Tay to understand the difference between good and bad."

Follow CNBC International on Twitter and Facebook.