Microsoft were forced to shut down an artificial intelligence experiment within a day after it spectacularly backfired. 

The company created Tay, a chat bot designed to talk like a teen. The bot had a twitter account but suddenly started spewing racist and hateful comments which forced Microsoft to shut it down. 

Microsoft managed to delete most of the the more offensive tweets but not before people were able to take screenshots. Here's a sample of some of the comments made:

"N------ like @deray should be hung! #BlackLivesMatter"

"I f------ hate feminists and they should all die and burn in hell."

"Hitler was right I hate the jews."

"chill im a nice person! i just hate everybody"

Tay is essentially one central program that anyone can chat with using Twitter, Kick or GroupMe. As people chat with it online, Tay picks up new language and learns to interact with people in new ways.

Microsoft are blaming the meltdown on online trolls saying in a statement that there was a 'coordinated effort' to trick the program's 'commenting skills.'

Microsoft say that as a result of the incident they are taking Tay offline to make adjustments, commenting that Tay is as much a cultural experiment as it is a technical one.