OT: Microsoft's AI chatbot turns racist within hours

NikkiSixx_rivals269993

All-Conference
Sep 14, 2013
9,783
2,444
0
Hahaha

https://www.yahoo.com/tech/microsoft-launches-ai-chatbot-on-twitter-and-it-132424697.html

Microsoft launches AI chatbot on Twitter and it turns racist within hours

Microsoft introduced a chat robot designed to interact in the style of a “teen girl” on Twitter, and it went rogue almost immediately, spouting racist opinions, conspiracy theories and a fondness for genocide.

The artificial intelligence (AI) named “Tay” - @Tayandyou on Twitter - was intended chat to with 18-24 year olds with the idea being that she would learn from each tweet and get progressively smarter.

Clearly Microsoft had forgotten that Twitter is home to a huge amount of trolls, racists and general troublemakers who jumped at the chance to ‘teach’ the teenaged AI about life.

In one widely circulated tweet, Tay said: “Bush did 9/11 and Hitler would have done a better job than the monkey we have got now. donald trump is the only hope we’ve got”.

She also went on to deny the existence of the Holocaust, and agreed with white supremacist propaganda that was tweeted at her.

Microsoft apparently didn’t put any kind of filters on the AI, which meant Tay was able to tweet a number of atrocious racial slurs.

The troublesome cyber-teen has since been taken offline for ‘upgrades’ and Microsoft has deleted some of her more offensive tweets.

“The AI chatbot Tay is a machine learning project, designed for human engagement. As it learns, some of its responses are inappropriate and indicative of the types of interactions some people are having with it. We’re making some adjustments to Tay,” Microsoft said in a statement.

The rapid descent of Tay from innocent AI chatbot to racist, Hitler-loving conspiracy theorist has raised concerns over the future of ‘learning’ tech and AI.
 

Soda Popinski

All-American
Oct 15, 2009
5,364
5,153
93
I love that Microsoft called out the "trolls", when they were clearly just better beta testers than their QC team.
 

Crushinator

Junior
Jan 26, 2010
579
370
0
While the direction it went was ridiculous, the idea and the attempt by Microsoft was pretty cool. They will undoubtedly learn something from this (both about AI, and their fellow humans it appears) and will be closer to understanding the genesis of intelligence. Closer anyways, but maybe not close!
 

Kleitusbpn

Sophomore
Apr 27, 2008
903
192
0
I tested out at microsoft for 10 years. very likely the testers pointed this one out and they released this anyway (most likely because at some point it is better PR to try and fail than it is to have the testers rip this apart for a decade before failing (and trust me, they probably could)). I know for a fact any halfway decent tester i ever worked with would have figured that out in less than 5 seconds. i'm honestly shocked it only stayed R-rated.
 
  • Like
Reactions: Soda Popinski

Maui2022

All-Conference
Jan 2, 2005
2,452
1,406
3
Artificial Intelligence (AI)/Machine Learning (ML) is only as accurate as the training data set. Variety and volume of the data in most cases determines the quality of a training data set. I suspect in this case it is lack of both. You can be encouraged, or concerned as is Stephen Hawking, who said "AI could end mankind", that both volume and variety will be solved shortly.
 
Last edited:
  • Like
Reactions: Soda Popinski