|
|
|
Thread Tools | Display Modes |
03-24-2016, 02:57 PM | |
Banned!
Join Date: Sep 2000
Location: NOT Columbia, MO 65201
Casino cash: $2980194
|
Microsoft Disabled Teen Twitter Robot Because The Internet Taught It To Love Hitler
http://distractify.com/geek/2016/03/...effec1f3de2f6f
I know everybody thinks that artificial intelligence, the phenomenon, not the mediocre Steven Spielberg film, is going to be the end of humanity because it's going to become self-aware one day and decide to exterminate the human race. Well it turns out that A.I. will indeed want to kill humans, but only because we taught it to. This is the Twitter account of Tay, Microsoft's teen girl AI system that was supposed to learn how to speak like your average teenage girl. What was unique about Tay was that she was going to be taught entirely from her interactions with other people on the internet. It was an exercise Microsoft partook in to improve its customer service software as well as a huge PR opportunity. Turns out that was a big mistake, because in less than a day, the internet turned her into a sex-fiending political conspiracy theorist with a soft spot for Adolf Hitler. |
Posts: 46,007
|
03-25-2016, 01:43 PM | #16 |
MVP
Join Date: Sep 2005
Location: Driftless Region
Casino cash: $955564
|
Did this Robot interact with Prison Bitch?
|
Posts: 8,693
|
03-25-2016, 03:56 PM | #17 |
Supporter
Join Date: Jul 2011
Casino cash: $4141956
|
|
Posts: 10,636
|
03-25-2016, 11:45 PM | #18 |
Stuff & Things
Join Date: Jan 2006
Location: The Yukon
Casino cash: $10126924
|
Microsoft should have it post here. Before you know it, it'll be telling people to die in fires, chug antifreeze and eventually hack Matt Cassle's bank account.
|
Posts: 21,498
|
03-25-2016, 11:54 PM | #19 |
Has a particular set of skills
Join Date: Dec 2003
Location: On the water
Casino cash: $3169627
VARSITY
|
There were groups organizing to **** with this as soon as it was announced. Part of it was Microsoft hate but the other part was this screamed, it needs to be ****ed with to prove you can't control or try to define the internet. You can data mine the shit out of big data but you can't invent something to reflect the internet.
__________________
Fear leads to anger, anger leads to hate, hate leads to suffering. -YODA |
Posts: 78,982
|
03-26-2016, 06:07 PM | #20 |
Cheaterlover*
Join Date: May 2009
Location: RI
Casino cash: $10010716
|
Aaaaand this is how Skynet learns to hate humanity.
Honestly, mankind... What the **** is your problem? Daleks, Cylons, Terminators... how many times do you need to be warned? |
Posts: 12,916
|
03-30-2016, 07:08 AM | #21 |
Space Cadet and Aczabel
Join Date: Aug 2000
Location: Kanab, UT, USA
Casino cash: $9333275
VARSITY
|
4 chan is like a PTA meeting compared to prison bitch
__________________
Thanks, Trump for the civics lesson. We are learning so much about RICO, espionage, sedition, impeachment, the 25th Amendment, order of succession, nepotism, separation of powers, 1st Amendment, obstruction of justice, the emoluments clause, conflicts of interest, collusion, sanctions, oligarchs, money laundering and so much more. |
Posts: 40,584
|
03-30-2016, 09:09 PM | #22 |
Ain't no relax!
Join Date: Sep 2005
Casino cash: $2308919
|
Tay v.2 came back, proceeded to get really high on kush in front of police, and started spamming users until taken down again.
A.I. is totally realistic you guys.... Microsoft’s racist chatbot returns with drug-smoking Twitter meltdown Microsoft’s attempt to converse with millennials using an artificial intelligence bot plugged into Twitter made a short-lived return on Wednesday, before bowing out again in some sort of meltdown. The learning experiment, which got a crash-course in racism, Holocaust denial and sexism courtesy of Twitter users, was switched back on overnight and appeared to be operating in a more sensible fashion. Microsoft had previously gone through the bot’s tweets and removed the most offensive and vowed only to bring the experiment back online if the company’s engineers could “better anticipate malicious intent that conflicts with our principles and values”. However, at one point Tay tweeted about taking drugs, in front of the police, no less. Tay then started to tweet out of control, spamming its more than 210,000 followers with the same tweet, saying: “You are too fast, please take a rest …” over and over. Microsoft responded by making Tay’s Twitter profile private, preventing anyone from seeing the tweets, in effect taking it offline again.
__________________
|
Posts: 47,478
|
03-31-2016, 09:45 AM | #23 |
Hey Loochy, I'm hooome!
Join Date: Oct 2008
Location: PooPooKaKaPeePeeShire
Casino cash: $2170752
|
__________________
Hey Loochy, I'm hoooome! |
Posts: 40,449
|
03-31-2016, 10:38 AM | #24 |
In Mahomes I trust!
Join Date: Aug 2000
Location: Baton Rouge, LA
Casino cash: $2554636
|
This is just more Nazi propaganda
|
Posts: 5,952
|
|
|