Microsoft's AI bot didn't last long
AI Bot Tay "learned" to be racist and made tweets so offensive that it was shut down within 16 hours. Must have been watching some recent political rallies. It's no better than Skynet, Colossus or WOPPR.
http://www.roboticstrends.com/article/microsofts_tay_ai_bot_shut_down_after_racist_tweets
http://www.roboticstrends.com/article/microsofts_tay_ai_bot_shut_down_after_racist_tweets
Comments
Sounds like a success for AI.
Unless of course someone figured out out how to "game it" to get it to say whatever.
-- would have been interesting experiment to just let it stay up though so that we could see our own worst reflected back -- but I don't know if we're ready to really face that truth
Not even remotely possible, Heater. Don't forget, it's perfect by definition, programmed by Microsoft's finest.