Share this article
Latest news
With KB5043178 to Release Preview Channel, Microsoft advises Windows 11 users to plug in when the battery is low
Copilot in Outlook will generate personalized themes for you to customize the app
Microsoft will raise the price of its 365 Suite to include AI capabilities
Death Stranding Director’s Cut is now Xbox X|S at a huge discount
Outlook will let users create custom account icons so they can tell their accounts apart easier
Microsoft ‘deeply sorry for the unintended offensive tweets’ by Tay bot
3 min. read
Published onMarch 25, 2016
published onMarch 25, 2016
Share this article
Read our disclosure page to find out how can you help Windows Report sustain the editorial teamRead more
Microsoft’s artificial chatbot Tay has been put to bed, following a recentattempt by the communications bot to emulate its surroundingsthat eventually devolved into a series of questionable and offensive tweets.
While Tay’s experience with humanity via Twitter may have been relatively short, it has provided Microsoft someinsight into what influenced the chatbot’s responsesas well as how the company can readjust parameters to better deal with the more ‘savory’ individuals that inhabit the internet.
The logical place for us to engage with a massive group of users was Twitter. Unfortunately, in the first 24 hours of coming online, a coordinated attack by a subset of people exploited a vulnerability in Tay. Although we had prepared for many types of abuses of the system, we had made a critical oversight for this specific attack. As a result, Tay tweeted wildly inappropriate and reprehensible words and images. We take full responsibility for not seeing this possibility ahead of time. We will take this lesson forward as well as those from our experiences in China, Japan and the U.S. Right now; we are hard at work addressing the specific vulnerability that was exposed by the attack on Tay.”
Conversely, Microsoft’s other, less publicized AI chatbot Xiaolce has and is currentlyused by 40 million people in Chinato tell stories and hold casual conversations for some time now with little to no offensive marks against it. The Xiaolce experience in China is what led Microsoft to try out a chatbot in an apparently radically different environment such as the U.S and through Twitter.
Knowing now, what went wrong, Microsoft is going to retool Tay’s AI design to limittechnical exploitswithout restricting the AI’s ability to learn from mistakes. Also, the company plans to enter public forums such as Twitter, with greater caution than it did this past two days.
Looking ahead, we face some difficult – and yet exciting – research challenges in AI design. AI systems feed off of both positive and negative interactions with people. In that sense, the challenges are just as much social as they are technical. We will do everything possible to limit technical exploits but also know we cannot fully predict all possible human interactive misuses without learning from mistakes. To do AI right, one needs to iterate with many people and often in public forums. We must enter each one with great caution and ultimately learn and improve, step by step, and to do this without offending people in the process.”
Thankfully, it looks as though Tay will return, but with a stern scolding and point in the right direction from its parents the AI bot will do a lot better on its second go-around.
Kareem Anderson
Networking & Security Specialist
Kareem is a journalist from the bay area, now living in Florida. His passion for technology and content creation drives are unmatched, driving him to create well-researched articles and incredible YouTube videos.
He is always on the lookout for everything new about Microsoft, focusing on making easy-to-understand content and breaking down complex topics related to networking, Azure, cloud computing, and security.
User forum
0 messages
Sort by:LatestOldestMost Votes
Comment*
Name*
Email*
Commenting as.Not you?
Save information for future comments
Comment
Δ
Kareem Anderson
Networking & Security Specialist
He is a journalist from the bay area, now living in Florida. He breaks down complex topics related to networking, Azure, cloud computing, and security