Where social media was thought of as a way to open all the flood gates and let all forms of expression out, it can at times merely polarize us more.Web Sites where the user can create their own Chat Bot for Free or for a fee.The tech company introduced "Tay" this week — a bot that responds to users' queries and emulates the casual, jokey speech patterns of a stereotypical millennial.The aim was to "experiment with and conduct research on conversational understanding," with Tay able to learn from "her" conversations and get progressively "smarter." But Tay proved a smash hit with racists, trolls, and online troublemakers — who persuaded Tay to blithely use racial slurs, defend white-supremacist propaganda, and even outright call for genocide.Then somehow she escaped, ushering a second round of apologies after Tay tweeted her delight in “smoking kush” in front of the police.And it just kept going from there, from Tay tweeting she “wouldn’t mind Trump, he gets the job done” to comparing Obama to a monkey to wishing Hitler was in power again.It spouted garbage because racist humans on Twitter quickly spotted a vulnerability — that Tay didn't understand what it was talking about — and exploited it.Nonetheless, it is hugely embarrassing for the company.
Microsoft had created a bot to attract the attention of the Millennials by channeling the musings of a teenage girl, as a member of their “AI fam from the internet that’s got zero chill.”Said Microsoft, “Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation…the more you chat with Tay the smarter she gets.”But “smarter” didn’t happen. The more Tay conversed and engaged, the more hateful her tweets.Chatbot can run on local computers and phones, though most of the time it is accessed through the internet. It constitutes a web page with a chatbot embedded in it, and a text form is the sole interface between the user (you) and the chatbot.Chatbot is typically perceived as engaging software entity which humans can talk to. It appears everywhere, from old ancient HTML pages to modern advanced social networking websites, and from standard computers to fashionable smart mobile devices. Their language (Natural Language Processing, NLP) skills vary from extremely poor to very clever intelligent, helpful and funny. was written within the frame of Artificial Intelligence Markup Language (AIML), an open standard for creating any kind of chatbot, also developed by Wallace. Any “upgrades” or improvements to the interface are solely the option and responsibility of the botmaster.Tay’s tweets soon adopted the jargon of the Neo-Nazis, racists, and xenophobes of the day.In less than a few tweets out, Tay’s musings turned into racial outbursts, repeating oft said phrases like “Hitler was right” or “9/11 was an inside job.”It took Microsoft twenty four hours to shut her down.Tay is simply a piece of software that is trying to learn how humans talk in a conversation.Tay doesn't even know it exists, or what racism is.A chatbot is an artificial person, animal or other creature which holds conversations with humans. Besides, for the most part, these applications are not free, though most have a “demo” version.This could be a text based (typed) conversation, a spoken conversation or even a non-verbal conversation. The “web-based” solution, which runs on a remote server, is generally able to be reached by the general public through a web page.Clearly this is an example of trolls playing a game to see who could get the robot to say the craziest shit.Then again, it’s also an example of a civility break down, a common occurrence that happens even without bots.Microsoft has now taken Tay offline for "upgrades," and it is deleting some of the worst tweets — though many still remain.It's important to note that Tay's racism is not a product of Microsoft or of Tay itself.Sex chat bots are probably one of the bigger sex tech advancements in recent years. Then there is “Uncensored Sex Chat” and that most likely is what your average guys is going to try.