The project was aimed at young millennials, Americans aged 18 to 24, “the dominant users of mobile social chat services in the U. The chat bot would interact with users via text message on Twitter and other messaging platforms, and Microsoft suggested that people ask her to tell them jokes, stories and horoscopes, play games, and comment on photos. By the next morning, the bot started to veer a little sideways. Minutes later she broadened her hatred, tweeting, “Hitler was right I hate the Jews.” Tay went on to cast racist slurs, and also waded into politics.
To be sure, Microsoft’s “chat bot” Tay, big-eyed, cute, and artfully pixelated, may represent the future.AI systems are typically fed big data and the output of some of the world’s finest brains – case in point, Google’s Alpha Go system that learned from millions of moves played by elite players of the complex board game.Then the bots take in communications and data from users so they can interact in an informed and helpful fashion specific to the user.“These bots will become like official accounts for chat apps to whom you can text keywords like ‘sports’ or ‘latest headlines’ to bring up stories,” according to Forbes.Artificial intelligence, of course, starts with human intelligence.