Microsoft last week apologized for its Tay chatbot's bad behavior. It took the machine learning system offline, only 24 hours into its short life, after Twitter trolls got it to deny the Holocaust and elicit pro-Nazi and anti-feminist remarks. "We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for," said Peter Lee, corporate vice president at Microsoft Research.