Bob said: “You i i i everything else.” Alice replied: “Balls have zero to me to me to me to me.” And then we killed them both. Bob and Alice were Facebook chatbots, computer programs designed to engage in conversations. Facebook engineers in America wanted them to trade items — hats, balls, books — with each other. They did, sort of, but in a private language incomprehensible to humans. They had to die.
Last year Microsoft created a teen girl called Tay, a Twitter chatbot who learnt to communicate by crunching enormous amounts of public data. Pretty soon she had decided that Hitler “did nothing wrong” and feminists “should all die and burn in hell”. She had also become explicitly horny. Microsoft killed Tay when she was only 16 hours old.
Silly stories, you might think.