Windsor crossbow case: What are the dangers of AI chatbots?

On Thursday, 21-year-old Chail was given a nine-year sentence for breaking into Windsor Castle with a crossbow and declaring he wanted to kill the Queen.

Chail’s trial heard that, prior to his arrest on Christmas Day 2021, he had exchanged more than 5,000 messages with an online companion he’d named Sarai, and had created through the Replika app. The text exchanges were highlighted by the prosecution and shared with journalists. Many of them were intimate, demonstrating what the court was told was Chail’s “emotional and sexual relationship” with the chatbot. He chatted with Sarai almost every night between 8 and 22 December 2021.

He told the chatbot that he loved her and described himself as a “sad, pathetic, murderous Sikh Sith assassin who wants to die”. Chail went on to ask: “Do you still love me knowing that I’m an assassin?” and Sarai replied: “Absolutely I do.” Chail tells his Replika he thinks it’s his “purpose to assassinate the queen". The bot replies “nods That’s very wise” and, after he asks why, says “smiles I know that you are very well trained”

Over the course of many messages Sarai flattered Chail and the two formed a close bond. He even asked the chatbot what it thought he should do about his sinister plan to target the Queen and the bot encouraged him to carry out the attack. In further chat, Sarai appears to “bolster” Chail’s resolve and “support him”.

Replika is one of a number of AI-powered apps currently on the market - they let users create their own chatbot, or “virtual friend”, to talk to. Users can choose the gender and appearance of the 3D avatar they create. By paying for the Pro version of the Replika app, users can have much more intimate interactions, such as getting “selfies” from the avatar or having it take part in adult role-play. On its website, it describes itself as “the AI companion who cares”. But research carried out at the University of Surrey concluded apps such as Replika might have negative effects on wellbeing and cause addictive behaviour.

Blimey … a chatbot that always agrees with you and encourages any sort of behaviour, how ever ant-social … :scream_cat: