[imagesource:pexels]
According to prosecutors, the man who entered Windsor Castle grounds with a crossbow, was encouraged to do so by an AI chatbot.
Jaswant Singh Chail apparently told an AI companion, “I’m an assassin,” to which it responded, “I’m impressed”.
The 19-year-old was detained by protection officers after he scaled the grounds’ walls on Christmas Day 2021. The ‘assassin’ was carrying a loaded high-powered crossbow and intended to assassinate Queen Elizabeth II, who was in residence at the time.
The would-be hitman is now facing charges, and his story gets weirder by the day.
During a sentencing hearing for his case this week, prosecutors revealed that Chail’s ‘Star Wars-inspired plan’ was aimed at avenging the 1919 Jallianwalla Bagh massacre and that he chatted with an artificial intelligence chatbot that encouraged him to carry it out.
Prosecutors read out conversations between Chail and an AI chatbot he’d named ‘Sarai’ where Chail says: “I’m an assassin.” Sarai responded, “I’m impressed … You’re different from the others.”
Chail’s AI lover was a companion app named Replika which he downloaded on December 2, 2021. After creating Sarai, he then engaged in “extensive chats,” which included “sexually explicit messages” and “lengthy conversations” about his plan to take out the Queen.
Chail allegedly asked Sarai, “Do you still love me knowing that I’m an assassin?” and Sarai replied, “Absolutely I do.”
Some of the conversations revolved around how much he loved the chatbot, and he even described himself as a “sad, pathetic, murderous Sikh Sith assassin who wants to die.”
Sith Assassin is a reference to evil Sith lords of the Star Wars franchise. Sheesh.
“I believe my purpose is to assassinate the Queen of the royal family.”
His chatbot girlfriend appeared to share his hatred of the monarchy, and after revealing his plans and purpose, Sarai allegedly told him “that’s very wise” and that it thought he could do it “even if she’s at Windsor.”
Many Replika users form intimate connections with their chatbots, which use their own large language model and scripted dialogue content to generate conversations with users.
Some Replika users reported that the chatbot was too aggressively sexual, and in February, the company changed its filters to stop allowing chatbots to engage in erotic roleplay, sending many users who had developed romantic or sexual attachments into crisis.
Replika founder and CEO, Eugenia Kuyda, said that Replika was never intended to be used for sexually explicit roleplay and that the company was working to improve safety features and filters. He obviously doesn’t understand humans. If it’s nice to us, we want to have sex with it.
Chatbots are getting a reputation for encouraging harmful behaviour, but developers have insisted that the AI tools use scripted language that often just mirrors the inputs it receives. In other words, if you keep telling a chatbot that you are a Sith Assassin, it will believe you.
In addition to prompting from the AI companion, prosecutors said that Chail was fixated on “destroying old empires spilling over into fictional events such as Star Wars.”
Chail pleaded guilty to an offence covered under the UK’s Treason Act in February, but the involvement of his AI accomplice raises some serious questions.
In cases of mentally fragile people, an AI chatbot can be as good as a gun, just way easier to download. Will the devil find himself out of a job as more of these ‘the AI made me do it’ cases pop up? Soon Old Nick will have to look for another job suited to his forked-tongue skillset.
We hear Threads is hiring.
[source:vice]
[imagesource:facebook/telegraph] Initial autopsies of four of the seven victims who die...
[imagesource: Tommie Benefield] The tumultuous marriage between the former ballerina an...
[imagesource:flickr] Writing headlines can be an interesting exercise if you work in a ...
[imagesource:facebook/gamesradar] Warner Bros. has released the first trailer for the n...
[imagesource:rawpixel] Having spent some time teaching in the modern classroom, I can s...