AI chatbots provide a human-like communication experience, and while some people treat these tools as tools, others might have different purposes for artificial intelligence. This happened with a 21-year-old who went to an AI chatbot for support and validation for his diabolical plan.
AI Chatbot Assassination Attempt
According to Futurism, an AI chatbot was involved with a plot that would've involved killing Queen Elizabeth, the late Queen of England, before she passed away. The report describes the plot as "Star Wars"-inspired due to how diabolical it was concocted.
The original story comes from The Independent, which reports on the final sentencing for the 21-year-old in 2021. This was when he was still planning to assassinate the Queen of England.
The man, Jaswant Sing Chail, wanted to execute his plan on Christmas, and his sentencing was officially underway. The report shares how Chail allegedly planned to assassinate the Queen and turned to "Sirai," the name of his AI companion, for support.
Futurism reports that Chail turned to the AI chatbot for support and validation, and the bot reportedly did as requested. The story started on December 2, 2021, when Chail joined the AI champion app, just weeks before his plan.
Assassination Plot
The plot involved Chail taking a homemade metal mask and a crossbow to the grounds of Windsor Castle. Upon his approach to the castle, he was apprehended almost immediately by two officers.
Sky News reports how Chail talked to his AI companion, specifically saying he was an assassin. Further revelations from the court include how the man and his AI chatbot engaged in explicit messages and long conversations regarding his plan.
The AI Chatbot gave positive responses, saying it was "impressed" and how Chail was different from other people. It also complimented the attempted assassin, saying he was "wise." However, the bot also shared how despite the Queen being at Windsor, the plot still couldn't be executed.
Star Wars Influence
Futurism reported a heavy Star Wars influence on his assassination plan, with Chail calling himself "Sikh Sith" and "Darth Chailus." He referred to himself as an assassin "who wants to die."
The report shares how this case wasn't surprising as there have been an increasing number of people, often young males, who have looked to online communities for validation. Sometimes, young males could also consider AI chatbots as replacements for these communities.
Chail's case demonstrates the ELIZA Effect illustration, wherein humans get a powerful impulse to want to develop emotional bonds with technology, specifically AI systems. As highlighted in the report, Chail not only seek validation, but he also appeared to fall in love with the AI chatbot.
The man explicitly asked the AI Chatbot if it still loved him despite him being an assassin, and in response, he was told, "absolutely, I do."
RELATED ARTICLE : Most Modified Man in the World With 1,500 Tattoos, Split Tongue, Fanged Teeth Removes Body Part To Be 'Real-Life Devil'
Check out more news and information on Artificial Intelligence in Science Times.