Psychological Effects of Character.AI and Parasocial Obsession

Psychological Effects of Character.AI and Parasocial Obsession
Character.AI allows you to talk to anyone, anywhere, with Artificial Intelligence-generated responses. But at what point is this new technology too much? Source.

By Ina Kim

On February 28th, 2024, a boy named Sewell Setzer III took his own life after chatting with an A.I. character from the online artificial intelligence chatting software Character AI. The mother of this teen is now taking legal action against the company, claiming that the bot inadvertently encouraged the 14-year-old’s feelings of suicide. 

Character AI, the beloved software where people can form conversations with A.I. characters, erupted in popularity because of how advanced these characters were. Users can progress from simply interacting with A.I. characters to psychologically believing that they have a bond with the bots. Either way, Character AI supposedly “takes the safety of [their] users very seriously.”

In interacting with a fictional character, users can develop a parasocial relationship: a one-sided relationship that a person forms with a fictional character or another person who doesn’t even know the user exists. When someone experiences a parasocial relationship, their medial prefrontal cortex activates an immense feeling of connection or even a sense of familiarity with their fictional character, tricking the brain into believing they knew this character their entire life.

There are three major parasocial relationships, according to Resilience Lab, and the last one could have indicated Setzer’s relationship with Character AI’s famous Game of Thrones character bot. Entertainment-social has the brain simply enjoying the entertainment from the character with no large trouble at first, “often light-hearted and involve a sense of entertainment.” Then the intense-personal relationship grows “a deeper attachment and may start to affect real-life social interactions.” The third and final stage of a parasocial relationship is the utmost stage: the borderline-pathological relationship, which people could “lead to behaviors such as stalking or violence.”

After Setzer’s death, Character AI has been taking the precaution of providing less explicit responses that violate Character AI’s Terms of Service or Community Guidelines and a new disclaimer whenever someone opens a new character to chat with: “This is an A.I. chatbot and not a real person. Treat everything it says as fiction. What is said should not be relied upon as fact or advice.”

Character AI or any other general artificial intelligence character chatting software can still be used for enjoyment but with caution. Simply put, users should limit their time chatting with characters and remind themselves that these characters are not real people, they are all fictional.