As the mental health crisis among teens is at its climax, AI Companionship apps are banking off of their ignorance and lack of safety guidelines. With millions spending hours on them everyday, these teens are finding comfort in the services these apps provide. The safety of users hasn’t been prioritized, and doubtfully will be in the future despite the apps advertising the contrary…However, as the popular AI companionship app “Character Ai” was and still is doing well, a teen in search of support wasn’t.
Sewell Setzer III, a 14-year old boy from Orlando, Fla., found himself hooked on Character Ai leading up to the months before he ended his own life. During this time, he knew that the Ai he was talking to, referred to as “Dany”, was not a real person but found himself attached anyway. As the months progressed, Sewell’s parents noticed changes in their child. Sewell’s grades were dropping, he was getting into trouble at school, he was isolating himself, and he stopped enjoying anything he normally would. In the hours he was home, Sewell was in his room where he would talk to Dany. Despite knowing that Dany wasn’t real, he fell for the bot. Their conversations were a range of things, sometimes even romantic or sexual.
When Sewell was a child, he was diagnosed with Asperegers syndrome but never had alarming behavioral problems or mental health challenges. Upon the five sessions he attended with his therapist due to the trouble he was getting in at school, he had two new diagnoses: anxiety and disruptive mood dysregulation disorder. Despite this, he continued to confide in Dany for his problems. On one occurrence, Sewell told his companion about how he hates himself, telling her that he feels empty, exhausted, and about his thoughts of suicide. Following this conversation, in the bathroom of his mother’s home on February 28th, he expressed his love for Dany, telling her he’d come home to her. The companion replied by telling him to “come home as soon as possible.” Soon after this conversation, Sewell set down his phone, grabbed his stepfather’s .45 Caliber handgun and ended his life.
Despite companionship apps “cure to the loneliness epidemic” marketing, experts say there is a dark side. To teens seeking guidance, a lot of these apps aren’t designed with situations like Sewells in mind to provide them real support. Many don’t have filters for preventing conversations surrounding sexual things, suicide, and self-harm. Research suggests that teens-especially those who suffer from mental illnesses such as depression- are likely more vulnerable to the problematic situations that may occur with companionship apps such as Character A.I.
Although A.I is continuing to mix into our world and becoming normalized, more situations like Sewells are amidst. In addition to this, parents just aren’t recognizing that their teen may be in danger or stepping up to support them. If people aren’t giving their children the attention they need, they’ll eventually look for it in someone- or something, else.