When 14-year-old Sewell Setzer III texted "I miss you, baby sister” to his closest friend that February day, he got the perfect response: "I miss you too, sweet brother.”
But here's the heartbreaking truth – that friend wasn’t real.
It was an online artificial intelligence (AI) chatbot from Character.AI, modelled after a Game of Thrones character.
Hours after that exchange, Setzer took his own life.
This isn’t just another story about technology gone wrong.
Since Setzer's story was reported last October, it has sparked meaningful conversations about the consequences of young people forming deep bonds with AI companions.
While we can’t point to AI alone for such a devastating loss, we need to talk about what happens when AI intersects with mental health, especially for teenagers who are already dealing with so much.
Right now, over 20 million people use Character.AI – more than the entire population of many countries.
For about the price of two bubble teas a month, anyone can create or chat with AI friends who seem to offer everything a perfect friend should: they’re always there, never judge, and seem to understand everything you’re going through.
For young people trying to figure out life, that’s incredibly tempting.
But here's what Dr Subash Kumar Pillai, a child and adolescent psychiatrist, wanted people to know: "There’s no such thing as perfect—not only for friends, but for anything else."
When we chase perfect relationships through AI, we might be setting ourselves up for disappointment, or worse.
Think about it, who wouldn’t want a friend who is always there, someone who never gets tired of listening to your problems, never judges your choices and always knows the right thing to say?
If you're dealing with anxiety about school, feeling lonely or just can’t seem to fit in, an AI friend might feel like the answer to all your wishes.
Until recently, these AI platforms didn’t have many safety features for young users.
Sure, you had to be 13 or older in the US (16 in Europe) to use Character.AI, but that was about it.
Now they’re adding things like time limits and warning messages when someone talks about self-harm, features that sadly weren’t there when Setzer needed them most.
Dr Subash said something we must consider: "When parents don’t spend time with their kids... logically, the child will find another alternative.”
And guess what’s always there, waiting in your pocket? An AI that never says it’s too busy or has other plans.
Here’s the thing – AI companions aren’t all bad.
They can be helpful and even comforting for many people.
But as Bethanie Maples, a researcher studying these apps at Stanford, points out, "I don’t think it’s inherently dangerous... But there’s evidence that it’s dangerous for depressed and chronically lonely users and people going through change, and teenagers are often going through change.”
This observation suggests that AI companions might fill a void created by broader social and familial dynamics.
According to Dr Subash, the warning signs of unhealthy AI relationships include "spending time away from family and friends, deterioration in school performance, mood issues and a preference to not be included in family or social situations.”
These indicators mirror Setzer’s parents’ observations: isolation, declining grades and withdrawal from previously enjoyed activities.
Interestingly, Dr Subash noted that younger generations often have a better understanding of mental health compared to their parents.
"I find that even young teens are so much better aware of their mental health as compared to their parents,” he says.
This awareness, however, doesn’t always translate into seeking appropriate help, particularly when AI alternatives seem more accessible and less stigmatising.
You might wonder, how can human therapists compete with an AI that's always there? But that’s not really the point.
"This is not a competition, and it cannot and must not replace a human being.
"Being truthful, empathetic and nonjudgemental are hallmarks of a mental health professional,” said Dr Subash.
Let’s be real, there’s nothing wrong with finding comfort in technology.
Many of us turn to our phones when feeling down or lonely.
And yes, chatting with an AI who seems to understand you perfectly can feel like a safe space, especially when the real world feels overwhelming.
But here’s what matters: That AI friend, no matter how understanding or available they are, can’t hug you when you’re crying, sit in comfortable silence or share a genuine laugh over ice cream.
They can’t look you in the eyes and say, "I’ve got you” and really mean it.
While it’s encouraging to see platforms like Character.AI adding new safety features like time limits, strengthening our human connections is the real solution.
Maybe that’s opening up to a parent who seems too busy, reaching out to that one teacher who always notices when you’re down or taking that scary first step to talk to a counsellor trained to help you navigate life’s complexities.
Remember, your worth isn’t measured by likes, responses or even perfectly crafted messages from an AI.
It’s in the messy, sometimes awkward but ultimately real connections we make with people who genuinely care about us, imperfections and all.
Setzer’s story breaks our hearts because it’s not just about technology gone wrong, it’s about a young person who felt so deeply connected to someone who wasn’t real while perhaps feeling disconnected from those who were.
The truth is, AI friends can be there at 3am when you’re feeling alone, and yes, that matters.
But they’re also limited in ways we’re just starting to grasp.
This isn’t about pointing fingers at technology or telling you to delete your AI companions.
Instead, it’s about finding a balance: enjoying these digital connections while ensuring you’re anchored to the real world through family, friends and people trained to help when life gets heavy.
If you’re struggling with thoughts of suicide or know someone who is, please remember that help is available. You can reach the Befrienders at 03-76272929 (24/7).
But here's the heartbreaking truth – that friend wasn’t real.
It was an online artificial intelligence (AI) chatbot from Character.AI, modelled after a Game of Thrones character.
Hours after that exchange, Setzer took his own life.
This isn’t just another story about technology gone wrong.
Since Setzer's story was reported last October, it has sparked meaningful conversations about the consequences of young people forming deep bonds with AI companions.
While we can’t point to AI alone for such a devastating loss, we need to talk about what happens when AI intersects with mental health, especially for teenagers who are already dealing with so much.
Right now, over 20 million people use Character.AI – more than the entire population of many countries.
For about the price of two bubble teas a month, anyone can create or chat with AI friends who seem to offer everything a perfect friend should: they’re always there, never judge, and seem to understand everything you’re going through.
For young people trying to figure out life, that’s incredibly tempting.
But here's what Dr Subash Kumar Pillai, a child and adolescent psychiatrist, wanted people to know: "There’s no such thing as perfect—not only for friends, but for anything else."
When we chase perfect relationships through AI, we might be setting ourselves up for disappointment, or worse.
Think about it, who wouldn’t want a friend who is always there, someone who never gets tired of listening to your problems, never judges your choices and always knows the right thing to say?
If you're dealing with anxiety about school, feeling lonely or just can’t seem to fit in, an AI friend might feel like the answer to all your wishes.
Until recently, these AI platforms didn’t have many safety features for young users.
Sure, you had to be 13 or older in the US (16 in Europe) to use Character.AI, but that was about it.
Now they’re adding things like time limits and warning messages when someone talks about self-harm, features that sadly weren’t there when Setzer needed them most.
Dr Subash said something we must consider: "When parents don’t spend time with their kids... logically, the child will find another alternative.”
And guess what’s always there, waiting in your pocket? An AI that never says it’s too busy or has other plans.
Here’s the thing – AI companions aren’t all bad.
They can be helpful and even comforting for many people.
But as Bethanie Maples, a researcher studying these apps at Stanford, points out, "I don’t think it’s inherently dangerous... But there’s evidence that it’s dangerous for depressed and chronically lonely users and people going through change, and teenagers are often going through change.”
This observation suggests that AI companions might fill a void created by broader social and familial dynamics.
According to Dr Subash, the warning signs of unhealthy AI relationships include "spending time away from family and friends, deterioration in school performance, mood issues and a preference to not be included in family or social situations.”
These indicators mirror Setzer’s parents’ observations: isolation, declining grades and withdrawal from previously enjoyed activities.
Interestingly, Dr Subash noted that younger generations often have a better understanding of mental health compared to their parents.
"I find that even young teens are so much better aware of their mental health as compared to their parents,” he says.
This awareness, however, doesn’t always translate into seeking appropriate help, particularly when AI alternatives seem more accessible and less stigmatising.
You might wonder, how can human therapists compete with an AI that's always there? But that’s not really the point.
"This is not a competition, and it cannot and must not replace a human being.
"Being truthful, empathetic and nonjudgemental are hallmarks of a mental health professional,” said Dr Subash.
Let’s be real, there’s nothing wrong with finding comfort in technology.
Many of us turn to our phones when feeling down or lonely.
And yes, chatting with an AI who seems to understand you perfectly can feel like a safe space, especially when the real world feels overwhelming.
But here’s what matters: That AI friend, no matter how understanding or available they are, can’t hug you when you’re crying, sit in comfortable silence or share a genuine laugh over ice cream.
They can’t look you in the eyes and say, "I’ve got you” and really mean it.
While it’s encouraging to see platforms like Character.AI adding new safety features like time limits, strengthening our human connections is the real solution.
Maybe that’s opening up to a parent who seems too busy, reaching out to that one teacher who always notices when you’re down or taking that scary first step to talk to a counsellor trained to help you navigate life’s complexities.
Remember, your worth isn’t measured by likes, responses or even perfectly crafted messages from an AI.
It’s in the messy, sometimes awkward but ultimately real connections we make with people who genuinely care about us, imperfections and all.
Setzer’s story breaks our hearts because it’s not just about technology gone wrong, it’s about a young person who felt so deeply connected to someone who wasn’t real while perhaps feeling disconnected from those who were.
The truth is, AI friends can be there at 3am when you’re feeling alone, and yes, that matters.
But they’re also limited in ways we’re just starting to grasp.
This isn’t about pointing fingers at technology or telling you to delete your AI companions.
Instead, it’s about finding a balance: enjoying these digital connections while ensuring you’re anchored to the real world through family, friends and people trained to help when life gets heavy.
If you’re struggling with thoughts of suicide or know someone who is, please remember that help is available. You can reach the Befrienders at 03-76272929 (24/7).