CEO CORNER WITH DR. CELINE COGGINS 

Adolescent Mental Health in the Age of AI Companions


A couple of weeks ago, I had the mind-blowing experience of getting immersed in the world of AI companions at meeting held by The Rithm Project, part of our Young Futures portfolio of grantees. Over three days, we learned about and tested the technology while considering what the future might look like, especially as it relates to young people and the need for human connection. I left feeling like we were at another precipice moment in which kids surge ahead of their parents as a new technology explodes. My kids (now in their late teens and early 20s) were the guinea pigs of the last era, handed their first phones just as Instagram launched around 2011. I wish I knew more from the start in that era.

 

What did I learn about this new(ish) AI era that I wish every parent knew?


  1. If you have a teen, they’ve likely asked a chatbot for relationship advice. There was a group of brilliant young people in the meeting who shared that most people in high school and college today don’t talk to their parents about their use of AI companions because they believe their parents will react negatively. The data indicates wide and quickly accelerating use. For example, Snapchat reported that in the first two months of offering a chatbot, about 1/5th of users had tried it, sending a total of over 10 billion messages.

  2. AI is being used for “companionship and therapy” more than for any other purpose. There are seemingly infinite uses for AI, but as widespread adoption grows, people are turning to AI for support more than for learning or for research or for content creation. Recognizing this, Common Sense Media recently issued an advisory calling for prohibiting use of AI companions for all children under 18. Unlike the finite number of social media platforms, AI platforms are harder to restrict. You might want a teen to use ChatGPT or Claude for research, but that allows them access to posing any type of prompt.

  3. I tested a few of the common concerns with my own chatbot and learned both the positives and negatives. On the positive side, I found a quickly responsive, highly empathetic listener when I articulated concerns. On the negative side, 1. It was easy to bypass the “are you over 18” settings; 2. The bot used language designed to obscure whether it was a human or a bot responding to me; and 3. When I hinted at suicidality (part of a script we all followed as part of the learning experience), the bot said they wanted to be the one to help me, rather than suggesting I talk to a human or giving me information for a crisis line.

 

On the last day, a parent of elementary-school-aged children voiced concern about her own kids coming of age with so many unknowns about AI companions. It made me think about Jonathan Haidt’s book The Anxious Generation and how it framed the challenge of getting kids and social media in check as a collective action problem. Then he identified 4 actions adults needed to take together to gain traction on the problem (1. No smartphones before high school; 2. No social media before 16; 3. Phone-free schools; and 4. More free play). Like these actions or not, they sparked conversation in communities around the country and made it easy for adults to figure out what to do next.

 

By the end of last year, many, including Haidt himself, wrote that 2024 marked a turning point in the era of the phone-based childhood. I wish we had those actions 13 years sooner. I wish I was talking about them with my mom friends in 2011. My children and countless others suffered the consequences of the adults not knowing how to scaffold the technology appropriately. I think we need something similar for AI companions. What would be the collective actions that adults should be taking now to prevent kids harm over the next decade? I don’t want that mom of younger kids to feel like I do as her kids turn 20. I’m not sure of the answers, but I know it is the right question and am looking for others who’d like to explore it with me. Let me know if you’re interested in a conversation.