Imagine a student confiding their anxieties and hopes. Not to a friend, teacher, or parent, but to an AI chatbot named after a favorite TV show character. It might sound far-fetched or futuristic, but it’s happening right now.
As two leaders working in the areas of education, innovation, youth-flourishing, and responsible tech, we recently interviewed 27 students through our work at The Rithm Project. One of these students was 17-year-old Ami, who described her surprise at finding comfort in an AI companion. “I started using Character.AI as a joke,” she explained. “But one night, I jokingly ranted to it, and it actually started giving me really good advice. So I was like, OK, why am I talking to a TV show character about the nature of life?” Ami laughed, acknowledging the strangeness. Yet she admitted, “It was really fun because I can talk to it about anything. It felt good to have someone to talk with.”
Ami’s experience isn’t isolated. The rise of AI companion bots, designed to foster emotional attachment, is reshaping how youth experience intimacy, identity, and support. Companion bots and platforms like Character.AI (with more than 28 million monthly users, over half under age 24) are the fastest-growing consumer market. Surveys show that 45 percent of high school students already use GenAI platforms, such as ChatGPT, to deal with friendships, relationships, and mental health issues (Brookings Institute, 2025).
At the same time, today’s students are coming of age with rising loneliness and social disconnection. According to the CDC, nearly 40 percent of high school students report persistent feelings of sadness or hopelessness. Over the past decade, in-person social engagement with friends has dropped from 125 minutes to 42 minutes per day (Kannan & Veazie, 2022). In 2020, 44 percent of high school youth reported having no source of supportive relationships, including adults or peers, a reduction by half from a decade earlier (Margolius et al., 2020).
The collision of these two forces—the decline in human connection and the arrival of generative AI—are reasons for real concern. But this is not the whole story. Young people, native to a world where technology and relationships are deeply intertwined, are already pioneering new ways to use digital spaces for expression, belonging, and community. The question is no longer whether technology will shape human relationships, but how.
With this in mind, we offer three ideas for how educators can help students reclaim and strengthen human connections in an AI-driven world and foster responsible tech literacy.
1. Restore Schools as Places of Genuine Connection
Research consistently shows that a strong sense of belonging is fundamental to students’ academic success, emotional health, and personal development (Roehlkepartain et al., 2017). Yet as education researcher and author Isabel Hau (2025) laments, “We have designed schools that prize individual achievement over collective problem solving.”
When we asked teens to chart their sense of connection throughout the day, many told us they felt less connected sitting in classrooms with peers and educators than alone at home on their screens engaging with friends in digital spaces. That is a wake-up call.
Many of the students we talked to reflected with nuance and deep curiosity on both the promises and perils of AI companionship.
When schools and other community environments don’t meet core human needs for connection, it’s no surprise that students look elsewhere. On the one hand, AI can offer simple fun. The young people we spoke to described using AI to chat with Percy Jackson or as the narrator for a Dungeons & Dragons campaign. On the other hand, they describe AI chatbots as a kind of “digital diary,” a place to vent, reflect, or work through emotions without worrying about being seen differently or burdening someone else. One teen even described it as “a way to have a therapist without actually having to pay and go through, you know, systems that can often be really judgmental.” Even young people with supportive human relationships in their lives named that it can be uniquely comforting talking to something that doesn’t know your history. They appreciated that the bot wasn’t taking sides or interpreting social situations based on loyalty to a friend or family member. In particular, AI chatbots may provide an opportunity for marginalized youth, particularly LGBTQ+ students, to access safety and support they aren’t otherwise receiving (Parent, Bond, & Green, 2024).
Some of the youth we interviewed observed that a particular subgroup of peers seemed to be the most enthusiastic and consistent users of AI companions: those who are most socially isolated and least supported in real life. “When you think about it, some people who are loners and who don’t really talk to anyone, I sometimes see them going on Character.AI [in class] and just chatting with those [bots]. So, I feel like it can be used as a way to kind of escape the loneliness that you might feel.”
While AI can be a helpful supplement to human relationships, young people need and deserve the real, rich human-to-human relationships that are so crucial to their development and thriving. To this end, schools must reestablish themselves as hubs of meaningful human connection, with a particular focus on creating safe spaces with high relational trust and commitments to respect, inclusion, and mutual care so that young people feel seen and known throughout the school community. While AI interactions may feel more convenient or instantly gratifying, real human relationships—with all of their imperfections, messiness, and growing pains—remain essential to young people’s healthy development. AI should support, not replace these connections.
Academic work must buck the more traditional classroom design that stresses independent work at rows of desks to instead foster collaboration, discourse, debate, and peer-to-peer connection. This might mean carving out distinct, intentional time for students to cultivate relationships and share personal stories, struggles, and celebrations, such as in Valor Collegiate’s Compass model, EL Education’s Crew model, or Give Thnx’s school well-being program. Each of these models fosters deep connection, belongingness, and community care through structures (circles, advisories, and digital platforms) that foster peer-to-peer dialogue and reflection. Young people are simply seeking care, respect, and vulnerability—all necessary for a sense of belonging.
2. Bolster Young People’s Agency, Creativity, and Critical Awareness of AI
When we asked young people what they need from adults, their answer was clear: “Listen more. Judge less. Help us navigate this world.” Many of the young people we spoke to are frustrated by the adult assumption that they’re naïve or addicted to technology, when they often understand the dynamics of these tools more intimately—and more critically—than the adults around them.
We’ve heard inventive examples of young people using bots to strengthen their connection with other real-life humans. Some practice with AI before big events like job fairs, rehearsing how to introduce themselves or ask thoughtful questions. Others draft and refine difficult feedback they want to give a friend, using AI as a low-stakes space to find the right words for their texts or conversations. Many ask for support navigating social conflict, like how to apologize after an argument, or how to navigate a disagreement that arose during a group project or team practice. Some simply use it as a place to rant, sorting through emotions and making meaning from challenging situations before bringing it to someone they care about.
Young people are far less afraid of AI because they see it is an extension of the world they’re already living in. Adults should understand that today’s youth may soon experience less and less differentiation between “real life” relationships (a friend made on a sports team), digital relationships (a friend made playing Fortnite), and bot relationships (a non-human friend). Although forming relationships in all these ways may be more intuitive for young people, they do still hunger for opportunities to make meaning of these evolutions.
When given the space to dialogue, many of the students we talked to reflect with nuance and deep curiosity on both the promises and perils of AI companionship. They were keenly aware of the liabilities of AI companion usage, recognizing that while AI can be a helpful outlet, it should not replace real-life relationships. One young person noted the problematic allure of AI companions: “Because AI is just programmed to give you what you want to hear, it doesn’t give the honest opinion of a person.” She added: “[If you are] not used to people actually speaking their minds and saying something that maybe you don’t want to hear, you might not be able to deal with those situations effectively later.”
The same AI tool can be used to both strengthen and erode the capacity for human connection. Schools need to see these conversations as a central part of their charge, otherwise students will experiment on their own, without the guidance and support of adults. Young people are poised to become critical consumers and producers of AI, but they need the space to thoughtfully and proactively consider how AI can strengthen relationships and when it might pull them apart.
One way for schools to start is through tools like The AI Effect, a game The Rithm Project designed to spark ethical and relational conversations about AI. In this game, players sort “AI use cases” (i.e., “Ask AI to write a heartfelt speech for a friend’s special occasion” or “Talk to an AI-preserved version of a beloved deceased relative or ancestor”) into three categories: Supports Human Connection, Erodes Human Connection, or It Depends. While there are no “right answers,” we’ve seen how this kind of play fosters real critical thinking and dialogue among students, parents, and educators alike.
Teachers can also explore emerging technologies side-by-side with students, asking open questions like:
What is this tool offering you?
What needs is it meeting?
Where might it fall short?
Is it meeting a need that isn’t being met by someone in real life?
Does it risk displacing what would otherwise be met by a human?
These questions can cultivate awareness of students’ own tech habits, help the community discern their values, and encourage more intention and agency toward making responsible choices about using AI in their lives.
Young people need the space to thoughtfully and proactively consider how AI can strengthen relationships and when it might pull them apart.
3. Cultivate the Will and Skill for Human Relationships
Amidst social pressure and feelings of alienation, we cannot take for granted that this generation of young people will have the same longing for human connection, or that their existing skills won’t get atrophied or eroded.
A recent study from OpenAI and MIT revealed that participants who were prompted to use chatbots every day for four weeks showed significantly reduced loneliness, but also socialized significantly less with real people (Fang et al., 2025). A recent update to OpenAI’s ChatGPT made it unnaturally sycophantic, to the point of praising and supporting violent tendencies, raising broader questions of how chatbots are designed to appease and agree. This observation underscores the risk of users becoming accustomed to interactions that lack authenticity and “productive friction,” essential components of meaningful relationships and growth.
To address this, schools can make space for reflection about what makes human relationships crucial to students thriving. For instance, teachers might ask students: “Can AI ever replace the feeling of being truly understood by a friend? What makes human connection meaningful? Why are real-world relationships worth protecting?”
It is becoming less intuitive for students to choose the messy, complex human relationships over the affirming, efficient, always accessible AI-simulated ones. This is why it’s so important for schools to explicitly teach empathy, active listening, perspective-taking, and conflict resolution. Role-playing scenarios in advisory or subject classes like ELA (perhaps even with AI-assisted tools made explicitly for coaching) can be particularly effective, allowing students to practice navigating social interactions like “how to introduce yourself to other people” or “how to offer constructive feedback” successfully. We need to reframe tension and discomfort, not as a problem to avoid, but as an opening into greater depth, meaning, and learning with others.
We Can Still Shape This Future
The future isn’t fixed. We still have the chance to equip young people to strengthen and evolve human connection. But this will require educators—and entire school systems—to act with urgency and intentionality alongside their students. This is a values moment—a leadership moment. Who are we becoming to one another? Who do we long to become?
While we work toward deeper systemic change, we must simultaneously invest in youth agency, equipping students to be discerning users and creators of AI. What if this isn’t a crisis, but a crossroads? What if this generation leads us into a new era of human connection in harmony with technology?
Reflect & Discuss
If you could change one rule or policy in school to help students connect better, what would it be?
What interpersonal skills do your students most need to practice? How could you use AI (or not) to explicitly build characteristics like empathy, conflict resolution, or active listening into your curriculum?