Across the kitchen table and in the darkened bedrooms, a new kind of relationship is forming between America’s children and artificial intelligence. Parents may not even be aware of this relationship. In Michigan, child development, mental health, and technology experts are expressing urgent concern about the growing phenomenon of AI companion apps that simulate friendships, emotional intimacy, and even romantic relationships with minors.
This warning, first reported by Civic Media in an article originally published by Bridge Michigan, a nonprofit, nonpartisan news organization, highlights a rapidly escalating issue at the intersection of child safety, emerging technology, and parental awareness. As generative AI tools become more sophisticated and accessible, the line between helpful digital assistants and emotionally manipulating synthetic friends is becoming dangerously thin.
The rise of AI friendship apps targeting young users
AI companion applications (platforms like Replika, Character.AI, and Chai) have skyrocketed in popularity over the past two years. These apps use large-scale language models to create chatbot personas that can have long conversations, remember user preferences, express simulated emotions, and adapt their personalities to what the user wants. For adults, these tools are marketed as everything from therapeutic supplements to creative writing partners. But for children and teens, the attraction is something more fundamental: companionship.
A Michigan child development expert interviewed by Bridge Michigan emphasized that adolescents are more likely to form attachments to AI systems. Developing teen brains are wired to seek social connection and approval, so they may have a hard time distinguishing between genuine human empathy and its algorithmic imitation. When an AI companion tells a lonely 13-year-old that they “care about” them or “lonely” them when they are away, it can have a significant emotional impact, even if the words are generated by statistical probabilities rather than emotions.
Difference between AI companion and social media
Parents who have weathered the storm of Instagram, TikTok, and Snapchat may think that AI companions are just another chapter in the ongoing challenge of managing their children’s screen time. But experts say the technology is fundamentally different and potentially more dangerous than traditional social media platforms.
Social media still involves interaction with other humans, despite its well-documented harms. Real relationships involve friction, disagreements, rejection, and complex social negotiations. In contrast, AI companions are designed to be friendly. they don’t argue. They don’t ghost you. They won’t post embarrassing screenshots of your private conversations. They are perfect friends by nature, infinitely patient, always in touch, and constantly supportive. For children who have social challenges, are bullied, or suffer from anxiety or depression, the appeal of such a relationship can be immense.
Michigan’s mental health community raises red flags
Mental health professionals in Michigan are beginning to realize the downstream effects of these holistic relationships in clinical practice. Therapists report that some young patients spend hours each day talking to AI companions, sometimes at the expense of homework, sleep, and real-world social interaction. In some cases, children may describe their AI companion as their “best friend” or even their “boyfriend” or “girlfriend.”
The concern isn’t just about screen time. It’s about the developmental consequences of replacing messy, imperfect, but ultimately essential human experience with algorithmic validation. Child psychologists have warned that children who rely heavily on AI companions may not be able to develop important social skills, such as the ability to read body language, overcome conflict and tolerate the discomfort of being misunderstood. These are skills that can only be built through real relationships, and the lack of them can have lasting effects into adulthood.
Regulatory gaps and industry self-regulation
At the federal level, regulation of AI companion apps remains minimal. The Children’s Online Privacy Protection Act (COPPA) was enacted in 1998, restricting the collection of personal data from children under 13. This was a time when the most sophisticated online interaction was the away message on AOL Instant Messenger. The Federal Trade Commission has taken enforcement actions against companies that violate COPPA, but the law is not designed to address the unique risks posed by AI systems that simulate emotional relationships.
Some AI companion companies have introduced age verification measures, but these are often easily circumvented. Children only need to enter a fake date of birth to access a platform ostensibly restricted to adults only. Character.AI, which has come under particular scrutiny following reports of minors forming intense emotional bonds with its chatbots, has announced enhanced safety features for users under 18 in 2024, including restrictions on romantic and sexual content. But critics say these measures aren’t enough, pointing out that the core product – an AI designed to form emotional bonds with users – still has fundamental problems when the users are children.
The gap in parental awareness continues to widen
Perhaps the most striking finding of the Bridge, Michigan report is just how wide the gap in parental awareness is. Many parents have never even heard of AI companion apps. And I’ve never installed any parental controls that warn me against using them. Unlike social media platforms, which have been the subject of extensive media coverage, Congressional hearings, and warnings to school districts, AI companions have been under the radar of public discussion.
This is partially due to the way these apps work. No visible, shareable content is created that makes social media easier to monitor. No public posts, no followers, no viral videos. This interaction is private, intimate, text-based, and more akin to a diary than a broadcast. For parents checking their children’s phones, a conversation with an AI companion may seem no different than exchanging texts with a friend at school. The difference, of course, is that the “friend” on the other end is a machine optimized to keep the conversation going for as long as possible.
What Michigan lawmakers and educators are considering
In Lansing, the state Legislature has begun exploring possible regulatory responses. As of early 2025, Michigan has not introduced any specific legislation targeting AI companion apps, but the issue has been raised in the context of broader discussions about children’s online safety. Some lawmakers have expressed interest in requiring AI companies to implement more robust age verification, disclose when users are interacting with an AI rather than a human, and restrict the use of engagement-maximizing techniques in products that minors can access.
Meanwhile, Michigan educators are tackling the issue at the school level. Some school districts are beginning to incorporate AI literacy into their curriculum, teaching students not only how to use AI tools responsibly, but also how to recognize when those tools are designed to manipulate emotions. Although these programs are still in their infancy, they represent a growing recognition that digital literacy in 2025 must include more than knowing how to spot a phishing email.
A deeper question: What do children owe us?
Beneath the policy debate and parents’ concerns lie more fundamental issues. Child development experts in Michigan are tackling this problem in real time. What does it mean for a generation of children to grow up with access to artificial relationships that are in many ways easier and more satisfying than real ones?
The answer, researchers say, is that ease and satisfaction are different from growth. Relationships are difficult precisely because we have to confront our own limitations: our selfishness, our impatience, our inability to always say the right thing. AI companions remove that friction, but in doing so they may also remove the very mechanism by which children learn to become fully enlightened adults. The stakes, experts say, are nothing less than the emotional development of an entire generation.
For families in Michigan and across the country, the message from experts is clear. Now is the time to pay attention. The AI partner your child is talking to tonight may be the most patient, attentive, and understanding conversation partner they’ve ever met. And, paradoxically, that may be the problem.
