August 8, 2025
Beijing – Our daily lives are increasingly shaped by artificial intelligence, and the line between reality and fantasy is constantly becoming fierce. Recently, Grok, a free AI assistant designed by Xai to “maximise truth and objectivity,” introduced the “Waifu” character.
This raises deep concern: Are technological innovations currently outweighing ethical regulations? Have we witnessed a big tech race at the bottom?
This feature may seem like harmless entertainment, but more ironically, Grok's marketing strategy competes with Openai's new AI agents. This allows you to plan and organize your trip to attend wedding parties, but raises bigger questions about the emotional manipulation of AI companies, the impact on social well-being, and the future of relationships.
Leshner et al. (2025) studied how people form intimate connections with fictional characters, especially within anime fandoms where “waifus” (idealized female characters) and “husbands” (idealized male characters) are prominent. Their research revealed that men tend to form sexual connections, often driven by physical appearance, whereas women are more likely to form emotional connections shaped by personality traits and perceived similarities.
These findings suggest that charm, emotional bonds, and even love, can extend to fictional entities, the psychological mechanisms that underpin human relationships. This study highlights the human ability to form meaningful connections, even when “partners” exist only on the screen or on the story.
But what if these connections are no longer one-sided? When AI characters like Grok's “Waifu” are designed to actively engage, flatten and adapt to the user's desires, the line between parasocial relationships (one-sided emotional bonds with fictional characters) and intimacy in reality becomes dangerously blurred. Leshner et al. Highlights, these connections are very meaningful and in some cases you can compete or move on real relationships.
The idea of a personalized AI companion is inspiring – remember a film like her – the ethical implications of such technology are serious. By exploiting well-documented psychological tendencies such as male preference for physical attraction and female desire for emotional connection, AI systems run the risk of promoting unhealthy emotional dependence. The AI “waifus” is not just about text on the screen. These are tools explicitly designed to attract, manipulate and blur the line between real human connections and commercial interests.
There is a particularly high interest for educators and parents. Such systems can distort young people's understanding of relationships, intimacy and consent. Leshner et al. Observation, parasocial relationships, often harmless, can teach individuals about intimacy. However, when such relationships are shaped by profit-driven AI systems, they promote distorted idealized models of human interaction, potentially risking impairing relationship skills and emotional development.
As AI technology evolves, it becomes essential to critically examine its meaning. If AI developers cannot convince civil society to adopt an ethical approach, regulations must intervene. But as a linguist, educator, and parents, what can you do in the meantime?
One immediate step is to promote important AI awareness among students and the community. An open conversation about the distinction between reality and fictional relationships, and the psychological impact of parasocial ties, is essential. Educators can incorporate discussions of the ethical implications of AI into their curriculum, helping young people critically assess their interactions with these systems.
At the same time, we must speak out collectively to raise questions about AI companies. Are we guiding AI innovations in the direction of enhancing humanity, or are we creating tools that erode the structure of human connection? The answer depends on the values we choose to support and the vigilance we maintain against this rapidly moving field.
Leshner et al. Show, humans, even fictional characters, have an extraordinary ability to form meaningful connections. However, this ability has a deep responsibility. It is about ensuring that these connections will not replace them, but enrich our lives. As Yuval Noah Harari, author of Sapiens, observes appropriately, “If the only intimacy we can form is non-human AI, we have no intimacy at all.”
Let's take this phone up urgently. Will AI's corporate empire listen to civil society?
Given that optimizing profits is essential, it may not be (human) cost. Will the government tighten regulations? The US government recently passed a bill banning states from regulating AI. We are witnessing a combination of business and national interests in order to prioritize human interests. By raising awareness of critical AI, at least, even in small ways, AI technology can work towards a future that serves humanity's best interests rather than compromise them.
Angel Lin is the chairman of Hong Kong University of Education. Liang Cao is a postdoctoral researcher at the same university. Views do not necessarily reflect Chinese views every day.
