AI costs more than just empathy

Machine Learning


He may have been making light comments when he casually said, “If you say “please” and “thank you” it adds millions of people to Openai's costs.” But the truth behind that comment is deeply reduced. In a digital world driven by algorithms and machine learning, empathy is no longer considered as strength. It's inefficiency.

This is not just about language. It's about what we are asked to quietly give up in the name of convenience and who will pay the price for it.

At the Digital Empowerment Foundation (DEF), working with marginalized last mile communities across India makes one thing clear. If a digital system cannot recognize the full range of human behavior, the most lost is already marked. As AI systems become wider, they don't just shape how we interact. They are reshaping who we are allowed to become.

From users to machine subjects

Today's AI Tools – Voice Assistant, Recommended Engine, Chatbot – Reward clarity, brevity, emotional neutrality. Everything outside of that norm is excluded. This sounds like harmless efficiency, especially in cultures with rich oral traditions, local dialects, or nonlinear storytelling practices, until you find yourself discouraged from expressing yourself naturally.

Over time, people begin to adapt to what the machine “understands.” They don't speak much emotionally, avoid idioms and simplify their identity. In fact, users are trained to sound like machines, but not the other way around.

Many of them work together through def for communities around the digital ecosystem Just ai The initiative, this creates a new kind of elimination not only from the platform but also from the full participation in digital life.

Algorithm Identity and Cultural Erase

The AI ​​does not display context. The pattern will be displayed. I can't hear my mother explaining her health concerns in secret phors. You can hear statements that cannot be parsed. They do not understand the tone and social cues of rural youths and phrasing. It marks it as irrelevant data.

Worse, this simplification of people to “users” is a way to flatten cultural expressions into mere input. And when people, especially young women and marginalized users, begin to change the way the platform talks, dresses and expresses online to reward, it becomes a deeper identity crisis.

I've seen this at my work at DEF's digital literacy. Young people increasingly curate their actions to pursue the visibility of their algorithms. Not because they want it, but because it's the only way.

The fantasy of empathy

AI may sound polite. They might say “sorry,” or use friendly emojis. But it does not feel it, does not understand pain, pleasure, or context.

What worries us is the increased user preference for machine interactions with human interactions. Not because they are more useful, but because they are faster, messy, emotionally detached. This reprograms how people relate to each other. It risks turning care and compassion into inefficiency and does a hasty skip to get things done.

Human futures, not machine templates

This trajectory is inevitable. This is the result of choices made by developers, tech companies, and policy makers, about which actions are worth recognizing and what expressions are discarded.

DEF continues to advocate for inclusive, people-centric design. We need a digital system that recognizes not only data but also dignity. Technology must adapt to people rather than force people to flatten themselves for the machine.

The burden of making yourself readable to AI is not already excluded from the mainstream system. We have to ask because AI is embedded in everything from education and healthcare to public welfare.

Empathy is not a luxury. That's not a cost to reduce it. That's correct. And it must remain at the heart of our digital future.

Dr. Arpita Kanjilal is Head of Research and Advocacy at the Digital Empowerment Foundation.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *