How NDA has become a tool for AI industry for surveillance and silence

Machine Learning


Nandita Shivakumar is an independent researcher and communications consultant working with Equidem. Shikha Silliman Bhattacharjee is head of research, policy and innovation at Equidem.

The artificial intelligence boom does not have algorithms alone. It runs an invisible workforce of millions of workers in the global South. These are data labels and content moderators that spend their days doing traumatic image reviews, data tags, and invisible tasks that make machine learning systems work.

However, the same tech companies that promise to build an ethical digital future rely on a familiar tool, a non-disclosure agreement (NDA). Kenya's young data label, Onyango, spoke to our organization, Equidem, as Meta and Open Eye subcontracted, as Onyango, as part of an extensive global survey of working conditions in the digital world of work in Africa, Asia and South America.

“I can't share anything with my family. They're the closest people to me, but I signed the NDA so I can't share anything.

His words speak to not only trauma, but systems designed to silence workers themselves.

Monitor without looking

Surveillance often works by its possibilities rather than constant observation. The threat of being monitored is internalized and people are forced to discipline themselves.

In the world of AI training and content moderation, NDA serves a similar purpose. These agreements go far beyond the protection of trade secrets and prohibit workers from speaking about their work (even to therapists and even to their families) under the constant threat that any disclosure is considered a violation of the NDA and leads to termination or legal action.

Workers live in fear: who can they say to who, who, and when?

I saw this horror firsthand while interviewing Equidem's report. “Scroll. Click. We contacted hundreds of data labels and content moderators in Kenya, Colombia, Ghana and the Philippines. The numbers spoke of their own: In Colombia, 75 out of 105 workers refused to speak. In Kenya, 68 out of 110 refused. The answer was clear when we asked the organizers and union leaders of these countries. The NDA was creating a culture of fear and forced silence.

“So many workers came to us and were terrified by what they signed,” said Kanyugi, vice president of the Kenya Data Label Yards Association. “People don't even say the word 'nda',” said the former Colombian Data Laverer, who worked on the meta contract. That's how scary they are. ”

Disclaimer architecture

Not only does NDA silence workers, it also helps maintain a control system that extracts labor while protecting high-tech companies and their billionaires' owners from liability.

In the AI ​​supply chain, most content moderators and data labels are adopted through third-party vendors or business process outsourcing (BPO) companies. These subcontractors are not contingent. They are designed to shield companies such as Meta, Openry and others, and are from responsibility to those who do the most traumatic work.

This arrangement allows the platform to benefit while distracting responsibility when things go wrong. Consider the case of Ladi Anzaki Olubunmi. This is a content moderator who reviews Tiktok videos that have signed with the huge telepel-formance of outsourcing that died after collapsed from apparent fatigue. Her family says she complained about her extreme workload and fatigue. However, the parent company behind Tiktok is not facing accountability.

On the other hand, the work itself is cruel. Moderators often need to review some of the most intrusive content online (rape, murder, suicide, child abuse) up to 1,000 videos per shift, and in just a few seconds to process each clip. And under the Speaking NDA, many workers are so afraid they can't speak about what they see. Even for family and therapists. The “Scroll. Click. Saup” study documented more than 60 serious cases of psychological harm, including depression, PTSD, insomnia, and suicidal ideation. Another 76 workers described physical symptoms, including chronic fatigue, panic attacks and migraines. And these are only workers who feel safe enough to speak.

NDA as a monitoring infrastructure

This is not a system bug. It's the system. The NDA serves clear political and economic functions. They allow businesses to extract maximum value from their labor while minimizing accountability.

By silence of workers, the NDA blocks general scrutiny of exploitative working conditions, blocks unionization and collective bargaining, and protects tech giants from liability even if abuse occurs in the supply chain. They allow businesses to assert ignorance while benefiting from the very structure that produces it.

This is what is called fragmented accountability. As harm spreads to jurisdictions and actors, no one is responsible and prevents workers from speaking safely.

And the meaning of this goes far beyond the individual workers. These systems that help silent workers build (the AI ​​models used in content models, search algorithms, and recommendation engines) shape what billions of people are seeing online. If people who train and feed these systems are too scared to be bound by the NDA and talk about harmful working conditions, the public loses access to important knowledge about how AI actually works. In fact, legal silence for workers is a barrier to public accountability. If we cannot interrogate the conditions under which these systems are built, we will increasingly be unable to meaningfully control the technologies that govern us.

Resist NDA and regain your strength

What we need now is not reforming on margins, but rather that digital workers are at the forefront of this political considerations of how legal tools in some companies have become a means of authoritarian control in the workplace.

That means limiting the NDA to the original narrow range. This is to protect proprietary data and blankets do not prohibit talking about working conditions. This means establishing international protection for whistleblowers and subcontractor workers, particularly for workers embedded in cross-border high-tech supply chains where corporate accountability is weakest. That means mandating transparency regarding the labor behind AI systems. Who is training them, under what conditions and at what human costs? And that means ensuring that all workers, regardless of their employer or location, have the right to organize, access mental health care and speak freely without fear of retaliation.

Some governments are beginning to act. In the US, the Speak-out Act, passed in 2022, restricts the use of NDAs in sexual harassment cases. But this is just the first step. Using NDAs by the High Tech Sector – Especially in the global South, it is largely unregulated and dangerously unchecked.

They don't want very many workers like Onyango. They want to be able to speak openly with their families. Talking honestly with the therapist. Join the union without fear. To share the burden of what they see without risking their livelihoods or facing legal sanctions. They want a future where the cost of building AI is not their silence or trauma.

The systems we create reflect the values ​​we hold, so the rest of us should want that too. A digital future built on secrets, oppression and suffering is not the future at all.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *