Hollywood is suing yet another AI company. However, there may be a better way to resolve copyright conflicts

AI For Business


This week, Disney, Universal Pictures and Warner Bros Discovery jointly sued Minimax, a Chinese artificial intelligence (AI) company, on suspicion of copyright infringement.

The three Hollywood media giants claim Minimax (which runs Hailuo AI and is reportedly valued at US$4 billion).

The lawsuit is the latest in a list of copyright infringement cases involving AI. These cases include authors, publishers, newspapers, music labels and independent musicians around the world.

Disney, Universal Pictures and Warner Bros Discovery have the resources to file fierce lawsuits and shape future precedents. They seek damages and injunctions for the continued use of the material.

Such cases suggest a general approach of “scraping first” and a general approach of addressing the outcome. Other ethical, moral and legal methods for data acquisition are urgently needed.

One way some people are beginning to explore is licensed use. So, what exactly does that mean? And is it really a solution to the growing copyright issue?

What is a license?

Licenses are often a legal mechanism that allows creative works to be used under paid, agreed terms. It usually includes two important players. Copyright holders (for example, film studios) and creative work users (such as AI companies).

Generally, a non-exclusive license grants the user permission to exercise certain rights by the copyright holder, but to retain ownership of the work, in exchange for fees.

In the context of generating AI usage, granting a non-exclusive license could result in AI companies being permitted to use and pay the fees. They can use copyright owner material for training purposes rather than simply rubbing without consent.

There are several licensing models, and these are already used in some AI contexts. These include voluntary, collective, and statutory licensing models.

What are these models?

Voluntary licenses occur when the copyright holder directly allows the AI ​​company to use the work normally for payment. It works in large, high value transactions. For example, the Associated Press has authorized Archive to Openai, the owner of ChatGpt.

However, if there are thousands of copyright owners involved, each with a small number of works, this method is slow, tedious and expensive.

Another issue is that once a generative AI company makes one copy of the work under license, it is unknown whether this copy will be used for other tasks. Additionally, because training requires a huge data set, applying voluntary licenses to AI training is difficult to expand.

This makes separate agreements with each copyright owner impractical. It can be complicated in terms of determining who owns the rights, what to clear, and how much to pay. Additionally, licensing fees can be prohibitive for small AI companies, and individual copyright holders may not be able to earn much of their revenue from their use.

Collective licenses allow copyright holders to control their rights by an organization known as the collecting society. Society negotiates with users and distributes license fees to copyright holders.

This model is already commonly used in the publishing and music industry. In theory, if it is extended to the AI ​​industry, it could provide AI companies with access to large catalogs of data more efficiently.

There are already a few examples. In April 2025, a collective license for the use of generated AI was announced in the UK. Earlier this month, another person was announced in Sweden.

However, this model raises questions about the fee structure and the actual use itself. How are the prices calculated? How much will I be paid? What constitutes “use” in AI training? It is unclear whether copyright holders with small catalogs will benefit as much as major players.

A statutory (or compulsory) licensing scheme is another option. It already exists in other Australian contexts, such as education and government use. Under such a model, the government can allow AI companies to use their work for training without the need for permission from each copyright holder.

Fees are paid to the Central Scheme at the specified rate. This approach ensures that AI companies are paid to copyright owners while accessing training data. However, the copyright owner's ability to use removes the ability to say no to use.

Risk of control

In reality, these license models are in a variety of spectra. Together, they represent future ways creator rights may reconcile with AI companies' data hunger.

Various forms of licenses offer potential opportunities for copyright holders and AI companies. It's not a silver bullet by any means.

Voluntary contracts are slow, fragmented and do not produce much revenue for copyright owners. The collective scheme raises questions about fairness and transparency. The statutory model risks underestimating creative work and rendering copyright owners helpless when it comes to using their work.

These challenges highlight much larger issues raised as copyright is considered in new technical contexts. In other words, it's a way to promote equity and innovation while balancing the stakeholders.

If careful balance does not arise, there is a risk of domination from a few powerful AI companies and media giants.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *