Licensing risks posed by generative AI
Copyright risks posed by AI-generated works
AI Whitepaper
comment
Artificial Intelligence (AI) is a very hot topic right now, causing interesting legal and regulatory issues. Many of them are related to intellectual property rights. In particular, “generative AI” (i.e., AI that generates text or images from prompts entered by a human user) exposes intellectual property-protected works to the risk of infringement (e.g., copyrighted using unlicensed material to train a generative AI and generate its output). ), creating potential uncertainties regarding ownership of the work produced by the generative AI itself.
Licensing risks posed by generative AI
Images created by generative AI software are taking the media by storm and can be as weird and wonderful as ‘Harry Potter characters as models for Balenciaga’ or ‘Kermit arrested’. Generative AI is trained on existing content and materials to produce compelling images. These materials are usually “scraped” from online sources from all corners of the Internet and are almost always protected by copyright. In one of his ongoing high-profile lawsuits, stock image firm Getty Images is suing Stability AI, creator of generative AI software Stable Diffusion, for alleged copyright infringement. Getty claims that Stability AI illegally processed and copied millions of Getty’s copyrighted images to train its AI software. Getty Images suspected that the image was being used illegally by Stability AI because of the Getty Images gray watermark remaining on the finished product.
The act of scraping data and images from the internet is the most efficient and cost-effective way to gather a wealth of information to base algorithms and learning models on, but it inherently raises copyright and licensing issues. . There is a copyright exception in the UK that does not constitute infringement and covers “text and data mining for non-commercial research”. However, based on the current wording of the law, this is unlikely to include web scraping for the purpose of training generative AI. This is because researchers relying on this exemption need lawful access to the material in the first place. This may not be the case when mass internet scraping inadvertently captures your work.
- Subject to access restrictions.again
- Any use in AI research or training requires a license (some stock image sites have specific licenses that can be used for the latter purpose).again
- Clearly stated that they are not available to the public.
Generative AI can also be deployed for commercial purposes, but the exceptions for text and data mining are not available (at this time). The fact that OpenAI’s GPT-4 model of his ChatGPT tool is now paid for, suggests a move by AI software companies to commercialize their software. Nonetheless, plans to extend the text and data mining exceptions to cover commercial use are still under consideration, despite strong objections from UK government ministers to proposals and predictions that they will not be implemented.
In practice, rights holders may find it difficult to enforce their copyrights. In the case of Getty Images, it was possible to trace the sources allegedly used to train the Stability AI tool, but there is often a lack of transparency about the origins of the training data, and such data is not used. “Following” can be very difficult. within the AI software itself. Getty only suspected that the image was used because the famous watermark appears on the final work, but without such distinguishing features, the rights holders could not allow their work to be used without permission. You may not realize it is being used. This can have a “chilling” effect on human artistry. Creators will fear disseminating their work online for fear of being used without permission to train AI software.
Copyright risks posed by AI-generated works
Questions have also been raised as to who (or what) owns the copyright in AI-generated works, given that AI is not recognized as a legal entity that can exercise and enjoy proprietary rights. The Copyright, Designs and Patents Act 1988 defines a “computer-generated work” as a work “generated by a computer in the absence of a human authorship.”
English law therefore seems to foresee the possibility of computers to provide at least limited creative input to works that might attract copyright protection. It is difficult to apply generative AI to generate images from photographs or human-entered text prompts. In such situations, we can say that there is at least some degree of human creator, as it is humans who think of text prompts, select images, and use generative AI as a tool to create the work. increase. A pencil for drawing. Furthermore, when generative AI makes use of existing images taken by human photographers and artists, those photographers and artists also own the copyright in the AI-generated work.
Human and AI copyrights are mixed in generative AI, so for copyright protection purposes, it is difficult to specify where human copyright ends and AI copyright “starts” . The law does not yet provide clear guidance on the thresholds that either a human or an AI must meet to be considered to have contributed sufficiently to the work to be considered an author. As AI becomes more sophisticated, it may even start generating works without human intervention. At that point, the law may need to recognize that AI software can own intellectual property rights.
Generative AI also raises the issue of originality. It is currently unknown whether computer-generated works (within the meaning of the law) are truly original, regardless of human input.The British requirement for originality is “sufficient skill” qualified by the fact that “the author has created it through his own efforts, rather than enslavingly copying from the work produced by the efforts of others”. , Effort, and Judgment’. 2009 info pack This case further narrows the originality requirement to “the author’s own intellectual creation” and suggests an element of human input. The Court of Justice of the European Union stated that “by making free and creative choices, the author expresses his creative capacity in a unique way and thus imprints his own ‘personal touch'”. We have further restricted this requirement.
Given that generative AI can create an image in just a few seconds, how do AI-generated artworks work in the UK or info pack formation of originality; can you really say that there is enough effort, skill and judgment in the images generated in a few seconds, or if it is not clear where the human author ends and the AI author begins, the work can be said to be “the author’s own intellectual creation”? Even if AI software contributed to the work in a material way, AI is now capable of thinking and understanding ideas independently in the same way that humans do. It is unclear whether it constitutes an “independent intellectual creation”, given that AI is unable to It is difficult to argue that
AI Whitepaper
On 29 March 2023, the UK government released an AI white paper titled ‘Innovative Approaches to AI Regulation’, outlining plans for the future regulation of AI in the UK. In the white paper, the government proposes to keep the existing sector-by-sector approach to AI regulation in the UK, but to introduce a cross-sectoral framework of five overarching principles. increase.
- Safety, security and robustness.
- Appropriate transparency and explainability.
- fairness;
- Accountability and Governance.and
- Competitiveness and Remedy.
Although the five principles are not yet on a statutory basis, the government intends to impose legal obligations on sector regulators to give due consideration to the principles and apply them to AI within their powers when exercising their functions. .
While this white paper provides only minimal commentary on intellectual property rights related to AI, it is important to note that the Government should follow Sir Patrick Vallance’s recommendations in the March 2023 UK Government Review to ‘promote innovation in technology’. I agree to proceed with regulation. Vallance said the government should issue a clear policy position on the relationship between intellectual property law and generative AI to give confidence to innovators and investors. It is hoped that implementing rules will be developed that clarify the extent and conditions under which AI developers may use copyrighted works as training data. This he is scheduled for by the summer of 2023.
comment
It is clear that the government aims to position the UK as an attractive and competitive place to invest in AI technology, and is either too strict at the moment or will stop investing. European jurisdictions appear to be taking a more cautious approach. Differing approaches across different jurisdictions could create regulatory headaches for companies operating across borders and, in the future, especially given the borderless nature of technology and AI. Pressure for jurisdictions to harmonize approaches is likely to increase.
One thing is certain: AI is a rapidly changing technology. Just a year ago ChatGPT didn’t exist, but now people use ChatGPT to pass exams, write poetry and music, and make their business faster and more efficient. Next year will be a big one for companies, lawyers and regulators working in the AI space. When it comes to regulating AI, developments in the UK as well as globally will need to be closely monitored as different jurisdictions decide what approach to take.
Contact us for more information on this topic Celes Wynn DavisBella Phillips phone (+44 20 7418 8250) or email ([email protected], [email protected] again [email protected]). The Pinsent Masons website can be accessed at the following URL: www.pinsentmasons.com.