Holding AI accountable: NTIA seeks public input to formulate policy – ​​publication

Applications of AI


low flush






April 17, 2023

As artificial intelligence (AI)-powered applications continue to grow in popularity, the National Telecommunications and Information Administration (NTIA) is currently soliciting comments and public input with the aim of producing a report on AI accountability. I’m here.

Given the recent rise in popularity of AI-powered applications such as ChatGPT, government and corporate officials are concerned about using such applications for crimes, infringing on intellectual property rights, etc. , have begun to express concerns about the potential dangers and risks associated with such technology. Spread misinformation and engage in harmful prejudices. In light of this, regulators in multiple countries are beginning to consider ways to facilitate the use of AI-powered applications in a legal, effective, ethical, safe and trustworthy manner.

On March 16, 2023, the U.S. Copyright Office clarified the scope of copyright for works generated using AI tools and the use of copyrighted material for machine learning purposes, including the use of copyrighted material by AI technology. Initiated an initiative to investigate copyright law and policy issues raised. The UK government has announced its AI regulatory framework on 4th April 2023. The NTIA is now issuing a Request for Comments (RFC) on AI Accountability, soliciting more general feedback from the public on AI accountability actions and policies.

request for comment

With the RFC, the Biden administration is taking a step towards potentially regulating AI technology. This could include certification processes that AI-powered applications must complete prior to release. The RFC says that the NTIA is seeking feedback on “what policies can support the development of AI audits, assessments, certifications, and other mechanisms to create trust earned in AI systems.” Specifically, this announcement indicates that the NTIA is seeking input on the following topics:

  • Types of data access required to conduct audits and assessments
  • How regulators and other actors can encourage and support trustworthy assurance of AI systems along with other forms of accountability
  • What are the different approaches that may be required in different industry sectors such as employment and healthcare

The RFC lists 34 more focused questions, including:

  • What is the purpose of AI accountability mechanisms such as certifications, audits, and ratings?
  • What AI accountability mechanisms are in use today?
  • How often should audits or assessments be conducted, and what factors inform these decisions?
  • Should AI systems be released with quality assurance certification, especially if they are high risk?
  • What are the most significant barriers to effective AI accountability in the private sector, whether cooperative or adversarial, including barriers to independent AI audits? The best strategies for overcoming these barriers and what is the intervention?
  • What is the role of intellectual property rights, terms of use, contractual obligations, or other legal rights in fostering or hindering a robust AI accountability ecosystem? For example, non-disclosure agreements and trade secret protection hinder the evaluation and auditing of AI systems and processes, and if so, what legal or policy developments are needed to ensure effective accountability frameworks?

next step

The NTIA said its RFC questions are not exhaustive and commenters are not required to answer every question posed.

In an RFC, the NTIA said it would rely on these comments and other public input on the topic to draft and publish a report on the development of an AI accountability policy, with a particular focus on the AI ​​assurance ecosystem. .

RFCs are Federal Gazette Submit written comments by April 13, 2023 and June 12, 2023.

Our Intellectual Property team is available to assist with NTIA submissions in response to RFCs.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *