AI systems automate efficiency but invite regulatory scrutiny

AI For Business


Federal regulators are stepping up enforcement efforts related to the automated systems many companies now use. On April 25, the Equal Employment Opportunity Commission, Consumer Financial Protection Administration, Department of Justice, and Federal Trade Commission issued a joint statement, committing to leverage existing laws to protect the public from bias in automated systems. bottom.

Emerging technologies should expect this pledge to apply laws such as Title VII of the Civil Rights Act of 1964, Section 5 of the FTC Act, the Children’s Online Privacy Protection Act, and many other consumer financial protection laws. there is.

The automation systems described in the joint statement include “software and algorithmic processes used to automate workflows and assist people in completing tasks or making decisions.” . [systems] It is sometimes marketed as “artificial intelligence”. ”

Regulators may identify when models lack transparency and when automated systems are trained on historical or biased datasets if they do not consider context emphasized.

Many businesses today use automated systems to influence legally significant actions, such as promoting employees, granting (or denying) loans to customers, evaluating medical information, and offering (or denying) insurance. Supporting decisions to give. Businesses should focus on three points mentioned in the joint statement:

  • The definition of an automated system is very broad, ranging from an email filter that automatically sorts mail into folders, to a program that analyzes property values ​​based on recent sales of similar properties, all aimed at increasing efficiency. applicable to any technology.
  • Automated systems that are used solely to assist human decision making and are not the sole basis for that decision are still within the definition.
  • Licensees of automated systems must ensure that automated systems are used under appropriate circumstances.

The joint statement joins a growing regulatory voice for automated systems. Article 22 of the EU General Data Protection Regulation has for a long time given the right not to be subject to a “single automated” decision if it has a “legal effect” or “major impact” on the person. has been given to people.[s]”They.

The privacy laws of Colorado and Virginia state that refusal and/or provision of financial and lending services, housing, insurance, access to education, criminal justice, judicial, financial services, and loan services are “legal or similar”. We allow individuals to opt-out of “profiling to drive decisions that have a material impact on Access to employment opportunities, medical services, or basic necessities. Connecticut’s privacy law provides similar opt-out rights, but they are limited to “fully automated decisions.”

However, the breadth of the joint statement stands in stark contrast to existing legislation given its definition of “automated systems” and its application beyond decisions made solely through automated systems. The statement also emphasizes the responsibility of licensees of automated system technology to monitor proper application.

This includes ensuring that the ability to opt out of fully automated decisions is respected where required by law. This becomes a complex task in which automated systems are also used to aid human judgment.

To reduce law enforcement risk, companies should start by reviewing their product lines that have legally significant consequences, such as decisions related to housing, health care, and financial services.

Once such products or lines of business are identified, a comprehensive list of relevant automated systems should be developed from the point of data collection to the point at which decisions are made. Therefore, at the end of this first step, you will have an inventory of automated systems that will be used to make legally significant decisions.

Next, businesses should conduct a privacy impact assessment-like exercise. This assessment should document the data collected and how it is processed by each automated system used to make the final decision.

In particular, it is important to understand how automated systems evaluate data input. When software vendors vaguely describe their proprietary methods, you may need some help to understand them.

With your automated system asset inventory complete, it’s time to do everyone’s favorite task: vendor contract reviews. Some automated system providers used by businesses may have contracts that go back ten years and auto-renew forever.

Legal and market standard language regarding representations and warranties, indemnification, and limitations of liability has changed dramatically. It may be worth trying to renegotiate these terms to better pass the risk back to the vendor. Vendors are well positioned to understand their automation systems.

Finally, conduct regular (semi-annual or annual) external audits of your automated systems, and if all other information is held constant, are the results biased toward certain demographics of the population? , is not biased. Additionally, it is important to incorporate the context of the business and its consumers into this to ensure a holistic review. This may include considering new types of data points to assess consumers to ensure their decisions are justified.

The Joint Regulatory Statement was intended to be a declaration to the market. The legal landscape of automated systems is changing like a Jackson Pollock painting. But companies that assess the systems in use and take proactive steps to mitigate the risks associated with them can transform paintings into something more Monet-like.

This article does not necessarily reflect the opinion of Bloomberg Industry Group, Inc., publisher of the Bloomberg Act and Bloomberg Tax, or its owners.

Author information

Sarah Hutchins is a partner at Parker Poe and leads the company’s cybersecurity and data privacy team.

Robert Botkin is an Associate at Parker Poe, focused on data privacy and security, AI, and technology regulation.

Please write to us: Author guidelines



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *