TAC Explains: AI/ML Glossary in Aircraft Certification

Machine Learning


The following terms appear of airflow A special report on certification of artificial intelligence/machine learning in aviation and is included here as a quick reference along with some related topics. Definitions are taken from various sources, TAC’s report and may differ slightly from other wording definitions.

agent: A software system that uses AI to autonomously perform tasks according to predetermined goals. Agents are typically developed through a machine learning process called reinforcement learning, where the algorithm rewards positive outcomes and punishes negative outcomes.

algorithm: A set of rules that defines a set of operations. In machine learning, it refers to the steps used to identify patterns in training data and train machine learning models to make predictions on new data.

ARP4754: Aerospace Recommended Practices (ARP) document “Civil Aircraft and Systems Development Guidelines” published by the standards organization SAE. It provides a process for identifying and minimizing the impact of development errors in aircraft system design and is accepted by regulatory authorities as a means of complying with system functional and safety regulations.

ARP4761: The “Guidelines for Conducting Safety Evaluation Processes for Civil Aircraft, Systems, and Equipment” is an SAE recommended practice that provides a process to ensure the safe design of aircraft systems. This is accepted by regulators as a means of complying with system safety regulations.

ARP6983/ED-324: The “Process Standard for the Development and Certification Approval of AI-Implemented Aviation Products” is a recommended practice currently being developed by the SAE/EUROCAE G-34/WG-114 joint working group. This could provide a process to support the certification of aircraft systems and equipment that incorporates machine learning. The first version of the standard is limited to frozen ML models developed through supervised learning.

Artificial intelligence (AI): The ability of machine-based systems to perform tasks normally associated with human intelligence. “AI” is also often used to refer to specific technologies used to perform these tasks, such as expert systems and machine learning.

Artificial neural network: A computational algorithm consisting of connected nodes or “neurons” that define the order in which operations are performed on inputs. Artificial neural networks identify patterns in training data and apply them to new data.

Deep neural network: Artificial neural networks with multiple hidden layers, related to “deep learning.”

Deterministic: Regarding computer programs, a system that always produces the same output given a particular input. “Deterministic” is sometimes used to refer to traditional logic-based software, as opposed to machine learning, but “frozen” ML models implemented on appropriate hardware can also be deterministic. Early aviation certification efforts are focused on this type of frozen model, which does not continue learning once it is implemented on an aircraft.

Development guarantee: A set of processes that ensures that complex aircraft systems perform their intended functions and are free of unacceptable unintended behavior.

Development Assurance Level (DAL): The level of development rigor required for an aircraft feature (FDAL) or item (IDAL). The five levels range from the most stringent DAL A (associated with the potential for catastrophic safety effects) to the least stringent DAL E (no safety effects).

DO-178: Software Considerations for Airborne Systems and Equipment Certification is a core document published by the standards organization RTCA to ensure the safety of airborne software.

DO-254: “Design Assurance Guidance for Aircraft Electronic Hardware.” RTCA standard used to ensure the safety of aircraft-mounted hardware.

Expert system: The field of AI concerned with developing systems that use logical rules to manipulate input and emulate the decision-making abilities of human experts. This form of AI was popular in the 1970s and 1980s.

Functional risk assessment: An assessment that identifies each feature installed on the aircraft and the potential impacts associated with its loss or failure.

G-34/WG-114: SAE/EUROCAE joint working group developing ARP6983/ED-324.

Generative AI: A type of AI that learns underlying patterns in training data and uses them to generate new data, such as text, images, and audio.

Industry consensus standards: A document developed collaboratively by members of relevant industries that outlines best practices or generally accepted requirements in a particular field.

item: In development warranties, hardware or software installed on an aircraft to meet established requirements.

Large language model: ML models trained on large amounts of text and designed for natural language processing tasks such as language generation. Large-scale language models power chatbots such as ChatGPT.

Guaranteed learning: A set of activities introduced by the European Union Aviation Safety Agency (EASA) aimed at providing confidence that the training data for ML models is correct and complete, and that the model can perform well on unseen data beyond the data on which it was originally trained.

Machine learning (ML): The field of AI concerned with developing algorithms that allow computers to learn from data rather than following hard-coded rules.

ML components: A concept first introduced by EASA and incorporated into the draft version of ARP6983/ED-324. Defined as a restricted collection of hardware and/or software items, at least one of which contains an ML model. This modifies traditional development assurance practices by occupying an intermediate level between systems and items.

ML model: A computer program trained using machine learning algorithms. It receives input data and outputs predictions or decisions.

Compliance measures: Acceptable methods for complying with regulations. Industry consensus standards can serve as a compliance tool if accepted by regulators.

Requirements: What an aircraft, system, or item is required to perform in a development warranty. There are different classes of requirements (such as customer requirements and installation requirements), and not all requirements are applicable at all levels.

S-18/WG-63: Joint SAE/EUROCAE committee developing and maintaining both ARP4754B and ARP4761A. Both are widely considered landmark documents in aircraft system safety certification and are recognized by regulators as a means of complying with system safety regulations.

Safety rating: Evaluation at the aircraft or system level. Identify safety goals based on the desired architecture (preliminary safety evaluation) and verify that those safety goals are met in the implemented product (final safety evaluation).

Supervised learning: A form of machine learning that uses labeled training data. This consists of sample data points and correct output or answers. For example, training data for a visual traffic detection system may include images of objects that have been correctly labeled by humans as airplanes, birds, or drones.

system: In development assurance, the level of aircraft design organized around functions such as power delivery by the aircraft. The system assigns requirements to items that implement hardware and/or software to support the functionality.

Training data: Data used to train machine learning models to make useful predictions.

Subscribe to Air Current

Our award-winning aerospace reporting combines the highest standards of journalism with the level of technical detail and rigor that sophisticated industry readers have come to expect.

  • Exclusive reports and analysis on flight strategy and technology
  • Full access to industry intelligence archives
  • we respect your time. Everything we publish will get your attention

Subscribe



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *