Signature line: Russ Nelson
As unmanned aerial vehicles (UAVs), or drones, become increasingly important for military, government, and civilian applications, traditional machine learning techniques essential to training these devices often rely on paradigms that require all raw data to be sent from the UAV to a centralized server. This approach may not be practical due to privacy concerns, bandwidth limitations, and the time required to complete the task. Now, researchers at the University of Alabama in Huntsville (UAH) College of Engineering have been awarded a $599,830 National Science Foundation grant to leverage collaborative artificial intelligence (AI) to address these challenges. This effort pioneers the use of secure multimodal federated learning (MFL) on UAV networks.
Dr. Dinh Nguyen, an assistant professor in the Department of Electrical and Computer Engineering at UAH, part of the University of Alabama System, will be the principal investigator on the initiative, which is scheduled to run through September 2028.
“Secure multimodal federated learning (MFL) over UAV networks can certainly enhance defense and security missions, but its impact goes far beyond military operations,” Ding explains. “This project has a particular focus on private and public sector applications such as disaster response, environmental monitoring, infrastructure inspection, and cybersecurity.”
UAVs, or drones, collect a variety of real-time data, from images to environmental measurements, and are used to train intelligent systems that can detect hazards, assess damage, and monitor conditions. Distributed AI involves training multiple AI models across different devices or servers without exchanging raw data.
“The importance of this project is to advance a new paradigm of decentralized AI in UAV systems that integrates security, adaptability, and efficiency,” Nguyen said. “By enabling drones with different sensors and data types to learn collaboratively under limited communication and computational resources, this project establishes the foundation for resilient, privacy-preserving, and intelligent aviation networks.”
MFL is a distributed machine learning technique that allows multiple clients to jointly train a single global model using data in different formats, without having to share each drone’s local private data.
“Simply put, multimodal federated learning allows groups of drones to ‘learn together’ without sending all raw data to a single server,” Nguyen explains. “Each UAV collects different types of data, such as video, temperature, and network signals, to train a small local model on its own data and only shares model updates, not the original data. These updates are combined to improve the shared global model. This ultimately improves the resilience and reliability of distributed AI in time-sensitive, resource-constrained, and hostile environments.”
To protect privacy, this approach employs a technique called modality-aware differential privacy. This is an enhancement that adds carefully designed “noise” to model updates, ensuring that sensitive information is not reconstructed. We also use secure aggregation methods that ensure the confidentiality of individual UAV contributions. To combat “model poisoning” that occurs when a compromised drone attempts to destroy a shared model, the framework integrates attack detection mechanisms that recognize and mitigate malicious updates.
“Combined, these defense capabilities enable UAV networks to work together safely even in hostile environments, ensuring reliable and secure federated learning across a variety of aviation systems,” Nguyen concluded. “These advances save bandwidth, reduce communication latency, and protect sensitive information. Overall, this system makes learning faster, more private, and more adaptable, making it ideal for missions with tight time, energy, and resource constraints.”
