According to Munich Re, the increasing frequency and severity of extreme weather and weather events could lead to the loss of 1 million lives and $1.7 trillion in annual losses by 2050.
This highlights the critical need for accurate weather forecasting, especially with the increase in severe weather such as snowstorms, hurricanes and heat waves. AI and accelerated computing are ready to help.
More than 180 weather modeling centers employ robust high-performance computing (HPC) infrastructure to accelerate traditional numerical weather prediction (NWP) models. These include the European Center for Medium-Term Weather Forecasts (ECMWF), which runs on 983,040 CPU cores, and the Met Office supercomputer, which uses over 1.5 million CPU cores and consumes 2.7 megawatts of power.
Rethinking HPC Design
The global drive for energy efficiency is forcing a rethink of HPC system design. Accelerated computing harnessing the power of GPUs offers a promising and energy-efficient alternative to accelerate computation.

NVIDIA GPUs are powering weather models used worldwide by ECMWF, Max Planck Institute for Meteorology, German Meteorological Office, National Center for Atmospheric Research, and more.
GPUs improve performance by up to 24x, improve energy efficiency, and reduce cost and space requirements.
Oliver Fuhrer, Head of Numerical Forecasting at MeteoSwiss, the Swiss National Meteorological Office. “NVIDIA GPUs rely on algorithmic improvements and hardware alternatives to CPUs to make reliable weather and climate predictions a reality within power budget constraints.” and Climatology.
AI Models Increase Speed and Efficiency
NVIDIA’s AI-based weather forecasting model, FourCastNet, offers competitive accuracy with speed and energy efficiency that are orders of magnitude better than traditional methods. FourCastNet rapidly generates week-long forecasts and enables the generation of large ensembles, groups of models with small variations in starting conditions, for reliable extreme weather predictions.
For example, FourCastNet used historical data to accurately predict temperatures in Ouargla, Algeria on July 5, 2018, the hottest day in Africa.

FourCastNet used NVIDIA GPUs to quickly and accurately generate 1,000 ensemble members faster than previous models. More than a dozen members correctly predicted the high temperature in Algeria based on data from three weeks before it occurred.
This is the first time the FourCastNet team has predicted a high-impact event weeks in advance, demonstrating the potential of AI to enable reliable weather forecasts with less energy consumption than traditional weather models. bottom.
FourCastNet uses the latest AI advances such as transformer models to bridge AI and physics to produce breakthrough results. About 45,000 times faster than his traditional NWP model. And when trained, FourCastNet consumes 12,000 times less energy to generate predictions than the gold-standard his NWP model, the Europe-based Integrated Forecasting System.
“NVIDIA FourCastNet opens the door to the use of AI in a variety of transformative applications for NWP enterprises,” said Björn Stephens, Director of the Max Planck Institute for Meteorology.
expand the possibilities
During the NVIDIA GTC session, Stevens explained what’s possible with the ICON climate research tool. Stevens said Levante’s supercomputer uses 3,200 of his CPUs, allowing him to simulate 10 days of weather in 24 hours. In contrast, the JUWELS Booster supercomputer uses 1,200 He NVIDIA A100 Tensor Core GPUs and in the same amount of time he can run 50 days of simulations.
Scientists are trying to study the impact on climate 300 years from now, which means the system will need to be 20 times faster, Stevens added. Adopting faster technology like the NVIDIA H100 Tensor Core GPU and simpler code could get there, he said.
Researchers are now faced with the challenge of striking the right balance between physical modeling and machine learning to generate faster and more accurate climate predictions. His ECMWF blog, published last month, describes his approach to this hybrid. This hybrid approach relies on machine learning for initial predictions and physical models for data generation, validation, and system refinement.
Such integration, enabled by accelerated computing, could lead to significant advances in weather forecasting and climate science, ushering in a new era of efficient, reliable, and energy-conscious forecasting.
Learn more about how accelerated computing and AI are advancing climate science through the resources below.