Improving Earth System Predictability with Artificial Intelligence
The U.S. Department of Energy (DOE) recently organized a series of workshops to identify challenges and research opportunities for the use of artificial intelligence (AI) and machine learning (ML) within Earth system models (ESMs). In October 2021, the community-led, multi-lab Artificial Intelligence for Earth System Predictability (AI4ESP) initiative hosted a multitude of interactive sessions that addressed all aspects of ESMs, as well as ML, uncertainty quantification, and computational science. More than 450 attendees with diverse backgrounds—including climate scientists, mathematicians, statisticians, and computer scientists—discussed over 150 white papers during this time. Here we summarize our main impressions of the AI4ESP workshop; these reflections precede a forthcoming formal report from the workshop organizers and DOE’s Office of Science.
A common theme within every presentation was the smart use of AI and ML tools in ESMs as surrogates for localized, small-scale, complex physical processes in order to drive higher-fidelity predictions within the overall system. Two points of understanding underlie this approach: (i) ESMs are multiscale models that couple complex phenomena at many scales, and (ii) AI/ML models—while likely incapable of replacing years of climate model research—can help integrate more small-scale phenomena into existing models. Here we briefly outline several examples of ways in which AI and ML are currently impacting ESMs.
Bridging Scales with AI
During the AI4ESP workshop, Tapio Schneider (California Institute of Technology) discussed a scale-bridging approach for AI-accelerated Earth system predictability (ESP). Complex climate models face multiscale challenges in which large-scale variables impact small-scale biophysical effects and vice versa. Cloud models—wherein humidity and temperature serve as the large-scale variables that affect cloud formation—are one such example. Microphysical effects and turbulent dynamics influence droplet growth in clouds, ultimately giving rise to large-scale results such as cloud albedo, cloud cover, and precipitation. Because the use of brute-force computation to resolve such small-scale phenomena on a global scale is impractical, new approaches are necessary (see Figure 1). Similar challenges also arise in biosphere and ocean models.
To meet these challenges, the Climate Modeling Alliance (CliMA) is building a new ESM that combines deep learning with reductional science to overcome each subject’s respective shortcomings. In the context of cloud modeling, this method involves coarse graining the fluid equations via conditional averaging; doing so yields exact conservation laws with closure functions that can be learned from multiple data sources. CliMA’s model is motivated by a three-pronged approach that advances theory to exploit parametric sparsity, harnesses diverse data (including data from physics-based Bayesian emulators), and leverages compute resources like graphics processing units for local high-resolution models.
AI for Subgrid Parameterization
Pierre Gentine (Columbia University) described the application of AI algorithms to learn parameterization of subgrid-scale processes in the atmosphere—such as turbulence and convection—that are present in conventional coarse-grained climate models with approximately 100 kilometers of horizontal resolution. These models come from high-resolution models that resolve processes at a finer scale (e.g., cloud-resolving models). The new data-driven parameterizations replace the traditional empirical parameterizations in coarse-grained models, reduce the biases in these models, and are computationally cheaper. However, instantaneous mass and energy conservation in these types of data-driven approaches continues to be difficult. Solving this challenge will ensure that climate change simulations remain generalizable under different scenarios and initial conditions.
Process-based Models in ML
Chaopeng Shen (Pennsylvania State University) highlighted advances in the field of hydrology, wherein data-driven approaches like long short-term memory models outperform traditional dynamical models. These approaches have successfully achieved multi-year forecasting of numerous factors, including soil moisture, stream flow runoff, tracer transport, and so forth. However, the challenges that accompany the interpretation of ML models persist.
Scientists have recently begun to incorporate process-based models (PBMs) into the ML training framework. Doing so requires the use of differentiable PBMs—which typically necessitates reimplementation of the PBM or its replacement with a differentiable surrogate—in order to leverage existing gradient-based training techniques, such as backpropagation. Data-driven approaches also show promise in parameter calibration of hydrology models via neural networks — a practice that improves generalizability. They can even help extract information from big data to improve/modify process representation in physical models.
Learning Equations From Data
Ocean models and cloud models face similar multiscale challenges; traditional closure models that represent subgrid turbulence struggle to accurately predict climate effects. Laure Zanna (New York University) presented an approach that learns subgrid closure models from data, including detailed simulations of small regions to predict subgrid forcing terms. This technique can also identify physical quantities and equations from a set of basis functions that are more interpretable than neural networks and better capture energy and momentum transfer between scales. It seeks to learn new physics from data, test the robustness of subgrid models, enable validation and verification, and accurately quantify uncertainties.
Applied Math Challenges
In addition to charting the progress of AI and ML within ESP, the AI4ESP workshop also identified a number of applied math challenges in areas like knowledge-informed ML. This field integrates physical laws into the learning process and can even design specialized network architectures that automatically satisfy physical invariants [2]. More generally, researchers can include constraints through differentiable optimization, which adds a convex optimization layer to neural networks [1]. Remaining challenges include the learning of emerging constraints, identification of erroneous constraints, and inclusion of state-of-the-art optimization methods. Future research could also involve the development of explainable AI to provide physical insight or causal relationships for the system in question, as well as the identification and classification of rare events such as wildfires and heat waves.
References
[1] Agrawal, A., Amos, B., Barratt, S., Boyd, S., Diamond, S., & Kolter, J.Z. (2019). Differentiable convex optimization layers. In Advances in neural information processing systems 32 (NeurIPS 2019). Vancouver, Canada.
[2] Karniadakis, G.E., Kevrekidis, I.G., Lu, L., Perdikaris, P., Wang, S., & Yang, L. (2021). Physics-informed machine learning. Nat. Rev. Phys., 3, 422-440.
[3] Schneider, T., Jeevanjee, N., & Socolow, R. (2021). Accelerating progress in climate science. Phys. Today, 74(6), 44.
[4] Schneider, T., Lan, S., Stuart, A., & Teixeira, J. (2017). Earth system modeling 2.0: A blueprint for models that learn from observations and targeted high-resolution simulations. Geophy. Res. Lett., 44(24), 12,396-12,417.
About the Authors
Sven Leyffer
Senior computational mathematician, Argonne National Laboratory
Sven Leyffer is a senior computational mathematician at Argonne National Laboratory and the current President of SIAM. He holds a Ph.D. from the University of Dundee in Scotland and works on nonlinear and mixed-integer optimization problems.
Salil Mahajan
Staff Scientist, Oak Ridge National Laboratory
Salil Mahajan is a staff scientist at Oak Ridge National Laboratory. His research interests lie in climate dynamics and variability, climate statistics and machine learning applications to climate modeling, and analysis and predictability.
Stay Up-to-Date with Email Alerts
Sign up for our monthly newsletter and emails about other topics of your choosing.