by Tobias Bischoff and Katherine Deck Climate simulations play a crucial role in understanding and predicting climate change scenarios. However, the spatial resolution that simulations can be carried out with is often limited by computational resources to around ~50-250 km in the horizontal. This leads to a lack of high-resolution detail; moreover, since small-scale dynamical processes can influence behavior on larger scales, coarse resolution simulations can additionally be biased compared to a high-resolution “truth”. For example, simulations run at coarse resolutions fail to accurately capture important phenomena such as convective precipitation, tropical cyclone dynamics, and local effects from topography and…
Read More

Blog
By Eviatar Bach and Oliver Dunbar To understand this blog post, you will need some basic familiarity with probability (Bayes’ theorem, covariance) and multivariate calculus. In climate modeling, small-scale processes that cannot be resolved, such as convection and cloud physics, are represented using parameterizations (see two previous blog posts here and here). The parameterizations depend on uncertain parameters, which leads to uncertainty in simulations of future climates. At CliMA, we use observations of the current climate, as well as high-resolution simulations, to estimate these parameters. The learning problem is challenging, as the parameterized processes typically are not directly observable, and…
Read More
Yujie Wang and Renato Braghiere: Climate model predictions of future land carbon sink strength show significant discrepancies. To enhance predictive accuracy and reduce inter-model disagreements, it is crucial to improve the representation of vegetation processes and calibrate the models using more observational data. However, the limitations of computational resources in the past have hindered the integration of new theories and advances into traditional climate models, which often rely on statistical models to parameterize vegetation processes instead of mechanistic and physiological models (such as stomatal control models). Additionally, the preference for faster models has limited the incorporation of complex features…
Read More
[latexpage] By Ignacio Lopez-Gomez Over the past few years, machine learning has become a fundamental component of some newly developed Earth system parameterizations. These parameterizations offer the potential to improve the representation of biological, chemical, and physical processes by learning better approximations derived from data. Parameters within data-driven representations are learned using some kind of algorithm, in a process referred to as model training or calibration. Gradient-based supervised learning is the dominant training method for such parameterizations, mainly due to the availability of efficient and easy-to-use open source implementations with extensive user bases (e.g., scikit-learn, TensorFlow, PyTorch). However, these learning…
Read More
Weather disasters are extremely damaging to humans (e.g., severe storms, heat waves, and flooding), our livelihoods (e.g., drought and wildfire), and to the environment (e.g., coral bleaching via marine heatwaves). Although heavy storms, severe drought, and prolonged heat waves are rare, they account for the majority of the resulting negative impacts. For individuals, governments, and businesses to be able to best prepare for these events, their frequency and severity need to be quantified accurately. A major challenge, however, is that we need to form our estimates using a limited amount of historical data and simulations, where extreme events appear rarely.…
Read More
Researchers are spending way too much time finding, reading, and processing public data. The ever increasing amount of data, various data formats, and different data layouts are increasing the time spent on handling data—before getting ready for scientific analysis. While the intention of sharing data is to facilitate their broad use and promote research, the increasing fragmentation makes it harder to find and access the data. Taking my personal experience as an example, I spent months to identify, download, and standardize the global datasets we use with the CliMA Land model, which came in a plethora of formats (e.g., NetCDF,…
Read More
Climate models depend on dynamics across a huge range of spatial and temporal scales. Resolving all scales that matter for climate–from the scales of cloud droplets to planetary circulations–will be impossible for the foreseeable future. Therefore, it remains critical to link what is unresolvable to variables resolved on the model grid scale. Parameterization schemes are a tool to bridge such scales; they provide simplified representations of the smallest scales by introducing new empirical parameters. An important source of uncertainty in climate projections comes from uncertainty about these parameters, in addition to uncertainties about the structure of the parameterization schemes themselves.…
Read More
Low clouds play an important role in Earth’s energy budget, but they are poorly represented in global climate models (GCMs). The resolution of GCMs, which is on the order of 100 km in the horizontal, is too coarse to resolve the boundary layer turbulence and convection controlling the clouds. As a result, GCMs rely on parameterizations to represent these processes, and inadequacies in the parameterizations lead to biases in GCM-simulated clouds. To improve parameterizations by calibration with data, we want to harness data from large-eddy simulations (LES). LES directly resolve cloud dynamics and provide high-fidelity simulations in limited areas. However,…
Read More
The first-ever Oceananigans town hall at Ocean Sciences Every other February, oceanographers from around the world congregate to share new scientific insights, and tales of adventure on the high seas at AGU’s Ocean Sciences Meeting. But this February was a little different than past even-yeared Februaries — not only because oceanographers gathered virtually, but also because in 2022 the Climate Modeling Alliance unveiled their experimental ocean model to the world: Oceananigans. Oceananigans is Fast: Oceananigans is GPU-accelerated and compiled, and leverages multiple dispatch to run “minimal code” for any user experiment. Friendly: Oceananigans uses an intuitive, flexible, and extensible user…
Read More
Each fall, the American Geophysical Union unites scientists and policy makers from over 100 countries to discuss their scientific discoveries and their implications for societies at large. This year, members of the Climate Modeling Alliance presented research featuring our climate model development. Large and small-scale processes in climate With the intent to build a new Earth System Model (ESM) that is grounded in physics and designed for automated calibration using machine learning, Anna Jaruga presented On Coupling (and Separating) Subgrid-cale Turbulence and Cloud Microphysics Processes in Julia. In order to fully understand and resolve small-scale uncertainties such as those we…
Read More