New models link weather and climate
Heavy rain and flooding have characterized the weather in recent weeks. In order to predict these weather events more accurately and to better understand them together with global climate change, ETH Zurich and its partners are developing a new generation of high-resolution weather and climate models.

Heavy rain, hailstorms and flooding: The past few weeks in the Alpine region and in northwestern Europe have made it clear how extreme storms can affect us. But how exactly are extreme weather events linked to global warming? For researchers working on weather-climate interactions and their modeling, this is a key question.
Models are a means to understand these interactions. They map the basic physical processes in order to calculate probable developments. However, with today's models and computer infrastructures, researchers are reaching limits in how accurate their statements about the interactions of weather and climate can be. This is why ETH Zurich and its partners have launched the EXCLAIM research initiative. Its goal is to significantly increase the spatial resolution of the models in order to increase their precision and to directly simulate the weather in a future, warm world on a global scale.
Seamlessly map the weather in the climate model
"Due to their high resolution, the new, global models will depict important processes such as storms and weather systems in much greater detail than was previously the case. This will allow us to study in much greater detail how climate changes and weather events influence each other," says Nicolas Gruber, the head of EXCLAIM (see box below) and professor of environmental physics.
The researchers are aiming for a real leap in scale in the spatial resolution of weather and climate models: To simulate the global weather and climate with all its regional details, their models place a virtual, three-dimensional grid over the globe. Based on physical laws, the researchers then calculate the respective climate conditions for each point in their models. In global climate models today, these points are 50 to 100 kilometers apart. In EXCLAIM, the researchers are aiming for a resolution of just one kilometer in the long term.
Since the computing power of today's high-performance computers is limited, the weather can so far only be simulated with such fine-mesh resolution on a regional basis - and only over relatively short periods of time. In the new models, the researchers now want to achieve this fine-mesh resolution globally as well, in order to simulate weather events from a global climate perspective much more sharply than before. This is like equipping the global climate models with an additional zoom function for small-scale events.
"The new models can also be used to make 'weather forecasts' in the future climate and find answers to how extreme events like this summer's heavy precipitation might look in the future," says Christof Appenzeller, head of MeteoSwiss' Analysis and Forecasting Division.
High-performance infrastructure for climate simulations
In order for the new models to show their advantages, a tailor-made computer infrastructure is required. After all, weather and climate models are among the most computationally and data-intensive problems. At EXCLAIM, the models are therefore developed hand in hand with the hardware and software of high-performance computers: "The computing and data infrastructure is set up entirely according to the requirements of the weather and climate models," says Thomas Schulthess, the director of the Swiss National Supercomputing Centre in Lugano. The new "Alps" supercomputing system, for example, is built so that the high-resolution climate models can also reproduce convective systems such as thunderstorms well.
For weather and climate to actually be simulated globally and over decades with a mesh size of a few kilometers, the model must run about 100 times faster than is currently possible. The first way to achieve this goal is to use faster and larger computers. The transition from the current high-performance computer at CSCS to the "Alps" system will contribute to this.
One challenge here is the end of "Moore's Law," according to which the performance of processors doubles about every 20 months: "Since the serial performance of processors has not been increased for about 15 years, the only way to increase the performance of supercomputers is to improve their parallel computing architecture," says Thomas Schulthess, adding: "Moreover, it is worthwhile to set up the architecture of a supercomputer so that it can optimally solve certain classes of research problems." An important role for computing performance is played by a mixed computer architecture in which the conventional main processors, the CPUs (Central Processing Units), which are responsible for calculations and data exchange between memory and components, are used together with GPUs (Graphical Processing Units).
The second option starts with the software and consists in optimizing the model code and adapting it better to the mixed computing architecture. Here, EXCLAIM follows a revolutionary approach in which the source code is split into a first part, which is the interface to model developers and users, and an underlying software infrastructure, in which the central algorithms of the model are implemented highly efficiently for the respective hardware. CSCS, MeteoSwiss and C2SM are already successfully pursuing this approach in the current weather model of MeteoSwiss. Now their approach is being applied to the ICON weather and climate model. "We were able to speed up the MeteoSwiss weather model by a factor of 10 using this approach, which allowed MeteoSwiss to improve the reliability of its forecasts," says Schulthess.
Dealing with the flood of data
Pure computational speed is not the decisive factor: When the resolution of the models increases, the amount of data increases massively. In addition, weather and climate research requires and produces very different data. For effective throughput, it is equally crucial that the computers can access the data as quickly as possible and write the results back out to storage media. Computing processes must be organized accordingly, maximizing storage bandwidth and avoiding costly data transfers. "For the new weather and climate models to produce useful results, we need to optimize the entire infrastructure. To do this, we are applying the experience gained from many years of collaboration with MeteoSwiss and the ETH Domain," says Schulthess.
EXCLAIM is interdisciplinary
In addition to the climate researchers of the ETH Center for Climate Modeling (C2SM), ETH computer scientists, the Swiss National Supercomputing Centre (CSCS), the Swiss Data Science Center (SDSC), the research institute Empa as well as the Federal Office for Meteorology and Climatology MeteoSwiss. The collaboration is intended not only to improve climate research modeling, but also to improve MeteoSwiss weather forecasts. The international project partners include the German Weather Service (DWD) and the Max Planck Institute for Meteorology (MPI-M), which is the model system ICON (Icosahedral Nonhydrostatic), which forms the basis for EXCLAIM, and the European Centre for Medium-Range Weather Forecasts (ECMWF), of which Switzerland is a full member.