International Supercomputing Conference 2009(SoSe 09)
Climate Research, especially with focus on climate simulations for different emission scenarios, is very likely to be high priority scientific and political topic in the next decades. The same applies to "localized" climate change information, which is of crucial importance for decision makers.
Timely execution of such climate simulations depends on 5 critical ingredients: Excellent science, skilled developers, efficient parallel algorithms and the availability of computing hardware of the highest performance class and last but not least the political support for excellent (inter-)national cooperation in this field of research.
The mode in which these simulations are carried out today is based upon two general assumptions:
Nations can provide for these ingredients
There is ample time for these simulations to arrive in time for the necessary decisions
Lately these assumptions have been questioned and became subject to a lot of debate. Assuming that there is a need for an increase in computing performance of at least 10**9 to provide the necessary answers, performance increase according to Moore’s law would take more than 40 years: too late for relevant political decisions, which are demanded on a 10 year time frame. Assuming further that climate simulations need to start on an IPCC level of performance, single nations resources are already now stretched to the limit to carry out such simulations in a timely fashion. The question arises if dedicated or general-purpose computing facilities are more apt for this kind of applications. Furthermore development, execution and evaluation of such simulations need a unique combination of special skill sets, which are hard to find within any single nation.
Such considerations have led to a debate if it wouldn't be a good idea to muster all available support for a very serious effort (i.e. funding on scales like HEP or space research), and aim at a single facility for world climate change research: The World Climate Computing Centre.
The session will be held in two parts:
The first part of the session, chaired by Reinhard Budich, will set the scene by looking at one current solution, the dedicated supercomputing centre: Prof. Jochem Marotzke, Director of the MPI for Meteorology and scientific director of the German Climate Computing Centre DKRZ, will be highlighting Climate Computing at a Dedicated Computing Centre: The German Climate Computing Centre DKRZ”. Prof. Julia Slingo had to withdraw her presentation at short notice: Her talk will also be presented by Prof. Jochem Marotzke. Julia is the Chief Scientist at Met Office and a world-leading expert in the predictability of weather and climate. Her talk elaborates on the questions Why a Revolution in Climate Prediction Is Needed & How to Achieve It: Outcomes from the World Modeling Summit”. Myles Allen from the University of Oxford, well known for his project climateprediction.net, will discuss climate supercomputing on the grid, as one extreme end of the solution spectrum. Software represents the most important and expensive infrastructure of every group doing numerical climate research. The current situation in this area as well as the challenges resulting from architectural changes and operational necessities will be presented by Dr. Balaji, Head of the Modeling Systems Group at GFDL in Princeton, USA.
The second part will be chaired by Prof. Wolfgang Hiller and discuss the future challenges and possible solutions for the climate computing problem. Dr. Sylvie Joussaume is a researcher within CNRS, expert in climate modelling, affiliated to IPSL/LSCE (www.ipsl.jussieu.fr). She will discuss the "Infrastructure for the European Network for Earth System Modelling", a project recently launched by the EC, to "foster high-end simulations enabling to better understand and predict future climate change". Stefan Heinzel, Head of the Computing Centre Garching of the Max-Planck-Society and technical director of the German Climate Computing Centre DKRZ, is one of the few persons world-wide who lead a general-purpose and a climate- specific computing centre at the same time. In his talk he will focus on Earth System Sciences: Challenges for HPC Centres”. NCAR’s Dr. Rich Loft will be the last speaker in the second part of the session with his talk on Why We Need Multiple Breakthroughs to Tackle Cloud- Resolving Climate Simulations”.
A panel on the question If Global Change is the Grand Challenge Application, Do We Need a World Climate Computing Centre?" will conclude the session.
Climate codes are not ideally suited for accelerator techniques. They do have an insatiable appetite for memory bandwidth and at the same time produce vast amounts of data. These properties differentiate them from other application areas. Such requirements led to the establishment of the Japanese Earth Simulator, which had a distinctive impact on the HPC market in the beginning of this decade - could a World Climate Computing Centre with a rather specific and application driven HPC architecture for extensive coupled climate multi model/multi ensemble experiments have a similar influence? These are interesting questions for a rather large group of conference attendants, like decision makers and scientists, but also developers, computing centre experts and vendors.
This video may be embedded in other websites. You must copy the embeding code and paste it in the desired location in the HTML text of a Web page. Please always include the source and point it to lecture2go!
Please click on the link bellow and then fill out the required fields to contact our Support Team! RRZ Support Link