Travelling back to the UK from Sri Lanka in July, I experienced a 10-degree temperature rise with the UK hitting over 40°C. While some people may argue that such extreme temperatures in the UK could just be a statistical anomaly, climate scientists such as Tim Palmer, Royal Society research professor in climate physics at Oxford University, who I spoke to at length on the subject, have no doubt that global mean temperatures are rising as a result of greenhouse gas emissions caused by human activities. 

Joseph Mariathasan

There is no choice but for the world to adapt to the impacts of climate change. But to do that, we first need to understand the implications. Palmer points out that the problem is that we just don’t have a good handle on what the impacts of climate change are at a regional level. Given the magnitude of regional impacts, this issue is critical. Palmer argues it can be solved through the creation of an international climate-computing centre. It is in the interests of European private sector companies and asset owners, particularly re-insurance companies, to work with governments to enable this to happen.

Granular detail

Despite attempts to reduce the maximum mean temperature rises by dramatically reducing man-made emissions of greenhouse gasses, regional impacts in the form of floods and unusually high temperatures are already becoming more prevalent. Whether it is re-insurance companies calculating catastrophe re-insurance premiums or water companies deciding on required reservoir capacities, corporations need far more detailed information to model and predict future scenarios than is available from existing climate models. As Palmer explains: “In the UK, we know that winters are getting warmer and wetter and summers are getting hotter and drier, but the question is, averaged over the year, which is winning?” 

That matters for corporations making investment decisions such as water companies investing in reservoir infrastructure to meet water needs. It is extremely difficult to decide whether the annual precipitation change is going to be positive or negative. 

“Whether it is re-insurance companies calculating catastrophe re-insurance premiums or water companies deciding on required reservoir capacities, corporations need far more detailed information to model and predict future scenarios than is available from existing climate models”

Determining climate change impacts and their frequencies requires much more granular and detailed modelling at the 1km level, says Palmer, to provide the answers societies and corporations need to adapt to climate change. Reinsurance companies, for example, are good at calculating expected losses arising from hurricanes, but estimating the probability of hurricanes requires models created under the auspices of the Intergovernmental Panel on Climate Change (IPCC). These models are currently too coarse to make accurate predictions of hurricane force strengths. “The winds in IPCC models of hurricanes are typically much too weak. Modellers have to utilise some statistical processing to get them to the right strength,” explains Palmer. 

Weather versus climate

It is important to distinguish between weather forecasting over a timespan of days to weeks and climate models which span decades. Palmer points out that weather forecasting has improved so much because scientists have been able to increase the resolution of their models enormously – the BBC weather forecasts not only cover what the weather will be like over the next weekend, but sometimes also the following weekend. That would have been inconceivable 50 years ago, when forecasts would be lucky to accurately predict two days ahead. 

Better models have reaped big dividends. It seems obvious to Palmer that the same should happen for modelling and forecasting climate change impacts. “We have just got to put the same degree of seriousness and effort that we put into weather forecasting models,” he says. 

Palmer envisages the ability to create the climate equivalent of the famous Turing test for artificial intelligence. The Turing test, envisaged by British physicist Alan Turing, suggested that determining whether a computer possesses artificial intelligence just required having a conversation with it and if the replies are indistinguishable from those likely to be given by a real human being, the computer could be seen as having intelligence. Palmer’s “Climate Turing test” would require climate modelling to produce outputs that are indistinguishable from real satellite photos of the world, which is currently not possible.

“It would be absolutely obvious to anybody that’s got even the slightest training in meteorology, that they’re looking at a model world, not the real world,” he says. The modelled world would produce rainfall patterns and cloud coverage patterns that never occur in the real world.

Achieving such accuracy is possible, argues Palmer. However, it needs a multi-national initiative to create a climate-computing centre analogous to the high energy physics research centre CERN, which was created in Geneva in 1954. Currently, there are many competing climate physics modelling groups, none of which has the computing power to achieve the 1km modelling resolution proposed by Palmer. Instead, climate scientists typically have to share national computing facilities with many other groups. “Having access to just a few percent of a super-computer’s time is really of not much value”.

Super-computing

Palmer envisages the ability to take maximum advantage of the latest developments in super-computer technology to address perhaps the most important problem mankind currently faces – understanding and responding to global mean temperature increases arising from human activity. That means using exaflop supercomputers that have the ability to perform billions upon billions of calculations per second (flops). That, he says, will enable models to be able to resolve the largest and most vigorous types of thunderstorms. 

What would be the cost? CERN’s operating budget alone is about €1bn ($1bn) a year. Palmer estimates that a climate-computing facility of the type required to help societies to adapt to global warming would need a fraction of that, perhaps around €100m a year including the costs of climate physicists and computing staff. For European corporates, particularly re-insurance companies, partnering with governments to achieve that capability, may be money well spent.

Joseph Mariathasan is a contributing editor to IPE and a director of GIST Advisory