While some people might still joke about the reliability of weather forecasts, meteorologists are likely to nominate weather prediction as one of the great success stories of modern science – a crowning achievement of collaboration across many scientific and technological fields.
And now Australian weather forecasts are about to become even better, thanks to a new satellite and supercomputer.
Most of us simply take it for granted that weather can be forecast with some accuracy several days ahead. As measured by maximum and minimum temperature predictions for the next day, over 95% of forecasts issued by the Australian Bureau of Meteorology are verified as accurate to within 3 degrees Centigrade, reflecting a steady improvement in science, weather observations, and computing power over the past 30 years.
But it’s not just about getting the maximum temperature right. Our ability to forecast important weather features has also improved dramatically over the past three decades.
For example, the recent snowfalls in New South Wales, which stretched into southeast Queensland, were highlighted nearly a week ahead by our weather models. And the wind change that was so influential on the fire behaviour on Black Saturday was predicted several days in advance. Such foresight would have been impossible over a decade ago.
A brief history of weather forecasting
In the pre-satellite era, the forecaster’s ability to analyse weather systems was limited by the availability of surface observations from weather stations. There were huge data gaps, such as over the Southern Ocean. Weather charts were hand-drawn, including the position of high and low pressure systems and cold fronts. High-impact weather events could catch communities by surprise.
In the early 1970s, the first weather satellites dramatically changed this, sending vital new information back to Earth that improved our understanding of Southern Hemisphere’s weather patterns.
At the same time, as supercomputing became cheaper and more powerful, numerical models began to replace the entirely manual analysis of early forecasters.
A forecast model solves fundamental equations of fluid dynamics and heat transfer to compute the evolution of the atmosphere with time (or “the weather”, in other words). While the basic formulae for doing this, based on Newtonian physics, have been known for almost a century, we had to wait for the growth in computing power to apply this knowledge to weather prediction.
The role of the forecaster continues to evolve as numerical prediction skill improves further and extra, more frequent, observations become available.
Ironically, rather than being challenged by limited information, modern forecasting techniques grapple with how best to assimilate the terabytes of data that flood in from all manner of observation sources and models.
Our region’s next-generation satellite is now in orbit
Modern meteorology is underpinned by satellites, providing real-time situational awareness, such as the position of a tropical cyclone, and the major initial input into weather models.
The main source of data comes from the polar orbiting satellites. These operate about 700 km above the Earth, roughly twice the height of the International Space Station. At these heights, the satellite observations are able to extract a vertical cross-section of the atmosphere, revealing such things as moisture, winds and temperatures.
Polar orbiting satellites scan the same area of the planet only twice a day which, if you are concerned about developing severe weather, is not frequent enough. For more frequent updates forecasters rely on satellites in a geostationary orbit.
At an altitude of 35,786 km above the Equator, a geostationary satellite has the same orbital period as the rotation of the Earth, and there is effectively no relative motion between the satellite and the ground.
For weather applications, this allows continuous monitoring of the area visible to the satellite, equating to roughly 40% of Earth’s surface. Visible and infra-red images of cloud cover from these satellites are familiar to most people, being routinely shown on television weather bulletins and on the Bureau of Meteorology’s website.
In October last year, the Japanese Meteorological Agency launched the 3.5-tonne Himawari-8 satellite into geostationary orbit above the western Pacific region, the first of a new generation of advanced meteorological satellites.
It provides a significant increase in the spatial and temporal resolution of satellite images, increasing the spatial resolution to 500m and increasing the frequency to every ten minutes, giving forecasters rapid updates on developing meteorological conditions, particularly in areas without radar coverage.
A key benefit will be the ability to observe thunderstorm formation. Other benefits will be seen in the detection of tropical cyclone genesis, detection and tracking of bushfire movements using hotspot algorithms, improved observation of fog, and faster detection and analysis of volcanic eruptions.
From September 2015 the imagery from Himawari-8 will be available on the Bureau’s website.
Improvements in Numerical Weather Prediction
One of the biggest contributors to improved weather forecasts is the increase in supercomputing power. The Bureau’s new supercomputer – to be built by CRAY, costing A$77 million and funded by the Federal Government – will be the fastest in Australia when it becomes operational in mid-2016.
But it’s not a case of simply upgrading to new hardware – improved forecasts are dependent on taking advantage of the increased computing power. Over the next few years, the Bureau will use the supercomputer to implement a next-generation high-resolution weather forecasting model.
Weather forecasts start in the real world with data about what’s actually happening at the start of the forecast period. The data – including temperature, humidity, surface pressure and wind — collected from a variety of sources – are fed into models in a process known as data assimilation.
As the models improve, and more data becomes available, techniques for data assimilation must also be updated.
In the Southern Hemisphere, satellite data can make up more than 95% of the observational data fed into forecasting models.
Recently the Bureau tested a prototype forecast model in New South Wales with a resolution of 1.5 km and hourly updates. This type of high-resolution model can assimilate 10-minute data from Himawari-8, and allows us to capture thunderstorms and sea breezes that are too fine in scale for current forecast systems.
The forecast model takes all available observations and essentially evolves the simulated atmosphere forward in time to create the actual weather forecast.
The models do this by breaking the atmosphere up into small grid boxes or cells. The current regional model has cells that are 12 km wide - too large to represent individual clouds, which are typically hundreds of metres across. The model thus estimates these “sub-grid-scale” processes using physics.
As the resolution of the models increases, the sub-grid-scale physics has to evolve. This is both a boon and a challenge for forecasting.
Get ready for new weather services
Taken collectively, with the advent of the new satellite, supercomputer and advancing science, the public can expect a step change in weather forecasting services over the next decade. Improvements can be expected in near-real-time information for unfolding weather events, and improvements in lead times for forecasts that assist our warning, response and recovery efforts for severe weather.
As with all advances in technology, it is impossible to predict what some of the new service opportunities will be, as they connect with advances in communication and technology, but we know they’ll continue to evolve and excite, for both meteorologists and the public.
The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond the academic appointment above.
Authors: The Conversation