Philip L. Woodworth, 4 August 2016.
One of the main objectives of the research at Bidston Observatory was to understand more about the dynamics of the ocean tides, that is to say, the physical reasons for why the tide propagates through the ocean as it is observed to do. Before the advent of digital computers, the only way to approach these questions was from basic mathematical perspectives, in which eminent scientists such as Pierre-Simon Laplace in France excelled in the 19th century, and in which Joseph Proudman at Bidston was an acknowledged expert in the 20th century.
Similarly, there has always been considerable interest in the reasons for large non-tidal changes in sea level, including in particular those which occur due to the ‘storm surges’ generated by strong winds and low air pressures in winter. For example, following the Thames floods of January 1928, Arthur Doodson at Bidston chaired a committee for London County Council that undertook a detailed study of the reasons for the storm surge that caused the flooding, and made recommendations for protecting the city in the future.
These areas of research were revolutionised in the mid-20th century, stimulated by public concerns following the major floods and loss of life in East Anglia in 1953 (Figure 1c,d), and finally made possible by the availability of modern computers in the 1960s. An important person in using computers in this work at Bidston was Norman Heaps, who joined the staff in 1962 and was eventually joined by a group of ‘modellers’ and ‘student modellers’ including Roger Flather, Judith Wolf, Eric Jones, David Prandle and Roger Proctor.
(As a digression, we may also mention the attempted simulation in this period of storm surges using electronic circuits, in effect analogue computers, by Shizuo Ishiguro, the father of the novelist Kazuo Ishiguro, at the Institute of Oceanographic Science at Wormley in Surrey. These devices were made redundant by digital computers. Ishiguro’s equipment can be seen at the Science Museum in London.)
Computer modelling of the tides has many similarities to the modelling of storm surges. In both cases, there are external forces involved: gravitational due to the Moon and Sun in the case of the tides, and meteorological (winds and air pressure changes) in the case of storm surges. These forces are exerted on the water surface inducing currents and redistributing volumes of water.
So the first thing a modeller has to know is how much the forces are. These are provided from astronomy in the case of the tides, and from meteorology for storm surges (e.g. information from the Met Office). In the case of the 1953 storm surge, the effect of the wind can be appreciated from Figure 1(a) which shows a deep depression crossing from west to east and strong winds from the north pushing water into the southern part of the North Sea. The winds are especially important in this case: their force is determined by the ‘wind stress’, which is proportional to the square of the wind speed, and the dynamics are such that a greater surge occurs when wind stress divided by water depth is maximum. In other words, bigger surges occur in shallower waters, such as those of the southern North Sea or the German Bight.
The next problem is to determine what the impact of these forces is, and for that the computer solves sets of mathematical equations at each point on a grid distributed across the ocean (e.g. Figure 2); these equations are in fact the same ones that Proudman and others used but could not be applied in this way at the time. The output of the models consists of long records of sea level changes and of currents at all points in the grid: as an example, Figure 1(b) provides a map of the maximum resulting surge during the 1953 storm surge event. Layers of ‘nested models’ enable very detailed information to be provided to coastal users in particular localities.
As Figure 1 demonstrates, surge modelling is particularly important to people who live at the coast. The Met Office can provide data sets of winds and air pressures up to 5 days ahead, which can be used to force the computer models. And, because the models can thankfully run faster than ‘real-time’, they can provide forecasts of what the likely magnitudes of storm surges will be several days ahead, enabling flood warnings to be issued. In the case of London, the operational warnings can be used to decide whether or not to close the Thames Barrier (Figure 1d).
These forecast techniques, developed at Bidston by Norman Heaps, Roger Flather and others, were first used operationally at the Met Office in 1978, and successor models, which are conceptually the same, are still used there, providing warnings to the Environment Agency. Similar schemes have been adopted by other agencies around the world. Storm surge models developed at Bidston have also been applied to areas such as the Bay of Bengal where surges can be considerably larger than around the UK and where there has been a large loss of life on many occasions.
Modelling at Bidston later developed into studying the 3-dimensional changes in the ocean that result in the transport of sediments or pollutants (‘water quality modelling’) or that have impacts on ecosystems. Modelling has also been applied to topics such as the safety of offshore structures and renewable energy. The same sort of computer modelling is now used throughout environmental science. For example, the models that the Met Office uses for weather forecasting, or the Hadley Centre uses to predict future climate use the same principle of solving physical equations on a grid.
But every modeller knows that their model provides only an approximate representation of the real world, and to help the model along there is sometimes a need to include real measurements into the model scheme, in order to constrain the mathematical solutions on the grid. These are called ‘assimilation models’, of which forecast weather models are the most obvious examples.
This enables us to return to tide modelling. Scientists at Bidston developed many regional models of the ocean tide for engineering applications as well as scientific research. These models tended to have ‘open boundaries’ where the region of the model grid meets the wider ocean. In these cases, it is normal to prescribe ‘boundary conditions’ which specify the tide at the boundary, and which are in effect a form of data assimilation. However, if one wants to make a tide model for a large region or for the whole ocean, with no boundaries, it was found that there were problems with obtaining acceptable results, as the assumptions which go into the computer codes were not universally applicable or missed some aspects of the tidal dynamics. Assimilation of sea level measurements by tide gauges and from space by radar satellites provided a solution to these problems.
In the last decade, a number of excellent parameterisations of the global ocean tide have become available. Some of these parameterisations are based purely on measurements from space (e.g. Figure 3), others are based on computer tide models that make use of only the known dynamics, and others are hybrid models that employ data assimilation. The two latter schemes provide information on tidal currents as well as tidal elevations. All three techniques are in agreement to within 1-2 cm which is a superb achievement. Proudman could never have dreamed of knowing the tide around the world so well, and it is thanks to him and others at Bidston leading the way that we now have an understanding of why the tide is so complicated.
The tide and surge models we have described above are usually operated in 2-dimensional mode (i.e. with the currents at each point in the grid taken as averages through the water column), and such model codes are relatively straightforward to construct and fast to run. A big change since the early days of the 1960s that first saw their construction is that modellers nowadays tend not to write their own codes, but instead adapt sophisticated modelling code packages written by others. This enables them to construct the 3-dimensional models of much greater complexity that are now used in research.
Numerical computer modellers now comprise one of the largest groups of scientists in oceanography laboratories such as the National Oceanography Centre in Liverpool (the successor of Bidston Observatory). Their models provide a way to make maximum use of oceanographic measurements from ships, satellites and robotic instruments in the ocean (and the ocean is a big place and there are never enough measurements) and a way to forecast how conditions in the ocean might evolve. It is inevitable that oceanography and many other aspects of science will rely on modelling more in the future.
Some References for More Information
- Cartwright, D.E. 1999. Tides: a scientific history. Cambridge University Press: Cambridge. 292pp.
- Heaps, N.S. 1967. Storm surges. In, Volume 5, Oceanography and Marine Biology: an Annual Review, edited by H.Barnes, Allen & Unwin, London, pp.11-47.
- Murty, T. S., Flather, R. A. and Henry, R. F. 1986. The storm surge problem in the Bay of Bengal. Progress in Oceanography, 16, 195–233, doi:10.1016/0079-6611(86)90039-X.
- Pugh, D.T. and Woodworth, P.L. 2014. Sea-level science: Understanding tides, surges, tsunamis and mean sea-level changes. Cambridge: Cambridge University Press. ISBN 9781107028197. 408pp.
- Stammer, D. and 26 others. 2014. Accuracy assessment of global barotropic ocean tide models. Reviews of Geophysics, 52, 243-282, doi:10.1002/2014RG000450.
- Wolf, J. and Flather, R.A. 2005. Modelling waves and surges during the 1953 storm. Philosophical Transactions of the Royal Society, A, 363, 1359–1375, doi:10.1098/rsta.2005.1572.