Nº 9Autumn 2018

Please send us your e-mail address and we will keep you updated when new issues are published

* Required


The need and the opportunity to study the economic impact of tsunamis on Spanish coasts


Dr Miguel Llorente Isidro - Geological Risks Area. Geological Survey of Spain (IGME). (m.llorente@igme.es)
Dr Jorge Macías Sánchez - Group for Research into Differential Equations, Numerical Analysis and Applications, University of Malaga. (jmacias@uma.es)

1. Introduction

In the last 20 years tsunamis have been responsible for hundreds of thousands of direct mortal victims and over 11 million displaced persons (1), which makes this type of phenomenon the most deadly of natural hazards in the history of humanity. Not all zones in the world are prone to suffering tsunamis. It is only those regions that are near a substantial body of water or connected to it via some geological feature, and among the zones where a tsunami might be triggered the chances of one actually occurring varies in each case. More than half of recorded tsunamis (2) trace their origin to the Pacific Ocean (Figure 1), although it is more accurate to say that of all reported tsunamis, roughly 60% originate from what is known as the Ring of Fire, which is the zone close to the coasts of Asia and America where ocean tectonic plates collide with continental ones and produce powerful seismic and volcanic activity. What is striking is that, although 10% of events are concentrated in the Atlantic Ocean (including the Caribbean Sea), in the Mediterranean the figure is approximately 13%. This is impressive because, if we put this into perspective, the Mediterranean is 70 times smaller than the Pacific Ocean, meaning that, per unit of area, the Mediterranean Sea is more tsunamigenic than the Ring of Fire. One could put forward the hypothesis that records of tsunamis are more complete in the Mediterranean than in other parts of the world for reasons associated with human settlements, as well as the zone’s level of development and capacity to record them. The hypothesis can also be advanced that the logging of events in the Mediterranean is not even complete, given that it is plausible that there might have been tsunamis that have gone unnoticed or their geological footprint still lies undiscovered. Despite these two theories concerning the incompleteness of records, some very interesting initial conclusions can be drawn. A case in point is that, where they have an effect on the coasts of the Iberian Peninsula, the likelihood of a tsunami occurring in the Mediterranean is 14% each year, whereas in the Atlantic the figure is an annual 5%, or perhaps this phenomenon is even more frequent if the propositions about incompleteness were to be confirmed. UNESCO’s IOC (3) is (if possible) more forceful in how it expresses this probability, as it says that, for an average 30-year interval, it is almost certain that a major tsunami will happen in the Mediterranean.

Figure 1. Database on tsunamis of the US National Oceanic and Atmospheric Administration (NOAA)(2) (https://maps.ngdc.noaa.gov/). The colours show the degree of impact on human lives and the shape of the symbol indicates their cause (earthquakes: a circle, volcanoes: a triangle, landslides: a square).

It is estimated that over 40% of the world’s population live in coastal zones (4), although, as has been seen in recent tsunami events (Sumatra, 2004; Japan, 2011; Indonesia, 2018), those affected are not just the inhabitants of the zone, but also tourists where activity of this nature is largely associated with the coast. So much is this the case that on the third World Tsunami Awareness Day the focus was partly on this particular feature of tsunami risk. 

In Spain (5) it was estimated that for 2017 overnight stays peaked at 40,000 people a day in beach areas in particular. If we take the Lisbon tsunami of 1755 as a benchmark, we can make a rough calculation of the number of potential Spanish victims of an equivalent tsunami to that one which might happen today. In 1755, some 1,214 people perished in Spain (6). At that time the population was 9 to 10 million (7) and given that the rise in the population up to the present day has been close to five times (or even 20 times in coastal zones), an educated guess for the figure would be in the 5,000 to 24,000 person range. In other words, in terms of mortal victims the yardstick magnitude for an event on a par with that in 1755 would potentially and in theory presuppose some 60,000 people taking both tourists and locals into account. In terms of those affected, the figure would have to be far higher. This rough guess should be revised and worked on so as to offer a figure based on more advanced methods and using the most and best-quality data possible.

2. What is a tsunami?

To understand what a tsunami is, we should start with a simpler concept — that of a wave. In a deep pool of calm water, if you cast a stone into it, waves or ripples form. The distance from one “ripple” to another tends to be minimal in this example and only comprise a few centimetres or so, while the height of the ripples is even less. We term the distance between the successive crests the wavelength, the time that two crests take to pass through the same point the “period”, and half the height between the peak and the trough, the amplitude (Figure 2). If it were possible to monitor the water molecules in these ripples with a marker pen, a circle would be drawn close to the surface and, the deeper we go, circles with an increasingly smaller radius would be traced out. There is a depth that marks the point after which the water in the pool would no longer move, which is equal to half the wave-length. The theoretical exercise with the marker pen and the water molecules would also serve to observe something else: the movement of an ideal wave does not transport water. This is why we call such phenomena oscillatory disturbances. If two people hold a rope, one at each end, and shake it to produce waves, it can be seen that there is indeed no transporting of mass, since neither person gains or loses rope and they only shake it from each end. In other words only energy is carried. The energy in these waves is better described in terms of their wave-length than via their amplitude, because the greater the wave-length, the greater the volume disturbed. 

Figure 2. Simplified diagram of a normal wave drawing towards the coast and different zones of the beach according to wave shape. In red, the movement of a particle in the water.

In the sea, the waves are mainly caused by the wind and by the gravitational pull of nearby bodies in the sun-earth-moon system. In round figures, the waves in the Atlantic Ocean have a wavelength of 90 metres and the Pacific one of 300 metres. Nonetheless, unlike other types of oscillatory disturbances, a tsunami has a wavelength of the order of tens of kilometres. Whereas a normal wave has a period of around 15 seconds, a tsunami typically has periods in the range of 10 to 30 minutes, (8) or even more. Given that the average depth of oceans is about 3,000 metres, a tsunami affects or moves the ocean’s entire water column or almost all of it. 

As a wave approaches the coast it rears up on itself, breaks and then advances inland like a sheet of water (Figure 2), although this all depends, besides the characteristics of the wave, on the geometry of the coast, including both the emerged and submerged parts. The fact that the waves lose their shape and break up derives from the inability of the water to remain in a circular pattern when the depth is less than half the wavelength. This happens to the point where the oscillation can be reduced to a straightforward guillotining movement parallel to the bottom. This is why a tsunami is more like a swash or spreading out zone than a proper wave and its march inland keeps to an oscillatory pattern more or less parallel to the terrain (if we ignore the turbulence). When the water’s movement inland dominates, we say that the tsunami is in its run-up, and when its movement out to sea takes over, we say the tsunami is in its backwash. Whereas the run-up is decelerating (against gravity and over a non-flooded surface and thus one with a lot of friction), during the backwash it is accelerating (moving in the same direction as gravity and over a flooded area), so a tsunami in its backwash phase is as hazardous, if not more so, than in run-up mode.

Out at sea and in deep areas, a tsunami has very little height (only a few centimetres) and moves very swiftly, at a speed comparable to a commercial flight (8) (some 900 km/h). Nevertheless, as it draws closer to the coast, due to the effect of not being able to go into an oscillatory movement, the tsunami increasingly gains height. The view of a tsunami from the land is akin to a sudden rise in the sea-level which rushes landwards at the pace of a galloping horse (8) (almost 40 km/h). In short, a tsunami can unleash coastal floods, the destructive power of which originates from both the speed at which the water travels and its volume, as well as the number of times it repeats itself, since it is not in fact a “wave” but rather a “train of waves”, or several in succession. To this we should add that as the flow advances it amasses sediment (rocks, sand) and all manner of objects (cars, waste, tree-trunks and plant detritus) which only add to its capacity to wreak havoc.

3. What triggers tsunamis?

At this stage the question arises of what causes tsunamis. The NOAA database (2) already points to some of the processes that most regularly give rise to tsunamis, which, in order of frequency, are earthquakes, landslides and volcanic eruptions. The fact is, however, that any phenomenon capable of shifting a large volume of water in only a short time will be able to prompt a tsunami.

3.1. Tsunamis caused by earthquakes

Earthquakes are a very frequent phenomenon and are actually constantly occurring but we only notice those that fall within the measuring range of instrumentation. On average more than one million earthquakes a year of a magnitude in Mw (9) of over 2 are recorded. As the magnitude increases, the frequency swiftly falls off. Every year 20 earthquakes of 7 Mw or greater are recorded on average (Figure 3).

Figura 3. Relación entre el número de terremotos y su magnitud con un equivalente en energía liberada (en kg de trinitrotolueno o TNT) (10).

For an earthquake to provoke a tsunami there has to be vertical deformation apparent on the surface of the floor of the body of water (Figure 4). Earthquakes in zones where there are convergent or divergent faults (focal mechanism of a reverse or normal fault) are more given to triggering tsunamis than those produced in zones where there are transform (or transcurrent) faults, as a reverse or normal fault implies that most of the deformation is in a vertical direction, whereas transform faults mean that there is mostly deformation along a horizontal plane. Even so, since the deformation of rocks by an earthquake is finite, deep earthquakes (with a hypocentre at over 100 km down) cannot cause tsunamis even if they are of great magnitude because the deformation does not come to be significant at the floor of the body of water.

Given that depth is perhaps a more critical aspect than the focal mechanism, a distribution map of earthquakes by depth offers an initial insight into those zones which are most likely to produce tsunamis that are triggered by earthquakes and which delineate the contours of tectonic plates quite well. According to the International Seismological Centre’s catalogue, the vast majority of earthquakes happen at a depth of under 50 km (Figure 5) and these are actually capable of generating tsunamis if they are of a substantial size. Nonetheless, most earthquakes closer to the surface are predominantly of the transform mechanism kind. This is why tsunamis of seismic origin are not as frequent as surface earthquakes.

Figure 4.The focal mechanisms of an earthquake (circles with black and white zones) provide a graphic description of the numerical matrix that characterises the relative shifting of two blocks that are separated by a fault (11), which is highlighted by a fine red line for didactic purposes. Indicating the “size” of the tsunami is merely as a guide, other things being equal and without taking into account an earthquake’s secondary effects (such as underwater or coastal slides). 

Figure 5. Distribution of the world’s earthquakes where the colour represents the depth of the hypocentre. Modified from the International Seismological Centre (12).

In principle, the concept of the need for a deformation of the crust in a vertical movement for a tsunami to be generated sounds fairly obvious. A rudimentary experiment at home shows that a sudden vertical variation at the bottom of a bucket of water produces ripples on the surface. Yet an earthquake can not only deform the crust by a vertical slip, but a transform fault can also move the ocean floor if the terrain is complex and rugged enough for a vertical face to be displaced. This is the first of the two hypotheses now being considered to have possibly caused the recent tsunami in Indonesia  on 28 September 2018. The second hypothesis advanced as what prompted this tsunami (where there were 2,000 victims and some 5,000 people unaccounted for) is that the earthquake could have triggered undersea shifting, which in turn caused the tsunami (13). Two clear case-studies of deformation of the crust involving vertical slipping are the Sumatra tsunami of 26 December 2004 (which saw several hundred thousand victims) and the earthquake in Japan of 11 March 2011 (where there were tens of thousands of victims). Both of these have a reverse fault focal mechanism. This type of rupture, the reverse or compressive kind, is that which can spark tsunamis the most easily, since the energy that rocks undergoing compression can build up is far greater than that they accumulate in a traction process and so, other factors being equal, the spreading out of the deformation zone tends to be much more extensive than in the case of normal or transform mechanisms.

At the international meeting of experts on tsunamigenic sources held at the University of Malaga on 6 and 7 November 2017 (14), three likely zones capable of generating tsunamis with a significant potential impact on Spanish coasts were identified. These are the area of the Gulf of Cadiz in a broad sense (where the earthquake and tsunami in Lisbon in 1755 originated), the Alboran Sea (where there is an abundance of seismic activity) and the northern fringe of Africa (the origin of the first recorded seismogenic tsunami involving economic loss, mainly in the Balearic Islands, in 21 May 2003, which was paid out by the Consorcio de Compensación de Seguros amounting to 350,000 euros) (15).

3.2. Tsunamis caused by landslides

Landslides occur as a corollary to a whole sequence of geological processes. The force which ultimately acts is gravity, causing part of a slope to move downhill to a new equilibrium position. In the case of emerged slopes sometimes only a change of humidity can be enough, or an animal passing through, the wind, the rain or an earthquake. The correlation between the magnitude of earthquakes and the landslide affected area (LAA) has been known for decades (16) and advances are being made towards more refined concepts such as the correlation between peak land acceleration and the LAA (17). Be this as it may, a rocky mass falling into a body of water can provoke a large wave. In fact, the largest ever recorded tsunami in terms of height attained (9 July 1958) was caused by a landslide that ended up in Lituya Bay, in Alaska, triggering a wave, the run-up of which was over 520 metres (18). Simulation of this kind of mega-tsunami represents a challenge for mathematical modelling and the only numerical simulation conducted with any success in connection with the actual geometry of Lituya Bay was by the Group for Research into Differential Equations, Numerical Analysis and Applications at the University of Malaga (the EDANYA group (19)). An additional example that is impossible to leave out of any introduction to tsunamis is that at the Vajont Dam in Italy on 9 October 1963. The slope that slid down next to the Vajont Dam entered the body of water at 100 km/h, prompting a wave which overtopped the dam (262 m high) and released 30 million cubic metres downstream (20). This claimed 2,000 victims.

Unlike tsunamis caused by earthquakes, those triggered by landslides affect a smaller area and are thus held to be local tsunamis. This criterion can be used to find out (at least in part) the source of a tsunami when this lies under the sea. Nevertheless, given that we know there is a correlation between earthquakes and landslides in emerged areas, it appears reasonable to think that there might also be a correlation between undersea earthquakes and landslides, so the genesis of many tsunamis could entail an intricate combination of both effects. Other coseismic phenomena, such as soil liquefaction, may also represent triggers of landslides, including those beneath the sea.

It was not actually until the event in Papua New Guinea on 17 July 1998 (an earthquake and a landslide under the sea) when serious thought began to be given to the effect of subsea landslides as a significant cause of tsunamis (21). This tsunami claimed in excess of 2,000 victims. As a result, bathymetric research campaigns were undertaken to provide a better description of the sea floor and deposits from landslides, thereby improving the ability to recognise and map other, similar events since then. In fact, the hypothesis of a major landslide associated with the volcanic side of the island of La Palma in the Canary Islands, where there are records of a multitude of mega-landslides (22), obtained intense media coverage. Yet study of landslides such as these suggests that the frequency of them is of the order of tens of thousands of years and associated with the volcanic processes developing volcanic structures.

In Spain (and in other nearby volcanic regions) the chances of a mega-landslide happening are very slim, so the interest here is more academic than from a planning point of view. At least this is what was concluded at the international meeting of experts in 2017, which was held at the University of Malaga (14). At the meeting it was also concluded that the possible occurrence of smaller landslides close to population centres might actually give rise to locally significant impacts. That said, the slopes of the Italian volcanoes Etna and Stromboli have been the subject of monitoring for decades now, for the double purpose of volcanic surveillance and mega-landslides also linked to tsunamis happening (23).

3.3. Tsunamis caused by volcanic eruptions

Volcanic processes embrace many types of different phenomena and most of them involve, or stand to involve, the movement of a substantial volume of matter. Among these processes are landslides, explosions or pyroclastic flows. The most well-known volcanic incident and subsequent tsunami is perhaps that on the island of Krakatoa on 27 August 1883, where its eruption culminated in a widespread explosion of the volcanic structure. A large portion of the island vanished as a consequence of the explosion. 296 cities were destroyed and 36,000 people perished, over 90% as a result of the ensuing tsunami (24). It is calculated that the energy released by the explosion is comparable to 10,000 atomic bombs like the one dropped on Hiroshima (10). When compared to tsunamis triggered by earthquakes, those provoked by volcanoes account for about 5% currently on record (2). This said, we can again point to a lack of complete records, given that it is extremely difficult to make observations of this type of phenomenon. There are 60 records of tsunamis prompted by volcanoes (2) in the 20th century. Some authors (24) identify up to eight mechanisms whereby a volcano can trigger a tsunami, namely: submarine explosions; pyroclastic flows; lahars; volcanic earthquakes; landslides; the collapse of lava flow fronts in coastal areas; the collapse of volcanic calderas due to subsidence; and shock waves from explosions in emerged areas. Of these, it appears that the most common type would be that caused by subsea explosions.

In Spain, the Canary Islands are the only volcanically active region, yet to date no study exists which enables any credible relationship to be established (or one at all useful for planning purposes) between the probability of an eruption and the likelihood of this provoking a tsunami; at least beyond the collapse at the sides referred to in the section on landslides (in the case of the island of La Palma). On the other hand, other volcanoes capable of causing tsunamis (albeit where it is debatable whether they could come to affect the Spanish coastline) are Etna, the Azores, Stromboli (which triggered an event of this kind on 30 December 2002 (23)) or others even further away. 

3.4. Tsunamis caused by changes of atmospheric pressure

In the Balearic Islands the coastal flooding caused by atmospheric conditions is relatively commonplace and it is for that reason that it is described using a name solely applied to it of rissagas. In the press and in the scientific literature, these have been renamed meteotsunamis, which is a term that is becoming increasingly used. The last rissaga to hit the Spanish coasts occurred on 15 July 2018 and mainly affected the port of Alcudia (Majorca). Up until the mid 1930s, it was believed that this kind of extraordinary oscillation (ranked more serious than mere swells) was influenced by astronomic factors in much the same way as tides. A paper was then published (25) which correlated changes in atmospheric pressure with the reasons for such rises and falls in the sea-level, and which has since been extensively refined in scientific literature. The phenomenon is now described as a response consisting of amplification of oscillations in both the sea and the atmosphere (26) which gives rise to disturbances which, in terms of time and space, more resemble those of a tsunami than those of any other kind of coastal phenomenon with damage potential (such as storms or standing waves known as seiches (27)). This means that there is linkage from a strong pressure gradient in the atmosphere with the waves on the surface of the sea caused by gravitational force or heavy winds.

3.5. Tsunamis caused by meteorites

It is not possible to examine tsunamis without at least touching on this particular trigger of them. Firstly, we should consider that, were an impact from a meteorite to occur, it is most likely to happen in one of our oceans, given that they take up 70% of earth’s surface area. As with a rockslide, the impact of a meteorite on a body of water can cause a tsunami, with the not inconsiderable difference between one event and the other of the hypervelocity at which meteorites hit the earth. If tsunamis originating from a volcano present problems in observing them, this type of event is even more difficult to describe or characterise. There is actually no irrefutable historical record of a meteorite/tsunami link, although we assume that the meteorite impact event to which the mass extinction of 65 million years ago is attributed could have brought about a major tsunami. Simulating this type of event is usually done in a similar way to a one-off explosion (28), which enables a better rough calculation to be made of the shock waves that la meteorite coming in at hypervelocity would provoke. As things stand, exercises of this sort are interesting from a scientific standpoint but we would need to enhance our ability to detect celestial bodies if we are to manage to offer a credible figure for the probability of impacts of this sort.

4. Simulating tsunamis

All studies of natural hazards are undertaken with a single objective in mind: to minimise the undesirable effects when they happen (Figures 6 and 7). Given that predicting many natural events is not possible, we resort to various different techniques or strategies to evaluate them, which can include using historical or geological data extrapolated to the present situation in raw form or by means of tools to generate scenarios that feature estimation of the probability of them occurring, which necessarily implies employing simulation techniques.

When assessing the impact of a natural hazard, we have to set up certain working hypotheses and here it is particularly useful to link the probability of the trigger factor to the likelihood that the natural hazard will occur. This means disregarding the variability of results due to changes in other variables and so we need to update this kind of study on a continual basis. Given that the amount of different triggers of a natural process, particularly tsunamis, can vary tremendously, any guesswork that focusses on just a single one underestimates the real danger of the phenomenon happening. Despite this, with the data available nowadays, we can estimate quite a lot if we focus the case study on the most likely (or most verified) cause, which, in the case of tsunamis, are earthquakes. Another way to approach a natural hazard is to simulate a known prior event and base ourselves on the premise that “if it has happened in the past, it could do so again in the future”. Strictly speaking, this hypothesis is sound if the processes are cyclical (such as water evaporating and raining down) but for non-cyclical processes (such as landslides or earthquakes) the hypothesis loses validity to the gain of a more probabilistic approach. Even in cases of flooding due to extreme rainfall, although the trigger is a cyclical process, the result (i.e. the flooding) is not, since the morphological changes to the watercourse and the transformation at basin level lead to variation in flooding for identical rainfalls at different times. This is why the validity of this kind of study tends to be limited to the near future, which, for river flooding is a six-year time-frame according to the European Floods Directive (29).

The difficulty in working with earthquakes as triggers of tsunamis, i.e. with the epicentre in a body of water, lies in the fact that their characteristics do not tend to be estimated with the same level of uncertainty as continental earthquakes. This is on account of the distribution of seismographs, the vast majority of which are located onshore for obvious reasons involving maintenance costs, and the fact that a better characterisation can be made of an earthquake with its epicentre on emerged terrain using supplementary studies (geological and even historical ones close to population centres). The lack of marine geology studies that focus on characterising tsunamigenic seismic sources is thus a challenge that should be faced to reduce uncertainty. Meanwhile many of the sources are characterised according to the current state of our expertise, which relies mostly on instrument-based seismic readings and the information gleaned from oceanographical campaigns, which yield bathymetric and geophysical data for such purposes. Such information is used to delineate and characterise the properties of sources and calculations are made of the probability of their rupture in association with their magnitude and the depth of the epicentre.

Figure 7. Effects of the Sulawesi earthquake and tsunami (Indonesia, 2018). Source: AP.

4.1. Crustal deformation as a tsunami source

Given that the greatest uncertainty in studying tsunamis lies in their source, and because studying marine seismo-tsunamigenic sources is very expensive (which involves intensive oceanographical efforts), reducing those uncertainties that can actually be monitored takes on particular importance. It is for this reason that there are worldwide benchmarking initiatives, which involve a sort of competition to validate case studies (both real and in the laboratory) to fine-tune calculation models. This testing is led and directed by the NOAA under its National Tsunami Hazard Mitigation Program (NTHMP), which establishes the requirements or standards to be met.

Where of seismic origin, the tsunamigenic source is usually approximated by following the Okada model (30), which bases itself on the theory of elasticity as the cause of the crustal deformation which initially brings about the tsunami. 

This model is used to derive the crustal deformation which produces the “initial condition”, i.e. the situation where there is deformation of the free surface of the sea that triggers the tsunami. What remains from this point on is to simulate the evolution of the initial wave, firstly out at sea, and then ultimately to recreate its impact on the coast.

4.2. Simulating the flow of the tsunami out at sea

Simulating flows is possibly one of the toughest challenges in modern science even though the equations that describe their movement date from the early 19th century. These are known as “Navier-Stokes Equations” (NSEs) in honour of their creators, the Frenchman Claude-Louis Henri Navier and the Irishman George Gabriel Stokes. They derive from three of the most important pillars of science: the Law of Conservation of Mass (the Lomonosov-Lavoisier Law), the Law of Conservation of Momentum (Newton’s Second Law) and the Law of Conservation of Energy (the First Law of Thermodynamics). The NSEs thus far have no single possible solution (except for particular and straightforward cases) and there is a whole branch that is given over to approximating it via computer-assisted numerical methods which is known as CFD (Computer Fluid Dynamics) (31). One of the most useful approximations (simplifications) for simulating water (and other Newtonian fluids) is to apply “Non-Linear Shallow Water Equations” (NLSWEs) or Saint-Venant Equations (in honour of their developer, the Frenchman Adhémar Jean Claude Barré de Saint-Venant).

With respect to the case of channelled water (such as that in a canal) the movement of the water can be simplified by assuming that for a specific section of the watercourse, the flow only follows a single directional vector (one-dimensional simulation). For the case of tsunamis, the approximation can be one-dimensional (in profiles) yet this disregards the effects associated with the interaction of waves as they draw closer towards the coast and due to their geometry, as well as the nested effects of the reflection and diffraction of the waves. This removes all possibilities of evaluating coastal amplification effects, which are critical to explaining many circumstances concerning a tsunami. Despite this, the 1D approach is very well-suited to the study of generalised case studies (or where it proves impossible to pursue any other type of approach) as long as it is followed by first rate specialists inputting reliable findings (32) which can come to be checked against scaled down physical models. Two-dimensional approaches assume that for any point in the calculation domain the water can be displaced in any direction on a plane. The difference between a 1D and a 2D model in computational terms is of several orders of magnitude, both in terms of data demand and calculation time. On a conventional computer, a simulation can take weeks to be completed and, above all, depends on the size and resolution of the calculation domain. Simplification of a 3D model in turn augments the complexity somewhat more and at present is only used for highly detailed cases (in no cases is this on a general or basin scale) and, above all, within an industrial context (turbines, engines, bridge pylons, dams) at an extremely high computational cost. 

Scientific literature is crammed with tsunami simulation tests and it has not been until recently that computation has started to slowly and steadily abandon calculation on a Central Processing Unit (CPU) and been performed on a Graphics Processing Unit (GPU), thereby attaining calculation speeds in the region of 100-200 times faster (due to the fact that GPU architecture enables thousands of simple operations to be performed simultaneously, which makes a high degree of parallelisation of certain calculation algorithms).

One of the most advanced simulation models as regards accurate and computationally efficient algorithms has been developed in Spain by the EDANYA group (33) at the University of Malaga. This team of mathematicians has studied and implemented dozens of numerical schemes and compared their results and calculation times until it has selected the optimal code for numerical simulation of tsunamis in the context of Tsunami Early Warning Systems. This is the Tsunami-HySEA simulation model included within the HySEA (34) (Hyperbolic Systems and Efficient Algorithms) family of codes developed in CUDA language for NVIDIA GPUs. The calculating efficiency in resolving propagation has made it possible to cut the calculation times from days (or weeks) of conventional media to only minutes and even (at present) to seconds through using multiple next generation NVIDIA cards (Tesla V100). This is so much the case that this code has been incorporated as a calculation module into the Early Warning Systems in several countries, including Spain, although, for several reasons, there is still a reliance on pre-calculated decision matrices and databases. This system was developed at the University of Malaga and last March received the prestigious NVIDIA Global Impact Award, made for the first time in its history to a non-US research body and it has also passed the standards required by the NTHMP (35).

But for this part of studying tsunamis (propagation), account is only taken of deformation of the floor and its effect on the open sea. On making landfall, the phase of calculating flooding commences.

4.3. Tsunami flooding

Although simulating onshore flooding with two-dimensional models means using the same equations as for simulation out at sea, the computational demand is far greater. This is due to several factors, including the divergence and convergence of flows owing to the presence of obstacles on the terrain; the fact that a single point can be “dried out” and “soaked” several times; but above all, the fact that the resolution needed for working on the coast has to be far greater than out at sea. For example, changing from a resolution of 400x400 metres to one of 50x50 means taking 500 times longer, i.e. going from five minutes to 2,560 (nearly two days). 

When we talk about a flooding simulation, what is actually generated is not in fact a “simulation”. The normal situation is that multiple simulations are carried out such that the errors that can be detected in one particular simulation are ironed out by the next one and so on, until the uncertainties cease to be significant for the purposes of calculation. It is not uncommon for the process to imply dozens of previous tests. As though this were not enough, generally speaking various scenarios are simulated and all the “final” simulations are repeated to check on the stability of the result by applying variations to certain parameters (sensitivity analysis).

The Tsunami-HySEA model uses a strategy of nested structured grids on the same calculation scheme so that the lower resolution (calculation nodes further apart or larger pixels) performs swifter computation and becomes useful in providing contour condition for the next higher resolution grid and so on as appropriate. The same numerical code performs both phases of the simulation: propagation on the open sea and coastal flooding, by adapting the numerical algorithms implemented appropriately in each case. This represents a big advantage when it is used in practice, since it avoids intermediate data processing steps to adapt the propagation model to its flooding counterpart.

4.4. Data required to simulate tsunamis of seismic source

To be able to construct a model to simulate tsunamis with a seismic origin a lot of data is needed. On the one hand we need to have a catalogue of seismic events and a catalogue of active faults. Then we have to know the parameters that define the seismic source, the fault, which is simplified on a horizontal plane (Figure 8). On the other hand, we need information on the emerged terrain that might be flooded (topography) and on the submerged land (bathymetry), as well as a map of the various furrows and ridges affecting the flow in the emerged portion.

For the case study of Spain, the catalogue of seismic events and the topography is freely provided by the National Geographic Institute and the general public via a website. With respect to active faults the Geological Survey of Spain (IGME) provides open access to the (QAFI, or “Quaternary Active Faults of Iberia”), as well as other sources of information needed to understand the regional tectonic context (the GEODE map — continuous digital geology of Spain and spin-off or associated products). As for the tsunamigenic seismic sources, work is still ongoing to reach agreement on the parameters. Regarding bathymetry, the Geological Survey of Spain belongs to the EMODnet (European Marine Observation and Data Network (36)), which generates continuous bathymetric coverage that is as uniform and consistent as possible using data from an assortment of oceanographic campaigns. The latest available version is from September 2018 and features a pixel of 250x250 metres. We additionally need to have higher resolution bathymetries of the zone close to the coast, which can be obtained from different quarters, such as the Navy Hydrographic Institute, the Ministry of Ecological Transition and oceanographic research projects (such as FAUCES (37), which includes partners of great renown, such as the Geological Survey of Spain, the Institute of Marine Sciences [Spanish National Research Council], the Spanish Oceanographic Institute, the University of Salamanca and the Commission for Marine Geology of the Spanish Geological Society).

Figure 8. Data required to characterise the seismo-tsunamigenic source: epicentre coordinates (latitude, longitude); hypocentre depth (km); net slip in metres; strike direction and dip direction measured in degrees from the north, eastwards; plunge direction; rake; and lastly the length and width of the fault plane (km).

4.5. Results of tsunami simulations

One of the bottlenecks in any simulation is careful selection of the variables to obtain. The more variables that are required to feed into the model, the slower the calculation process, given that a lot of time can come to be required to write sizeable volumes of data. Furthermore, a great investment of time is needed to process it afterwards and present it in such a way that it becomes easy to interpret, even for the most expert of audiences. Whatever the case, the following three variables are typically of interest at the propagation stage (Figures 9 and 10): 
  • wave heights in the vicinity of the coast, since these provide a key benchmark for estimating the “intensity” we can expect from a tsunami along a coast; 
  • the time for the first wave (positive or negative) to arrive or of the most significant wave (represents the maximum available window for any self-protection action and early warning);
  • virtual tide gauge records at particular points (where we can observe the full oscillation of the tsunami over time). 

With respect to the flooding part, the following variables are typically used (Figure 11):
  • the maximum run-up height of the tsunami (which provides information on the limit affected inland in terms of the height above sea level which the tsunami reaches along the coast);
  • the maximum water depth at each flooded point (which implicitly includes the maximum extension or reach);
  • the top speed of the water at each flooded point (although this is actually less common than the depth because processing is very demanding; to start with it implies two values per pixel: an east-west module and a north-south module, and then the spatial variation of this is very large);
  • the specific water flow at each point (which includes the classic concept of the volume of water moved per unit of time at each point, with or without indication of its direction). 
All these parameters can be obtained distributed over time, for example, to produce videos or more detailed material, although, given that the volume of data expressed like this is very big, processing it becomes very slow.

Figure 9.Example of map of wave amplitude times (above) and arrival time (below) (38).

Figure 10.Example of virtual tidal gauge reading at two points located before and after a harbour breakwater (39).

Figure 11.Example of three parameters (depth, speed and flow) obtained using the Tsunami-HySEA model and post-processing in a Geographic Information System (39)

5. Assessing the damage caused by a tsunami

If we start from the concept of a tsunami being like that of a flood, we could consider as valid the classic scheme which correlates flow velocity and depth (Figure 12) (40) with the binary probability of damage. Yet tsunamis pose several additional challenges. If this were a single flow, as in the case of a flood, the scheme would be perfectly applicable, but, as has been noted in previous sections, a tsunami has at least two flows, advancing and receding. If an advancing tsunami causes damage, then the damage it causes as it flows back is additional to this. This hypothesis would even be valid for the spatial interval of maximum reach of the most significant wave (which does not have to be the first), but in the intervals where the second most significant wave reaches then there are over four flows. On the other hand, this simplification would be valid for tsunamis with a distant origin but for those provoked by the most common cause, earthquakes, and in the case of earthquakes close to the coast, we can expect a certain level of damage as a result of the acceleration of the ground prompted by the earthquake itself. As we have seen in the case of the recent tsunami in Indonesia in 2018, tsunami damage can come on top of a whole chain of direct and indirect effects from a volcano: partial or total infrastructure collapses (as it is unable to withstand the additional loads from ground oscillation), soil liquefaction, landslides (sliding and breaking away), rifts in the ground, and no end of causes in succession (fires, pollution). 

Figure 12. Classic scheme of hazardous zones for people, vehicles and infrastructure (40).

The most widely used qualitative approach to assessing tsunami risk (41) and which could be verified (42) following the Sumatra earthquake of 2004 is the subject of a whole host of variants to be applied in different study cases. This is the PTVA-3 (Papathoma Tsunami Vulnerability Assessment) model, which is expressed in the form of an RVI (Relative Vulnerability Index). Although the original author speaks of vulnerability, they build in hazardousness parameters (depth, orientation of buildings with respect to the flow), so in the best of cases this concerns a specific vulnerability, or even relative risk within the framework of the European concepts of Natural Risk Zones (43) (44), which are based on the concepts of the UNISDR (United Nations International Strategy for Disaster Reduction) (45).

It should be pointed out that for the Spanish case study, one should review the validity of this model, in particular as regards its incorporation into the general approach developed by the IGME for the Consorcio de Compensación de Seguros (46) (CCS) for the purposes of quantitative gauging of economic loss. The quantitative interpretation of other, similar publications might be especially useful in this regard, such as the Guide to reducing the vulnerability of buildings against flooding (the “Guía para la reducción de la vulnerabilidad de los edificios frente a las inundaciones”) produced by the CCS (47).

6. Discussion

An initiative was undertaken by the Geological Risks Section at the Geological Survey of Spain in the early 2000s to make studies of geological risks which was left on hold due to underfunding. This was the PRIGEO plan. Backing and openings were sought, and technical and scientific initiatives promoted, to develop a tool that we experts and scientists hold to be “vital” for a modern country: the mapping of natural hazards. The main hazards were taken into account (flooding, earthquakes, landslides and volcanoes) and several efforts were made to include other hazards besides which were less well-known outside specialist circles at that time, such as tsunamis or expansive clay.

The first hazard which Europe managed to quantify and visualise by using a mapping tool was flooding, and this was only thanks to the proclamation of the European Floods Directive (29) which became effective in 2007 and subsequent implementation of it under national regulations. Less than a year afterwards, the IGME published a technical handbook on drawing up flood mapping (48), and four years later the then Ministry for Agriculture, Fisheries, Food and the Environment (MAPAMA) published a guide on setting up the National Mapping System for Floodable Zones (49),(50).

All of this was the fruit of a major effort beforehand by various working groups where the IGME and the Directorate General for Civil Protection and Emergencies (DGPCYE) insistently asked for tsunamis to be included as part of the research into coastal flooding risks. Even so, tsunamis were explicitly left out of the European Floods Directive, as well as transposition of it and completely side-lined. The Sumatra tsunami in 2004 and that in Japan in 2011 stoked worldwide interest in events of this kind and concern within the Spanish government about tackling this issue grew.

In 2015 a major step was taken towards studying tsunamis in Spain: the DGPCYE published the relevant Basic Guidelines (51) and two years later, in 2017, the qualitative study of areas of interest was presented (52), which emphasises the Peninsula's Mediterranean coasts, the Gulf of Cadiz, the Canary Islands and the Balearic Islands as priority areas relative to the Galician and Cantabrian coasts for undertaking future efforts.

7. Conclusions

There is no doubt about it: a tsunami will occur that will hit Spanish coasts. This is as much a truism as saying that the sun will rise tomorrow. Nonetheless, we cannot predict when, or how, or which section of the Spanish coastline will be hit by the upcoming tsunami. 

The maps and studies of risks are not aimed at predicting and in fact the last thing that is intended is for even one of the premises on which they are based to come true. The intention is for risk maps to serve as a means to alert about the potential undesirable consequences so that, right from day one of their publication, or even before via efforts to disseminate, such as this very article, a start is made on providing a remedy to mitigate such effects, where such consequences are calculated using state-of-the-art science available at any time.

Unlike early warning systems, where the aim is to notify using the best possible information, thereby providing those concerned as accurately as can be of what is about to strike, maps and studies of natural hazards make it possible to become aware of a reality that will not just disappear by not taking any notice of it; they enable planning thresholds to be established that are consistent with the technical and budgetary circumstances as well as any suitable measures to be conceived to follow Japan’s example, where the objective (which is somewhat utopian today) is “zero victims from natural disasters” (1).

The perfect setting is in place today to spearhead an initiative to study tsunamis in Spain. It will not be a perfect study (these do not exist) since, among other things, we have to have more detailed geological information on oceans, but while this is on its way, we cannot miss out on that particular wave. The know-how is here in Spain and at the highest level internationally. If we fail to make a start now, tomorrow might prove too late.

The authors would like to thank those colleagues who have offered to revise or polish this article, both from the Geological Survey of Spain, the University of Malaga or the Environmental Hydraulics Institute at the University of Cantabria and others close to us, who have taken the trouble to help out for no other recompense than this modest paragraph.

Effects of the tsunami in Japan (2011).
Source: AFP/GETTY.

The maps and studies of risks are not aimed at predicting and in fact the last thing that is intended is for even one of the premises on which they are based to come true. The intention is for risk maps to serve as a means to alert about the potential undesirable consequences so that, right from day one of their publication, or even before via efforts to disseminate, such as this very article, a start is made on providing a remedy to mitigate such effects, where such consequences are calculated using state-of-the-art science available at any time.