Showing posts with label earthquakes. Show all posts
Showing posts with label earthquakes. Show all posts

Saturday, July 21, 2012

More Mash-ups: Mapping A Century of Earthquakes

A recent posting on the AGU linkedin site drew my attention to a map tat plotted all magnitude 4 and above earthquakes that have occurred since 1898. The map in the Herald Sun clearly shows the distribution and the ‘hotpots’ you might expect around the Pacific ‘ring of fire’ as well as some intra-plate bursts of colour that suggest even the interior of continents are not immune from these hazards.
Although a nice image, the map represents a key trend that I mentioned in a earlier blog – mash-ups. The map was produced by John Nelson of IDV Solutions  a US software company specialising in visualising data. The maps combine data from the US Advanced National Seismic System and the United States Geological Survey to produce a map that spatially locates each piece of data. IDV Solutions understand the importance and power of such mash-ups and Deborah Davis published an article in Directions magazine (25th February 2010) on the importance of mash-ups for security. Although their observations about mash-ups are directed at security the observations in the articles are as useful for trying to understand and manage hazards and the risks associated with them.

Mash-ups provide a means of consolidating data from diverse sources into a single, comprehensible map and in a visual context that has some meaning for the observer. The map produced can be made relevant to the customer or user by ensuring that it contains additional information relevant to their interpretation of the information. A map of landslides combined with topographic data provides a context for helping to understand why the landslides might have occurred. Adding surface geology as another layer improves the context of interpretation for a landslide specialist, adding the road network improves the context of interpretation for a hazard manager. Once data has a context it is easier to spot relationships between phenomena. With this single, common map available to all parties there is a common basis for discussion and for decision-making. Having a common source of reference may even encourage discussion and debate. In addition, it may be easy to see where data is lacking and what other data these parties may require to aid their decision-making. The cost-effectiveness of such mapping should not be neglected either. Using existing data and producing a new product is very cost-efficient.



Tuesday, June 22, 2010

Environmental Disasters

We remember disasters. Pictures stream across our television screen, graphic images of the misery of death and the chaos of destruction. Hundreds, even thousands of people die, their agony captured and rerun in digital formats across the Web. Scientists tell us what happened and why, politicians bemoan the lack of warning and the poor die. Boxing Day 2004, Katrina 2005, Haiti 2010 – just dates and places but there is an immediate recognition of what they refer to.

How can we study these events and, importantly how can we understand and prevent them? This is a set of questions that has long been a staple part of academic study and it has a distinctly geographical dimension. There are three main approaches to trying to understand hazards and disasters: the dominant, the developmental and the complex. They could be seen chronologically, with the last being more sophisticated than the first or they could just be seen as different ways of looking at the same thing. A key thing to bear in mind is that there is usually a distinction between a hazard and a disaster. A hazard is the potential or possibility for damage or harm; the vulnerability for a loss. Disaster in contrast is the realisation of that potential.

The Boxing Day tsunami of 2004 and the Haitian earthquake of 2010 were destructive by any measure you care to use. Death tolls of over 200,000 almost outstrip comprehension. Whole cities levelled, communities ripped apart, national economics shattered. How could we study such events? The dominant approach (or behavioural paradigm according to Smith and Petley, 2009) takes what many would call as a very rational, scientific view of a disaster. A disaster is the result of an extreme geophysical event. The geophysical causes the problem and it is often seen as a matter of luck or not if people are harmed. Studying how people behave before and during a disaster, understanding their rational decision making and where they are irrational is the basis for developing management tools for organising populations. Civil authorities tend to view such events as distinct and discrete breaks with normality; times of crisis for which special plans, actions and laws (or lack of liberties) are needed.

In addition, understanding detailed scientific analysis of the geophysical processes that produce extreme geophysical events provides the information for attempting to predict the location and timing of such events. Of course such detailed study requires extensive and often expensive monitoring systems as well as well integrated early warning systems and a civil authority able to rationally plan for what to do with a population once the sirens sound. Such an approach could be called a ‘technological fix’ approach to disasters. Leave it to the experts and everyone might be saved.

How does this view play out in reality? Let’s take FEMA’s webpages on the National Earthqauke Reduction Programme (NEHRP) as an example (www.fema.gov/plan/prevent/earthquake/nehrp.shtm or www.nerhp.gov for the NEHRP webpages). The opening statement highlights the dominance of the dominant view of disasters.

‘The National Earthquake Hazards Reduction Program (NEHRP) seeks to mitigate earthquake losses in the United States through both basic and directed research and implementation activities in the fields of earthquake science and engineering.’
Reading the Strategic Plan for the National Earthquakes Hazards Reduction Programme (Fiscal Years 2009-2013) (http://www.nehrp.gov/pdf/strategic_plan_2008.pdf) there are three key goals –

A: Improve understanding of earthquake processes and impacts;
B: Develop cost-effective measures to reduce earthquake impacts on individuals, the built environment, and society-at-large;
C: Improve the earthquake resilience of communities nationwide.

Within these goals there are a number of objectives but each can be seen to be using language associated with the dominant paradigm. For goal A, for example, objective 1 is ‘advance understanding of earthquake phenomena and generation processes’, basically do fundamental science into the geophysical phenomenon. Even when people and society are considered as in objective 3, it is viewing actions as being amendable to study using similar rational methods as those used for the geophysical phenomenon. Likewise for goal B, the objectives talk about developing tools for assessing loss and risk, implying that everything can be allocated a number, a quantity for comparison. Even goal C is couched in these terms. Objectives 11 and 12, for example, seek to promote or support the implementation of public and private standards in building codes and policies, whilst objective 10 focuses on developing more comprehensive risk scenarios for planning actions, presumably at an appropriate organisational level.

Seeing disasters as driven by the geophysical event, as crisis that need crisis management responses and as understandable via scientific analysis and usually via quantification of impacts is not necessarily wrong. It is vital to know what the geophysical event is and how it varies. It is vital to understand how buildings behave in earthquakes and build accordingly. But are people simply rational entities that will be told what to do by authorities? Is it simply a case of knowing more and telling everyone – will this reduce disasters?

In my next blog will explain how people’s behaviour has been studied and how rational decision making is incorporated into this dominant paradigm.