Friday, July 27, 2012

Public Risk Communication

Communicating risk to the general public is a vital task in managing risk. The UK government has produced a leaflet outlining a ‘Practical Guide to Public Risk Communication’. The leaflet is a very short guide to practice. The three key aspects of risk communication are to reduce anxiety around risks, to manage risk awareness and to raise awareness of certain risks. There are 5 key elements to public risk communication: assembling the evidence, acknowledgement of public perspectives, analysis of options, authority in charge and interacting with your audience.


Each of these elements is expanded and discussed via a set of questions that organisations should consider. The first element is concerned with establishing the nature of the risk and its magnitude and demonstrating a credible basis for the position taken by the orgnaization. Evidence is paramount in this element but also, implicitly, is the question of trust or belief in the evidence. Who provides the evidence and the basis of that evidence are as important as clearly articulating the risk. As part of this aspect of trust, the question of ambiguity and uncertainty is bound to arise. Despite what politicians may wish, science is inexact and often searches for and is ladened with uncertainty. How organisations deal honestly with uncertainty can have a huge bearing on the trust they emit and retain between hazardous events.

Understanding how the public understand risk is essential to getting the message across. Lumping everyone in as ‘the public’, may not be that helpful though, as the leaflet notes. Assuming everyone perceives a hazard in the same way and in a consistent manner may be hoping for too much. The leaflet uses the term ‘risk actors’ as a coverall term for individuals or groups who engage with a risk or who influence others approaches to and understanding of risk. Perception and so the message about risk should be differentiated but that differentiation may depend upon the exact mix of risk and risk actors. An issue I would like to raise here is whether once you are a risk actor you are no longer a member of the public? A home-owner who has recently experienced a flood may be much more active in their local community in taking steps to reduce the flood risk – does that make them a risk actor, a well-informed member of the public or a member of the public and does this matter fro how risk is communicated to them? Risk actors could be viewed as being bias, of having their own agendas, and so not really reflective of the views of the general public.

Analysing options suggests that organisations have rationally weighed the risk and benefits of managing public risk as well as the options for action available to them. The last sentence is interesting ‘the technical and societal interests will need to be reconciled if the solution is to be generally accepted.’ This sentence is made in relation to technical solutions that may no have public sympathy initially. The implication that through debate these solutions can be accepted implies a very hierarchical and technocratic view of hazards and risk management – the public have to be educated and brought to accept the solution. I maybe reading this aspect too cynically but may be not.

The authority in charge section, however, adds weight to the idea that this is a technocratic and hierarchical view of hazards (maybe not a surprise in a government publication!) The need for an organisation to determine if it is appropriate for them to step into manage risk and the clear limits of responsibility to risk management imply a very structured and ordered view to how to manage the world and associated risks. A telling point is made that exercising authority now is as much about building and maintaining trust as it is about lines of formal authority. Trust, or rather the perception of trust, will dramatically affect the ability of an organisation to manage risk in a sceptical society. The last sentence states ‘Organisations that are not highly trusted will increase the chances of success by enlisting the help of other organisations – such as independent scientific organisations – who have the confidence of society’. A call to collaboration or a call to objectivity?

So is it all ‘smoke and mirrors’ or does this leaflet help to further risk management? Without a doubt communicating risk and its management effectively to different audiences is essential. The leaflet does provide some very good guidance on this. The ideas should not, however, be used uncritically as they are designed with a very technocratic and hierarchical view of risk management in mind. Examples of public risk communication are also provided and for flooding the conclusions reflect this bias (starts on page 22 of the report). Risk quantification is sought for flood hazard, as are software and tools for aiding local risk planning and managing the possible clash between the expectations of more flood defence infrastructure and the new risk-based approach (risk is the focus rather than the hazard itself). Communication about risk and its management is viewed as coming from the Environment Agency , insurance companies and DEFRA – not much about local ideas or communication from the ground up! The link itself is, however, viewed as potentially corrosive to public trust. This hits at the nub of the issue, the government wants the trust of people to be able to act in a manner it views as appropriate. Actions, of necessity in a complex society, require working with organizations with their own agendas and this creates suspicion. Can risk ever be managed without trust and can you manage anything without eroding some degree of trust somewhere?



Urban Air Pollution Around the World

Two interesting blogs and a website for all issues cocnerning atmospheric pollution can be found at urbanemission.blogspot.co.uk/ , aipollutionalerts.blogspot.co.uk/ and at urbanemissions.info/ . All of these sties are run by Sarath Guttikunda from New Delhi. An important aspect of these sites are reports from all over the globe concern atmospheric pollution in urban areas. The reports highlight that atmospheric pollution is a global issue, a world-wide problem that needs action. Taking action, however, requires information that can inform decision-makers about the extent of the problem. These sties also provide this by linking through to monitoring information from urban areas from around the world. These sites also highlight that local populations, communities and neighbourhoods, are not just passively sitting there waiting for decision-makers to make decisions. The volume of reports and community awareness show the concern and impetus for changes driven by the local-level certainly exists. Enabling those changes is another issue that is dependent on local conditions and their context, political, economic and social. The information and data provided by these websites, however, does permit individuals and communities from all over the world to compare their conditions with others in similar circumstances and to exchange ideas and plans for pressuring decision-makers for change.

The Urban Emissions website is worth a look as well for the modelling tools that are available for download. An interesting one, given my last blog, is the Air Quality Indicator download. This simple calculator helps you work out the air quality for an urban area based on daily observations or modelled values. It does, of course, assume that the data will be available in the first place!

Understanding Daily Air Quality

Atmospheric pollution is a continuing environmental problem across the globe. Within the UK data on historic pollution levels as well as current pollution levels can be found at DEFRA Air Quality site. A great store of information and one that you can download data from.

Wonderful as this source is for research, atmospheric pollution is not a problem that has past or is under tight control on a global scale. The locations of UK data reflects the monitoring networks set up in the 1960s by Warren Springs Laboratory, largely in response to the Clean Air Act (1956) and the need to monitor levels of pollutants to ensure that standards were being meet. The early monitors tended to be instruments such as the sulphur dioxide bubbler (so old I couldn't fidn a photo of it on the Web!). Air was pumped into the machine at a known rate and the atmosphere reacted with the liquid as it bubbled through the machine. After day the flask with the liquid in was removed and replaced with another flask of liquid. The liquid from the previous day was analysed using titration techniques (reacting the sulphur dioxide with another chemical to get a colour change and a reaction product that could be accurately measured) to determine the levels of sulphur dioxide (once the various calibrations and calculations had been done). I know this because I used an old bubbler in my thesis to monitor sulphur dioxide levels on the roof of the Geography Department at UCL, London. It was educational, but was a pain to have to process daily, particularly as it was just self-taught me undertaking the titration much to the amusement of colleagues in the lab. Passive monitors such as nitration tubes (they just sit there and the pollutants react with them) were also used, but still needed chemical post-processing to obtain a result.

By the time I finished my thesis in 1989, real-time monitoring of pollutants, or at least hourly averaged and then 15-minute averaged values, were becoming more usual and replaced the daily averaged data. This is great for monitoring levels virtually continuously and for identifying specific pollution episodes but how much information is there and how can you interpret it? Air quality standards have varying monitoring levels for different pollutants and even the same pollutant can have different exceedence values. Sulphur dioxide levels in the UK, for example, should not exceed 266 mirocgrams/m3 more than 35 times per year if measured as averaged 15-minute concentrations. If measured as 1 hourly means then 350 micrograms/m3 should not be exceeded more than 24 times per year. If measured as a 24 hour average then 125 micrograms/m3 should not be exceeded more than 3 times a year. So the limits change with the monitoring period and the type of equipment being used to monitor pollution levels. This variation may begin to get confusing if you try to communicate it to too many different end-users.


A simplified version, the Daily Air Quality Index, recommended by the Committee on the Medical Effects of Air Pollutants (COMEAP) uses an index and banding system with bands numbered1-10 and colour coded for the level of severity of atmospheric pollution. The scale mirrors established ones for pollen and sunburn so draws on existing public understanding of colour and levels. Bands 1-3 are green, so represent low atmospheric pollution levels, 4-6 are shades of orange and represent moderate atmospheric pollution levels, 7-9 are darkening shades of red ending up at brown and represent high atmospheric pollution levels, whilst the oddly coloured purple band of 10 represents very high levels of atmospheric pollution. The index itself combines the highest concentrations for a site or region of five key pollutants: nitrogen dioxide, sulphur dioxide, ozone, PM2.5 and PM10.

The DAQI may be a useful tool for communicating information about the general level of pollution as it relates to human health but does the simplicity mask complexity that disaggregated data would not? The relative contribution of the five pollutants to the index can be gauged by the information on each at the DEFRA website. PM2.5 and PM10 uses 24-hour running mean concentrations and have specific threshold levels fro each band, whilst sulphur dioxide is measured as 15-minute averaged concentrations and, again, has threshold values for each band. The index itself, though, hides if all or just one or two of the pollutants push the DAQI into a band. The index misses other pollutants that could impact upon human health, even though these may be monitored such as benzene. The cocktail of pollutants used to create the index also reflects a specific context, the UK, would the cocktail of significant pollutants vary with other contexts? The cocktail and the monitoring intervals are not necessarily ‘natural’ ones – they have been developed from monitoring set up for other purposes such as regulatory requirements. The index is squeezed out of what exists.


The DAQI is a very, very useful tool, but it reflects an attempt to communicate complex and huge volumes of information in a simplified manner that, the makers believe, will be of use to specific end-users. Once data is compressed and simplified you are bound to loss some of the information contained in its variations and detail. The index you develop for targeted end-users will, of necessity, exclude a lot of the information you have collected and it is just useful, for the end-users in particular, to be aware of this.





Wednesday, July 25, 2012

Beijing Air Quality – Citizen-Science Approach to Mapping Levels?

A recent article in Environmental Technology Online reports on a community-based science project called ‘Float’ that is actually part-science and part-art project. The idea is that pollution-sensitive kites will be flown over Beijing. These kites contain Arduino pollution-sensing modules and LED lights and will indicate levels of volatile organic compounds, carbon monoxide and particulate matter by changing colour to green, yellow or red depending on the pollutant levels. The kites are attached to GPS device loggers and the real-time data website Cosm.

The project is designed by students Xiaowei Wang from Harvard’s Graduate School of Design and Deren Guler from Carneige Mellon and is designed to involve local residents in data collection. The project relies on public funding and is still raising funds. The project derives its funds from Kickstarter, a website devoted to creative projects and obtaining funding for such projects (The Float project on Kickstart). The project also has funding from the Black Rock Arts Foundation and the Awesome Foundation.

The project has generated a lot of interest on the Web:

Fighting China’s Pollution Propaganda, with Glowing Robot Kites For the People

Pollution-detecting kites to monitor Beijing's air quality
Glowing Pollution Sensor Equipped Kites Replace Beijing's Stars
Kickstarter Project Plans to Measure Beijing Pollution Using Kite Sensors

Only a couple of comments and an expression of interest in the results really.


The project is undoubtedly part of the growing and, in my view, superb trend towards more inclusive community or participatory science (choose whichever term you prefer, Guler uses citizen-science). The ideal of getting local communities involved in the data collection as well as involving them in all aspects of the research process is an excellent way to raise awareness of an issue as well as educate people about the scientific approach and its problems and potentials. The Float project has involved local communities, young and old, from the start with workshops in Beijing and as well as in the design of the kites. In terms of how to organise a community-based, participatory science project it is one that I will advice my students to look at. It is just a shame that the descriptions of the project veer from highlighting the science to highlighting the arts aspects as if the two are, or need to be, distinct. It should also be remembered that this project, as any project involved in monitoring pollution, is entering the political as well as the scientific arena. Involving local populations is a political act (as is their agreement to involvement) as much as the monitoring of pollution by the American Embassy or the siting of monitoring sites by the Chinese. Local is as political as the national or international, but the nature of the act does not necessarily mean the data is political bias only that data collection is for a purpose.

As with most community-based projects, however, there is the issue of belief, trust or confidence in the data collected. These projects do tend to illustrate quite nicely the continuing divide between the ‘specialist’ or ‘expert’ and the ‘public’ (I would say amateur, but much of British science in the nineteenth and early twentieth century only developed because of amateurs!) The expert has been trained and accepts certain methods as being appropriate for data collection. Control and standardization are essential in ensuring what is termed ‘intersubjectivity communication’ between researchers – basically it means  I know what you did because that is how I was trained to do it, so I trust your data as being real. Guler seems to downgrade the status of the data collected even before the project really begins by stating:

‘We’re trying to interact with people on the street and see what they’re tying to do with the information they see. I don’t plan to argue that this is the most accurate data because there are many potential reasons for differences in air quality reports. We want to just keep it up, upload the data, and focus on that more after we come back’.

My impression is this statement is a great get-out clause for ‘official’ monitoring be it by the Chinese or atop the American Embassy. I wouldn’t’ be so pessimistic. The aims of the project in terms of improving public understanding of air pollution, its impact on health and the visualization of pollution through the kites are all excellent and likely to be successful. The data collected is also of value. The ‘official’ pollution monitoring sites probably conform to national or international standards for static sites in terms of equipment and monitoring periods. The kite data does not necessarily provide comparable data to these sites. The kites are mobile and collect data on levels that can be spatially references (I assume in 4 dimensions). They provide a different perspective on atmospheric pollution rather as a spatially altering phenomenon, something the official monitoring sites can not provide.  It could even be argued that the kite data provides information on pollution as experienced by the population (although the population is unlikely to move across the sky at the height of the kites!) The important thing to remember is that there is not one, single correct measure of atmospheric pollution; there are merely different representations of atmospheric pollution. The official static sites have the advantage of having clearly defined protocols that ensure the data or information they collect is immediately comparable with data or information collected at similar monitoring sites globally. The Float project is generating a different and novel set of data or information. This may require a different approach to thinking about the information and its interpretation (Guler seems to suggest this with some hints at triangulation of trends) and in how confidence or belief in the information is assessed either qualitatively or quantitatively. I will be very interested to see what form the results and interpretation takes. Good luck with the project!

Institute of Hazard, Risk and Resilience at the University of Durham

An extremely useful website is the Institute of Hazard, Risk and Resilience located at the University of Durham. They have just published their first on-line magazine, Hazard Risk Resilience that outlines some key aspects of their research and is well worth a look (also their blog now linked at the side of my blog). In addition the site contains podcasts on aspects of hazards.


An important research project for the Institute is Leverhulme funded project on ‘Tipping Points’. Put simply ‘tipping points;’ refer to a critical point, usually in time, when everything changes at the same time. This idea has been used in describing and trying to explain things as diverse as the collapse of financial markets and the switches in climate. The term ‘tipping point’ (actually ‘tip point’ in the sutdy) was first use in sociology in the 1957 by Martin Grodzins to describe the ‘white-flight’ of white populations from neighbourhoods in Chicago after a threshold number of black people moved into the neighbourhood. Up to a certain number nothing happened, then suddenly it was as if a large portion of the white population decided to act in unison and they moved. That this action was not result of co-ordinated action on the part of the white population suggested that some interesting sociological processes were at work. (Interestingly, I don’t know if the reverse happens or if research has been conducted into the behaviour of non-white populations and their response to changing neighbourhood dynamics). Since about 2000 the use of the term tipping point has grown rapidly in the academic literature, a lot of the use being put down to the publication in 2000 of ‘The Tipping Point: How little things can make a big difference’ by the journalist Malcolm Gladwell (who says academics don’t read populist books!)

Research suggests that the metaphor of a ‘tipping point’ is a useful one in getting across a lot of complex and complicated processes and changes that occur in socio-economic, political and physical systems. One of the foci of research in the project is on trying to assess if this metaphor does actually describe, quantitatively or qualitatively or both, real properties of systems. Another focus of the research  are concerned with exploring how the metaphor becomes an important aspect of the phenomena being researched, even taking on the character of an agent in the phenomena itself. Importantly, the project also considers what it means to live in a world where ‘tipping points’ abound and how important anticipatory understanding is for coping wit that world.

Tipping Point by Malcolm Gladwell





Virtual Water: Accounting for Water Use

Following on from my last blog on the discovery of a massive aquifer under part of Namibia I thought it might be useful to consider a key accounting concept for water resources: virtual water. The term ‘virtual water’ was coined by Tony Allen of SOAS and and refers to the invisible water, the water that it takes to produce the food and goods we consume or as Virtual Water puts it


Virtual water is the amount of water that is embedded in food or other products needed for its production.’

(Other websites on virtual water include: Virtual Water)
Some of the figures involved are quite amazing. A kg of wheat takes 1,000 litres of water to produce, a cup of coffee takes 140 litres, 1kg of rice takes 3,400 litres and 1 car takes 50,000l litres of water to produce. You can even work out your own water footprint using the calculator on the site. There is even an app for your mobile! Additionally there is information on national water footprints and, importantly, the idea that virtual water is traded between nations.

Mekonnen and Hoekstra (2011) at the University of Twente published a UNESCO report on the issue of virtual water over the period 1996-2005. They divided virtual water into three colours: green, blue and grey. Green water is the water associated with agricultural production, blue water is that associated with industrial production whilst blue water is the water associated with domestic use. From their analysis they calculated that the global average water footprint for consumption was 1385 m3 per year per capita, with industrialized countries having a footprint of 1250-2850 m3 per year per capita, whilst developing countries had footprints in the range 550-3800 m3 per year per capita. The low values represented low consumption volumes in these countries, whilst the large values represented big water footprints per unit of consumption.

Their key conclusions were that about 20% of the global water footprint for that period was related to production for export. This means that there are large international flows of virtual water with some countries importing virtual water (2320 billion m3 in the time period). In countries where water is scare there is a tendency to import large amounts of virtual water in food and agricultural products, saving national water resources for key uses that add greater value such as industrial production. Tony Allen makes an argument in 1997 for the Middle East being one of the first regions to develop this adaptation to resource scarcity. The relatively large volume of international flows in virtual water generated water dependencies that they suggest strengthen the argument that issues of local water scarcity need to be considered within a global context.


The significance of this concept for the discovery of the aquifer and its use is that the Namibian reserve has to be viewed within a global context. The development of agriculture and the technical development of the resource are likely to be political decisions and increasingly likely to be geopolitical decisions that have to take into account the regional position of Namibia, the likely trade partners for the virtual water, the geopolitical power of potential partners and the future frictions that could arise as environmental change affects the current international demands and flows of virtual water.

Tuesday, July 24, 2012

Namibian Aquifer: Who Benefits?

A recent BBC article reported on the discovery of major aquifer in Namibia. The new aquifer is called Ohangwena Il and it covers an area of about 70x40km (43x25 miles). The project manager, Martin Quinger, from the German federal institute for geoscience and natural resources (BGR) estimates that the 10,000 years old water could supply the current water consumption needs of the region for about 400 years.


The find could dramatically change the lives of the 800,000 people in the region who currently rely on a 40 year old canals for their water supply from neighbouring Angola. Martin Quinger states that sustainable use is the goal, with extraction ideally matching recharge. The easy (and cheap) extraction of the water under natural pressure is complicated by the presence of a small salty aquifer that sits upon the top of the newly discovered aquifer. Quinger states that if people undertake unauthorised drilling without following their technical recommendations then a hydraulic short-cut could be formed between the two aquifers contaminating the fresh water.

In terms of the use of the water he comments that:
‘For the rural water supply the water will be well suited for irrigation and stock watering, the possibilities that we open with this alternative resource are quite massive’.
The EU funded project also aims to help young Namibians manage this new water supply before their funding runs out.
The discovery is a great, potentially life-changing resource for the region but the question that arises in my mind is who is going to benefit from this discovery? The current socio-ecological system in the region is attuned to the amount of water available. The availability of more water could change this but will it be for the benefit of the current population? A key aspect is the last point made about the EU funded project – the management of the resource by those in the region. The skills required to manage a large water resource are context dependent. They depend on the uses to which that resource will be put. They require technical and resource allocation skills that presume a context of educational levels that are embedded within a culture and location. Acquiring these skills takes a while as people go through the appropriate training and gain the experience that helps management of this resource. If this expertise does not exist now within the region then the implication is that external support will be needed and, by implication, paid for.

Another issue is the assumption that the water will be used for improving agricultural production which I assume (maybe wrongly) means more intensive agriculture. The key questions are then what type of agriculture and what additional resources are required to ensure that it works? Thinking of the whole agricultural system as a complex network of relations, the question really is what network of relations will be overlaid onto the existing agricultural network to ensure the success of the new type of agricultural production. A more intensive agriculture implies fertilizers, investment, technical know-how as well as access to markets, regional, national and international, so that funds can be extracted from the new produce. Again is it likely to be the regional population that is able to conjure up the finance, technical knowledge and all the other bits of the network required to develop this new agriculture? In time, the answer might be yes, but will external agencies, such as the government and investors permit this time before developing the valuable resource?

This problem with development as seemingly envisaged by the project is illustrated in the comment concerning extraction. The implication is that only people with a specific level of technical ability can extract the water. This implies that a system of permits is likely to be implemented and so access to the resource will be controlled and restricted. It also implies that the permits will be allocated to operators able to meet the technical requirements outlined by the project, and if this expertise does not exist within the region then the operators will have to be external contractors. This system is likely to require financing so value will have to be extracted from the supply of water. To who will the funding flow and who will pay for it? How will the regional population receive the water and what will the price of the water? I would like to hope that the EU funded project will enable the management of the resource by the regional population for the benefit of that population in the manner that that poplulation sees fit for their own development.

Sunday, July 22, 2012

Disaster Database

EM-DAT is an international disaster database run by the Centre for Research on the Epidemiology of Disasters (CRED). This Emergency Events Database provides data on the location, size and impact of different types of disaster from 1900 to the present (about 18,000 disasters in total and counting). For educational and investigative purposes a useful feature is the ability to create your own dataset based on your own criteria. You can select regions or countries, specific time periods and specific types of hazards to develop your customized set of data. An interest aspect of the database is that ‘technological’ hazards are included as well as ‘natural’ hazards, so an impression of different types of hazard can be built up. Hazards are divided or defined hierarchically by disaster generic group (natural or technological), subgroup, main type, then sub-type and a sub-sub type! It is not as complicated as it sounds! The impact of each hazard is defined in terms of number of people killed and injured, the number of people made homeless and those requiring immediate assistance after the disaster event as well as the estimated damage caused by the disaster event.
There are also a number of pre-prepared maps and graphs of the location of disasters on a global scale and of trends in disasters. The map below shows flood disasters between 1974 and 2003, which the graph illustrates the trend in the number of technological hazards since 1900. This database could be a very useful tool for exploring trends and patterns of disasters through time at difference scales. The reasons for these trends may require some thinking – increased reporting of disasters, increasing population growth and spread of population into hazardous areas, increasing growth of vulnerable populations to name but a few. Without the initial data through even identifying these patterns to explain would be difficult.

Flood disasters 1974-2003
Trend of numerb of technological disasters since 1900


Saturday, July 21, 2012

More Mash-ups: Mapping A Century of Earthquakes

A recent posting on the AGU linkedin site drew my attention to a map tat plotted all magnitude 4 and above earthquakes that have occurred since 1898. The map in the Herald Sun clearly shows the distribution and the ‘hotpots’ you might expect around the Pacific ‘ring of fire’ as well as some intra-plate bursts of colour that suggest even the interior of continents are not immune from these hazards.
Although a nice image, the map represents a key trend that I mentioned in a earlier blog – mash-ups. The map was produced by John Nelson of IDV Solutions  a US software company specialising in visualising data. The maps combine data from the US Advanced National Seismic System and the United States Geological Survey to produce a map that spatially locates each piece of data. IDV Solutions understand the importance and power of such mash-ups and Deborah Davis published an article in Directions magazine (25th February 2010) on the importance of mash-ups for security. Although their observations about mash-ups are directed at security the observations in the articles are as useful for trying to understand and manage hazards and the risks associated with them.

Mash-ups provide a means of consolidating data from diverse sources into a single, comprehensible map and in a visual context that has some meaning for the observer. The map produced can be made relevant to the customer or user by ensuring that it contains additional information relevant to their interpretation of the information. A map of landslides combined with topographic data provides a context for helping to understand why the landslides might have occurred. Adding surface geology as another layer improves the context of interpretation for a landslide specialist, adding the road network improves the context of interpretation for a hazard manager. Once data has a context it is easier to spot relationships between phenomena. With this single, common map available to all parties there is a common basis for discussion and for decision-making. Having a common source of reference may even encourage discussion and debate. In addition, it may be easy to see where data is lacking and what other data these parties may require to aid their decision-making. The cost-effectiveness of such mapping should not be neglected either. Using existing data and producing a new product is very cost-efficient.



Media and Hazards: Orphaned Disasters

An interesting blog published in 2010, Orphaned Disasters: On Utilising the Media to Understand the Social and Physical Impact of Disasters’ poseted by KJ Garbutt, looks at how the media views, priorities and ignores disasters producing what the author calls ‘orphaned disasters’. It has some interesting points to make. The blog has a link to Masters research undertaken at the University of Durham on which the blog is based.




Sunday, July 15, 2012

Surface Water Flows and Flooding

The recent ASC report on flooding and water scarcity makes some interesting points about surface water flows, particularly those associated with flood hazard in urban areas. The report states that ‘Every millimetre of rainfall deposits a litre of water on a square metre of land’. Water falling onto a paved, impermeable will not infiltrate into the ground and so the volume has to move somewhere. The amount of paved surface is increasing as the report noted that green spaces in urban areas have been paved over and so surface water flows in urban areas are increasing even before the more intense rainfall associated with climate change is considered. The figures cited are that the proportion of paved gardens has increased from 20%in 2001 to 48% in 2011 of the total garden area of 340,000 hectares.

To combat this increase in paved area contributing to runoff the report suggest that urban creep should be minimized, sustainable urban drainage (SUDS) (SUDS) (SUDS-EA) should be improve to slow down water flows and store water above ground, and that conventional sewers should be maintained or upgraded: all good ideas. Recent floods in urban areas have highlighted the importance of such measures. Paved surfaces permit no storage of water, runoff is almost immediate and, with intense rainfall, the volumes of runoff involved can be huge within a short period of time. Overwhelmed urban drainage systems mean that the water moves rapidly across impermeable surfaces and flows through streets and roadways using them like predefined river channels. Similarly, when a river bursts its banks the water tends to use the paved, impermeable surface as a routeway for movement. The urban road network provides a convenient substitute for natural channels providing water with a rapid means of moving across an urban area.
A great deal of the potential damage from a flood and even flash floods could be mapped using a detailed digital elevation model (DEM) and a knowledge of past events in an urban area. This will help map out previous routeways that surface flows have used. Future events may be harder to predict, as the urban infrastructure changes and precautions are taken by planners to block or re-route surface flow, then the microtopogaphy of the urban area may be a guide to patterns of surface flow but other factors will also affect the detailed routes the water takes. The local detail is a bugger for modelling flow patterns. It will be interesting to see what, if any, use is made of the information about flood damage from the recent floods. There is a great deal of information online from Twitter, as well as local blogs and newspapers accounts that could provide a great deal of information about how surface water moved through urban areas. The potential for ‘citizen science’, for ordinary people (a horrible term that seems to imply scientists and planners are extraordinary) to contribute to the scientific investigation of flooding is immense. Co-ordination of this type of information, the mere exercise of collecting and collating information, or judging its quality and usefulness fro modelling and understanding urban surface flow is immense. Time, expertise and, potential funds, are needed for these activities but by who is unclear. Once the aftermath of the floods disappears from public view, the chances of funding such work drops dramatically. The need for people, the public (rather than the ordinary – anyone got a better term that isn’t condescending?) to be involved is important, however, if some of the recommendations of the ASC report are put into practice. In particular, the emphasis on households undertaking property-level flood protection measures might be enhanced if they were also actively involved in monitoring and in the feedback loop from modelling studies of their local areas. This would not only mean they were better informed about the risks of flooding but also more likely to act in the manner hoped for by planners if they felt they were an active part of preventing flood damage rather than passive victims in urban flooding.





Saturday, July 14, 2012

Flooding and Development in the UK: Some thoughts on ASC Report

The recent ASC Report provides some interesting insights into current planning practices in the UK in relation to flooding. The ASC report (chapter 2) suggests that climate change is not the only factor increasing flood risk (page 27). It points out that risk changes if the probability of an event occurring changes OR if the consequences of an event alter. The first aspect is meant to cover the purely physical aspects of a hazard or rather of climate change in affecting flooding, whilst the second aspect is meant to relate to the socio-economic aspects of a hazard. It could be argued that a hazard isn’t really a hazard unless people are involved so an ‘event’ isn’t really a hazardous event unless there is a vulnerable population, so the two aspects may be more tricky to separate than appears. The two may go hand in hand.


Leaving this aside, the report does highlight the importance of the need for planning that bears these aspects in mind. The tables on pages 28 and 29 also provides some estimates of the amount of stock at risk from different types of flooding with 1.2 million properties at risk from river flooding, 230,000 of these at significant risk (a greater than 1 in 75 chance in any given year). The report points out that most floodplain development is within built-up areas that already have flood defences. Continuing development behind these existing defences increases the total value of assets that are being protected. In any cost-benefit analysis this increased value will make any future investment decisions easy – keep investing in defences as the value of assets keeps increasing. This means that current flood defences lock-in long-term investment, meaning that higher and stronger defences are continually required. The report recognises that this has been known for a while as the ‘escalator effect’.

The report also identifies that only a small number of planning applications have been approved when there was a sustained objection from the Environment Agency (EA). This implies that most planning applications meet the requirements of the EA for taking into account flood risk – a comforting point. However, the report also notes that, in general, local authorities are implementing national planning policy by continuing to build with protection in floodplains (page 36 of the report). In addition most flood risk management policies in plans focus on making development safe once the strategic decision to build in floodplains has been taken (page 37 of the report).

Combined this implies that floodplain development will continue and will produce an investment strategy for flood defences that encourages further development in already protected areas, forcing further and more extensive protection of these areas. The report states that the EA as taken a strategic approach to funding structural flood defences ‘targeting investment towards communities at greater flood risk and with the highest social vulnerability’ (page 40). The justification for this approach, however, is then expressed in terms of average cost-benefit ratio of 8:1, i.e. for every £1 spent on flood defences there is an expected reduction in long-term cost of flood damage of £8. This implies that social vulnerability is defined in terms of money as are any other benefits. This would suggest that assets that are relatively easy to attach a monetary value to will weigh heavily in these calculations. Again this would encourage development in flood protected areas as any additional value from the buildings an infrastructure will increase the cost-benefit ratio and so ensure the continuation of more, stronger and higher flood defences. Whilst here is nothing inherently wrong with this it does mean that if or when the defences are breached the cost of flood damage will be huge. How is this potentially high magnitude cost worked into the cost-benefit equations? Is the potential loss or cost discounted and over what time scale? How does the probability of such a high magnitude loss change as both aspects of flood risk mentioned above, the physical and the socio-economic, change into the future? Is the potential cost so huge that the defences must alwasy be enhanced into the future no matter what the rate of cliamte change or the cost? Is the current strategy commiting the future to locational inertia of housing, business and infrastructual investment?

Wednesday, July 11, 2012

Future Flood Risk in England: Committee on Climate Change Report

The recently published report ‘Climate change – is the UK preparing for flooding and water scarcity?’ an Adaptation Sub-Committee (ASC) Progress Report of the Committee on Climate Change, could not have asked for better timing really. The ongoing unseasonal rain (and by rain I mean torrential downpours rather than light summer showers) has brought the issue of flooding to the forefront of many people’s minds, not least those inundated with dirty brown flood waters. The headline grabbing pieces of the report are:


• floodplains have been developed faster than anywhere else in England over the past decade; one in five properties in floodplains were in areas of significant flood risk

• the current ‘build and protect’ policy will leave a legacy of rising protection costs; current levels of investment in flood defences will not keep pace with increasing risks with the number of properties at significant risk of flooding double by 2035

• increasing investment in flood defences and property protection measures could halve the number of properties at risk by 2035 (relative to current levels)

The recommendations of the report in relation to flooding (in the executive summary and helpfully laid out in a box on page 17) sound reasonable but they need to also be looked at in the light of the government’s recent modernization of planning policy (National Planning Policy Framework) which highlights the needs to undertake major and sustainable development in the UK in the near future. The ASC report advises that for flooding robust and transparent implementation of planning policy in flood risk areas is required and that local authorities should be consistent and explicitly takes into account long-term risk of flooding when deciding the location of new developments (page 17 of the report). In addition, there should be support for sustained and increased investment in flood defences by both private and public sources as current spending will not keep pace with the increasing risk. Failing increased expenditure, ways to manage the social and economic consequences of increased flood frequency should be identified. Lastly, there should be increased enabling of the uptake of property-level measures to protect against floods and encouragement of the increased use of sustainable drainage systems to manage surface water.

The National Planning Policy Framework highlights the importance of sustainable development particularly of housing and the importance of community involvement. The ASC report even recognises this on page 12 when it states that ‘Development in the floodplain may be rational decision in cases where the wider social and economic benefits outweigh the flood risk, even when accounting for climate change’ but then adds that in a review of 42 recent local development plans there was mixed evidence of transparency from local authorities in terms of locating development on areas other than floodplains and in including the long-term costs of flooding associated with climate change. So no cosnsitent approach to the issue yet.

So how do you decide (and who decides) if development on the floodplain is worth it or not? The decision seems to be located at the local authority level where contradictory messages are being delivered – develop sustainably (whatever that means) and don’t develop there unless you absolutely have to. There is also an assumption that the decision can be rationalised, presumably using cost-benefit analysis. This puts the valuation of property, the environment, business and infrastructure at the centre of any argument. How does this square with the ‘sustainability’ issues raised in the National Planning Policy Framework? Even worse it is not necessarily the current valuation that will be important by the future valuation of both the land and its use as well as the costs of clearing up flood damage. How do you work this out in a consistent and mutually agreed manner for the whole country for every land-use or potential land-use? Even if the local authorities made their valuations explicit would every stakeholder agree? What of localism as well? How much input will communities have into the decision-making process if valuation is the central pillar of decision-making for housing? Can they, the communities, compete in any discussion with the complicated economic modelling available to local authorities?

An interesting aspect of the report is the focus on ‘property-level protection measures’ by which they means things such as door guards and airbrick covers which the report points out require a take-up rate increase of 25-30 times by 2037 to reach all 200,000 to 330,000 properties that could benefit from their use. Although the report discusses these measures in combination with government investment as being a means of reducing flood risk I do wonder if this is more evidence of the ‘moral hazard’ argument coming into the discussion. In economic theory this refers to the tendency to take undue risks when the costs of those risks are not borne by the individual (or entity) taking those risks (a nice discussion of this idea can be found on Wikipedia – (I am not adverse to using this source if it is well done but wouldn’t advice it for any of my students reading this!) Property-level protection measures seem to throw responsible for tackling the risk (and increased risk of flooding) onto the property owner. Although not put into these words, does that mean it is their fault? They brought the house there, so the risks are theirs to bear? They have to implement property-level protection measures or else why should anyone else help them, such as insurers or local authorities? Odd when developers are allowed to develop on floodplains and then sell houses on – do home buyers have choice in where to buy if housing development is concentrated in floodplains or if those are the only locations they can find houses cheap enough (or in the right price range) for them to buy? Where exactly does responsibility for living in a floodplain lie?



Monday, July 9, 2012

Road Traffic Pollution and Death: Interpreting the Data

A recent report suggests that road traffic pollution causes 5,000 premature deaths a year in the UK, whilst exhaust from planes adds another 2,000 (http://www.bbc.co.uk/news/science-environment-17704116 for summary, the actual report is a paper in Environmental Science and Technology which is a journal for which you need a subscription). The numbers are comparable with those produced by COMEAP (Committee On the Medical Effects of Air Pollution) "The Mortality Effects of Long-Term Exposure to Particulate Air Pollution in the United Kingdom" which estimates air pollution was responsible for 28,000 deaths in the UK in 2008 (the more recent study estimates 19,000 deaths in that year). One interesting statistics from the more recent report is that road traffic accidents caused only 1,850 deaths in 2010, meaning that traffic pollution is a more potent killer.

So what can we make of these figures. The exact number of deaths depends on how you calculate 'premature' deaths. This means you need to extract from the number of deaths those that would not have happened had it not been for the pollution. This means using life-table analysis to predict survival rates of different age groups. If air pollution improves, for example, you might expect everyone to have an improved survival change but that this would be greater for young children than for people in their 80s. The children who benefited from the reduction in pollution have to die sometime so the benefit is not sustained indefinitely. This means that you have a dynamic or continually changing death rate based on a reduction in pollution levels.  The COMEAP report suggests that any benefits from reductions in air pollution should be expressed in terms of improved life expectancy or number of life-years gained but accept that the 'number of attributable deaths' is a much catchy way of expressing the information.

An interesting read for interpreting the 'deaths' is the appendix Technical Aspects of Life Table Analysis by Miller and Hurley. This short report goes through the technical aspects and assumptions involved in this sort of analysis. Be aware through it does get into the mathematics fairly quickly. Importantly, starting with 2008 as a baseline you construct an age-specific all-cause mortality hazard rates, hi, that acts upon age-specific populations, ei.  Additionally, the number of viable births into the future is taken to be the same as the 2008 baseline. Changing policies alters the 'impact factors' which differ by age group and time. By altering this impact factor you change the the hazard impact and so alter the mortality rates.


Understanding how 'deaths' are calculated and the assumptions involved are vital to interpreting the information provided. This tends to be particularly important when, as in this report, the 'deaths' are the end result of mathematically modelling of a data set and a series of key assumptions about the impact of different scenarios.  I am not suggesting that the mathematics is wrong, the use of life-table analysis has a long and profitable history in the insurance industry so the modelling is on a very sound base. The COMEAP report recognises this problem of interpretation (starting page 13) and knows that there is a trade-off between between full accuracy and accessibility. It is also acutely aware that the numbers are open to misunderstanding if the basis of their calculation is not understood. On page 14 of the report, for example, they state for the term 'number of attributable deaths' that:

To emphasize that the number of deaths derived are not a number of deaths for which the sole cause is air pollution, we prefer an expression of the results as “an effect equivalent to a specific number of deaths at typical ages”. It is incomplete without reference also to associated loss of life. The Committee considered it inadvisable to use annual numbers of deaths for assessing the impacts of pollution reduction, because these vary year by year in response to population dynamics resulting from reduced death rates.

In interpreting this type of data it is important to know how it was derived, to know if it was modelled, and if so how, and, as importantly, the exact technical definitions used for terms. The alternative is relying on others to interpret the data for you with all the attendant agendas potentially coming into play as they draw their conclusions.

UK Real-time Flood Alerts Online - Using Information in Novel Ways

A BBC report on 6th July (http://www.bbc.co.uk/news/technology-18740402) informs its readers of the online launch of a real-time flood alerts map developed by Shoothill, a Shrewsbury-based company, which uses data from the Environment agency network of monitoring sites. Users can zoom into the map and see flood alerts and warning as issued by the Environment agency within the previous 15 minutes.


The site is worth a visit but it does beg the question, particularly as the unseasonably weather continue in Britain and elsewhere – what does this company add to the existing EA site that makes it more useful? The EA flood warning front page (http://www.environment-agency.gov.uk/homeandleisure/floods/31618.aspx) shows a map of Britain that you can click on by region and then text information on flood warnings including locations are provided. Clicking further through the individual warning locations provides more detailed information. The Shoothill site provide the same information if you click on the symbol on the map.

The answer seems to be that the Shoothill site provides the information visually linked to a map. Is this such an advance? It seems to be and it indicates a key component of using the Web – the concept of mash-ups. Amazon and Google use a similar view of the flexibility of information in their Associates programmes – increasing revenues by allowing specialist to access databases and the facilities to purchase goods through links to Amazon and Google sites.

For Shoothill, the data is provided by the EA but the use to which it is put, and the value added to that novel use, is provided by Shoothill. Locating the flood warnings in a map may seem obvious but it takes specialist skills and time to do this, particularly in beign able to update the inforamtion in real-time. Shoothill uses the existing information in an innovative way adding value to the data in terms of how people can use and interpret it. Such innovation would not be possible without access to that information. This may seem like an odd view of data and information but within the Web environment, the value of information does not necessarily lie in keeping it the private and the exclusive property of one company or organization. The value of information can be released or expanded by allowing others to access it and to use it in a manner that may not have been envisaged by the information generators. Both parties can gain as Amazon and Goggle have already figured out!

What Can I Do With Geography 2?

In Geographical , September 2011 (see http://www.geographical.co.uk/ for magazien website plus their blog) there is a very good set of cameos from geographers who have found their training useful for their current employment. The foreword from Dr Rita Gardner, the Director of the RGS (with IBG) provides a bit of context but it is the sketches of the geographers that is most informative about why geography is so useful. The individuals arrange from an air traffic control specialist, through the seemingly more exciting jobs of adventurer and helicopter pilot, to socially aware jobs of activism coordinator at Oxfam founder of Green Economic Institute. Additionally, renewable energy, water resources engineer, environmental consultant and transport logistic manager all get a look in as well as the maybe more expected and traditional trainee teacher. The important point that all of these individuals make is that geography has proven to be extremely useful to their chosen career path even if they didn’t believe it at first. Well worth a read if you are trying to justify taking the subject up to some non-believers.