Friday, July 27, 2012

Public Risk Communication

Communicating risk to the general public is a vital task in managing risk. The UK government has produced a leaflet outlining a ‘Practical Guide to Public Risk Communication’. The leaflet is a very short guide to practice. The three key aspects of risk communication are to reduce anxiety around risks, to manage risk awareness and to raise awareness of certain risks. There are 5 key elements to public risk communication: assembling the evidence, acknowledgement of public perspectives, analysis of options, authority in charge and interacting with your audience.


Each of these elements is expanded and discussed via a set of questions that organisations should consider. The first element is concerned with establishing the nature of the risk and its magnitude and demonstrating a credible basis for the position taken by the orgnaization. Evidence is paramount in this element but also, implicitly, is the question of trust or belief in the evidence. Who provides the evidence and the basis of that evidence are as important as clearly articulating the risk. As part of this aspect of trust, the question of ambiguity and uncertainty is bound to arise. Despite what politicians may wish, science is inexact and often searches for and is ladened with uncertainty. How organisations deal honestly with uncertainty can have a huge bearing on the trust they emit and retain between hazardous events.

Understanding how the public understand risk is essential to getting the message across. Lumping everyone in as ‘the public’, may not be that helpful though, as the leaflet notes. Assuming everyone perceives a hazard in the same way and in a consistent manner may be hoping for too much. The leaflet uses the term ‘risk actors’ as a coverall term for individuals or groups who engage with a risk or who influence others approaches to and understanding of risk. Perception and so the message about risk should be differentiated but that differentiation may depend upon the exact mix of risk and risk actors. An issue I would like to raise here is whether once you are a risk actor you are no longer a member of the public? A home-owner who has recently experienced a flood may be much more active in their local community in taking steps to reduce the flood risk – does that make them a risk actor, a well-informed member of the public or a member of the public and does this matter fro how risk is communicated to them? Risk actors could be viewed as being bias, of having their own agendas, and so not really reflective of the views of the general public.

Analysing options suggests that organisations have rationally weighed the risk and benefits of managing public risk as well as the options for action available to them. The last sentence is interesting ‘the technical and societal interests will need to be reconciled if the solution is to be generally accepted.’ This sentence is made in relation to technical solutions that may no have public sympathy initially. The implication that through debate these solutions can be accepted implies a very hierarchical and technocratic view of hazards and risk management – the public have to be educated and brought to accept the solution. I maybe reading this aspect too cynically but may be not.

The authority in charge section, however, adds weight to the idea that this is a technocratic and hierarchical view of hazards (maybe not a surprise in a government publication!) The need for an organisation to determine if it is appropriate for them to step into manage risk and the clear limits of responsibility to risk management imply a very structured and ordered view to how to manage the world and associated risks. A telling point is made that exercising authority now is as much about building and maintaining trust as it is about lines of formal authority. Trust, or rather the perception of trust, will dramatically affect the ability of an organisation to manage risk in a sceptical society. The last sentence states ‘Organisations that are not highly trusted will increase the chances of success by enlisting the help of other organisations – such as independent scientific organisations – who have the confidence of society’. A call to collaboration or a call to objectivity?

So is it all ‘smoke and mirrors’ or does this leaflet help to further risk management? Without a doubt communicating risk and its management effectively to different audiences is essential. The leaflet does provide some very good guidance on this. The ideas should not, however, be used uncritically as they are designed with a very technocratic and hierarchical view of risk management in mind. Examples of public risk communication are also provided and for flooding the conclusions reflect this bias (starts on page 22 of the report). Risk quantification is sought for flood hazard, as are software and tools for aiding local risk planning and managing the possible clash between the expectations of more flood defence infrastructure and the new risk-based approach (risk is the focus rather than the hazard itself). Communication about risk and its management is viewed as coming from the Environment Agency , insurance companies and DEFRA – not much about local ideas or communication from the ground up! The link itself is, however, viewed as potentially corrosive to public trust. This hits at the nub of the issue, the government wants the trust of people to be able to act in a manner it views as appropriate. Actions, of necessity in a complex society, require working with organizations with their own agendas and this creates suspicion. Can risk ever be managed without trust and can you manage anything without eroding some degree of trust somewhere?



Urban Air Pollution Around the World

Two interesting blogs and a website for all issues cocnerning atmospheric pollution can be found at urbanemission.blogspot.co.uk/ , aipollutionalerts.blogspot.co.uk/ and at urbanemissions.info/ . All of these sties are run by Sarath Guttikunda from New Delhi. An important aspect of these sites are reports from all over the globe concern atmospheric pollution in urban areas. The reports highlight that atmospheric pollution is a global issue, a world-wide problem that needs action. Taking action, however, requires information that can inform decision-makers about the extent of the problem. These sties also provide this by linking through to monitoring information from urban areas from around the world. These sites also highlight that local populations, communities and neighbourhoods, are not just passively sitting there waiting for decision-makers to make decisions. The volume of reports and community awareness show the concern and impetus for changes driven by the local-level certainly exists. Enabling those changes is another issue that is dependent on local conditions and their context, political, economic and social. The information and data provided by these websites, however, does permit individuals and communities from all over the world to compare their conditions with others in similar circumstances and to exchange ideas and plans for pressuring decision-makers for change.

The Urban Emissions website is worth a look as well for the modelling tools that are available for download. An interesting one, given my last blog, is the Air Quality Indicator download. This simple calculator helps you work out the air quality for an urban area based on daily observations or modelled values. It does, of course, assume that the data will be available in the first place!

Understanding Daily Air Quality

Atmospheric pollution is a continuing environmental problem across the globe. Within the UK data on historic pollution levels as well as current pollution levels can be found at DEFRA Air Quality site. A great store of information and one that you can download data from.

Wonderful as this source is for research, atmospheric pollution is not a problem that has past or is under tight control on a global scale. The locations of UK data reflects the monitoring networks set up in the 1960s by Warren Springs Laboratory, largely in response to the Clean Air Act (1956) and the need to monitor levels of pollutants to ensure that standards were being meet. The early monitors tended to be instruments such as the sulphur dioxide bubbler (so old I couldn't fidn a photo of it on the Web!). Air was pumped into the machine at a known rate and the atmosphere reacted with the liquid as it bubbled through the machine. After day the flask with the liquid in was removed and replaced with another flask of liquid. The liquid from the previous day was analysed using titration techniques (reacting the sulphur dioxide with another chemical to get a colour change and a reaction product that could be accurately measured) to determine the levels of sulphur dioxide (once the various calibrations and calculations had been done). I know this because I used an old bubbler in my thesis to monitor sulphur dioxide levels on the roof of the Geography Department at UCL, London. It was educational, but was a pain to have to process daily, particularly as it was just self-taught me undertaking the titration much to the amusement of colleagues in the lab. Passive monitors such as nitration tubes (they just sit there and the pollutants react with them) were also used, but still needed chemical post-processing to obtain a result.

By the time I finished my thesis in 1989, real-time monitoring of pollutants, or at least hourly averaged and then 15-minute averaged values, were becoming more usual and replaced the daily averaged data. This is great for monitoring levels virtually continuously and for identifying specific pollution episodes but how much information is there and how can you interpret it? Air quality standards have varying monitoring levels for different pollutants and even the same pollutant can have different exceedence values. Sulphur dioxide levels in the UK, for example, should not exceed 266 mirocgrams/m3 more than 35 times per year if measured as averaged 15-minute concentrations. If measured as 1 hourly means then 350 micrograms/m3 should not be exceeded more than 24 times per year. If measured as a 24 hour average then 125 micrograms/m3 should not be exceeded more than 3 times a year. So the limits change with the monitoring period and the type of equipment being used to monitor pollution levels. This variation may begin to get confusing if you try to communicate it to too many different end-users.


A simplified version, the Daily Air Quality Index, recommended by the Committee on the Medical Effects of Air Pollutants (COMEAP) uses an index and banding system with bands numbered1-10 and colour coded for the level of severity of atmospheric pollution. The scale mirrors established ones for pollen and sunburn so draws on existing public understanding of colour and levels. Bands 1-3 are green, so represent low atmospheric pollution levels, 4-6 are shades of orange and represent moderate atmospheric pollution levels, 7-9 are darkening shades of red ending up at brown and represent high atmospheric pollution levels, whilst the oddly coloured purple band of 10 represents very high levels of atmospheric pollution. The index itself combines the highest concentrations for a site or region of five key pollutants: nitrogen dioxide, sulphur dioxide, ozone, PM2.5 and PM10.

The DAQI may be a useful tool for communicating information about the general level of pollution as it relates to human health but does the simplicity mask complexity that disaggregated data would not? The relative contribution of the five pollutants to the index can be gauged by the information on each at the DEFRA website. PM2.5 and PM10 uses 24-hour running mean concentrations and have specific threshold levels fro each band, whilst sulphur dioxide is measured as 15-minute averaged concentrations and, again, has threshold values for each band. The index itself, though, hides if all or just one or two of the pollutants push the DAQI into a band. The index misses other pollutants that could impact upon human health, even though these may be monitored such as benzene. The cocktail of pollutants used to create the index also reflects a specific context, the UK, would the cocktail of significant pollutants vary with other contexts? The cocktail and the monitoring intervals are not necessarily ‘natural’ ones – they have been developed from monitoring set up for other purposes such as regulatory requirements. The index is squeezed out of what exists.


The DAQI is a very, very useful tool, but it reflects an attempt to communicate complex and huge volumes of information in a simplified manner that, the makers believe, will be of use to specific end-users. Once data is compressed and simplified you are bound to loss some of the information contained in its variations and detail. The index you develop for targeted end-users will, of necessity, exclude a lot of the information you have collected and it is just useful, for the end-users in particular, to be aware of this.





Wednesday, July 25, 2012

Beijing Air Quality – Citizen-Science Approach to Mapping Levels?

A recent article in Environmental Technology Online reports on a community-based science project called ‘Float’ that is actually part-science and part-art project. The idea is that pollution-sensitive kites will be flown over Beijing. These kites contain Arduino pollution-sensing modules and LED lights and will indicate levels of volatile organic compounds, carbon monoxide and particulate matter by changing colour to green, yellow or red depending on the pollutant levels. The kites are attached to GPS device loggers and the real-time data website Cosm.

The project is designed by students Xiaowei Wang from Harvard’s Graduate School of Design and Deren Guler from Carneige Mellon and is designed to involve local residents in data collection. The project relies on public funding and is still raising funds. The project derives its funds from Kickstarter, a website devoted to creative projects and obtaining funding for such projects (The Float project on Kickstart). The project also has funding from the Black Rock Arts Foundation and the Awesome Foundation.

The project has generated a lot of interest on the Web:

Fighting China’s Pollution Propaganda, with Glowing Robot Kites For the People

Pollution-detecting kites to monitor Beijing's air quality
Glowing Pollution Sensor Equipped Kites Replace Beijing's Stars
Kickstarter Project Plans to Measure Beijing Pollution Using Kite Sensors

Only a couple of comments and an expression of interest in the results really.


The project is undoubtedly part of the growing and, in my view, superb trend towards more inclusive community or participatory science (choose whichever term you prefer, Guler uses citizen-science). The ideal of getting local communities involved in the data collection as well as involving them in all aspects of the research process is an excellent way to raise awareness of an issue as well as educate people about the scientific approach and its problems and potentials. The Float project has involved local communities, young and old, from the start with workshops in Beijing and as well as in the design of the kites. In terms of how to organise a community-based, participatory science project it is one that I will advice my students to look at. It is just a shame that the descriptions of the project veer from highlighting the science to highlighting the arts aspects as if the two are, or need to be, distinct. It should also be remembered that this project, as any project involved in monitoring pollution, is entering the political as well as the scientific arena. Involving local populations is a political act (as is their agreement to involvement) as much as the monitoring of pollution by the American Embassy or the siting of monitoring sites by the Chinese. Local is as political as the national or international, but the nature of the act does not necessarily mean the data is political bias only that data collection is for a purpose.

As with most community-based projects, however, there is the issue of belief, trust or confidence in the data collected. These projects do tend to illustrate quite nicely the continuing divide between the ‘specialist’ or ‘expert’ and the ‘public’ (I would say amateur, but much of British science in the nineteenth and early twentieth century only developed because of amateurs!) The expert has been trained and accepts certain methods as being appropriate for data collection. Control and standardization are essential in ensuring what is termed ‘intersubjectivity communication’ between researchers – basically it means  I know what you did because that is how I was trained to do it, so I trust your data as being real. Guler seems to downgrade the status of the data collected even before the project really begins by stating:

‘We’re trying to interact with people on the street and see what they’re tying to do with the information they see. I don’t plan to argue that this is the most accurate data because there are many potential reasons for differences in air quality reports. We want to just keep it up, upload the data, and focus on that more after we come back’.

My impression is this statement is a great get-out clause for ‘official’ monitoring be it by the Chinese or atop the American Embassy. I wouldn’t’ be so pessimistic. The aims of the project in terms of improving public understanding of air pollution, its impact on health and the visualization of pollution through the kites are all excellent and likely to be successful. The data collected is also of value. The ‘official’ pollution monitoring sites probably conform to national or international standards for static sites in terms of equipment and monitoring periods. The kite data does not necessarily provide comparable data to these sites. The kites are mobile and collect data on levels that can be spatially references (I assume in 4 dimensions). They provide a different perspective on atmospheric pollution rather as a spatially altering phenomenon, something the official monitoring sites can not provide.  It could even be argued that the kite data provides information on pollution as experienced by the population (although the population is unlikely to move across the sky at the height of the kites!) The important thing to remember is that there is not one, single correct measure of atmospheric pollution; there are merely different representations of atmospheric pollution. The official static sites have the advantage of having clearly defined protocols that ensure the data or information they collect is immediately comparable with data or information collected at similar monitoring sites globally. The Float project is generating a different and novel set of data or information. This may require a different approach to thinking about the information and its interpretation (Guler seems to suggest this with some hints at triangulation of trends) and in how confidence or belief in the information is assessed either qualitatively or quantitatively. I will be very interested to see what form the results and interpretation takes. Good luck with the project!

Institute of Hazard, Risk and Resilience at the University of Durham

An extremely useful website is the Institute of Hazard, Risk and Resilience located at the University of Durham. They have just published their first on-line magazine, Hazard Risk Resilience that outlines some key aspects of their research and is well worth a look (also their blog now linked at the side of my blog). In addition the site contains podcasts on aspects of hazards.


An important research project for the Institute is Leverhulme funded project on ‘Tipping Points’. Put simply ‘tipping points;’ refer to a critical point, usually in time, when everything changes at the same time. This idea has been used in describing and trying to explain things as diverse as the collapse of financial markets and the switches in climate. The term ‘tipping point’ (actually ‘tip point’ in the sutdy) was first use in sociology in the 1957 by Martin Grodzins to describe the ‘white-flight’ of white populations from neighbourhoods in Chicago after a threshold number of black people moved into the neighbourhood. Up to a certain number nothing happened, then suddenly it was as if a large portion of the white population decided to act in unison and they moved. That this action was not result of co-ordinated action on the part of the white population suggested that some interesting sociological processes were at work. (Interestingly, I don’t know if the reverse happens or if research has been conducted into the behaviour of non-white populations and their response to changing neighbourhood dynamics). Since about 2000 the use of the term tipping point has grown rapidly in the academic literature, a lot of the use being put down to the publication in 2000 of ‘The Tipping Point: How little things can make a big difference’ by the journalist Malcolm Gladwell (who says academics don’t read populist books!)

Research suggests that the metaphor of a ‘tipping point’ is a useful one in getting across a lot of complex and complicated processes and changes that occur in socio-economic, political and physical systems. One of the foci of research in the project is on trying to assess if this metaphor does actually describe, quantitatively or qualitatively or both, real properties of systems. Another focus of the research  are concerned with exploring how the metaphor becomes an important aspect of the phenomena being researched, even taking on the character of an agent in the phenomena itself. Importantly, the project also considers what it means to live in a world where ‘tipping points’ abound and how important anticipatory understanding is for coping wit that world.

Tipping Point by Malcolm Gladwell





Virtual Water: Accounting for Water Use

Following on from my last blog on the discovery of a massive aquifer under part of Namibia I thought it might be useful to consider a key accounting concept for water resources: virtual water. The term ‘virtual water’ was coined by Tony Allen of SOAS and and refers to the invisible water, the water that it takes to produce the food and goods we consume or as Virtual Water puts it


Virtual water is the amount of water that is embedded in food or other products needed for its production.’

(Other websites on virtual water include: Virtual Water)
Some of the figures involved are quite amazing. A kg of wheat takes 1,000 litres of water to produce, a cup of coffee takes 140 litres, 1kg of rice takes 3,400 litres and 1 car takes 50,000l litres of water to produce. You can even work out your own water footprint using the calculator on the site. There is even an app for your mobile! Additionally there is information on national water footprints and, importantly, the idea that virtual water is traded between nations.

Mekonnen and Hoekstra (2011) at the University of Twente published a UNESCO report on the issue of virtual water over the period 1996-2005. They divided virtual water into three colours: green, blue and grey. Green water is the water associated with agricultural production, blue water is that associated with industrial production whilst blue water is the water associated with domestic use. From their analysis they calculated that the global average water footprint for consumption was 1385 m3 per year per capita, with industrialized countries having a footprint of 1250-2850 m3 per year per capita, whilst developing countries had footprints in the range 550-3800 m3 per year per capita. The low values represented low consumption volumes in these countries, whilst the large values represented big water footprints per unit of consumption.

Their key conclusions were that about 20% of the global water footprint for that period was related to production for export. This means that there are large international flows of virtual water with some countries importing virtual water (2320 billion m3 in the time period). In countries where water is scare there is a tendency to import large amounts of virtual water in food and agricultural products, saving national water resources for key uses that add greater value such as industrial production. Tony Allen makes an argument in 1997 for the Middle East being one of the first regions to develop this adaptation to resource scarcity. The relatively large volume of international flows in virtual water generated water dependencies that they suggest strengthen the argument that issues of local water scarcity need to be considered within a global context.


The significance of this concept for the discovery of the aquifer and its use is that the Namibian reserve has to be viewed within a global context. The development of agriculture and the technical development of the resource are likely to be political decisions and increasingly likely to be geopolitical decisions that have to take into account the regional position of Namibia, the likely trade partners for the virtual water, the geopolitical power of potential partners and the future frictions that could arise as environmental change affects the current international demands and flows of virtual water.

Tuesday, July 24, 2012

Namibian Aquifer: Who Benefits?

A recent BBC article reported on the discovery of major aquifer in Namibia. The new aquifer is called Ohangwena Il and it covers an area of about 70x40km (43x25 miles). The project manager, Martin Quinger, from the German federal institute for geoscience and natural resources (BGR) estimates that the 10,000 years old water could supply the current water consumption needs of the region for about 400 years.


The find could dramatically change the lives of the 800,000 people in the region who currently rely on a 40 year old canals for their water supply from neighbouring Angola. Martin Quinger states that sustainable use is the goal, with extraction ideally matching recharge. The easy (and cheap) extraction of the water under natural pressure is complicated by the presence of a small salty aquifer that sits upon the top of the newly discovered aquifer. Quinger states that if people undertake unauthorised drilling without following their technical recommendations then a hydraulic short-cut could be formed between the two aquifers contaminating the fresh water.

In terms of the use of the water he comments that:
‘For the rural water supply the water will be well suited for irrigation and stock watering, the possibilities that we open with this alternative resource are quite massive’.
The EU funded project also aims to help young Namibians manage this new water supply before their funding runs out.
The discovery is a great, potentially life-changing resource for the region but the question that arises in my mind is who is going to benefit from this discovery? The current socio-ecological system in the region is attuned to the amount of water available. The availability of more water could change this but will it be for the benefit of the current population? A key aspect is the last point made about the EU funded project – the management of the resource by those in the region. The skills required to manage a large water resource are context dependent. They depend on the uses to which that resource will be put. They require technical and resource allocation skills that presume a context of educational levels that are embedded within a culture and location. Acquiring these skills takes a while as people go through the appropriate training and gain the experience that helps management of this resource. If this expertise does not exist now within the region then the implication is that external support will be needed and, by implication, paid for.

Another issue is the assumption that the water will be used for improving agricultural production which I assume (maybe wrongly) means more intensive agriculture. The key questions are then what type of agriculture and what additional resources are required to ensure that it works? Thinking of the whole agricultural system as a complex network of relations, the question really is what network of relations will be overlaid onto the existing agricultural network to ensure the success of the new type of agricultural production. A more intensive agriculture implies fertilizers, investment, technical know-how as well as access to markets, regional, national and international, so that funds can be extracted from the new produce. Again is it likely to be the regional population that is able to conjure up the finance, technical knowledge and all the other bits of the network required to develop this new agriculture? In time, the answer might be yes, but will external agencies, such as the government and investors permit this time before developing the valuable resource?

This problem with development as seemingly envisaged by the project is illustrated in the comment concerning extraction. The implication is that only people with a specific level of technical ability can extract the water. This implies that a system of permits is likely to be implemented and so access to the resource will be controlled and restricted. It also implies that the permits will be allocated to operators able to meet the technical requirements outlined by the project, and if this expertise does not exist within the region then the operators will have to be external contractors. This system is likely to require financing so value will have to be extracted from the supply of water. To who will the funding flow and who will pay for it? How will the regional population receive the water and what will the price of the water? I would like to hope that the EU funded project will enable the management of the resource by the regional population for the benefit of that population in the manner that that poplulation sees fit for their own development.