Showing posts with label risk communication. Show all posts
Showing posts with label risk communication. Show all posts

Monday, October 29, 2012

The L’Aquila and Legal Protection for Scientists

Charlotte Pritchard’s recent BBC article (http://www.bbc.co.uk/news/magazine-20097554) raises an interesting question – should scientists stop giving advice and, if not, should they have professional indemnity insurance? This cover, as a lot of eager insurance website will tell you, is designed to protect professionals and their businesses if clients (or a third party) make a claim against them because they believe that they have suffered loss due to non-performance, breach of contract or professional negligence or all the above. Insurance cover is up to a maximum limit outlined in the policy and, presumably, based on the losses for similar types of professional activity in the past and the likelihood or probability of a claim being made against a particular type of professional. Such policies are standard tools of legal protection for architects, engineers, business consultants, insurance brokers (ironic?), solicitors, accountants and independent financial advisers. Pritchard points out that even the Met Office have a professional indemnity self-insurance fund in case a forecaster fails to predict a flood that results in the loss of life (how things have moved on since Michael Fish!)
Transferring this type of policy into the academic realm is not unusual, several of my colleagues have such policies when they undertake consultancy work and a lot of universities commercial branches offer such cover. A key question that should be asked is, is the nature of the information being provided the same for all these professions or is the scientific information of a different type? Is it the information that is the same or is it the intended use of that information that requires the provider to have legal protection? If I were an engineer advising on a building project, the audience, the investors, the builders, etc, employ me to ensure that the result is satisfactory – the building stays up (simplistic but you get the idea). There is a definite, time limited and clearly defined outcome that my advice is meant to help achieve. Is this the case for scientific advice about the possibility of a hazarduous event? Is the outcome clearly defined or is there variability in the expectations of the audience and those of the information provider? Aren’t experts in both cases offering their best ‘guesses’ given their expertise and standards in specific areas?
The development and (by implication from Pritchard) the almost essential nature of legal protection for people giving advice tells us a lot about current attitudes or beliefs about science and prediction. Pritchard quotes David Spiegelhalter, Professor of Public Understanding of Risk at the University of Cambridge as stating:

“At that point you start feeling exposed given the increasingly litigious society, and that's an awful shame…. It would be terrible if we started practising defensive science and the only statements we made were bland things that never actually drew one conclusion or another. But of course if scientists are worried, that's what will happen."

The belief that science can offer absolute statements concerning prediction underlies the issue at L’Aquila. Despite the careful nature of the scientific deliberations, the press conference communicated a level of certainty at odds with the understanding of seismic activity and at odds with the understanding of the nature of risk in seismology. Belief that seismic events are predictable in terms of absolute time and location is at odds with what is achievable in seismology. By extension this view believes that scientists understand the event and understand how it is caused and so understanding causation leads to accurate prediction. This ignores the level and nature of understanding in science. Scientists build up a model, a simplification of reality, in order to understand the events, the data that they collect. This model is modified as more information is produced but it is never perfect. The parts of the model are linked to one and other by the causes and processes the scientist believe are important, but these can be modified or even totally discarded as more events, more information is added. So it is feasible to understand, broadly, how seismic events occur without being able to translate this understanding to a fully functional and precise model of reality that can predict exactly when and where an earthquake will occur.

If scientists are held legally to account for the inexact nature of the scientific method then there are major problems with any scientist wanting to provide any information or advice to any organisation.

Communicating the inexact nature of our understanding of reality, however, is another issue. If the public and organisation want accuracy in predictions that scientists know is impossible then the ‘defensive science’ noted by Spiegelhalter, will become the norm in any communication. Bland statements of risk of an event will be provided and, to avoid blame, scientists and their associated civil servants will always err on the side of caution, i.e. state a risk level that is beyond the level they would state to colleagues. Even this type of risk communication carries its own risks – stating it will rain in the south of England on a bank holiday could deter visitors to the seaside and when that happens then couldn’t business in coastal resorts sue or provide their own information (Bournemouth launches own weather site - http://news.bbc.co.uk/1/hi/england/dorset/8695103.stm )? If the reports conflict then who should the public believe?

Bland science implies communication that scientists perceive to be of least risk to them personally. This could vary from person to person and from institution to institution so the level of ‘risk’ deemed acceptable to communicate as ‘real’ to the public will begin to vary.

There is no easy answer to this issue and whilst there isn’t then legal protection sounds a reasonable way to go if you want to make your scientific knowledge socially relevant. It may, however, be worth thinking about the ideas scientists try to transmit as messages. Three simple things then spring to mind: what is the transmitter? what is the message and what is the audience? The scientist (transmitter) will have their own agenda, language and views on the nature of the message. The message itself will be communicated in a specific form along specific channels, all of which can alter its original meaning or even shape its meaning. Likewise, the audience is not a blank, passive set of receivers – they have their own views, agenda and will interpret the message as such. More time spent understanding how the scientific message is communicate may help to ensure that the message is interpreted by the audience(s) in the way the scientist intended.

Friday, October 26, 2012

The ‘Michael Fish Effect’: The L’Aquila Case and Expectations of Science


On 15th October 1987 weather presenter Michael fish, dressed in the loud-ish tie and bland suit typical of the 1980s, delivered one of the most infamous statement in British science – Earlier on today, apparently, a woman rang the BBC and said she heard there was a hurricane on the way... well, if you're watching, don't worry, there isn't!’ - hours later the most severe storm since 1703 hit the south-east of England killing 18 people. Although Michael Fish claims to have warned viewers to ‘batten down the hatches’ later in the report, the story of this error of scientific prediction have gone into meteorological folklore. Michael Fish, however, was not blamed for the storm, for its misprediction or the deaths that followed, in fact the whole incidence has taken on a mocking and good-humoured tone that has kept Michael Fish fondly in the public memory. The recent trial in L’Aquila has drawn comparisons with Michael Fish’s pronouncement, but the comparison is not that simple.

The trial and sentencing of seven Italians (six seismologists and a civil servant) by an Italian judge on 22nd October for ‘crimes’ in relation to the L’Aquila earthquake in on 6th April 2009 has, rightly, sent shockwaves around the scientific world (http://www.guardian.co.uk/science/2012/oct/22/scientists-convicted-manslaughter-earthquake, http://www.bbc.co.uk/news/world-europe-20025626). The six-year jail sentences (although no sentence will be implemented until at least one appeal under Italian law) are for the crime of multiple manslaughter. The verdict was reached by after a trail by judge (rather than a jury trial) decided after only 4 hours of deliberation that the men were guilty of providing "inexact, incomplete and contradictory information" about whether the small tremors felt in the area some weeks and months before the larger earthquake were the basis for an earthquake warning. The implication of society persecuting science has even drawn parallels with the trail of Galileo in 1633 (http://www.guardian.co.uk/science/across-the-universe/2012/oct/24/galileo-laquila-earthquake-italian-science-trial).

Behind the initial shock, however, the verdict can also be viewed to be about a failure to communicate risk rather than a failure of science to be able to accurately predict an event (http://www.newscientist.com/article/dn22416-italian-earthquake-case-is-no-antiscience-witchhunt.html). The prosecution in the case made it clear that the accusations were about poor communication of risk and not about science as such. The New Scientist article makes it clear that the communication of the risk was left to a civil servant with no specialism in seismology. His statement that:
 "The scientific community tells us there is no danger, because there is an ongoing discharge of energy. The situation looks favourable." 
should have rung alarm bells with the seismologists on the Major Hazard’s Committee but none were present at the press conference to correct this simplistic (and potentially incorrect) statement.

Nature offers an even more detailed analysis of the miscommunication of science involved in the statement issued by the civil servant (http://www.nature.com/news/2010/100622/full/465992a.html). Nature looked at the minutes of the meeting held between all seven men on 31st March 2009. In the meeting, none of the scientists stated that there was no danger of a large earthquake nor did they state that a swarm of small quakes meant there would not be a large one (or that there won’t be one). The prosecution claimed that the statement at the press conference persuaded a lot of people to remain in their homes who would otherwise have left the region, hence the charge of multiple manslaughter.

The whole case highlights, for me, two important issues. Firstly, what are the expectations of science from the public, from government and from the scientists themselves. The minutes of the meeting make it clear that the scientists put over views couched in clear scientific terms of uncertainty and unpredictability about seismic events. Uncertainty and unpredictability are commonplace in science. Recognizing the limits to what we know about the physical environment and how this impact on our ability to model what little we do know to produce predictions is an important aspect of science. Is this acceptance of our ignorance and inability what decision makers or the public want to hear in a crisis situation? Is the image that these groups have of science a bit different from that scientists have? The expectation of certainty, of clear yes or no answers to specific questions seems to be an expectation that science can not fulfill  The statement of the civil servant may reflect his interpretation of the committee discussion but the terms use are ones that reveal a desire to communicate certainty, a quality science by its very nature can not provide.  Science does not work by finding the truth but rather by eliminating the false. This is a long and painful process of rejecting errors and accepting, for the time being, whatever ideas are left even though you know that one-day these ideas themselves may be altered or rejected in the light of new evidence.

Secondly, the trail highlights that scientists should realise that they work in society and society has expectations of them. Leaving communication of a complex and inconclusive discussion to a civil servant may have seemed appropriate to the scientists but it also implies a view that it was not their responsibility. Sitting on a national committee such as the Major Hazards Committee not only means that you are a highly respected member of the scientific community, it also means that you believe there is a social benefit to be gained from your knowledge. The social benefit is drastically reduced if you are unable to communicate effectively to a vital audience, the public. Assuming that you do not have to work at communicating this knowledge or that it will be communicated accurately for you is delegating responsibility for your views to someone else. In this case delegation means loss of control of your views. Science is difficult and science is complex but just saying this does not mean that you should not try to communicate the issues involved in a complicated subject such as seismology. You may not be providing the answers people want to hear but then again you are not providing them with simplistic and wrong answers. At least Michael Fish didn’t rely on someone else to communicate his mistake - it was all his own work - just like his tie. 

Friday, July 27, 2012

Public Risk Communication

Communicating risk to the general public is a vital task in managing risk. The UK government has produced a leaflet outlining a ‘Practical Guide to Public Risk Communication’. The leaflet is a very short guide to practice. The three key aspects of risk communication are to reduce anxiety around risks, to manage risk awareness and to raise awareness of certain risks. There are 5 key elements to public risk communication: assembling the evidence, acknowledgement of public perspectives, analysis of options, authority in charge and interacting with your audience.


Each of these elements is expanded and discussed via a set of questions that organisations should consider. The first element is concerned with establishing the nature of the risk and its magnitude and demonstrating a credible basis for the position taken by the orgnaization. Evidence is paramount in this element but also, implicitly, is the question of trust or belief in the evidence. Who provides the evidence and the basis of that evidence are as important as clearly articulating the risk. As part of this aspect of trust, the question of ambiguity and uncertainty is bound to arise. Despite what politicians may wish, science is inexact and often searches for and is ladened with uncertainty. How organisations deal honestly with uncertainty can have a huge bearing on the trust they emit and retain between hazardous events.

Understanding how the public understand risk is essential to getting the message across. Lumping everyone in as ‘the public’, may not be that helpful though, as the leaflet notes. Assuming everyone perceives a hazard in the same way and in a consistent manner may be hoping for too much. The leaflet uses the term ‘risk actors’ as a coverall term for individuals or groups who engage with a risk or who influence others approaches to and understanding of risk. Perception and so the message about risk should be differentiated but that differentiation may depend upon the exact mix of risk and risk actors. An issue I would like to raise here is whether once you are a risk actor you are no longer a member of the public? A home-owner who has recently experienced a flood may be much more active in their local community in taking steps to reduce the flood risk – does that make them a risk actor, a well-informed member of the public or a member of the public and does this matter fro how risk is communicated to them? Risk actors could be viewed as being bias, of having their own agendas, and so not really reflective of the views of the general public.

Analysing options suggests that organisations have rationally weighed the risk and benefits of managing public risk as well as the options for action available to them. The last sentence is interesting ‘the technical and societal interests will need to be reconciled if the solution is to be generally accepted.’ This sentence is made in relation to technical solutions that may no have public sympathy initially. The implication that through debate these solutions can be accepted implies a very hierarchical and technocratic view of hazards and risk management – the public have to be educated and brought to accept the solution. I maybe reading this aspect too cynically but may be not.

The authority in charge section, however, adds weight to the idea that this is a technocratic and hierarchical view of hazards (maybe not a surprise in a government publication!) The need for an organisation to determine if it is appropriate for them to step into manage risk and the clear limits of responsibility to risk management imply a very structured and ordered view to how to manage the world and associated risks. A telling point is made that exercising authority now is as much about building and maintaining trust as it is about lines of formal authority. Trust, or rather the perception of trust, will dramatically affect the ability of an organisation to manage risk in a sceptical society. The last sentence states ‘Organisations that are not highly trusted will increase the chances of success by enlisting the help of other organisations – such as independent scientific organisations – who have the confidence of society’. A call to collaboration or a call to objectivity?

So is it all ‘smoke and mirrors’ or does this leaflet help to further risk management? Without a doubt communicating risk and its management effectively to different audiences is essential. The leaflet does provide some very good guidance on this. The ideas should not, however, be used uncritically as they are designed with a very technocratic and hierarchical view of risk management in mind. Examples of public risk communication are also provided and for flooding the conclusions reflect this bias (starts on page 22 of the report). Risk quantification is sought for flood hazard, as are software and tools for aiding local risk planning and managing the possible clash between the expectations of more flood defence infrastructure and the new risk-based approach (risk is the focus rather than the hazard itself). Communication about risk and its management is viewed as coming from the Environment Agency , insurance companies and DEFRA – not much about local ideas or communication from the ground up! The link itself is, however, viewed as potentially corrosive to public trust. This hits at the nub of the issue, the government wants the trust of people to be able to act in a manner it views as appropriate. Actions, of necessity in a complex society, require working with organizations with their own agendas and this creates suspicion. Can risk ever be managed without trust and can you manage anything without eroding some degree of trust somewhere?