Monday, October 29, 2012

Using St Paul’s Erosion Data to Predict Future Stone Decay in Central London

In a recent blog I mentioned a research project just completed on a 30-year remeasurement of stone decay on St Paul’s cathedral in central London. A second paper looks at how this data might be used to model decay into the future (http://www.sciencedirect.com/science/article/pii/S1352231012007145  - you need to have an account to get access to the full paper in Atmospheric Environment). Modelling erosion rates into the future tends to use relationships derived from erosion data of small (50x50x10mm) stone tablets exposed in different environmental conditions. Using such data and regression analysis a statistical relationship can be derived between stone loss and changing environmental conditions. These relationships are often referred to as dose-response functions.


Two equations stand out – the Lipfert and the Tidblad et al. does response functions. Using these two equations for the decades 1980-2010, they predict an erosion rate of 15 and 12 microns per year as opposed to the measured losses on St Paul’s of 49 and 35 microns per year. The ratio between the measured and the dose-response erosion rates varies from 3.33 in the decade 1980-1990 to 2.75 in the decade 2000-2010, so fairly consistent. The difference between the two measures of decay may result from differences in what they are actually measuring. The dose-response function uses small stone tablets, exposed vertically in polluted environments. The weight loss of these tablets is measured and then converted to a loss across the whole surface of the tablet. The micro-erosion meter sites measure the loss of height of a number of points across the same surface on a decadal time scale. Both measures are changes in height but derived in different ways. What is important is that both methods indicate the same patterns of change in relation to declining sulphur dioxide levels. Both measures of erosion show a decline and both show it in the same direction and, by and large, in proportion to each other. Interestingly, when the dose-response functions are used to work out erosion on the cathedral since it was built the long-term erosion rate (as measured by lead fin heights relative to the stone surface) is only 2.5 times greater than that predicted by the does-response functions. This is a similar ratio, more or less, to those indicated over the last three decades.

The St Paul’s data does not imply that dose-response functions do not work – if anything it confirms the patterns in decay they indicate – but the St Paul’s data does suggest that using these dose-response functions to model decay into the future may require a correct function, equivalent to the ratio of about 2.5-2.75 to convert the losses to those that will be found on St Paul’s Cathedral.

Atmospheric Pollution and Stone Decay: St Paul’s Cathedral

I have recently published a paper with colleagues from Oxford, Cambridge, Sussex and York on a 30-year measurement of erosion rates on St Paul’s Cathedral, London (http://www.sciencedirect.com/science/article/pii/S1352231012008400  - you need to have an account with the journal to access the paper). Pleasingly, the academic work did get some press (http://www.independent.co.uk/news/uk/home-news/pollution-erosion-at-st-pauls-cathedral-in-record-300year-low-8205562.html , http://www.stpauls.co.uk/News-Press/Latest-News/St-Pauls-safer-from-pollution-than-at-any-time-in-its-history , ) and I even did a very short radio interview on local radio (distracted during it because they just started their afternoon quiz to which I knew the answer!)


The paper outlines how the rates of erosion (and the rates of surface change) of five micro-erosion meter sites around the cathedral have changed over the decades since 1980 and how this has mirrored a dramatic fall in pollution levels in central London.



Figure 1 Micro-erosion meter site with protective caps being removed for remeasurement

Figure 2 Erosion rates, rates of surface change and environmetnal variables for central London 1980-2010

The erosion rates have dropped since the closure of Bankside power station in the early 1980s with atmospheric pollution, as indicated by sulphur dioxide levels dropping from 80ppb in 1980 to about 3ppb in 2010. Erosion rates fell from 49 microns per year in the decade 1980-1990 to 35 microns per year in the decade 2000-2010. Erosion rates in the 1980-1990 were statistically significantly higher than erosion rates in both the decades 1990-2000 and 2000-2010. Erosion rate in the decade 1990-2000 and 2000-2010 were statistically similar. Although the decline in erosion rates was not as steep as the fall in pollution levels, erosion rates are now at a level that could be explained by the action of the acidity of ‘normal’ or ‘natural’ rainfall alone. ‘Normal’ rainfall is a weak carbonic acid, produced by the reaction of carbon dioxide and water in the atmosphere, meaning that ‘normal’ rainfall has an acidity of about pH5.6.

Erosion rates represent the loss of material from a surface but not all points’ measured lost material, some gained height over the measurement periods – this is surface change. Points can gain height for a number of reasons; slat in the stone could distort and push the surface up, lichens and bacteria could form crusts that raise the surface and eroded material might be deposited in depressions in the surface causing an apparent raising of the surface. The rates of surface change were 44 microns per year in the decade 1980-1990 but fell to 26 microns per year by the decade 2000-2010 (and were only 25 microns per year in the decade 2000-2010). This suggests that rats of surface change fell in a similar manner to rates of erosion and match the drop in sulphur dioxide levels as well.

Back in the 1980s the long-term rates of erosion, since the early 1700s were also determined using lead find. These lead fins were produced when the holes used to raise the stone blocks into the balustrade were filled with lead. Over time the fins became proud of the surface as the stone around them eroded. By measuring the height difference between the fins and the stone (and then dividing by the time of exposure) the long-term rates of erosion can be calculated. The long-term rates from 1690/1700 to 1980 were about 78 microns per year. This suggests that the cathedral has experienced much higher erosion in the decades before 1980. This does suggest that the erosion rates we have managed to measure from 1980 onwards, and the associated pollution levels, were not as damaging to the cathedral as those experienced in the years up to 1980.

The L’Aquila and Legal Protection for Scientists

Charlotte Pritchard’s recent BBC article (http://www.bbc.co.uk/news/magazine-20097554) raises an interesting question – should scientists stop giving advice and, if not, should they have professional indemnity insurance? This cover, as a lot of eager insurance website will tell you, is designed to protect professionals and their businesses if clients (or a third party) make a claim against them because they believe that they have suffered loss due to non-performance, breach of contract or professional negligence or all the above. Insurance cover is up to a maximum limit outlined in the policy and, presumably, based on the losses for similar types of professional activity in the past and the likelihood or probability of a claim being made against a particular type of professional. Such policies are standard tools of legal protection for architects, engineers, business consultants, insurance brokers (ironic?), solicitors, accountants and independent financial advisers. Pritchard points out that even the Met Office have a professional indemnity self-insurance fund in case a forecaster fails to predict a flood that results in the loss of life (how things have moved on since Michael Fish!)
Transferring this type of policy into the academic realm is not unusual, several of my colleagues have such policies when they undertake consultancy work and a lot of universities commercial branches offer such cover. A key question that should be asked is, is the nature of the information being provided the same for all these professions or is the scientific information of a different type? Is it the information that is the same or is it the intended use of that information that requires the provider to have legal protection? If I were an engineer advising on a building project, the audience, the investors, the builders, etc, employ me to ensure that the result is satisfactory – the building stays up (simplistic but you get the idea). There is a definite, time limited and clearly defined outcome that my advice is meant to help achieve. Is this the case for scientific advice about the possibility of a hazarduous event? Is the outcome clearly defined or is there variability in the expectations of the audience and those of the information provider? Aren’t experts in both cases offering their best ‘guesses’ given their expertise and standards in specific areas?
The development and (by implication from Pritchard) the almost essential nature of legal protection for people giving advice tells us a lot about current attitudes or beliefs about science and prediction. Pritchard quotes David Spiegelhalter, Professor of Public Understanding of Risk at the University of Cambridge as stating:

“At that point you start feeling exposed given the increasingly litigious society, and that's an awful shame…. It would be terrible if we started practising defensive science and the only statements we made were bland things that never actually drew one conclusion or another. But of course if scientists are worried, that's what will happen."

The belief that science can offer absolute statements concerning prediction underlies the issue at L’Aquila. Despite the careful nature of the scientific deliberations, the press conference communicated a level of certainty at odds with the understanding of seismic activity and at odds with the understanding of the nature of risk in seismology. Belief that seismic events are predictable in terms of absolute time and location is at odds with what is achievable in seismology. By extension this view believes that scientists understand the event and understand how it is caused and so understanding causation leads to accurate prediction. This ignores the level and nature of understanding in science. Scientists build up a model, a simplification of reality, in order to understand the events, the data that they collect. This model is modified as more information is produced but it is never perfect. The parts of the model are linked to one and other by the causes and processes the scientist believe are important, but these can be modified or even totally discarded as more events, more information is added. So it is feasible to understand, broadly, how seismic events occur without being able to translate this understanding to a fully functional and precise model of reality that can predict exactly when and where an earthquake will occur.

If scientists are held legally to account for the inexact nature of the scientific method then there are major problems with any scientist wanting to provide any information or advice to any organisation.

Communicating the inexact nature of our understanding of reality, however, is another issue. If the public and organisation want accuracy in predictions that scientists know is impossible then the ‘defensive science’ noted by Spiegelhalter, will become the norm in any communication. Bland statements of risk of an event will be provided and, to avoid blame, scientists and their associated civil servants will always err on the side of caution, i.e. state a risk level that is beyond the level they would state to colleagues. Even this type of risk communication carries its own risks – stating it will rain in the south of England on a bank holiday could deter visitors to the seaside and when that happens then couldn’t business in coastal resorts sue or provide their own information (Bournemouth launches own weather site - http://news.bbc.co.uk/1/hi/england/dorset/8695103.stm )? If the reports conflict then who should the public believe?

Bland science implies communication that scientists perceive to be of least risk to them personally. This could vary from person to person and from institution to institution so the level of ‘risk’ deemed acceptable to communicate as ‘real’ to the public will begin to vary.

There is no easy answer to this issue and whilst there isn’t then legal protection sounds a reasonable way to go if you want to make your scientific knowledge socially relevant. It may, however, be worth thinking about the ideas scientists try to transmit as messages. Three simple things then spring to mind: what is the transmitter? what is the message and what is the audience? The scientist (transmitter) will have their own agenda, language and views on the nature of the message. The message itself will be communicated in a specific form along specific channels, all of which can alter its original meaning or even shape its meaning. Likewise, the audience is not a blank, passive set of receivers – they have their own views, agenda and will interpret the message as such. More time spent understanding how the scientific message is communicate may help to ensure that the message is interpreted by the audience(s) in the way the scientist intended.

Friday, October 26, 2012

The ‘Michael Fish Effect’: The L’Aquila Case and Expectations of Science


On 15th October 1987 weather presenter Michael fish, dressed in the loud-ish tie and bland suit typical of the 1980s, delivered one of the most infamous statement in British science – Earlier on today, apparently, a woman rang the BBC and said she heard there was a hurricane on the way... well, if you're watching, don't worry, there isn't!’ - hours later the most severe storm since 1703 hit the south-east of England killing 18 people. Although Michael Fish claims to have warned viewers to ‘batten down the hatches’ later in the report, the story of this error of scientific prediction have gone into meteorological folklore. Michael Fish, however, was not blamed for the storm, for its misprediction or the deaths that followed, in fact the whole incidence has taken on a mocking and good-humoured tone that has kept Michael Fish fondly in the public memory. The recent trial in L’Aquila has drawn comparisons with Michael Fish’s pronouncement, but the comparison is not that simple.

The trial and sentencing of seven Italians (six seismologists and a civil servant) by an Italian judge on 22nd October for ‘crimes’ in relation to the L’Aquila earthquake in on 6th April 2009 has, rightly, sent shockwaves around the scientific world (http://www.guardian.co.uk/science/2012/oct/22/scientists-convicted-manslaughter-earthquake, http://www.bbc.co.uk/news/world-europe-20025626). The six-year jail sentences (although no sentence will be implemented until at least one appeal under Italian law) are for the crime of multiple manslaughter. The verdict was reached by after a trail by judge (rather than a jury trial) decided after only 4 hours of deliberation that the men were guilty of providing "inexact, incomplete and contradictory information" about whether the small tremors felt in the area some weeks and months before the larger earthquake were the basis for an earthquake warning. The implication of society persecuting science has even drawn parallels with the trail of Galileo in 1633 (http://www.guardian.co.uk/science/across-the-universe/2012/oct/24/galileo-laquila-earthquake-italian-science-trial).

Behind the initial shock, however, the verdict can also be viewed to be about a failure to communicate risk rather than a failure of science to be able to accurately predict an event (http://www.newscientist.com/article/dn22416-italian-earthquake-case-is-no-antiscience-witchhunt.html). The prosecution in the case made it clear that the accusations were about poor communication of risk and not about science as such. The New Scientist article makes it clear that the communication of the risk was left to a civil servant with no specialism in seismology. His statement that:
 "The scientific community tells us there is no danger, because there is an ongoing discharge of energy. The situation looks favourable." 
should have rung alarm bells with the seismologists on the Major Hazard’s Committee but none were present at the press conference to correct this simplistic (and potentially incorrect) statement.

Nature offers an even more detailed analysis of the miscommunication of science involved in the statement issued by the civil servant (http://www.nature.com/news/2010/100622/full/465992a.html). Nature looked at the minutes of the meeting held between all seven men on 31st March 2009. In the meeting, none of the scientists stated that there was no danger of a large earthquake nor did they state that a swarm of small quakes meant there would not be a large one (or that there won’t be one). The prosecution claimed that the statement at the press conference persuaded a lot of people to remain in their homes who would otherwise have left the region, hence the charge of multiple manslaughter.

The whole case highlights, for me, two important issues. Firstly, what are the expectations of science from the public, from government and from the scientists themselves. The minutes of the meeting make it clear that the scientists put over views couched in clear scientific terms of uncertainty and unpredictability about seismic events. Uncertainty and unpredictability are commonplace in science. Recognizing the limits to what we know about the physical environment and how this impact on our ability to model what little we do know to produce predictions is an important aspect of science. Is this acceptance of our ignorance and inability what decision makers or the public want to hear in a crisis situation? Is the image that these groups have of science a bit different from that scientists have? The expectation of certainty, of clear yes or no answers to specific questions seems to be an expectation that science can not fulfill  The statement of the civil servant may reflect his interpretation of the committee discussion but the terms use are ones that reveal a desire to communicate certainty, a quality science by its very nature can not provide.  Science does not work by finding the truth but rather by eliminating the false. This is a long and painful process of rejecting errors and accepting, for the time being, whatever ideas are left even though you know that one-day these ideas themselves may be altered or rejected in the light of new evidence.

Secondly, the trail highlights that scientists should realise that they work in society and society has expectations of them. Leaving communication of a complex and inconclusive discussion to a civil servant may have seemed appropriate to the scientists but it also implies a view that it was not their responsibility. Sitting on a national committee such as the Major Hazards Committee not only means that you are a highly respected member of the scientific community, it also means that you believe there is a social benefit to be gained from your knowledge. The social benefit is drastically reduced if you are unable to communicate effectively to a vital audience, the public. Assuming that you do not have to work at communicating this knowledge or that it will be communicated accurately for you is delegating responsibility for your views to someone else. In this case delegation means loss of control of your views. Science is difficult and science is complex but just saying this does not mean that you should not try to communicate the issues involved in a complicated subject such as seismology. You may not be providing the answers people want to hear but then again you are not providing them with simplistic and wrong answers. At least Michael Fish didn’t rely on someone else to communicate his mistake - it was all his own work - just like his tie.