Tuesday, 31 January 2012

Making Accidents Happen

Perhaps my favourite non-fiction book (classing poetry as fiction) is "Normal Accidents" by American sociologist Charles Perrow. Most sociologists seem to write a kind of priestly jargon ideal for excluding the 99.9% of the world who haven't got the faintest what they're on about, but some American academics (I well remember J.K. Galbraith) write English that is both cultured and accessible. Charles Perrow is one.

He analyses the causes of major high-tech accidents and finds certain themes. The more complex the technology, and the faster it operates, the more likely are "unexpected interactions", things coming together that aren't supposed to. This could be as unsubtle as two wires rubbing together when the wiring diagram would give you no warning they might meet. It could be a cleaner's shirt catching on a lever and pulling it (which caused a major incident in a nuclear power station). People tend to react to perceived risk in such systems by adding safety devices, but these add to the complexity of the system and can cause accidents. He quotes a classic example of a state-of-the-art American cargo ship on to which automatic sprinklers were introduced, set to turn on if there was a fire. The ring round the nozzle of one sprinkler was the wrong metal. Over a period this corroded and resulted in the sprinkler turning on, which caused a flood, which caused an electrical fault, which caused a fire which eventually caused the ship to be abandoned.

I found this fascinating, but even more fascinating to me is the human psychology that causes or worsens an accident by misunderstanding the situation. Of course, people are sometimes faced by dangerous situations where the information they receive is not enough and they have to guess. I don't mean that: I mean when the information is quite good enough for a correct interpretation, but that doesn't happen. That seems to take place for a number of reasons deep-seated in human nature.

We are very good at interpreting situations - patterning, if you like - but once we've achieved an interpretation, we're very reluctant to abandon it. Thus an experienced (American coastguard) captain interprets the lights of a ship coming the other way as the lights of a ship going the same way but very slowly. The initial mistake is easily made - but as the pointers pile up that he's got it wrong, he fails to reconsider. In this case other crew members did interpret it rightly but feared to question the dictatorial captain's judgement. That could be compared to the Piper Alpha disaster, where the rig was not closed down because the manager on the site was supposed to get authorisation from Aberdeen and he couldn't raise them. Structures of authority, rules and systems meant to deal with relatively ordinary matters are clearly inadequate for major emergencies, yet people stick to them. We can trust machines too much: the machine is telling me X, so it must be so (even if all sorts of evidence points the other way). That was at the heart of the failure to understand the nature of the problem at Three Mile Island sooner. Some people, of course, panic while others remain rational even when very scared, and this seems to be mostly a matter of body chemistry (though training, of course, can help if it's just a matter of applying a routine emergency response, not of making desperately hard decisions). Finally, we struggle to believe that something is happening if it's beyond our experience: "It's never happened, so it won't happen". Since outside the military and emergency services, it's extremely unlikely that a person in charge has experienced, say, a major fire at work, this makes a nonsense of preparation.

So much more I could write on this. Any comments?

No comments:

Post a Comment