Is the Twitter Joke Trial an example of a society being ‘debugged’?

I just spent an interesting evening with various lawyers, bloggers and activists at the Westiminster Skeptics association (@WestSkep on Twitter), discussing the Twitter Joke Trial. I think this is an example of debugging a society: the skeptics are performing a valuable exception reporting service. We have to be careful, however, not to put in a safety system that accidentally causes more problems than it solves.

Westminster Skeptics This discussion is about the trial of a young man – Paul Chambers – who tweeted a ‘joke’ to his followers in which he expressed his frustration at the snow closure at Robin Hood airport by making a reference – in jest – about ‘blowing it up’. Although at the time it was not explicitly described as a joke, it was intended as such to Paul’s small group of online followers.

Circumstances cascaded as well-meaning people first noted the comment (during a random internet search) and then escalated the information through various chains-of-command, each of which tended to treat it as ‘not malicious’ or ‘not credible’. However, because of the potential seriousness of the imagined act (a terror attack is one of the most serious of consequences that must be considered and investigated), an unstoppable momentum overtook this otherwise forgettable comment.

It is my understanding that the local police force decided to arrest Paul under anti-terrorist legislation, and then passed the case on to the Crown Prosecution Service.  The Crown Prosecution Service decided that an offense had been committed – albeit unintentionally – and took the case to court.

Paul was found guilty, and at his subsequent appeal the sentence was upheld.

I am not able to assess the case – instead, I am interested in the parallels between a functioning legal system (a social construct, made of up laws and systems that interact in complex and sometimes inexplicable ways) and a large software system like a nuclear power station or oil platform.

In Tim Harford’s book Adapt, about the complexity of the modern world and how experimentation and adaptation can allow evolutionary forces to shape better solutions, Tim discusses where rigid safety rules, put in place to prevent identified accidents, have the unintended effect of increasing complexity and introducing larger, un-identified safety problems.

Tim Harford provides three examples where safety rules amplified accidents to make them disasters: the Three Mile Island nuclear accident, the Piper Alpha oil disaster, and the Sub-Prime Financial Crisis. In each case, the safety mechanisms failed, causing the problem to increase and become much harder to bring under control.

System design looks to build-in safety, using ‘what-if’ scenarios to examine failure cases, and adding safety features to mitigate or remove the effect of these failures – thereby (apparently) increasing safety. Where they fail is when they have to deal with unanticipated consequences.

For example, during the Piper Alpha accident, a safety mechanism that should have switched on water pumps to douse the fires was not enabled because it had a safety mechanism to protect divers in the water – the pumps would not come on, in case they sucked in divers. A manual over-ride for the safety system was built in, but this over-ride was placed in the control module – which is what had been disabled by the blast and could not be reached. Hence, the mechanism to protect individual divers from potential harm prevented another system from kicking in, which might have stopped over a hundred deaths.

The point here is – no matter how carefully we design our legal systems to increase safety, and prevent harm, we will be increasing the overall complexity and introducing unintended side-affects. A written constitution, for example, that sets inviolable rules will find itself unable to adapt to the inevitable changes in our society. We need flexibility to be able to adapt.

So how do we know if we need to adapt? We find examples where things are not working as expected – in software design: bugs. An error in the system, which throws an exception.

The Westminster Skeptics meeting tonight was an exception handler for our society; it means that the legal system has a bug. The group, the activists and bloggers, are our society’s debuggers, and we should be grateful to them for the service they provide: they allow us to identify the bugs, and improve the system.

We just have to be careful not to introduce a safety mechanism that leads, unexpectedly, to a cure that is worse than the disease.

How do we

0 Comments

Add yours →

One Pingback

  1. E-bike

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: