By Andreas Haggman:
Edward Snowden’s revelations have prompted fierce debates in both the intelligence world and the cyber domain more generally. Opinions and analyses on the impact of the revelations can be found at every level of publication from academic journals to online discussion forums. The outcome of the debates with regards to the long-term operations of intelligence agencies is still unclear. However, what has already manifested itself is the public relations nightmare resulting from the much-maligned electronic snooping, conducted in particular by United States’ NSA and UK’s GCHQ signals intelligence agencies. Chiefly thanks to The Guardian’s publications, there have been outcries from the general public of foul play and invasions of privacy on the part of intelligence agencies.
In defence, the UK government’s stance was to assert that those who have nothing to hide have nothing to fear. GCHQ’s digital hoover may sweep up an unprecedented amount of internet traffic, but if you simply form part of the proverbial dust you will be ejected unmolested at the other end. If, on the other hand, you have more sinister objects strewn all over your digital floor, these will be caught in the filter and you will be dealt with accordingly.
Many commentators have noted that this explanation is not sufficient to justify large-scale privacy invasions. In a survey, Daniel Solove collated a number of responses to this issue, with one objection being particularly resonant. The complaint was the blunt question ‘So do you have curtains?’ The reasoning behind this is that if we follow the UK government’s logic, law-abiding citizens have no need to obscure from view what they do in their own homes. Because they don’t fear any reprisals for wrongdoing (since they do no wrong), they have no need to hide their actions.
At first glance the argument is compelling, but the analogy fails because of the inherent problem that it does not distinguish between privacy and secrecy. ‘Privacy,’ Eric Hughes stated in A Cypherpunk’s Manifesto, ‘is the power to selectively reveal oneself to the world.’ The key word here is ‘selectively’. Something that is hidden from everyone is a secret; something that is hidden from people you choose is private. Curtains, by their nature, entail privacy – you can choose when they are open and closed – and no one thinks anything of it when they are closed. However, if they were permanently drawn closed they would be tools of secrecy and, indeed, arouse some suspicion in the neighbourhood. For our purposes, it is this last point which is problematic.
The electronic equivalent of curtains is encryption. With so-called public key encryption protocols, two people are able to communicate without any outsider being able to read the content of the messages being passed. This can be used with discretion, so sending an unencrypted email equates to having curtains open, and sending an encrypted email entails having them closed. The problem here is that whereas curtains are a societally accepted privacy tool, encryption maintains a dubious status outside cyber security-aware circles. Because the default approach for the vast majority of people is to not actively encrypt their data and communications, those who do encrypt can be viewed with suspicion; especially those who encrypt consistently – that is, have their curtains closed all the time.
The issue, then, is that in encryption we have tools endowing us with the ability to create privacy in the digital domain, yet our attitude towards these tools means they are thought of as tools for secrecy. This is in contrast to the analogous curtains, which are accepted as tools of privacy.
All of this seems to be a great contradiction. Government organisations and corporations use encryption to protect the data they hold about us and for this we are thankful; indeed this is something we have come to demand. Similarly, many email providers encrypt the messages we send despite us not actively choosing to do so, which we nevertheless welcome. So those institutions of which we increasingly request transparency, we simultaneously embrace for their use of (perceived) secrecy tools.
This contradictory stance has stringent implications for security. If encryption is embraced at an organisational level, governments and corporations are able to maintain integrity of data, therefore keeping it secure. When this data concerns, for example, national infrastructure or defence details, security of data is directly connected to security of the state. On a personal level, however, if encrypting one’s own data is seen as illegitimate and not widely practiced, the same logic implies negative connotations for personal security.
In liberal democratic states this presents a problem. Such states espouse individual values and hold the safety of people in high regard. If personal security is compromised, upholding the values and safety falls to those entities whose security remains intact – that of governments and corporations. However, it is in the public interest to maintain some measure of control of their own security, for completely relying on others could be dangerous, lest the interests of the public and the interests of other entities (governments and corporations) unexpectedly diverge.
This line of reasoning is suspiciously Palmerstonian and, I suspect, would sit well with anti-gun control activists (particularly in the US). It could also be argued that at this point the analogy is overstepping its limits: encryption concerns only data on computers and extrapolating effects in the digital world to the physical world is stretching it too far. But this argument looks at the problem too abstractly. The data concerned often has a direct effect on the physical world, so encryption of this is necessary to maintain personal security.
The point here is that we need curtains. We need them not for any sinister purpose, but to maintain control over our privacy and personal security. In the digital world encryption offers these curtains. Unfortunately, until the taboo of encryption is overcome, personal security will remain in the hands of other entities. If we want to to seize control it is up to us, collectively, to embrace the protection offered by encryption.
Andreas Haggman is a MA student in Intelligence and International Security at King’s College London. His academic focus is on cyber security, particularly the development of weaponised code and organisational responses to cyber security issues.