OPINION. The British government today published proposed revisions to the Investigatory Powers Bill, which retains elements of the so-called ‘Snooper’s Charter’.
If it secures Parliamentary approval – which is not certain – the Bill will be significant because it mandates telecommunications providers and ISPs to retain all customers’ phone and internet data for up to a year, and ministers, not judges, may have the final say in whether to seize records.
So, beyond the vital questions of whether the scheme is acceptable or workable – which are explored in detail below – a key question must be: are ISPs and telecoms providers ready for it, given that they’re being placed in the front line of the government’s plans?
In the wake of the TalkTalk hack just over a week ago, in which unencrypted customer data was stolen, the answer should be clear: no. And if it proves to be the case that a group of children were behind that attack (police have reportedly questioned a 15-year-old boy and other teenagers), then that answer should be shouted from the rooftops of Westminster.
The hack of the provider’s website demonstrated two things. First, that customer records and account information are irresistible to criminals in a world in which private data is increasingly a de facto currency. And second, that there is no legal obligation on internet companies, telcos, and cloud platforms to encrypt such data.
However, businesses can breathe a sigh of relief that the amended Bill will not (at this stage) seek to ban the use of encrypted telecommunications – a proposal that would have driven an ideological tank through the UK’s digital economy. But if such a proposal were to be pushed through Parliament, it would bring Whitehall into head-on conflict with technology companies, such as Apple, BlackBerry, Microsoft, Google, and many apps providers, which are increasingly building encryption into the heart of their offerings.
But back to TalkTalk. The company’s CEO Dido Harding went as far as claiming the Data Protection Act itself as her defence against widespread criticism of the fact that customer data had been stored ‘unsalted’. The Act fails to make data encryption a legal requirement, allowing providers to hide behind some vague and poorly worded terms.
So this all-too-public failure of a major communications provider to secure customer data means one thing: it will be open season on the internet records of UK businesses and private citizens should the amended Bill pass through Parliament, because their allure will be irresistible to criminals.
Meanwhile, ISPs, telcos, mobile providers, and others, will be expected to secure all that data from day one, even though there is no legal obligation on them to encrypt it. And as dozens of separate private sector entities, their response will inevitably be piecemeal, rather than a co-ordinated national strategy.
By – in effect – leaving it up to the market to decide how best to react to a centralised programme of national intelligence gathering, the government’s plan has the makings of being a dangerous ideological gamble with citizens’ personal security. Common sense suggests that these facts alone risk increasing cybercrime and decreasing national security, rather than bolstering the government’s fight against terrorism, organised crime, and abusers.
Lording it over the Bill?
Away from the TalkTalk boardroom, Dido Harding is Baroness Harding of Winscombe, a member of the House of Lords who has – ironically – recently stated her personal determination to make the Web safer for all, especially for women and children.
In February this year she said in an interview with ThisIsMoney.com, “’We’re a democracy and so you want Parliament to make these decisions. In the course of the next couple of years we’re going to have a national debate about it. I want to be part of that [debate], but, my goodness, as an internet service provider chief executive I shouldn’t be the one making the decision, society should.”
Society’s opinion, however, hasn’t being sought. Nevertheless, Harding’s words also ring hollow because her company has failed to protect customer data from malicious intrusion and theft. So it will be interesting to see whether she speaks out in favour of the Bill or opposes it – a classic case of a company being forced to decide which is more important: shareholder value, or taxpayer value?
But will the amended Bill itself protect vulnerable children – as the Home Secretary claims, and as Harding wants? Again, no. Quite the reverse, in fact.
Increased risk to children
First, children’s right to privacy is protected under the UN Convention on the Rights of the Child, so the British government would have no choice but to seek a national security exemption against the Convention – the most widely upheld piece of human rights legislation in history – given that any national data-mining exercise would inevitably gather data about children’s internet activities as well as those of adults.
(The risks of data-mining algorithms, machine-based decision-making, and surveillance are explored towards the end of this external blog entry.)
Second, child-protection workers, teachers, charities, and other childcare professionals all tend to agree on one thing: that once children reach adulthood at 18, they should be given the right to have all of their childhood internet activity ‘forgotten’, so as to prevent their own data from causing them problems later in life, such as with prospective employers. The amended Bill makes that impossible, and therefore acts against the professional opinion of the very people whose job is to protect children in civil society.
Indeed, the danger of citizens being oppressed by their own data, in effect, have been significantly increased by the scheme.
And third, for another very important, but rather uncomfortable, reason: by far the largest distributors of explicit images of people under the age of 18 are minors themselves: the phenomenon known as ‘sexting’. So rather than protect children, the amended Bill may, in fact, criminalise thousands of teenagers for disseminating images of themselves. (This has already happened in both the UK and the US.)
The technical challenges
Then there are the technology challenges of an overt national surveillance programme. IP addresses identify devices, not people. That a device might have been used in suspicious or illegal circumstances does not constitute evidence that the owner has committed a crime, given that anyone might have used it, or hacked into it.
IP addresses can be cloaked; they can be rented off the shelf via (legal) IP-hiding and proxy server platforms that allow users to appear to be anywhere in the world, from Colorado to Korea; and they can be absorbed into malicious botnets without the user having any idea that their computer is being used to send spam, viruses, or carry out Denial of Service Attacks.
Just as significantly, there will soon be untold millions of new IP addresses, thanks to the Internet of Things (IoT): all those smart devices that will be coming online over the next few years under the extended addressing made possible by IPv6. A tsunami of data, in fact, which the police almost certainly lack the resources to deal with.
The IoT itself will pose challenges on an unprecedented scale. The security of smart devices, from lightbulbs to cars, fridges, and environmental control systems, has been found wanting in many tests, including some carried out recently by IBM, which successfully used – for example – an MP3 file to remotely disable a car’s brakes, and a building’s HVAC system to gain access to corporate wifi credentials.
What all this means is that just because a device within a specific real-world location, such as an office or house, been used in an illegal or ‘suspicious’ act does not constitute reliable evidence of any wrongdoing on the owner’s behalf. The IBM tests prove that it’s entirely possible that a hacker could gain control of a household’s broadband account – and any computer connected to it – via an insecure smart lightbulb. And as more and more IoT devices are rushed to market to capitalise on the first wave of consumer interest, it’s inevitable that the security of many will be poor to non-existent.
In short, the technology challenges inherent in any meaningful monitoring and data-mining of internet records in order to catch criminals are so vast and so complex as to make the whole scheme nonsensical. That could mean thousands of false positives and a colossal waste of police resources, with untold numbers of innocent people coming under suspicion.
Indeed, it goes without saying that organised criminals, terrorists, and pornographers are the very people who will have the skills to evade blanket surveillance, or pass evidence onto innocent ‘bystanders’.
And, of course, someone in a back room will be asked to write the algorithms that identify ‘suspicious’ patterns of behaviour, or ‘suspicious’ website visits and trigger words, because it stands to reason that human beings will have neither the time nor the resources to monitor the entire population’s internet activity. Who writes those algorithms, based on what rules, what keywords, and what concepts of ‘suspicious’ behaviour is a very serious question, and the fact that politicians, not judges, will have the power to hit ‘Enter’ on investigating such activity suggests that those decisions may have a worrying political dimension.
The business angle
Now, beyond the alarming lack of preparedness of some ISPs and telcos, few of these unfortunate side effects of the amended Bill will have a direct impact on business, perhaps. However, three things certainly will: the rise of BYOD schemes and ‘shadow IT’ in business; the expanding grey area between business and private use of technology, especially when it comes to unified communications and collaboration (UC&C) tools; and any lack of trust in the UK’s digital economy and in its technology providers.
The first two can be taken together: the lack of any meaningful distinction today between ‘business’ and ‘consumer’ IT. Increasingly, employees use their own devices for business (aka ‘BYOD’), and informally use their own cloud applications and storage facilities within the business (aka ‘shadow IT’, or ‘Bring Your Own Cloud’ [BYOC]). This shifts their private internet usage into the corporate realm, and thus may implicate organisations in any illegal – or merely ‘suspicious’ – acts.
More, the use of encrypted communications, proxy servers, and other tools by enterprises is central to the successful running of the digital economy, especially in areas such as financial services, defence, and any business that revolves around intellectual property – which, in this day and age, is most business. In short, these technologies are core to the UK’s economic health.
But many private individuals are also ‘companies’: self-employed people, freelancers, and so on; anyone who uses limited company status for business, in fact. Will their use of encrypted communications be accepted by the government, or does it apply merely to large enterprises? If so, why should large businesses be allowed to encrypt communications, but not anyone else? And what about all those flexible workers who are working remotely from home and accessing core enterprise systems through private networks?
This whole area has so many pitfalls that, again, it will prove to be completely unworkable, because thousands of law-abiding citizens will be using encrypted communications in the home in the normal course of their working lives.
Which brings us to trust.
Why trust is essential
For our digital economy to succeed, citizens need to trust that, in a world of unified communications, digitised governmental services have their best interests at heart. Customers need to trust that their service providers can protect their private data. And private companies need to trust that the UK is a safe place to do business. In each of these areas, the amended Bill falls woefully short and puts that trust at risk.
The impression given by the proposals is that politicians and inexpert officials are rushing surveillance plans through, rewriting and amending them on the fly, with little consideration of the real-world consequences that may follow – and they are doing so against the advice of most technology providers (let alone civil liberties campaigners).
This impression is only reinforced by a new report on the progress of digital government programmes worldwide by consultancy Deloitte Digital.
The report, The Journey to Government’s Digital Transformation, says that one of the signifiers of an “early stage” digital government – rather than of a “developing” or “digitally maturing” one – is a primary focus on cost-reduction, rather than on citizen benefit. Out of over 1,200 respondents in government organisations across 70 countries worldwide, the British government is revealed by the Deloitte survey as being by far the most focused on cost-reduction (and therefore, implicitly, the least focused on citizen benefit).
The report even quotes Mike Bracken on the need for organisations to develop a supportive culture beneath technology change. At the time he was interviewed for the report, Bracken was presumably still head of the UK’s Government Digital Service (GDS). However, he resigned in August and has since joined the Co-Operative Group, citing an institutional lack of the culture necessary to effect real digital transformation and citizen benefit. Recently, he has set about employing several of his former colleagues (who, we can surmise, may be equally disillusioned with the UK’s digital progress).
In this context, the amended Investigatory Powers Bill should cause real alarm. An ill-informed, low-cost, blunt instrument to fight crime – one that’s politically expedient, technologically unworkable, ignores emerging technologies, and is reliant on a disparate group of private providers to secure citizens’ data (even though there’s no legal obligation on them to encrypt it)? That’s not a solution to anything!
It’s just not right
But there’s another dimension to this proposal, which we cannot ignore: this is simply a scheme that has no place in any society that values freedom of speech and freedom of thought. A culture in which security services police people’s reading habits online at politicians’ behest would be a catastrophic misstep for us all.
In a free society, no one follows citizens around bookshops or libraries noting what books they browse, read, borrow, or buy, or what opinions they read or listen to, because we know that people are not defined by their browsing or by their reading choices. Just as important, we know that to read something is neither to agree with it, nor to condone it.
Reading something online is no different; just because a text is digitised does not somehow make it dangerous, any more than a piece of paper is inherently dangerous. We read to learn, to find out about the world, to enquire, to ask questions, to gain different perspectives. And the roots of the World Wide Web, of course, lie in the linking of information resources to aid the spread of knowledge and education, to the benefit of us all.
Yet sooner or later a politician will utter the immortal words, “The innocent have nothing to fear”, but that will not be the case if these proposals become law – a law from which MPs will, extraordinarily, apparently be exempt.
If the UK adopts blanket surveillance of citizens’ browsing habits, then we are opening the door to a form of digital ‘McCarthyism’ on an unprecedented scale, and this can only damage the UK, damage our society, damage the economy, damage business, undermine trust, undermine digital programmes, and put the UK out of step with the rest of the developed world.