Every day and every time we use our “smart phones” we are being tracked. From Facebook’s use of tracking cookies to monitor users to Carrier IQ key logging software for “smart phones”; companies and governments are using digital surveillance. The surveillance is troubling in its own right, but there is a deeper concern. When the surveillance is linked to our online behaviour and identity, it can be used to repress and discriminate. To some writers, the internet’s freedom is giving away to a darker possibility that authoritarian states will use the internet for control and repression. Yet the deeper concern may be what governments do on our behalf with our tacit consent.
Our desire to benefit from open data makes our privacy vulnerable to tracking and data mining. Only those off the net will be free of this intrusion. However, those who prefer to remain digital active may employ agents or avatars for their online work and to allow them to retain some privacy. The average citizen, though, will not be so lucky. They will have to suffer the relative loss of privacy and anonymity that those with more resources can avoid. The particular danger from the loss of privacy is that the open data and transparency agenda can encourage digital discrimination such as “weblining.” To protect our privacy and anonymity, and resist the danger of digital discrimination, we will have to find a way to reassert our electronic political rights and our electronic freedom.
The dystopia predicted or promised by some has not yet emerged. Instead, the electronic frontier is slowly being tamed, but the danger to our privacy and anonymity remains just as great. Now, it appears our privacy is at risk less from an innate authoritarianism than from a government’s search for economic benefits from harvesting personal information. In other words, the promise and economic potential of “big data” in the name of economic growth is the greater danger. The way our personal information held by governments is being used is changing. In the UK, the government is pursuing a transparency agenda to disclose more government data to encourage economic growth. Governments want to harness the promise of open data and big data. Until recently, the promise from such data has been mostly hidden and relatively unconnected. We are on is the cusp of an explosive period where big data’s potential emerges. People can link to big data across services and businesses for economic gain. Where big data is most valuable is where involves information about consumers, personal information about our behaviour that can be exploited for economic gain. The value that businesses or governments gain comes at the price of our identities, our lives and our privacy.
Open data appears as a benign movement because data storage and data itself are only used for benign purposes and is relatively anonymous. Yet, this view is not tenable given the power of computers to unlock identities from digital datasets that are purported to be anonymous. Unlike a paper based system, which has some physical limits to the searches that can be performed, big data means a person will not know who has the data and what they are doing with it. In that past, one could assume, or expect, that businesses would not have access to government (citizen) information. Now, we cannot be certain that the government is not selling its data. At the same time, one would not expect business databases, (health or insurance), to be routinely available to the government without the consumers knowing and consenting. Yet, the challenge of big data is not resolved by one piece of legislation or programme. Instead, we need to consider a wholesale change in the government’s attitude and approach to personal information. The United Kingdom has been very determined in this area.
In the UK, Government and business databases that store public information are more widely available and increasingly linked. Opening up government data is becoming a reality, which is laudable and helpful. As a guiding approach to public services, open data has important democratic benefits. Transparent public services, where decision and actions are supported by evidence that can be seen and assessed by the public, on their merits, increases the public’s opportunity to hold them to account. In other words, transparency creates accountability. However, the promise of efficiency and effectiveness hold a darker side. The transparency agenda contains an important, if less well publicized, impact on privacy. The question is: “will citizens accept a loss of their privacy on government controlled information in return for economic benefits?” In much the same way that Facebook uses the privacy of its members, are governments, like the UK, poised to do this as well?
The unanswered question is: are our privacy rights developed, in the paper age, are ready an electronic age? Are we now destined to be forever looking over our shoulders and kept wondering who has tracked and traced our online forays without our consent and without a say in how the data is used?
We are doing it for the customer
Businesses justify digital surveillance of customers in the name of customer satisfaction. If this is the case, why is the customer rarely told? Why are such systems beyond the control of the customer? How companies justify this surveillance and use of privacy is as disappointing as it is disingenuous. The customer satisfaction excuse is similar to the government’s “national security” excuse. All privacy violations and surveillance are only done in the name of “national security” or “crime and disorder”. Today the argument is that it is needed for “jobs” and “economic growth”.
Is it is too late to turn it off? After the iPhone tracking scandal, where users’ movements were being tracked unbeknownst to the user, the company appeared to get the message and allowed the functionality to be turned off. Can governments turn it off? Perhaps customers have become desensitized tot eh loss of privacy and the way their privacy is used because they have been incentivized through supermarket “loyalty points” schemes. As mentioned earlier, Facebook operates by exploiting privacy. The problem there is that customers do not know how the information is being mined or tracked or how far the information has been shared and for what purposes. For citizens, governments have begun to exploit privacy. For example, the government has shared demographic data relating to the census. The data, anonymized demographic information from the national census, is being shared with businesses. See for example, Francis Maude’s speech to the Demographics User Group [DUG] releasing census demographic data to the group. The DUG has 15 corporate members. Can governments restrain themselves from becoming like Facebook in that they will seek to harvest users (citizens) information? Perhaps citizens will be asked to “provide” their personal information in return for benefits from the state? Are we already trading our privacy for economic rewards, but now it will be made explicit?
If it is digital it can be mined and exploited: whether you like it or not.
We have digital identities whether we want them or not. The government and consumer databases capture our daily transactions as well as key life stage events. These databases are becoming more available to businesses and they are becoming linked. In the past, we could assume that the databases were anonymised, personally identifiable information removed, to protect our identity. The increase in computing power means that this is no longer the case. As a result, we will need different ways to protect our digital privacy. We are no longer anonymous. We need to reassert our political rights to create privacy rights fit for an era of open and big data.
In the big data age, our data can be tracked, traced and mined for value and meaning. Even though data mining has been around for some time, it has usually been limited in scope (usually a company’s own customer database) and rarely connected across industries or between businesses. What is new is that governments are beginning to exploit for economic gain either by their own analysis or by making it available to businesses. In the past, companies had to mine their own customer data or paid large fees to access the data of other companies or credit agencies in their search for value added. However, in the big data era cloud computing linked to the availability and access of databases means that the situation has changed fundamentally. Now, the internet and social media mean that more personal information is available. People are publishing more of their personal information in forums or Facebook pages, which is available for analysis. At the same time, the government and businesses, are mining their data as well. The issue is framed both negatively and positively. It is framed negatively by saying that such data mining is needed to identify and stop the next terrorist attack. It is framed positively by saying the mining is needed to unlock economic benefits. Governments will seek to harvest personal information in the same way as Facebook so they can benefit commercially. Put simply, the argument is that businesses (and governments) have to harness “big data” to maintain a commercial and economic comparative advantage.
Aside from mining of big data for economic benefits, the economic approach to personal information, presented by big data and open data, represents a major challenge to privacy. The power of big data allows organisations to create better models of individual and their behaviour. At the same time, it shows the scale of technology and techniques available to look at an individual based upon their personal information. While the public in the UK were shocked and dismayed by the scale of phone hacking at the News Of the World, it pales in comparison to what can be obtained through a focused approach with techniques such as those used with big data mining. By comparison, the private investigators were using relatively amateurish techniques (exploiting a basic access code flaw) to access saved phone messages. The technologies and techniques that are available both legally and illegally to businesses and governments are much more advanced for a greater insight into a person’s digital identity.
Only those without a digital identity will be free?
Citizens have to become aware of these issues to reassert their political rights in the digital realm. Without an awareness of these issues, citizens cannot assert their political rights or support legislation that will protect their rights. One such proposed right, a right to be forgotten is an attempt to address some of these concerns, but it falls short because it does not address the underlying issue. The issue is not how long businesses or governments hold the data, or even whether they hold it, the problem is the citizen cannot control or influence what is done with the information. What has not been solved, although governments are groping towards a solution, is how to protect privacy in the digital era and yet the commerce that benefits from it continue. The debate over intellectual property rights, seen in such legislation as SOPA, PIPA, and ACTA are proxies for the need to protect our original property right: our privacy.
The power of big data mining suggests means that only those without a digital identity or those with a strong ability to hide or control their digital identity may be free. People off the grid (such as the Amish) live and thrive without recourse to its advantages, yet they are a minority. However, anyone willing to live with that technological austerity can have that type of freedom. The alternative is to create digital avatars or have an agent that allows a person to act electronically yet maintain their privacy. For most of us, neither technological austerity nor technology prosperity is viable. For a small group, the issue will be resisting the digital surveillance implicit in big and open data. In any case, we are all trying to reassert our political rights. To this, we will have to reshape our online identities to protect our privacy. The tension though is whether we have become desensitized because we want the “benefits” that the commoditisation of our privacy (Facebook, store loyalty cards), creates. In that sense, the citizen’s may be unable to protect themselves from such manipulation. A more cynical view may suggest that citizens will have to surrender their privacy to obtain state benefits.
We face a difficult choice to retain some control over our digital identity. We can become completely public, by publishing details of our lives, thereby giving up our anonymity and reducing our private life to banality, or we can attempt to assume digital personae, electronic identities, that allow us to maintain a semblance of a private (electronic) life. Yet, neither will be sufficient on its own for the vast majority of citizens. Legislation to protect privacy and our digital freedom will be needed to tame the electronic frontier. In the UK the tension between privacy and “transparency” appears particularly problematic. The UK government’s approach to big data and transparency suggests that these need not be in tension with privacy. The government commissioned Kieron O’Hara’s to investigate the issue. His report Transparent Government, Not Transparent Citizens seemed to suggest a robust defence of privacy. Yet, a closer reading leads one to realize that it is paving the way for the government to trade the public’s privacy, in the putative goal of transparency of public services, in the hopes of economic gain. In this regard, the ethos of open data, improved public services, economic growth, is being offered as a reward but at the cost of privacy. Moreover, it allows big and open data to become tools of discrimination. The open and big data threaten our privacy and, ultimately, our freedom because we cannot control it.
To make the public aware of the danger from databases, the Joseph Rowntree Organisation published in 2009 a report arguing that UK had become a database state. The current government is seeking to harness those databases because it sees the economic benefits from using this information. To that end, the O’Hara report considers the trade-off between privacy and transparency in such databases. The question is to what extent you, the public, are willing to accept your privacy being compromised by the state in return for “economic growth”. Open data is not going to stop, so the question is whether it can be harnessed for positive outcomes. If the public are unaware of the implicit dangers they cannot demand or support legislation that will protect their electronic identities and rights.
One way that government data is exploited is through the census. The Output Area Classification distils the 2001 census to its lowest available level. In doing so, it allows a person to create a profile for area based upon reported census information. In a professional capacity, Experian and other credit reference agency use the Census data, amongst some other data sets, to map people according to their postal codes. When businesses use this it is called market segmentation. The danger is that someone could use that tool to look at your neighbourhood (or any neighbourhood) and make assumptions (or correlations on other databases) about you to your detriment. They could then make decisions without you, or anyone else in the neighbourhood knowing it was based upon your area’s profile. In the United States when digital information is used to discriminate it is called weblining.
Are Facebook and Google enabling evil”? How prevalent is Digital Discrimination like “weblining”?
In the United States, banks once discriminated against minorities by refusing to lend to them. An area marked out with a red line on a map a process called “redlining” which mean that anyone living in that area would have higher interest rates or would be refused a loan. The process was ruled discriminatory and declared illegal. In the digital age, a similar process is called the issue is called weblining. Although there are positive goals associated with process of using demographic information to identify customers, a process called market segmentation; it can be used for less benign reasons.
The challenge going forward is how we can stop personal information from being exploited and used to discriminate. What we put on the web is only the tip of the personal information iceberg. The power of big data in terms of personal information is immense. Consider the following situation. Imagine if all your previous credit card transactions, all mobile telephone calls, all emails, were cross-referenced and analysed against your medical records, your employment records and those of your friends and families. In that scenario, do you really think you are going to be forgotten by the digital age? When decisions are made about you on the basis of that information, whether it is accurate or inaccurate, how will you know? How will you change it?
What can be done?
- The public need to become aware of the issue so they can reassert their information rights and protect their digital privacy.
- Companies that use personal information must be explicit in how they are using it and why. A user should not have to use legal tools (such as the data protection act in Europe) to find out what information a company holds and how it uses it.
- The companies need to be held to account for their social responsibility regarding privacy.
- Legislation is needed because self-regulation is not working. SOPA, PIPA, and ACTA may be crude, but they reflect a problem that major web companies have not solved.
What are your opinions surrounding privacy online? What do you make of the increasing ability of companies like Google and Facebook to employ ‘digital surveillance’ on us for profit? What do you make of SOPA, PIPA and ACTA in relation to these issues?
Lawrence Serewicz is trying to understand and explain the American idea. The question is whether America will change the world before the world changes America. He is the author of the book America at the Brink of Empire: Rusk, Kissinger, and the Vietnam War. On Twitter he is known as@lldzne.
See also on the PostDesk blog: Disconnect co-founder interview: People will pay for tools to protect privacy, Facebook still tracks us when logged out
By the same author: Sex trafficking in the digital age: is it becoming a ‘legitimate’ business?