“When businesses are allowed to gather personal information, predict your behaviour and make decisions based on it, this is the information wild west. Welcome to the nightmare.”
Richard Hollis, director of information security consultancy the Risk Factory, believes that the ability of big data and predictive analytics systems to profile customers is generating “huge problems” in privacy terms for both online, and offline, consumers that are, as yet, poorly understood. He explains:
“It’s all about the data these days and everyone wants it – states, governments, businesses. And people don’t understand the implications because they want cheap hotel rooms and the like. But the big question is, who owns the data?”
For example, if Hollis went to an online pharmacy to buy some medicine, would his activity information belong to him or the shop in question – particularly if it was going to be used for marketing purposes to personalise offers in a bid to get him to buy more in future? He warns:
People are completely in the dark. They don’t know how much their data is worth to companies and they can’t see the negative consequences of a lack of transparency. They don’t understand that companies are in business purely to make money and they have no sense that they own this personal information – the colour of their hair, the size of their feet or whatever. And they won’t until they’re made to face it, probably as a result of a huge breach-of-privacy lawsuit.
Hollis also cites the now almost mythical tale of a teenage girl whom US discount retail chain Target worked out from her buying habits was pregnant before her dad did.
The story, uncovered by Charles Duhigg, staff writer for The New York Times, as part of the research for his book on the science of habit formation entitled The Power of Habit: Why We Do What We Do in Life and Business, saw the aforementioned furious father stalking into a Target store outside Minneapolis to confront the manager.
Clutching coupons for maternity clothes, baby gear, cots and the like that had been sent to his daughter who was still in high school, he questioned the chain’s apparently irresponsible behaviour and queried whether it was trying to encourage her to become pregnant.
The manager apologised and then called the dad a few days later to do so again, only to be told somewhat shamefacedly that the man’s daughter was expecting after all and the baby was due in August.
Understanding shopping habits
But in the same article, Duhigg also revealed some of the inner workings of the process. Target, which is described as one of the “smartest” organisations in the US at predictive analytics, assigns each shopper using a loyalty card with a unique personal code, known internally as a Guest ID number.
This number is assigned to every online and offline interaction they have with the company, ranging from filling in surveys to buying goods with a credit card. It is also linked to demographic information such as their age, marital status, home address and estimated salary.
Although Target refused to identify what other demographic information it had bought in from third parties, other possibilities include ethnicity, job history, education and the web sites they visit. And Target is scarcely alone. As Duhigg says:
“Almost every major retailer, from grocery chains to investment banks to the US Postal Service, has a “predictive analytics” department devoted to understanding not just consumers’ shopping habits but also their personal habits, so as to more efficiently market to them.”
As if to illustrate this point, another story that hit the headlines only a few months later revealed how big data and predictive analytics tools were being used to manipulate online pricing based on basic demographic information – and that was as much as three years ago in the relatively early days of such activity.
The Wall Street Journal discovered that online travel agency Orbitz Worldwide was showing Apple Mac users more costly travel options than PC-based consumers because the former tended to spend as much as 30% per night more on hotels.
The company justified its actions by saying that it wasn’t showing the same room to shoppers at different prices – it was just ranking them differently, but customers could still opt to list results by price if they so chose.
While Orbitz had not done anything illegal, Dr Paul Bernal, a lecturer at the UK’s University of East Anglia’s (UEA) Law School, believes that such behaviour potentially sets a dangerous precedent.
“When you log on online, you only see the prices that are offered to you. You don’t see what anyone else is offered or any kind of default pricing. You only see the personalised stuff and that’s true of nearly every internet service.”
Lack of transparency
The problem is that this lack of transparency could lead to abuses of commercial power. Bernal explains:
“If, for example, you work out from someone’s shopping data that they’re diabetic because they buy sugar-free chocolate etc, incentives could work both ways. So they could be offered a lower price as they’re a long-term customer or a higher price for a one-off sale. Legally, it’s not clear if this is OK or not, but it’s classed as sensitive personal data under data protection law. So the question is would it be right to have legislation that prevented this kind of discrimination?”
Bernal also points to supermarket chain Tesco’s introduction of screens using facial scanning technology at its 450 petrol station across the UK.
The OptimEyes screens, which were developed by Lord Alan Sugar’s Amscreen company, are positioned by the till and scan the eyes of queuing customers to determine their age and gender, before displaying tailored ads to them. They also adjust their choice of adverts based on time and date and monitor customers’ purchases. Bernal says:
“The point is that online is merging into offline more and more, and if services are immediately accessible in the real world, it just adds another dimension to things. It’s not to say that any of this is necessarily bad, but you have to be aware of the potential downside before you say ‘don’t bother regulating it’. And today the law and thinking around it are lagging far behind.”
One of the main safeguards in this area at the moment is based around data protection law. While the US, unlike Europe, has little in the way of overarching data protection legislation, it does have individual laws covering health, children, social security numbers and the like. Consumer watchdog the Federal Trade Commission has also been very active against companies that have misrepresented their use of customer information.
The European Union, on the other hand, is in the process of updating its existing laws, although they are not expected to come into force until 2018 at the earliest. Apparently the most lobbied legislation ever, it covers everything from customer profiling to tightening up regulations around the need to gain customer consent for using personal data. But it is still unclear whether the pro- or anti-privacy camp will prevail.
The problem, says Nicola Fulford, head of data privacy and protection at technology law firm Kemp Little, is that:
“The law is always playing catch-up with technology. It’s very hard for it to be adequate in anything more than general principle terms. Data protection laws safeguard up to a point and there are some rights of redress, but consumers have to be aware that their privacy is being invaded first. And most enforcement action so far has been around data breaches – there have been very few fairness cases.”
One such case though concerned taxis and other private hire vehicles in Southampton, where the local City Council’s policy that drivers had to record all conversations and images using CCTV for safety reasons when their cars were in use [http://www.bbc.co.uk/news/uk-england-hampshire-18982854] was ruled “disproportionate”.
In this instance, UK data protection watchdog, the Information Commissioner’s Office, said that the Council had “gone too far” as the safety of passengers and drivers had to be balanced against “the degree of privacy that most people would reasonably expect in the back of a taxi cab”.
But Pete Wood, chief executive of information security consultancy First Base Technologies, is less concerned about such issues and more about the creation of data “supersets”:
“I don’t see the commercial merit for retailers in abusing your personal information – they want more business and the way to get it is to target you more cleverly. In my view, it’s just part and parcel of living in a capitalist society.”
The real fear for him is what happens if the data isn’t secured properly and gets into the wrong hands as a result:
“Because this kind of information is so all-encompassing, if it leaks the danger is that someone with a less ethical mindset combines it with other online data such as the voters’ or housing register and is able to form conclusions about individuals. There’s a natural paranoia about not wanting corporations to hold lots of information about you, although Elvis has left the building there. But it’s when all of this data is combined to make a superset that it gets really scary.”
On the positive side though, at least such issues are starting to come out into the open more to be discussed. As Bernal concludes:
“Since the Snowden revelations, everyone has been worried about the security services, but corporations gather far more information with more direct implications for our lives – and the security services often get access to it too so it’s a double whammy. But whether the situation is good or harmless, it’s something that we should know about because building the infrastructure to enable organisations to know everything about us has implications.”
The time to debate these privacy questions at both a societal, governmental and international level is now rather than wait for a major breach of privacy law to set a legal precedent.