Lab rats
Do you remember how you were feeling between 11 and 18 January, 2012? If you’re a Facebook user, you can scroll back and have a look. Your status updates might show you feeling a little bit down, or cheery. All perfectly natural, maybe. But if you were one of 689,003 unwitting users selected for an experiment to determine whether emotions are contagious, then maybe not. The report on its findings was published in March this year: "Experimental evidence of massive-scale emotional contagion through social networks". How did Facebook do it? Very subtly, by adjusting the algorithm of selected users’ news feeds. One half had a reduced chance of being exposed to positive updates, the other had a more upbeat newsfeed. Would users be more inclined to feel positive or negative themselves, depending on which group they were in? Yes. The authors of the report found – by extracting the posts of the people they were experimenting on – that, indeed, emotional states can be transferred to others, "leading people to experience the same emotions without their awareness".
It was legal (see Facebook’s Data Use Policy). Ethical? The answer to that lies in the shadows. A one-off? Not likely. When revealed last summer, the Facebook example created headlines around the world – and another story quickly followed. On 28 July, Christian Rudder, a Harvard math graduate and one of the founders of the internet dating site OkCupid, wrote a blog post titled "We Experiment on Human Beings!". In it, he outlined a number of experiments they performed on their users, one of which was to tell people who were "bad matches" (only 30 per cent compatible, according to their algorithm) that they were actually "exceptionally good for each other" (which usually requires a 90 per cent match). OkCupid wanted to see if mere suggestion would inspire people to like each other (answer: yes). It was a technological placebo. The experiment found that the power of suggestion works – but so does the bona fide OkCupid algorithm. Outraged debates ensued, with Rudder defensive. "This is the only way to find this stuff out,” he said, in one heated radio interview. “If you guys have an alternative to the scientific method, I’m all ears."
Two months after the hullabaloo, Rudder published a book, Dataclysm: Who We Are (When We Think No One’s Looking), which described how OkCupid keeps the messages you send to potential dates – and the sections of those messages that you erase before you send. The theory? Data is "how we’re really feeling". Data is, Rudder determines, a path to sociological truths that – by implication – are just as valid whether they’re received through experimentation, cached from something deleted, or harvested from public tweets. Is he right? Do we commit to being experimented on, in order to improve the systems we’re using? What about when people behind social media sites play with our emotions? We did, after all, tick the Terms and Conditions box.
The debate, says Mark Earls, should primarily be about civic responsibility, even before the ethical concerns. Earls is a towering figure in the world of advertising and communication, and his book Herd: How to Change Mass Behaviour By Harnessing our True Nature, was a gamechanger in how people in the industry thought about what drives us to make decisions. That was a decade ago, before Facebook, and it’s increasingly clear that his theories were prescient.
He kept an eye on the Facebook experiment furore, and was, he says, heavily against the whole concept. "They’re supporting the private space between people, their contacts and their social media life," he says. "And then they abused it."
Earls posits that because there wasn’t really a need for the experiment – social science had already figured out that emotions are contagious – there was another agenda at play. "What Facebook were trying to do, I suspect, was to create some sense of being proper social scientists themselves. Fair enough," he says. "But it was also to tell advertisers quite how powerful they are." That, he feels, is key. "As they say in All the President’s Men: follow the money." Facebook has around $3bn in ad revenue per quarter. For casual online users, then, the mantra has long been: if it’s free, you are the product. As you haemorrhage layer upon layer of data, you are being sold to an audience.
"In no other area of the media in Europe, North America or Asia, is the information about the reach of a particular media and its effectiveness owned by the media owner as opposed to owned by joint industry bodies," Earls explains. "They [Twitter, Facebook et al] all own the data and they won’t release it. They don’t want themselves compared to anyone else." The industry and the regulators need to realise this is important, he says. "The only way through is not only government-led – because governments are scared of these organisations as well – but to get a consensus between industry and policy makers: collaboration." What’s true for television, print and radio is absent online. There is, Earls says, a kind of hubris in the digital world that comes with regulation being almost entirely down to the company. "These nice, smiley people in Silicon Valley aren’t actually your friends – they’re business people," he says. "The big problem is, they need to make money. This is why Facebook is messing around with timelines."
John Oates, ethicist with the Academy of Social Sciences and the British Psychological Society, has been vigorously debating the ethics of online experimentation. As one of the authors of the paper Generic Ethics Principles in Social Science Research, he thinks there is scope for a voluntary code of conduct. Vociferous in his belief that ethical considerations need to become the bedrock of this new era, he co-wrote a letter to the Guardian in the wake of the Facebook revelations, decrying the practice.
"We were not over-egging the case," he says, speaking to me from his office at the Open University, where he lectures on Psychology. Oates references highly vulnerable people who rely on social media for supporting their self esteem. "I have one particular person quite close to me who is disabled and relies on giving and receiving likes from Facebook," he says. "There are other vulnerable people with mental health issues who find this crucial. To manipulate the balance of positive and negative information flowing to them is risky in terms of people who are very close to the edge."
Ofcom in the UK has channels of redress for people who think their human rights have been infringed online – but these too are on shifting sands, given that users of social media have already given a de facto form of consent, just by signing up. In Oates’s guidelines, there is a deliberate move away from informed consent towards what he calls "valid consent". While acknowledging the complexity, he also feels it’s “impossible to fully consent to something in advance”. He points to the wider implications, too, believing that witnessing the potentially unethical actions of big corporations can desensitise people. If the Wild West is just a click away from our desktop, then anything goes.
What is interesting is not so much what happens to the data we leak with every step (we know it is bought and sold, analysed and stored), but its contribution to how we can be influenced to act in the future. A group of US based professors and researchers published a research letter in 2012. Entitled A 61-Million-Person Experiment in Social Influence, it detailed exactly that: on 2 November, 2010, there was a randomised control trial involving all Facebook users aged 18 and over in the US. Users were randomly assigned to a "social message" group, an "informational message" group or a control group. The first two groups received a message at the top of their news feeds encouraging them to vote.
The "social message" group, however, also had the opportunity to click an "I voted" button, and to see a selection of profile pictures of friends who had also voted. Clicking that button allowed the researchers to "measure political self-expression, as it is likely to be affected by the extent to which a user desires to be seen as a voter by others". Matching those 6.3 million users to voter records (which are publically available) allowed the researchers to posit that "seeing faces of friends significantly contributed to the overall effect of the message on real-world voting". People wanted to be seen. The results suggested that the Facebook social message increased turnout directly by around 60,000 voters, and indirectly through social contagion by another 280,000 voters. "This means," the report stated, "it is possible that more of the 0.60 per cent growth in turnout between 2006 and 2010 might have been caused by a single message on Facebook."
Social contagion, then, determined by a social experiment – and all the subjects were unwitting. This test was only about whether or not people were encouraged to vote, but does it open up the possibility that they could be persuaded to vote for a particular party? Dr Harry Brignull is an independent UX (user experience) consultant based in the UK. As well as curating the website darkpatterns.org, which alerts consumers to the sleight-of-hand tricks sites use to get you to accidentally purchase more than you intended, he holds a PhD in Cognitive Science, and is experienced in academic research. The internet, he says, is "one big psychology experiment". Brignull points out that until recently, if you wanted to run a psychology experiment, you’d need a lab, participants, lots of equipment and – crucially – approval from an ethics committee. "One of the biggest changes we’ve seen on the internet since 2000 is the fact that it’s now incredibly easy to run psychology experiments on your customers at huge scale, without them even knowing," he says. "Today, we are all being psychologically evaluated constantly by people who want to make money from us. This isn’t inherently a bad thing, but there is a big risk that this new power will be used irresponsibly."
It’s the availability of this technology that enables researchers to make ethical mistakes, Brignull believes. "If it’s your job to run a high volume of these studies, you’re going to get slightly numbed to the subtle ethical differences between one study and another." He points out that in a cognitive science or psychology department, there would be "tight ethical guidelines and you’d be trained in what’s allowed and not allowed. When doing these studies in a commercial environment, the goal is to make money." And there’s no ethics committee.
Brignull talks about A/B testing, wherein websites serve up different versions of their pages at random – adjusted wording, colours, layout – and test which variant encourages us to click through, make a purchase, or do whatever the business wants users to do. He’s quick to say that A/B testing is nothing new – it’s around 100 years old, and the digital version is not much different from what the advertising agency Ogilvy & Mather promoted back in 1967, designing adverts differently according to what newspaper they appeared in and seeing which elicited the greatest response. "You could compare the old print approach to a smallholding farm in the 60s," Brignull says. "What we’ve got now is a factory farm. What they’re doing is more or less similar, but it’s the sheer scale of it that’s terrifying."
A/B testing is what is widely considered to have helped Obama win the election. Dan Siroker, CEO and cofounder of Optimizely, served during the 2008 campaign. Siroker applied a principle used by his company, which optimises websites via A/B and multivariate testing. What worked for commerce – test which version of your website encourages online visitors to stay, and buy – worked for politics. He set to work on Obama’s campaign website, with the aim of converting the casual browser into a signed-up subscriber. It offered a variety of choices – black and white photo of Obama versus turquoise, still image as opposed to video footage, "learn more" button rather than "sign up" – the data spoke, and so did the final results: 4m email addresses, thanks to the new site, and over $100m raised in funds.
Are we being tricked, though? Rory Sutherland, current vice chairman of Ogilvy & Mather, is circumspect about criticism. "We should be careful before we get too condemnatory about testing," he says. "We ourselves, if you believe in evolution, are also a product of a long experimental programme of testing and modification. Nature is itself a great big biological and social science experiment, after all."
Sutherland is interested, though, in the bigger picture of acceptance and consent, particularly in how we take what we’re given when it comes to social media. "How happy should we be in any case that it is an algorithm which determines what news we see from our friends?" he asks. It is the malaise of complicity: we hand ourselves and our data over to companies and allow them to manoeuvre us into certain actions.
Take, for instance, the UK-based, global customer science company Dunnhumby. The company works with such retail giants as Macy’s and Coca-Cola in the US, and Tesco in the UK. Giles Pavey, their chief data scientist, discussed how they encourage loyalty in Tesco customers. "When people are using their Clubcard and shopping online, there’s a treasure trove of information," he says. "I believe that in the future there are only going to be two viable strategies… cheapest or most relevant." Being relevant is a tricksy business, and works best if the company knows what the customer needs before they do. Thus, Dunnhumby’s website claims the group can "predict the future".
Tesco, via your Clubcard, will of course have details about your location, favourite items, when and where you shop (they don’t, Pavey says, sell on this information). An algorithm can see that you tend to buy healthy food and that on Saturday, say, it’s likely to rain in your area. Cue a recipe for a heartwarming soup on a rainy day, plus a discount voucher. On Saturday, it is indeed raining, so you nip to Tesco and make yourself a soup. How much were you manipulated, and how much was this something you might have done anyway?
Pavey argues, "just because things are technically possible, you don’t pursue them in blinkered way." He says if a customer changes their buying behaviour, stocking up on folic acid tablets and unperfumed creams, there isn’t an algorithm to send them an email saying: "It looks like you’re pregnant!". "There are often cultural reasons to override science," he says. This self-regulation is either ethical or commercially minded, given that Target, the retail monolith in the US, got into hot water for setting up algorithms which scored shoppers on a "pregnancy predictor" scale. Cue vouchers for baby products. As Charles Duhigg reported for the New York Times in 2012, the methodology of capitalising on Big Data was high on innovation and low on forethought: women were freaked out. One man came to the store, angry that his teenager was sent maternity promotions – only to discover that Target had worked out his daughter was pregnant before he had. Target changed their approach: still targeting women they thought were pregnant, but mixing up the baby vouchers with other goods, so that it looked random. One executive told Duhigg: "We found out that as long as a pregnant woman thinks she hasn’t been spied on, she’ll use the coupons… As long as we don’t spook her, it works." It was, in effect, an experiment, and the results were beneficial, as they showed a clear way to progress with marketing goods. They were still targeting the right people, just in a different way. Did it matter that women felt it was creepy, and a store broke news of a pregnancy before the family knew? Those questions, in the light of commerce, are strategic rather than ethical.
Perry Marshall is an online marketing strategist and entrepreneur, based in Chicago. He speaks of the 80/20 rule, or the Pareto principle: 20 per cent of your customers represent 80 per cent of business. Marshall, as you might expect, is candid. "An unbelievable amount of information is available for companies who are willing to pay for it," he says. Experimenting, he says, is just like fishing. "We’re looking for fishing holes that have fish. Groups of people who respond favourably. What part of the lake do Icast my line in where the fish are biting? In this case, it’s combinations of demographic or psychographic indicators that are the marketing equivalent of that location in a lake."
There is a backlash. The social networking site Ello will never, says Paul Budnitz, CEO and cofounder, experiment on its users. Based in Seattle, the company has no adverts, providing a free service and relying on paid-for features for revenue. Budnitz spoke to me over email, and wouldn’t disclose how many people have joined, or be drawn on how they’re seen in the social media industry ("Ello is a social network. Facebook is an advertising platform. We’re not competing…"); but he did say there were "thousands of people writing in requesting features they would be willing to pay for." In the same vein is DuckDuckGo, a search engine that doesn’t track users. Launched in 2008, it’s the brainchild of Gabriel Weinberg, an MIT graduate. "Your search history is your most personal data on the internet because you share your financial and health problems with your search engine without even thinking about it," he says. "Additionally, this information was increasingly being handed over to governments and marketers. Since you don’t need to track people to make money in web search, I decided to just not track people because that is the better user experience." There is a big disadvantage, he says, for users of companies that do personalisation experimentation. They are "trapped in a filter bubble, and seeing only points of view that one agrees with, and less and less opposing viewpoints."
One man who works hard to optimise the filter bubble is Ben Morris. He’s the President of Kristalytics, a US marketing company that holds an enormous amount of data. He outlines its reach, over the phone, while driving between Austin and San Antonio in Texas: "We have the entire US residential consumer file in-house," he says. "It’s 253 million individual records, 156 million household records, and we have over 1000 data points of enhancement from over 40 different licensors so we have information about behaviour, income, psychographics, demographics, geographics, and we use that for modelling. We also have over a billion emails and we have a look-up table approaching 43 billion IPs." Kristalytics strip off the personal details of individuals, putting each person into what they call a "psychographic segment". It’s not about what you’re selling, Morris says. And then, succinctly summing up the context into which the casual online browser strays, he says: "It’s about the attitudes and the beliefs and the interests of the people who are buying your product or services. If you can [use] our demographics or psychographics to be able to pique their curiosity, then you will succeed in marketing."