Can search engine rankings swing elections?
A few years ago I had a hunch about search rankings. It was well known that people click more on higher-ranked search results – partly out of laziness and partly because they trust them. Could that trust, I wondered, cause people’s opinions to change on issues on which they were undecided? Could it, perhaps, even change how people vote?
Randomized, controlled experiments Ronald Robertson and I began conducting in early 2013 demonstrated that search rankings do indeed have such powers. When one candidate is consistently favoured in search results – in other words, when high-ranking results link to web pages that make that candidate look better than his or her opponent – this can, in a large country, push millions of votes to the higher-ranked candidate, shifting up to 80 per cent of undecided voters in some demographic groups. Because this is such a large effect and because many elections are won by small margins, we believe that search rankings have been determining the outcomes of upwards of 25 per cent of the world’s national elections for years now, with increasing impact each year as internet penetration has increased.
We labelled the power that search rankings have to shift people’s opinions the Search Engine Manipulation Effect, or SEME (pronounced ‘seem’). In numerous experiments with more than 10,000 participants from 39 countries, SEME has proved to be one of the most powerful and dangerous effects ever discovered in the behavioural sciences. It is powerful because of the big shifts it produces, and it is dangerous because it is virtually invisible as a source of influence.
Very few people, we have learned, can detect bias in search rankings, and the few who can detect it shift even farther in the direction of the bias. When sources of influence are invisible, people mistakenly conclude that they are not being influenced at all – that they have made up their own minds. Thus SEME works almost as a form of mass hypnosis – shifting the opinions of large numbers of people without them having any idea they are being manipulated.
We have evidence now that SEME can shift attitudes, beliefs and opinions about almost anything, not just about political candidates. In recent experiments we were able to shift people’s viewpoints about fracking (safe or dangerous?), homosexuality (in the genes or a matter of choice?), and artificial intelligence (beneficial or a threat to humanity?) by between 25 and 36 per cent after a single online search. We have also learned that people shift even more when exposed to biased search rankings repeatedly – say, when rankings favour one candidate in searches conducted over a period of weeks or months before an election. We are also learning more about the demographic factors: some individuals are highly susceptible to this form of influence – in other words, are highly trusting of search rankings – even though no-one has the slightest idea how search rankings are actually generated.
SEME is powerful because, unlike other list effects, it is supported by a daily regimen of conditioning. Most of our searches, after all, are for simple facts: where was Winston Churchill born? Invariably, we find the answer to such queries in the top search position, which teaches us, over and over again, that what is higher on the list is better and truer. When we finally conduct a search relevant to an issue we are truly perplexed about, the conditioning is irresistible: what’s higher is better, what’s higher is truer. We are so confident about this that 50 per cent of our clicks go to just the top two items on the list, and we very rarely leave the first page of results for the mysterious wasteland that lies beyond.
The Google problem
With 90 per cent of online search controlled by just one company in many countries worldwide – including most countries in Europe – a handful of executives in Mountain View, California, has more power over humankind than a small group of people has ever had before. Google’s leaders have the power not just to flip elections but to impact what more than a billion people think, do and say every day. Are they deliberately manipulating search rankings to exercise such power?
Without a whistleblower or warrant to sort things out, there is no way to know for sure, but anti-trust actions brought against Google by both the European Union and India are based on evidence that Google’s search rankings systematically favour Google products and services over those of their competitors – in other words, that they are indeed biased in ways that serve the company. An internal investigation by the US Federal Trade Commission in 2012 came to a similar conclusion, and a recent study by Nicholas Diakopoulos and others concluded that Google’s search rankings in the US consistently favour the Democratic Party.
It should surprise no-one that Google would programme its search algorithm – the computer program that selects and orders the search results – to serve the company. It is a profit-making enterprise, and court decisions in the US have even ruled that search engine companies have free rein in ordering search results – that search results are an expression of a company’s right to free speech.
The algorithm problem
As I have thought about this issue over the years, however, I have found myself more concerned about the search algorithm than the executives. While it is true that company executives might have good reason to support a particular political party, I suspect that the vast majority of search rankings that we see are truly ‘organic’. (For the uninitiated, ‘organic’ is the slick marketing term Google officials use to describe search results that happen ‘naturally’, without deliberate intervention by Google software engineers. Organic, as we all know, is good.)
Even though Google acknowledges that its engineers make manual adjustments to the search algorithm more than 400 times a year, I suspect the results of almost all of the three billion searches people conduct every day are not specifically predictable by the engineers. Algorithms have a way of taking on their own lives, so to speak, and this must surely be the case for Google’s search algorithm.
A handful of executives in Mountain View, California, has more power over humankind than a small group of people has ever had before
This, I feel, is truly disturbing. When the algorithm selects web pages from among the 45 billion currently in Google’s index and then posts links to those web pages in a certain order, it necessarily elevates some perspectives over others. That is its job, after all. It is certainly not programmed with an ‘equal time’ rule. Thus the thoughts, beliefs, attitudes, opinions and behaviours of a billion people are being impacted invisibly every day by a computer program.
This could provide an intriguing plotline for a science fiction novel, but it is not so far-fetched when you begin to realize that we are the characters in that novel. And who, or what, is the author?
Other new sources of influence
SEME, it turns out, is just the beginning of the story. In a book I am writing currently, I am looking at a wide range of new sources of influence that new technologies have brought into being in recent years; some of which, I believe, have emerged accidentally. In an ethically questionable experiment that Facebook conducted in the US on election day in 2010, for example, 60 million Facebook users were sent ‘Go out and vote’ reminders. As a result, 340,000 people voted who otherwise would not have.
In 2014, Harvard law professor Jonathan Zittrain published an article in the New Republic pointing out that if Facebook chose to send such a message exclusively to people who supported one candidate, that could easily flip a close election; he dubbed this kind of manipulation ‘digital gerrymandering’. Like SEME, this is entirely invisible to users and, again like SEME, it leaves no paper trail, which gives the offending company complete deniability.
In late March 2016, the popular hook-up site Tinder, which serves more than 60 million users every month, adjusted its simple swipe-left-or-right date selection routine for use in selecting the right US presidential candidate. Based on a few simple questions about where you stand on various issues (swipe left for pro-immigrants, right for kick them all out), Tinder quickly tells you which candidate is your best match. But who wrote this computer program, and couldn’t it easily be gamed to favour one candidate? How would one know one way or another?
I could go on. The internet is brimming with shiny new methods for manipulating people invisibly and on a massive scale. I haven’t even mentioned some of the more obvious new forces bearing down upon us, such as targeted advertising supported by detailed personal profiles thick enough to choke a large horse. And then there are the new dirty digital tricks that have been used to fix elections, beautifully documented in a recent report by Bloomberg Businessweek called ‘How to Hack an Election’. There are also the disturbing connections between some of the glittering new high-tech companies and the US intelligence community. A recent report by British investigative journalist Nafeez Ahmed suggests that the Google search engine was in part the child of the NSA and CIA. They were looking for a perfect tool for spotting threats to national security. What could be better than a search engine that serves as a gateway to the entire internet and that tracks and records everything we do, including the search terms we use and the websites we visit?
Spooky futuristic monitoring
The internet is brimming with shiny new methods for manipulating people invisibly and on a massive scale
I have only scratched the surface here, but the bottom line is that the democratic system of government – or at least a meaningful form of it – is very much in jeopardy at the moment; and so, for that matter, is our personal freedom. When, through ignorance or inaction, we give up more and more of our ability to make truly free choices – giving leaders we didn’t elect and computer programs we can’t even begin to understand the power to manipulate us, we are heading down a dangerous path.
Identifying threats like SEME and imposing regulation where it will help are small catch-up steps that are better than nothing. But given how quickly things are changing, I think we need to think bigger. I think the only way we have any chance of protecting ourselves from what will surely be an endless parade of new high-tech manipulative tools is to create new organizations and agencies that exist specifically a) to scrutinize new technologies as they arise, searching for manipulative mechanisms that may or may not yet have names; and b) to anticipate the creation of such technologies based on reasonable extrapolations from what was developed last week. If this sounds like an element in a science fiction plot – like a high-brow version of The Minority Report, perhaps – well, yes, I suppose it is. Remember that we are indeed characters in such a story at this very moment. The only question is whether we decide to take a hand in writing that story rather than allowing ourselves to be overwhelmed by forces we have trouble understanding.
Robert Epstein is a senior research psychologist at the American Institute for Behavioral Research and Technology in California, former editor-in-chief of Psychology Today, and author of 15 books. He is currently working on a book called The New Mind Control.
Dr Robert Epstein©2016, All rights reserved.Do not quote or cite without permission of the author.