How we are gulled

VB: How would you describe the disinformation landscape today?
PP: In the digital age our assumptions about what guaranteed a democratic information environment have been turned upside down. We thought it was ‘freedom of expression’, pluralism, the marketplace of ideas and media going across borders. On the other side you had the totalitarians, the dictators of the Cold War, who had censorship and just one newspaper.
But ‘freedom of expression’ is now used by authoritarian regimes and dodgy democracies to swamp the information environment with disinformation and chaos. Today’s autocrats use online armies of trolls, cyber militias and online mobs to attack opposition and drown out criticism. Pluralism has tipped over into polarization and partisanship so vicious that mature democracies risk not being a model for anyone any more.
At the same time, you have these tech companies that are dominating the landscape, that are outside of any kind of democratic control or oversight and whose algorithms are making everything worse. So, that’s the systemic crisis and it needs big interventions.
What big interventions?
Advert
We need regulatory change: changes to online design and to the kind of online spaces we have debates in. I don’t think we should be having our political debates on Facebook. We need to think about the role of public-service-spirited media in the digital age. Information flowing across borders sounds great but it’s very clear that the way the digital environment is built makes hostile-state or extremist subversion way too easy. Rules of engagement need to be worked out.
What we need is joined-up thinking that goes beyond: ‘Let’s pass a law on fake news’ (which is stupid) or ‘let’s create a fact-checking organization’ (which is virtuous but feeble). We’re all rushing around looking at little bits of it. What we lack is the vision and the language, almost, to understand what to do with it. Basically, we’re witnessing the structural disintegration of the public sphere; we need a structural rebirth. It’s going to take a fusion of talents to do it: legal, tech, but also media-journalist-type people to think about what our role is as well.
What difference have Covid-19 and the 2020 US election made?
After saying they would never do anything about fake news, the tech platforms were suddenly going: ‘There’s fake news going round saying you should drink bleach. We’ll do something about that.’ Partly they didn’t want people dying, but also it was to show: ‘We can regulate ourselves; we don’t need regulation.’ But I don’t think anybody is buying that.
We’ve always accepted censorship and propaganda around public health. But outside these concrete, public-health risks and inside the arena of ‘political speech’ it gets complicated.
The platforms say: ‘We are neutral... People are just saying it.’ Facebook says: ‘We’re not touching that. It’s part of free speech.’ The push-back to this is: ‘No. These are algorithmic decisions.’ Platforms are already intervening by pushing algorithms higher or lower. Everything we see online is there because of some sort of design decision.
Advert
So, the first step is transparency. We need algorithmic auditing and accountability so that we can make a judgement. We need an arbiter or regulator, independent of government and democratically accountable. This will lead to fights and messy democratic conversation. But at least that process will be transparent.
Regarding the election: there was a lot more understanding about hostile information and disinformation campaigns than in 2016. The media did not fall so readily for hack-bait. Tech companies have become more aware of problematic campaigns and Trump was removed from platforms after the storming of the Capitol for incitement to violence, which is illegal. But legislation of online electoral advertising is still lacking. Online ads should if anything be more transparent than TV ads; you should know not only who is behind an ad and who paid for it, but why you are being targeted. You should understand immediately what other ads are being shown to people by this party; are they doing mixed messaging?
You’ve talked about information overload and the creation of ‘censorship through noise’. Can you expand on this?
What’s really new about the internet is not disinformation but its ability to amplify it and target it at scale. There’s no problem with one person, or even one government, saying ‘you are fake news’ to a critical newspaper. The problem is thousands of fake accounts saying it, giving the feeling that it’s organic when actually it’s designed. The ability to amplify, target and spread can just drown out the truth, drown out opposition voices, make people confused, disorientated, dispirited.
Disinforming content is nothing new. What’s new is disinforming, dissembling behaviour: something that looks like a person online but actually is hundreds and thousands of fake accounts. Something that looks like a genuine news site but is actually 5,000 fake accounts all saying the same thing at the same time in order to game the Google app. But you don’t know that. We can make certain types of deceptive behaviour illegal without encroaching on freedom of expression and I think we should.
Are you saying that there’s too much focus on content, not enough on these distribution and design issues?
Advert
Precisely. There’s not that much we can do about content, to be frank. We can’t regulate what people say online. We’ve always had crazy people on street corners saying stuff. We’ve always accepted this as part of democracy.
The problem is targeting that’s not transparent; campaigns that don’t look like campaigns, algorithmic boosting of what gets lots of clicks rather than boosting of accurate content. That’s where the regulation needs to be focused.
How should platforms become transparent?
One way is transparency for the users: ‘I’m online, why is this content being directed at me?’ Another is public, government, academic, oversight of the tech infostructure, the algorithms.
Before the US election, Trump’s people would say Google is pushing conservative news down and liberal news up. Maybe it was. We just don’t know. We can’t have these issues, have our public sphere, in a black box. It eats away at trust. It has to be opened up.
Algorithms should be showing people a genuine mix of content. If you punch, say ‘Syria’, into YouTube you should not be getting wall-to-wall pro-Assad, RT [international channel controlled by the Russian state] ‘White Helmets are Al Qaeda’ disinformation just because the activists and the Kremlin are actively gaming YouTube.
What can we do about polarization in general? Tell us about your work with the Arena Initiative.
We’ve been working largely with media. Can they create content that inspires less toxic debate and reaches audiences that are currently living in different information environments?
We’ve been looking at Ukraine and the culture wars there: people with very different attitudes to the Second World War and to the Soviet Union. Are there commonalities to build a proper conversation around?
We’ve been doing work in Hungary: looking at whether Orbán’s culture war is actually reflected in what people care about. And we’ve been working in Italy with Corriere della Sera exploring what might make the conversation about migration less toxic.
These challenges are all very different; there’s no magic formula. But if we unite sociology with journalism, if we give journalists more understanding of the effects of their editorial choices, hopefully we will get a new generation of media who will take more responsibility for what they are doing to the fabric of democracy.
The problem is that financial incentives point in the other direction. They are: be as scandalous as possible, be as polarizing as possible, fire up your side against the other side. You’re going to get quicker likes and shares and easier subscription if you say: ‘This is your identity based on hating the other identity, come and join our community.’
Disinformation and polarization often operate at an emotional, gut level. How can we rationally engage with this?
We’ve got to! People feel disempowered for various reasons and [populist leaders] are giving them a fake sense of empowerment. They are satisfying a democratic craving, but in a really malign and disingenuous way, saying: ‘I’ll take back control for you.’
We have to address that sense of lack. What are the ways of having your democratic voice and feeling empowered? Are there new ways of voting, new ways of engaging? We have to think how we use the tech space to give genuine control.
There is still a demand for democracy. Populist leaders are meeting it in one way, we have to meet that demand in another way. It is tough; we shouldn’t underestimate that.
But there is also a joy in co-operation, in overcoming divides, and a deep human need for healing. I think people might be a little bit sick of polarization. In focus groups we hear: ‘We don’t want to live in a society of hate. We want to talk to people.’ A demand for a different kind of tone. Also, when it comes to solving problems like the pandemic, polarizing politicians have been pretty shit.
Peter Pomerantsev is a journalist and academic who has participated on various expert bodies looking into regulating digital media. His prizewinning book, This Is Not Propaganda, was published in 2019. He is interviewed here by Vanessa Baird.
This article is from
the March-April 2021 issue
of New Internationalist.
- Discover unique global perspectives
- Support cutting-edge independent media
- Magazine delivered to your door or inbox
- Digital archive of over 500 issues
- Fund in-depth, high quality journalism