Report pinpoints role of likely Russian troll networks in European election disinformation

A network of accounts flooded social media with disinformation in the run-up to the European Parliamentary elections a new report has found.

The report was commissioned by the Social Democrats in the European Parliament (S&D) grouping together with the Dutch delegation GroenLinks-PvdA and produced by disinformation specialisists Trollrensics.

It reveals that organised networks of thousands of accounts, which the researchers believe are of likely Russian origin, actively influenced public opinion on X in France and Germany during the elections while voters in the Netherlands, Italy and the English-speaking public were also affected by the troll networks

Trollrensics’ data analysis showed that at least 20% of all tweets about the French far-right politician Zemour came from this troll network, for example. However, the research company estimated the actual percentage is significantly higher as the networks manipulated the X algorithm to amplify specific themes.

The research also found that German political party AfD received a huge boost thanks to the troll army. At least 10.7% of the tweets about the AfD came from the disinformation network.

The network focused mainly on spreading pro-Russian propaganda, messages about anti-vaxxers with anti-vaccination narratives and anti-LGBTIQ+ messages.

Thijs Reuten, an MEP for the S&D, said, “We commissioned this independent study as we were curious about the extent of online foreign interference and how measurable it is – especially because this sometimes seems so hard to ascertain. This study has shown that significant influence took place during the European elections. Troll armies managed to make topics trend and at the same time make certain news reports less visible.”

Reuten added, “This clearly shows our democracy is vulnerable and that foreign powers are willing to spend a lot of money and effort to sow division in our population. We need to defend ourselves better against such organised attempts of foreign interference. I expect the European Commission and the intelligence services to be on top of this. Our open society is in danger if troll armies are able to manipulate social media and, therefore, the public debate”.

The report confirms concerns from European groups that large-scale troll networks from Russia were attempting to influence the outcome of the elections.

X marks the spot where Israel-Hamas disinformation wars are being fought

The tragedies unfolding in Israel and Gaza are putting the social media platform X to the test – a test that X keeps failing. X, formerly known as Twitter, has elevated disinformation alongside fact-based reports on the conflict that range from graphic images created through AI, video game footage, and a plethora of recycled clips from Syria’s decade-long conflict.

Yet X’s disinformation overload should have been expected. Since Elon Musk’s acquisition of the platform, it has undergone a series of algorithmic and “aesthetic” changes that upended the credibility of the content. Instead of boosting posts from experts and on-the-ground reporting, X’s algorithm promotes Twitter Blue subscriptions, accounts which pay for a verification checkmark. This “pay to play” method has served to boost accounts of bots and propagandists, and has enabled disinformation to go viral in a short amount of time.

Musk’s favourite sycophants are being rewarded for their click-baiting methods amid the violence in Israel and on the Gaza Strip. Mario Nawfal – an obscure businessman who gained a following on X from his endorsements by Musk – posted a 2020 YouTube video showing Turkish missiles fired in northern Syria. Nawfal misstated: “Salvo of rockets fired by Hamas from the Gaza Strip towards Israel” alongside the video.

His message was tagged with a “community note” – an X fact-check system implemented through crowd-sourcing. But the post remained up, highly visible because of X’s algorithm. As of this writing, the post had more than two million views.

Musk’s own attention-seeking posts amid the violence demonstrate that. “All Tesla superchargers in Israel are free,” Musk posted. But his gesture was not all about altruism. There was a caveat. The post was restricted to replies from paid subscribers only.

Of greater concern is the platform’s role and influence in spreading distortion and disinformation. Musk bought Twitter for his own ideological reasons, and has viewed himself at war with “woke” values, which he argues erodes the foundations of democracy. Through his own personal crusades he has aligned himself with far-right ideologues and authoritarian leaders. And in turn he has garnered their loyalty.

Musk has made other changes on X that also have had a profound impact on how facts are represented. Earlier this month, X removed media-composed headlines from news articles. Musk argued the change was to “greatly improve the aesthetics” of the platform. But now users are shown images without context, allowing for bots, propagandists and even meme accounts to fill in the blanks with unsubstantiated claims. The result has created an alternative reality where conspiracies reign over fact.

As Twitter, the platform was a digital democratiser that gave voice to ordinary citizens beyond the confines of traditional media. In times of political upheaval or natural disaster, Twitter had a reputation for delivering on-the-ground reporting and firsthand accounts in real time.

Now it is Musk’s personal megaphone promoting his political views and business interests.

X’s representation of the events in Israel and Gaza reveal that the platform’s strengths for truth-telling are eroded and all but gone. Will we heed the warnings by Twitter’s demise or even realise the impact X now has on how we see the world?

The woman exposing the propaganda puppet masters

Dr Emma Briant, one of the key researchers who uncovered the Cambridge Analytica scandal in 2018

The vortex of misinformation, conspiracy theories, hatred and lies that we know as the unacceptable face of the internet has been well documented in recent years. Less well documented are the players behind these campaigns. But a small and growing group of journalists and researchers are working on shining a light on their activities. Dr Emma Briant is one of them. The professor, who is currently an associate at the Center for Financial Reporting and Accountability, University of Cambridge, is an internationally recognised expert who has researched information warfare and propaganda for nearly two decades. Her approach is that she doesn’t just research one party in the information war. Instead Briant considers each opponent, even those in democratic states, a breadth and detail that is important. As she tells me you miss half the story if you concentrate on single examples.

“This is a world in which there is an information war going on all sides and you can’t understand it without looking at all sides. There isn’t a binary of evil and pure. In order to understand how we can move forward in more ethical ways we need to understand the challenge that we are facing in our world of other actors who are trying to mislead us,” Briant says.

“There are powerful profit-making industries that are reshaping our world. We need to better research and understand that, to not simply expose some in isolated campaigns like they are just bad apples,” she adds.

Briant is perhaps best known for her work on Cambridge Analytica. She was central in exposing the data scandal related to the firm and Facebook at the time of the USA’s 2016 election. So what drove her to this area of research?

“My PhD looked at the war on terror and how the British and Americans were coordinating and developing their propaganda apparatus and strategies in response to changing media forms and changing warfare. Now that led me to meet Cambridge Analytica or rather its predecessor, the firm SCL group. Cambridge Analytica were using the kind of propaganda that had been used in the military, but in this case in elections, in democratic countries.”

The groundwork for this research was laid much earlier, when Briant lived as a child in Saudi Arabia around the time of the Gulf War. She was shocked to find lines and lines of Western newspapers censored with black pen, to the point you couldn’t read them, and pro-US and anti-Iraq propaganda everywhere.

“I was amazed by the efforts at social control,” she said.

Then, during her first degree, she studied international relations and politics when 9/11 happened and, as she says, “the world changed”.

“I was really very concerned about what we were being fed, about the spin of the Iraq war,” says Briant.

Like many she was inspired by a teacher, in her case Caroline Page.

“[Page] wrote a book on Vietnam and propaganda, and she had interviewed people in the American government and I was amazed that a woman could just go over to America and interview people in politics and in government and get really amazing interviews with high level officials. This really inspired me.”

Briant was motivated by both Page’s example and her specific work.

“She wanted to really find out what was going on and understand the actors behind the propaganda. And that is what really fascinates me most. Who’s behind the lies and the distortions? That’s why I’ve taken the approach that I have, both in looking at power in organisations like governments and how that’s deployed, and looking at how we can govern that power in democracies better.”

Because of Briant’s all-sided approach, she says she can attract the ire of people across the spectrum. People who focus only on Russia, for instance, might dislike that Briant critiques the British government. Conversely, people who are critics of the UK and US government call into question whether she should challenge Russian or Chinese propaganda. But, as she reiterates, “it’s really important to have researchers who are willing to take on that difficult issue of not only understanding a particular actor but understanding the conflict, protecting ordinary people and enabling them to have media they can trust and information online which is not deceptive.”

Criticism of her work has at times taken on a sinister edge. Briant is, sadly, no stranger to threats, trolling and other forms of online harassment.

“It’s very difficult to even just exist online if you’re doing powerful work, without getting trolled,” Briant says.

“The type of work that I do, which isn’t just analysing public media posts and how they spread, but is also looking at specific groups’ responsibilities and basically researching with a journalistic role in my research, that kind of thing tends to attract more harassment than just looking at online observable disinformation spread. Academics doing such work require support.”

Briant cites the case of Carole Cadwalladr, a journalist at the Guardian, as an example of how online campaigns are used to silence people. Like Briant, Cadwalladr pointed the looking glass at those behind the misinformation that spread in the lead-up to the EU referendum. Cadwalladr experienced extreme online harassment, as well as a lengthy and very expensive legal battle. Taken by Arron Banks, the case had all the hallmarks of being a SLAPP, a strategic lawsuit against public participation, namely, a lawsuit that has little to no legal merit. Its purpose is instead to silence the accused through draining them of emotional, physical and financial resources.

Briant has not been the subject of a SLAPP herself but has experienced other attempts to threaten, intimidate and silence her. Meanwhile, the threat of lawfare lingers in the background and has affected her work.

“Legal harassment has a real impact on what you feel like you are able to say. At one point after the Cambridge Analytica scandal it felt like I couldn’t work on highly sensitive work with a degree of privacy without the threat of being hacked or legal threats to obtain data or efforts to silence me. You cannot develop research on powerful actors and corrupt or deceptive activities as a journalist or a researcher without knowing your work is secure,” Briant says.

The ecosystem might be changing. New legislation has been proposed that will make using SLAPPs harder in the UK, where they are most common (the US, by comparison, has laws in place to limit them). But, as Briant highlights, there is more than one way to skin a cat.

“I don’t think people really understand the silencing effect of threat, not necessarily even receiving a letter but the potential of people to open up your private world.  The exposure of journalism activities before an investigation is complete enables people to use partial information to misrepresent the activities, it can even put sources at risk,” she says.

While Briant believes these harassment campaigns can affect anyone doing the sort of work that she and Cadwalladr do, she says we can’t ignore the gender dynamic.

“Trolling and harassment affects a lot of different women and women are much more likely to experience this than men who are doing powerful work challenging people. This is just true. It’s been shown by Julie Posetti and her team, and it’s also the case if you look at other minorities or vulnerable communities.”

Of course if Briant was just a bit player people might not care as much. Instead, Briant has given testimony to the European Parliament and had her work discussed in US Congress. She’s written one book, co-authored another and has contributed to two major documentary films (one being the Oscar-shortlisted Netflix film The Great Hack). In today’s world, the attacks she has received have become part of the price people are paying for successful work. Still it’s an unacceptable price, one that we need to speak about more.

Briant is doing that, as well as more broadly carrying on with her research. She’s also writing her next two books, one of which revisits Cambridge Analytica. In Briant fashion, it places the company in a wider context.

“I’m looking at different organisations and discussing the transformation of the influence industry. This is really a very new phenomenon. Digital influence mercenaries are being deployed in our elections and are shaping our world.”

At arm’s length – A stronger role for government?

[vc_row][vc_column][vc_column_text]What can EU and national governments do to stop disinformation originating from foreign actors and domestic extremists? Could new rules pose a future risk for traditional media at the hands of populist leaders? What role does the media play in maintaining unity and liberalism in Europe? What limits – if any – can be extended to social media?

Join Index chief executive Jodie Ginsberg for a panel discussion. 

Part of FT Future of News Europe conference

Book tickets here[/vc_column_text][vc_column_text]When: Tuesday November 26, 4.10pm
Where:  Amsterdam[/vc_column_text][/vc_column][/vc_row]