Taiwanese protestors at a march in June attacked pro-Chinese media and called for tighter regulation.
A report in the Economist describes how China seeks to influence Taiwan’s election, scheduled for next January. As an example it cites a story that reached United Daily News, one of Taiwan’s top newspapers, revealing leaked documents in which the US asked that biological weapons be manufactured at a lab run by the island’s defence ministry. Soon these documents were shown to be pretty crude Chinese fakes, but by then the story had spread to Taiwanese talk shows and influencers. Eventually the story was embellished further, turning it into a plot to collect 150,000 samples of Taiwanese blood to develop a special virus to kill ethnic Chinese.
These stories were part of a wider campaign to persuade the Taiwanese to turn away from the US and instead build stronger ties with China. Taiwan’s government was so concerned by the potential impact of this campaign that it established a task force to address the problem of disinformation. They worried that it would only take 7% of voters to be affected by this and other claims (including a fabricated plan to blow up a Taiwanese chipmaker) to affect the election results.
Chinese disinformation drew on pre-existing fissures within Taiwanese society, including the fear of being abandoned by the US. Yet, as the Economist observed at the conclusion of its report, much of its messaging fell flat. Although it targeted the Democratic Progressive Party, supportive of the US alliance, its candidate, William Lai, is leading in the polls. Whatever doubts might have been created about the US there are more about China (unsurprising given that it has recently been surrounding Taiwan with ‘warplanes and warships’). A 2022 survey that found that 34% of respondents agreed that America is a ‘credible’ country, but only 9% said the same of China.
This episode confirms a feature of modern international conflict. A lot of effort goes into spreading rumours and fake news. Although some of it sticks the political impact is still slight. This is the conclusion I came to with my co-author, Heather Williams, Senior Fellow at the Center for Strategic and International Studies (CSIS) in Washington, in a book just published by the International Institute for Strategic Studies. It is entitled Changing the Narrative: Information Campaigns, Strategy and Crisis Escalation in the Digital Age. (Details of how to purchase with a discount at the end of the post!).
It looks at the role of social media in contemporary international conflict, including the efforts by states such as Russia and China to change the politics and perceptions of other countries, taking advantage of the speed with which information, both faked and accurate, can spread in the digital age.
Our China chapter, which mainly focuses on Beijing’s attempts to control the changing narratives on Covid, opens with a discussion of past efforts to influence Taiwan’s elections. There was a digital campaign launched in the lead-up to the 2016 Taiwanese presidential elections, which included spamming the Facebook account of the frontrunner and eventual winner, Tsai Ing-Wen, with tens of thousands of posts demanding that Taiwan and mainland China be reunified. Then in the 2018 local elections, anti-Tsai propaganda appeared on numerous social-media platforms, including Twitter, which was picked up by legitimate news sources and widely disseminated. It happened again in 2020. There were some 59 YouTube channels, with tens of thousands of subscribers and millions of views, promoting a pro-Beijing narrative. Yet still Tsai Ing-Wen emerged unharmed - and she was re-elected in 2020.
The bio-weapons theme is also familiar. In January 2021, as China came under pressure to explain the origins of the Covid pandemic in Wuhan, a Chinese government spokesman Hua Chunying said at a press conference:
‘I’d like to stress that if the United States truly respects facts, it should open the biological lab at Fort Detrick, give more transparency to issues like its 200-plus overseas bio-labs, invite WHO [World Health Organization] experts to conduct origin-tracing in the United States’.
The bio-labs theme has also long been a staple of Russian propaganda, with claims that the US had secret bioweapons labs in former Soviet states, particularly Ukraine. The Iranian regime also pushed the story, retweeting these allegations on 23 March 2020. Russia amplified the claim as part of its attempts to justify its full-scale invasion of Ukraine. It was one of the few such claims that gained some international traction, if only because it was also being promoted by China in its efforts to portray the US as the source of COVID-19. On 8 March 2022 the Chinese foreign ministry claimed that the US had 336 biological labs in 30 countries under its control.
There is no doubting the energy China puts into promoting misinformation in Western countries. A recent report by the UK Parliament’s Intelligence and Security Committee outlined the many ways that China seeks to exert influence in the UK, and the importance of promoting its own narratives, including on Covid. The Committee observed that:
‘China has worked hard on disinformation. It has greatly exaggerated its work to counter the virus and develop vaccines, and has sown seeds of doubt about the origins of the virus, to make the world believe that China was not at fault.’
Assessing the impact of Social media
While there are many examples of the abuse of social media, and of wider disinformation campaigns attracting attention, it is not always easy to judge their effectiveness. To try to do so was one objective of this book. Another was to explore how and in what ways the reach and immediacy of social media might encourage its use as an instrument in either calming or escalating major crises. We examined four cases: the crisis between India and Pakistan in 2019 over Kashmir; the crisis between the US and Iran in 2020, following the assassination of Qasem Soleimani; China’s management of the Covid pandemic, 2020-2022; and Russia’s war with Ukraine, 2013-2023.
The social media age is still in its infancy. Twitter opened for business in 2006, two years after Facebook and a year before Apple launched the iPhone. (Smart phones now account for some 80 percent of social media activity). So as a form of communication it is barely two decades old. But it is also still developing. It was used quite differently, for example, by the candidates’ teams in the 2012, 2016 and 2020 US presidential elections, as their understanding of platforms such as FaceBook became progressively more sophisticated and their messages more targeted. So what happened in the past may not be a reliable guide to the future. Actors learn from experience, adapt, and innovate.
It is also salutary to recall that social media was at first expected to pose a challenge to the ability of authoritarian governments and leaders to control the information and ideas available to their populations (remember the ‘Arab Spring’). This is now seen as somewhat utopian, as these governments took steps to limit access to social media and control its content, (for example China’s ‘Great Firewall’) while using it to promote their own narratives, often with strong nationalist themes.
Our project began as we considered the implications of Donald Trump’s continued use of tweets to make his points even after becoming president. This meant that whatever he tweeted, however eccentric or inflammatory, had to be treated as an authoritative statement of the US position. One of the more notorious tweets came after North Korean leader Kim Jong-un expressed confidence in his 2018 New Year’s Day address that the United States was deterred because ‘a nuclear button is always on the desk of my office’, Trump retorted on Twitter:
‘North Korean leader Kim Jong Un just stated that the ‘Nuclear Button is on his desk at all times.’ Will someone from his depleted and food starved regime please inform him that I too have a Nuclear Button, but it is a much bigger & more powerful one than his, and my Button works!
One American analyst, Jeffrey Lewis, resorted to fiction to demonstrate how this tendency of Trump could exacerbate a crisis, when an ill-disciplined tweet was taken to mean more than intended. In his book Kim misinterpreted a tweet from Trump boasting that ‘LITTLE ROCKET MAN WON’T BE BOTHERING US MUCH LONGER’ as meaning that an attempt to destroy the North Korean regime was under way with catastrophic results.
For a while, while not actually triggering wars, it seemed as if Twitter might become a significant mode of diplomatic communication as many leaders, including some who would never allow this to be used as a free means of communication in their own countries, for example Iran’s Ayatollah Ali Khamenei, released tweets that gained notice by ignoring normal diplomatic courtesies and by their rapid and widespread dissemination.
There is certainly more sensitivity now to the risks of disinformation, which has led a number of actors, such as Bellingcat, entering the eco-system to expose the fakery. The implosion at Twitter/X represents the latest stage in the evolution of this technology. The growing impact of content generated through artificial intelligence may well represent the next stage. In retrospect, the concern about Twitter diplomacy aggravating a crisis may have largely been a Donald Trump phenomenon, as he tweeted impetuously and without consultation. It was a function of the character of the man rather than the medium. But the potential remains, and he could become president again.
In general, when it comes to impact, we are talking about processes that may not change that much. This is because of the ‘stickiness’ of our beliefs. Our minds are not blank sheets of paper on which outsiders can write what they like but are shaped by the milieux in which we live: our inherited cultural traditions, including language; the quality and nature of our education; our inter- actions with family and friends; and our responses – both emotional and rational – to new events or information.
We can also recall that before social media, information was disseminated effectively, and often subversively, through pamphlets, radio, and cassettes. In the 1990s there was much discussion of the impact of TV news channels, described as the ‘CNN effect,’ with suggestions that the 24/7 news cycles was adding to the pressure on politicians to respond to events without appropriate reflection and led to concerns that striking images of the horrors of war could lead to ill-considered interventions.
Our research led us to five broad findings:
1. Information campaigns, including those on social media, are only one part of wider political struggles shaped by a variety of factors. A tweet alone cannot escalate a crisis. Rather, at times of crisis, information campaigns interact with other, more traditional forms of military and economic power.
2. The ease with which public perceptions can be shifted by information campaigns should not be exaggerated, especially when they come from foreign or anonymous sources and challenge established narratives. Crisis messages are most effective with domestic audiences, where they are likely to reinforce existing views. This is what happened during the 2019 Indo-Pakistan crisis, which served to strengthen the Bharatiya Janata Party (BJP) government led by Indian prime minister Narendra Modi, without making much difference to Pakistani behaviour.
3. States cannot therefore be confident that information campaigns will have the desired effects. While conventional wisdom suggests that Russia, China and Iran have taken full advantage of the digital environment to improve their international standing and gain support for their views, and we do not doubt their investment in this effort, there is little hard evidence that they make a huge material difference. To the extent that they do so, it is because they aggravate existing problems in the target society. This was the case with Russian interference in the 2016 US presidential election – the most commonly cited example of success). Then the most effective items of “fake news” directed against Hillary Clinton originated in the US.
4. Information campaigns are limited as strategic instruments because of the difficulty of anticipating the effects that messages will have, even when sent by a political leader. Messages can be interpreted in different ways, and audiences also vary. The environment is noisy and chaotic so it can be arbitrary which messages get the most notice and amplification.
5. Social-media campaigns are also hard to control. This is the case even with domestic audiences. They can, for example, create a ‘rally round the flag’ effect that could have unintended international consequences by increasing domestic pressure on an authoritarian leader to escalate.
So when an item goes ‘viral’ whether it will have political effects will still depend on the interests at stake and available policy options. Social media adds another layer of complexity to the management of crises and the conduct of conflicts, but one that is not necessarily transformational. In addition, as China demonstrated with its claims surrounding Covid, there can be a risk of straying so far from observable facts that credibility is lost, and a contrast opens up between official narratives and the experiences of ordinary people. This was the case with Beijing’s assertions about the necessity and success of its Covid quarantines. The result was an explosion of popular discontent that eventually forced a quick change of policy.
Russo-Ukraine War
When we looked at the Russo-Ukraine conflict, the primary impact of digital information operations was to reinforce pre-existing views. Russians, who continue to give support, even if tepid, to the invasion are more likely to get their information from television, which is full of conspiracy theories and bluster about the war, than social media. That does not mean that bad news does not on occasion penetrate.
Ukraine can be said to have been more effective in its information campaigns, especially in the West (the position in many non-Western countries is more complex). Here they had the advantage of a sympathetic audience rather than one inclined to be suspicious. Their domestic audience needed no persuasion that they were the victims of aggression. In addition their story was compelling and, unlike the Russia version of events, fitted verifiable facts. This was a story about a country being brutally attacked and needing all the help it could get.
Ukrainian bloggers had a strong presence on social media, providing up-to-date reports about battlefield developments and posting dramatic videos of Russian units being attacked, often against a background of pop music. As the war dragged on, these increasingly showed death and injury in explicit detail. Others provided translations of Russian posts (often from Telegram) detailing how badly things were going for Russian forces, or the maltreatment of mobilised soldiers.
While most of the information was accurate it was also selective. There were few videos of weapons missing targets or of Ukrainian casualties (official control over information was tight) so it still gave a partial view of events. Some bloggers did no more than cheerlead and were prone to optimism that at times merged into disinformation, but still a number gained reputations for reliability and objectivity and attracted large followings. If there was a lesson it was that candour is important when it comes to gaining trust.
Among the pro-Russian bloggers were some who analysed military developments in a way that acknowledged Russian difficulties even while emphasising Ukrainian losses and insisting on the inevitability of Russian victory. Many were close to the military but frustrated with the levels of incompetence, and were unsparing in their criticism of the high command. There were also many bots that just repeated standard Russian talking points, easily identifiable because they were anonymous and had few followers. Overall, the Russian information strategy was no more successful than its military strategy. This was despite a doctrinal belief within the Russian security establishment of the importance of psychology and manipulating information to shape perceptions.
Social Media and Democracy
It is important to stress again that we are not denying the existence of these campaigns, the inherent malevolence of many of them, or their occasional influence. But we do urge a sense of perspective. They are part of wider struggles in which other instruments of power are usually more important. For example, in Russia’s efforts to deny that the regime of Bashar al-Assad used chemical weapons against rebels in the Syrian civil war, and to discredit the ‘White Helmets’ group, what mattered more than the disinformation, unpleasant though that was, was Russia’s institutional advantages as a permanent member of the UN Security Council, which enabled it to undermine and even block international investigations.
Most importantly, while these campaigns can be aggravating, they feed off pre-existing divisions within our societies and diminished confidence in elites and systems of governance. If we were better able to address the sources of division within our societies, there would be fewer opportunities for others to exploit them. Countries such as China, Iran and Russia are opportunists. The best way to combat them, therefore, is to strengthen our own democratic practices.
Democratic governments are expected to be able to absorb hostile commentaries without repression, on the assumption that in the end their fate will be decided through free elections. In our societies the information environment, populated by a mixture of traditional and social media, cannot be controlled and can only be regulated with difficulty. It would be naive to suggest this always keeps the public accurately informed or that the system cannot become subject to serious distortions as a result of ownership issues surrounding major outlets and platforms. Nonetheless, there are still a range of voice to he heard and flexibility, both in holding governments to account and in addressing major issues.
The digital information environment is complex and confusing. It has a dark side that will always be difficult to eradicate. There will always be consumers of conspiracy theories, online gambling and pornography. Fake news, in some cases spread by fake personalities, will continue to exist. Even if damaging online narratives supporting the policies of hostile powers have little basis in reality, they will sometimes take hold. New technologies enabled by artificial intelligence, such as ‘deepfakes’ and large language models, may make disinformation easier to produce and harder to identify.
But despite these existing and emerging risks, we believe that using information campaigns to change established views in areas of public concern requires is rarely straightforward. This is especially so when there is widespread awareness of those risks that do exist from personal attacks and demagogic politics and a readiness to find ways to mitigate their effects by pushing back against the fakery and calling out the perpetrators. Lastly, amidst these concerns, we can still also appreciate the benefits that the digital information environment brings for news, entertainment and social connections. After all that is how you are able to read about this book.
For a 20% discount on the hard copy of Changing the Narratives: Information Campaigns, Strategy and Crisis Escalation in the Digital Age, (normally £16.99) please enter code ADELCTN at checkout when you order from Routledge.com.
Thanks. That seems right to me.
I am featured about 50 minutes into "After Truth," a documentary by Andrew Rossi, discussing my work in the 2017 US Senate special election in Alabama. We used real information about Roy Moore that was flying under the radar of Republican voters. What we learned in that experiment is that such campaigns will only ever have a real effect on an extremely close election. They do not really move the needle very much. Instead, the most damaging use of such a campaign is disclosure after a very close election. IOW you can't really get candidates elected, but you can undermine the winner of a close election.