Roughly year and a half ago Twitter announced that it had removed 8,558 accounts that allegedly used coordinated activities to manipulate the platform’s functions to promote the ruling Serbian Progressive Party (SNS) and its leader.
Twitter’s removal of the large network of accounts shows that the public in Serbia is a target of political astroturfing. This term is used to denote communication activities of interest groups disguised as authentic expressions of support by ordinary citizens. Similar phenomena are known by other names. For example, the domestic public often refers to inauthentic communication in the digital space as botting or trolling.
How much do we understand what these activities are and what are their political implications?
Manipulating public opinion is not legitimate political marketing
After the announcement on Twitter, KRIK’s journalist asked President Vučić who the people who have been engaged in the promotion of SNS were, how were they paid and whether this expense was reported to the Anti-Corruption Agency.
“I have no idea, I’m not on Twitter,” the president replied, adding: “I don’t care about that, ordinary people support me even without Twitter. I have no idea what you are talking about”. Such statements are an attempt to downplay these activities or to portray them as legitimate political marketing.
In contrast, in its definition of the problem Twitter relies on military terminology and uses the term information operation to denote the activities of state-related actors done in order to manipulate the platform and create an inauthentic campaign of influence. Thus, Twitter connects such activities with the techniques of psychological warfare, by which the fight for public opinion is led by manipulation and not by arguments.
Similarly, researchers at the Oxford Programme on Technology and Democracy refer to cyber troops as actors associated with government or political parties whose task is to manipulate public opinion through social media. In Oxford team’s reports Serbia is mentioned as one of the countries where evidence of the existence of cyber troops was found.
After Twitter published the contents of the network of deleted accounts, Stanford Internet observatory analysed their activities. The findings of this study are consistent with reports from professional media and other researchers and point to two basic network strategies. The first was aimed at promoting the regime. This is achieved by retweeting posts of the most important accounts (such as @avucic, @sns_srbija or @InformerNovine) and frequent use of appropriate hash tags such as #vucic, #sns or #brzejacebolje. In addition, the network promoted pro-regime content outside of Twitter by sharing links to pages that publish such content (e.g. sns.org.rs, vucic.rs, informer.rs, alo.rs and pink.rs). Another strategy was to organize a negative campaign against political opponents. To achieve this, one tactic was to co-opt opposition hash tags (e.g. #poceloje), to change the course of the discussion in the desired direction. Another tactic was organized support for anti-opposition tweets.
One of the important problems related to such activities is the lack of transparency, as the questions posed by the journalist of KRIK show. The direct link between the SNS / Aleksandar Vučić and the deactivated network of accounts has not been established, making it easy for the President to avoid responding to the questions. When states engage in non-transparent and problematic digital activities, whistleblowers and independent journalists can play a significant role in their detection.
The testimonies of people who claim to have been hired to run the SNS’s astroturfing campaign have given us a first insight into how the so-called SNS internet teams work. Based on the writings of Južne Vesti, BIRN and other media, there is a very stable and centralized structure behind these activities. It is based on employees in the public sector, most often with precarious employment status, who implement the digital strategy of the party leadership during working hours and at their free time. They have access to software through which they receive instructions on which activities to focus on. Such reports suggest that these are not legitimate or harmless activities. We aren’t talking about active citizens participating in a political discussion, but about an organized sabotage of political discourse through the creation of a false image of public opinion.
Manipulation of public opinion can have significant political consequences
So far, there has been little public discussion about how such activities can influence political opinion and behaviour of ordinary users of social networks. The key to understanding possible political consequences is the public perception that the pro-regime narrative in the described astroturfing campaign is created, supported and spread by ordinary people. This perception affects how citizens assess the general mood of the public on given issues. The perception of public opinion can further influence formation of attitudes, persistence of existing attitudes, willingness of citizens to express their opinions and otherwise participate in political processes.
A study conducted in Germany revealed that exposure to comments on social networks affects how citizens assess public opinion in the debate on emigration. The readiness of citizens to get involved in the discussion on this topic depended on whether they assessed their own position as a minority or a majority. A study from Great Britain showed that the use of Twitter announcements in media reporting (which also happened with astroturfing comments in national tabloids) affected citizens’ perception of the majority opinion about a large infrastructure project and the expansion of the National Health Service. The effect on theirperception of the majority opinion led to a modification of attitudes in the expected direction.
Another study in Germany recently examined how pro-Russian astroturfing comments affected citizens’ attitudes on issues such as the poisoning of Sergei and Julia Skripalj. Similar to the previous examples, exposure to such comments influenced the respondents’ assessment of how the public in general understood Russia’s responsibility for this and similar events. The perception of the prevalent public opinion then led the respondents to attribute less responsibility to Russia on average. Among the respondents who were extremely confident in Russia’s involvement at the beginning of the experiment, the exposure to astroturfing comments and the consequent perception of public opinion weakened this attitude.
Mechanisms that explain such findings are grounded in classical theories of media effects, and the examples given above illustrate the growing body of research that finds support for these theories in the digital environment. An example of a classical theory in the field of forming public opinion is the spiral of silence. In short, the mechanism that this theory explains relates to the fear of social isolation, due to which people constantly monitor their environment (through media, personal contacts and social networks), assessing how public opinion is positioned in relation to controversial social issues. When people conclude that they advocate a minority position in such situations, the fear of isolation will lead them not to express their views publicly. Thus, over time, an opinion that is recognized as majority becomes more and more dominant, while alternative narratives become increasingly difficult to see in public discourse.
Exemplification theory explains why the comments of ordinary citizens in the media and on social networks are important in this process. The audience easily remembers and pays attention to the “authentic” comments of citizens, to which it does not link the attempts of persuasion that are regularly expected from other actors in the public sphere. Relying on inductive learning, citizens conclude that the examples they have been exposed to and see as authentic are part of a broader trend. The more examples like this, the more open the audience is to their arguments.
Exposure to the described astroturfing campaign in Serbia could lead an audience that doesn’t have a stable ideological and party preference to the conclusion that the comments they saw represent a generally accepted opinion and that the arguments presented in them are convincing. One potentially vulnerable group is young people, especially those under 24, for whom social networks are the primary source of information. They are also a part of the electorate with relatively less political experience and less stable political preferences.
Such an astroturfing campaign could lead other citizens to the impression that public opinion has been formed and that all political engagement is in vain, making them self-censor and withdraw from various forms of political participation. Fear of isolation is just one of the mechanisms that explain the tendency towards self-censorship in such cases. The possibility of direct intimidation by observing orchestrated attacks on dissidents should not be overlooked either. Thus, a study from Finland showed that the respondents refrained from commenting on topics related to Russia for fear that they would become the target of Russian trolls who had increased activity in that country.
Another possible effect of astroturfing is to direct public attention. According to the basic formulation, the agenda setting theory predicts that citizens attach more importance to a topic the more it is represented in the media. In today’s digital environment, social networks have threatened the monopoly of traditional media in setting the agenda. Astroturfing campaigns could be used to draw public attention to favourable topics (e.g. accusing the opposition of corruption) and push unpleasant topics out of public discourse such as extreme stratification, the increasingly difficult position of workers, health and education problems or increasingly serious environmental problems.
In conditions in which economic interests of technology corporations and the lack of political will make systemic solution of problems unlikely, it is important to understand what the consequences of astroturfing can be on collective decision-making in our society. The listed assumptions about the possible consequences are formulated on the basis of available relevant theories and findings of empirical research. Empirical testing of these assumptions in our context is necessary to draw more precise conclusions.
A situation in which the public discourse about bots and trolls on the Internet is perceived as normal and their activity accepted as inevitable must be prevented. Just as false news are a perverted inversion of the idea of professional journalism in the field public informing, political astroturfing is a meaningless idea of mass advocacy in a democracy. Both result in an increase in epistemological uncertainty in which it is increasingly difficult to discern what is true and authentic activity, because everything is subject to manipulation. Such circumstances provide ideal conditions for strengthening apathy. As long as citizens withdraw en masse from political life, it is easier for the political elite to provide support for the status quo by other undemocratic means.
Translation: Iskra Krstić
This article was ORIGINALLY PUBLISHED in Serbian on Sep 21, 2021.