Trolls, bots and everyone else: Online disinformation campaigns and 2019 presidential elections in Ukraine

Urman, Aleksandra; Makhortykh, Mykola (3 September 2019). Trolls, bots and everyone else: Online disinformation campaigns and 2019 presidential elections in Ukraine (Unpublished). In: EuroCSS 2019. Zurich, Switzerland. September 2-4.

Full text not available from this repository. (Request a copy)


Today, online disinformation campaigns are increasingly employed to manipulate and alter public opinion. The targets of these campaigns vary; yet, political elections are one of the most prominent of them. The use of coordinated disinformation efforts was traced in the recent elections in the US (Bessi & Ferrara, 2016; Faris et al. 2017), France (Ferrara, 2017) and Italy (Cresci et al. 2017). The purposes of these efforts varied from attacking specific candidates (Ferrara, 2017) to forming negative attitudes towards certain social groups (Bennett & Livingston, 2018). By doing so, disinformation campaigns corrode the foundations of democratic systems and increase societal polarization by dividing citizens along partisan lines (Tucker, 2018).

The research on online disinformation during political elections is focused on two categories of agents: automated agents and human agents. The former are automated social media accounts (known as social bots/sock puppets) used to generate large volumes of content to support/attack candidates and their sympathisers (Bessi & Ferrara, 2016; Ferrara, 2017). The latter are human actors disseminating false information to condemn (i.e. “troll”) or praise (i.e. “elf”) candidates and their supporters (Bradshaw & Howard, 2017; Abdullina, Ageeva & Artamonova, 2018). Until now, however, these two categories of agents are usually discussed separately, whereas in practice organized disinformation campaigns often involve both of them.

In our paper, we analyze the involvement of both automated and human agents in the online disinformation efforts during 2019 presidential elections in Ukraine. There are two reasons behind our choice of case study: firstly, as part of the ongoing Russian-Ukrainian conflict, Ukraine is frequently targeted by online disinformation campaigns sponsored by Russia (Mejias & Vokuev, 2017). Considering the importance of presidential elections in Ukraine for the further course of the conflict it is highly probable that such campaigns would occur. Secondly, under the conditions of the ongoing information warfare, domestic Ukrainian actors increasingly adopt disinformation techniques to target their political opponents (Zhdanova & Orlova, 2017) that further increases polarization in the Ukrainian society.

To examine the interactions between human- and bot-produced disinformation and polarization in Ukraine, we are going to address the following research questions: How much online content was produced by bots and trolls compared with ordinary users in the case of specific candidates? How messages produced by bots and trolls differed in terms of the format and the purpose? What was the impact of disinformation campaigns and if trolls or bots were more effective?

Item Type:

Conference or Workshop Item (Paper)


03 Faculty of Business, Economics and Social Sciences > Social Sciences > Institute of Communication and Media Studies (ICMB)

UniBE Contributor:

Urman, Aleksandra, Makhortykh, Mykola


000 Computer science, knowledge & systems > 070 News media, journalism & publishing
300 Social sciences, sociology & anthropology
300 Social sciences, sociology & anthropology > 320 Political science




Mykola Makhortykh

Date Deposited:

30 Sep 2019 17:31

Last Modified:

05 Dec 2022 15:31

Uncontrolled Keywords:

bots, trolls, polarization, Ukraine, elections


Actions (login required)

Edit item Edit item
Provide Feedback