Algorithmically Curated Lies: How Search Engines Handle Misinformation about US Biolabs in Ukraine

Kuznetsova, Elizaveta; Makhortykh, Mykola; Sydorova, Maryna; Urman, Aleksandra; Vitulano, Ilaria; Stolze, Martha (24 January 2024). Algorithmically Curated Lies: How Search Engines Handle Misinformation about US Biolabs in Ukraine (arXiv). Cornell University 10.48550/arXiv.2401.13832

[img]
Preview
Text
2401.13832.pdf - Published Version
Available under License Creative Commons: Attribution (CC-BY).

Download (318kB) | Preview

The growing volume of online content prompts the need for adopting algorithmic systems of information curation. These systems range from web search engines to recommender systems and are integral for helping users stay informed about important societal developments. However, unlike journalistic editing the algorithmic information curation systems (AICSs) are known to be subject to different forms of malperformance which make them vulnerable to possible manipulation. The risk of manipulation is particularly prominent in the case when AICSs have to deal with information about false claims that underpin propaganda campaigns of authoritarian regimes. Using as a case study of the Russian disinformation campaign concerning the US biolabs in Ukraine, we investigate how one of the most commonly used forms of AICSs - i.e. web search engines - curate misinformation-related content. For this aim, we conduct virtual agent-based algorithm audits of Google, Bing, and Yandex search outputs in June 2022. Our findings highlight the troubling performance of search engines. Even though some search engines, like Google, were less likely to return misinformation results, across all languages and locations, the three search engines still mentioned or promoted a considerable share of false content (33% on Google; 44% on Bing, and 70% on Yandex). We also find significant disparities in misinformation exposure based on the language of search, with all search engines presenting a higher number of false stories in Russian. Location matters as well with users from Germany being more likely to be exposed to search results promoting false information. These observations stress the possibility of AICSs being vulnerable to manipulation, in particular in the case of the unfolding propaganda campaigns, and underline the importance of monitoring performance of these systems to prevent it.

Item Type:

Working Paper

Division/Institute:

03 Faculty of Business, Economics and Social Sciences > Social Sciences > Institute of Communication and Media Studies (ICMB)

UniBE Contributor:

Makhortykh, Mykola, Sydorova, Maryna, Urman, Aleksandra

Subjects:

000 Computer science, knowledge & systems
000 Computer science, knowledge & systems > 070 News media, journalism & publishing
300 Social sciences, sociology & anthropology

Series:

arXiv

Publisher:

Cornell University

Language:

English

Submitter:

Mykola Makhortykh

Date Deposited:

12 Feb 2024 10:07

Last Modified:

12 Feb 2024 10:16

Publisher DOI:

10.48550/arXiv.2401.13832

ArXiv ID:

2401.13832

Uncontrolled Keywords:

war, Ukraine, Russia, disinformation, conspiracy theory, biolabs, US, web search, algorithm, audit

BORIS DOI:

10.48350/192757

URI:

https://boris.unibe.ch/id/eprint/192757

Actions (login required)

Edit item Edit item
Provide Feedback