Abstract: Following the annexation of Crimea a decade ago, Russian warfare tactics have evolved to adapt to technological advancements and make up for shortfalls in conventional military domains; among them falls the use of Russian television and Telegram channels to influence the Ukrainian public. As claimed by Maschmeyer et al.’s “Donetsk Don’t Tell – ‘Hybrid War’ in Ukraine and the Limits of Social Media Influence Operations”, television outperforms social media in the limited context of Ukrainian support to Russia’s influence operations. However, due to stringent limitations in broadcasting subsequent to Maschmeyer’s study, and theoretical shortcomings, such a statement is not criticism-proof.
Problem statement: With the technological advances in the war in Ukraine, how can one objectively assess the role of television and social media?
So what?: Considering the theoretical and methodological pitfalls, it is crucial to rerun an analysis whose findings could substantially influence media control of traditionally disregarded media.

Source: shutterstock.com/Christelle Neant
The Rise of Influence Operations
Following the annexation of Crimea a decade ago, Russian warfare tactics have evolved to adapt to technological advancements and make up for shortfalls in conventional military domains.[1] Spanning from propaganda dissipated through traditional media, to AI and deepfakes,[2] hybrid techniques have been carried out to garner Ukrainian and global support and motivate the use of force in the region. Namely, influence operations (IO) are a key component of hybrid warfare and are defined as “the integrated employment, during military operations, of information-related capabilities in concert with other lines of operation to influence, disrupt, corrupt, or usurp the decision-making of adversaries”.[3] Their aim is–through information channels–to affect impressions to either favour the actor leading the operation and/or disserve the opposing side.[4]
Spanning from propaganda dissipated through traditional media, to AI and deepfakes, hybrid techniques have been carried out to garner Ukrainian and global support and motivate the use of force in the region.
However, the effectiveness of these non-military means of warfare is not yet a given. In “Donetsk don’t tell – ‘hybrid war’ in Ukraine and the limits of social media influence operations”, Lennart Maschmeyer, Alexei Abrahams, Peter Pomerantsev, and Volodymyr Yermolenko attempt to prove the advantage of television in hybrid warfare.
The research was carried out by testing five hypotheses:[5]
H1: The more people agree with narratives, the more likely their foreign policy preferences are to align with the sponsor’s interests
H2: Audiences of partisan television channels are more likely to be exposed to narratives than audiences of partisan social media channels.
H3: Audiences of partisan television channels are more likely to agree with narratives than audiences of partisan social media channels.
H4: State sponsorship/content origin of influence operations via social media is less likely to be attributable than via television.
H5: Partisan television channels are likely to reach a larger audience than partisan social media channels.
Through the case analysis of anti-Western information operations in Ukraine and the statistical testing of the five hypotheses, it is underlined that television has a relative advantage in disseminating propaganda. Namely, television’s persuasiveness at a national level effectively diminishes the impact of new technology and social media in influence operations.
Hybrid War
In a literary landscape where research on the effects of political disinformation is scarce, the article by Maschmeyer et al.[6] delves deep into experimental evidence to provide more insights on means of influence operation in current conflicts. Namely, the analysis is primarily aimed at comparing the resonance and persuasiveness of social media, taking Telegram[7] as a case study, and television broadcasts, limiting the geographical scope to Ukraine. Considering the plethora of platforms, the authors correctly selected Telegram, comparable to partisan television as it is scarcely regulated or fact-checked.
Methodology-wise, the researchers employed content analysis and quantitative analysis mixed with national survey findings to examine the linkages between agreement to narratives, foreign policy preferences, and media consumption (divided into five hypotheses).[8]
As for the latter, anonymised computer-assisted telephone interviews were administered in February 2020 to 903 respondents across Ukraine, aged 18-60+. The survey included questions concerning individual political values and perceptions, views on global alliances, and most importantly, the respondents’ consumption and belief in news sources and anti-Western disinformation narratives.
Anonymised computer-assisted telephone interviews were administered in February 2020 to 903 respondents across Ukraine, aged 18-60+.
The authors picked 15 anti-Western tropes propagated both by biased Telegram accounts and television channels. The narratives encompassed statements on Ukrainian dependence on Western economic institutions, and their alleged toxicity for Ukrainians–e.g. “the EU uses Ukrainians for low-paid labour” and “Ukraine and Russia are equally responsible for the war in Donbas”.[9]
In this case, the purpose of anti-Western operations is evident as it is the main cleavage in Ukrainian politics.[10] Putin has widely used the contestation of Western institutions and standards to justify and redefine transgressions of State sovereignty in the Middle East and Eurasia.[11] The case of Ukraine, more specifically, offers fertile ground for research on the link between anti-Western sentiments and agreement with Russian foreign policy. It has been reiterated, in fact, that the West played a crucial role in the 2014 political crisis and, alongside NATO and EU expansion, has been a root cause of the tension and subsequent escalation.[12]
By applying regression methods to such survey findings and analysing them both with OLS and Logit, Maschmeyer’s study is backed by positive correlation and results, whose significance and reliability are proven regardless of the chosen statistical process. However, other methodological approaches are not as criticism-proof.
To better understand the gaps in this work, all five hypotheses’ analytical process and findings will now be investigated as separate subheadings. An exception will be made for Hypothesis 4 (State sponsorship via social media is less likely to be attributable than via television) as it is irrefutable. Namely, most platforms are now more than ever populated by millions of bots and sock puppets hidden behind a veil of anonymity. This intrinsic characteristic sparks a struggle to attribute legal responsibility to IO.[13] For instance, the 300-platform-wide network “Secondary Infektion” has been traced back to Russian territory, though State responsibility has not been determined.[14] Moreover, H2 and H3 will be discussed together to avoid repetition of the main counterargument.
H1: The more people agree with narratives, the more likely their foreign policy preferences are to align with the sponsor’s interests.
Maschmeyer’s belief that influence operations sway political preferences seems less straightforward than illustrated. In fact, an analysis of the Ukrainian case and the Serb-Croatian cross-national impact of state-controlled influence operations underlines that exposure to propaganda leads to greater polarisation and radicalisation rather than belief change.[15] More precisely, Russian television has been found by Peisakhin and Rozenas[16] to persuade voters who held pre-existing pro-Russian views. However, they were ineffective in changing the political opinion of pro-Western individuals.
Therefore, it is crucial to consider how prior views could influence the correlation between narrative agreement and foreign policy. Undeniably, pro-Russian sentiment has changed following the invasion of Ukraine. However, the data was collected and analysed at the beginning of January 2021, thus sparking the assumption that the baseline agreement with Russian foreign policy was still less unified, especially in eastern oblasts.[17] Precisely, according to data gathered by Goodwin and Jackson,[18] Luhansk and Kherson’s populations displayed ca. 20% and 10% support for unification with Russia, respectively. Overall, positive attitudes towards the neighbouring country were shared by 53% and 45% in the east and south of Ukraine, respectively. With the invasion of 2022, such figures plummeted to 4% and 1%, respectively. With this, support for Ukrainian alignment with NATO and Western institutions doubled in the same oblasts.[19]
Luhansk and Kherson’s populations displayed ca. 20% and 10% support for unification with Russia, respectively.
When analysing Maschmeyer’s[20] online appendix, the tables–despite being divided in oblasts–show a singular regression run for Ukraine as a whole. To better understand the regional differences and consider pro-Russian sentiment before and after the war, it could be beneficial for the field to run regressions for single regions and verify such relations.
In sum, while it is logically irrefutable that alignment with Western institutions is linked to the absence of anti-Westernism, key elements are missing from the analysis, such as the consideration of pre-existing bias and, more specifically, a variable related to the disinformation source.
Hence, the analysis concerning the first hypothesis displays a strong correlation but no causation. Since the latter ought not to be presumed from correlation alone,[21] the research would benefit from deeper attention to confounding variables such as pre-disposition to Russian narratives, personal bias, and geographical location.
Lastly, even if causation were proven, as the authors suggest in the conclusions, its effect would be limited, persuading only 10% of users to align with pro-Russian foreign policy perspectives.[20]
H2: Audiences of partisan television channels are more likely to be exposed to narratives than audiences of partisan social media channels.
H3: Audiences of partisan television channels are more likely to agree with narratives than audiences of partisan social media channels.
Literature on hybrid warfare and media studies underline that television is more persuasive than social media. The authors attribute this to its repetitive and centralised nature, as well as credibility–precisely, traditional media are preferred as a main source of information as they are perceived as more authoritative and truthful.[23]
The main criticism of this section is advanced by the authors themselves, who state:
“This finding keeps alive the possibility that exposure causes agreement, but could just as well imply that Ukrainians predisposed to agree with Russian narratives actively seek out pro-Russian media, where they are subsequently exposed to the narratives (agreement causes exposure).”[24]
With this, confirmation bias comes into play. Precisely, the media user acquires content that does not contradict their preference,[25] thus directly seeking out disinformation. Therefore, it is difficult to determine the actual pervasiveness of influence operations.
Such arguments reinforce the idea that the study would benefit from oblast-specific linear regression and attention to pre-existing positive attitudes towards Russia (both at individual and provincial levels), which were both accounted for in the survey structure[26] but not calculated in the regression.[27]
Moreover, the study considers Ukrainian TV channels, ignoring the previous presence of Russian television in eastern Ukraine – the repetitive nature of those broadcasts contributed to strengthening support for pro-Russian candidates.[28] While their research data does not account for such influence, it could be interesting to discover whether prolonged exposure to such narratives in adulthood –considering the majority of respondents was at least 20 in 2014 – shaped their perception to the point of being substantial prior polarisation.
The study considers Ukrainian TV channels, ignoring the previous presence of Russian television in eastern Ukraine – the repetitive nature of those broadcasts contributed to strengthening support for pro-Russian candidates.
Considering all these caveats would lead to a more specific identification of causal relations that go beyond mere correlation.
H5: Partisan television channels are likely to reach a larger audience than partisan social media channels.
While the perks of television expressed in the above analysis are undeniable and confirmed by other scholars,[29] participants’ age distribution and media consumption habits raise suspicions. Precisely, participants were mentioned to be somewhat evenly distributed, divided into categories as follows:
- 18-29 (18.1% of respondents);
- 30-44 (28.5%);
- 45-59 (25.5.%); and
- 60+ categories (28%).[30]
However, the younger range was absent when analysing all the regression tables reported in the online appendix.

Source: Maschmeyer et al., 2023 – online appendix, 33.
In the argumentation, the exclusion of younger brackets is substantial to statistically understand Telegram consumption habits and influence. Namely, according to Statista,[31] most Telegram users in Ukraine at the time of the survey were aged 18-29, while individuals over 60 made up a mere 2%.

Source: Share of social media users in Ukraine in January and February 2021, by age group; Statista 2022.
While different regressions, such as age range, cannot be run due to a lack of raw data, it can be presumed that the outcome of H5 testing would differ. In other words, the relevance and resonance of social media may be downplayed in the regression and investigation. Precisely, the absence of data from the younger portion of responders would substantially influence overall findings since, as external data shows, it is the percentile with consistently higher social media usage. In sum, when considered, a proper inclusion of younger social media users provides a more comprehensive (or different) account of mediatic audiences.
Three Pitfalls
Having analysed the paper, dissecting the methodology behind each hypothesis, three main pitfalls can be identified.
Firstly, the paper fails to consider prior partisanship that could skew the results of agreements with propaganda. Precisely, individuals in eastern oblasts such as Luhansk and Donetsk were likely to agree with Russian foreign policy before exposure at the time the survey was administered. The data could, therefore, reflect this imbalance, as shown in the H1 regression, for instance, by Luhansk’s coefficient (0.6166) compared to Kyivska’s (0.3490). In this, geographical location, as well as previous exposure to Russian television channels, occurred in and after 2014 and has been found by scholars to offer prior bias that tended to be radicalised by exposure to like-minded narratives.[32]
Geographical location, as well as previous exposure to Russian television channels, occurred in and after 2014 and has been found by scholars to offer prior bias that tended to be radicalised by exposure to like-minded narratives.
Secondly, another factor that has not been appropriately addressed in the statistical research is the 18-29 age bracket of responders, deemed the most active on social media, especially Telegram. For this reason, an appropriate consideration of the totality of the age group to whom the survey was administered could lead to more well-rounded findings.
In the survey, individuals aged 18-24 accounted for nearly 20% of responders, making their impact on the regressions substantial, especially when considering their stark preference for Telegram compared to the remaining three age brackets. Therefore, re-running the calculations, including younger responders’ data, could shift the conclusions, leading to a more objective account of the propagandistic potential of traditional media. Precisely, the limited selection of Telegram channels, the scarce content analysis of their messages, and the exclusion of social media-oriented generations risk skewing results for H2 and H5. Including younger groups might, therefore, prove the unwavering influence of social media, disproving the dominance of television in terms of reach. The inclusion of younger people in the recalculation would, in fact, not necessarily disprove the effectiveness of television tout-court, but likely show age-dependent preferences, with a relative advantage of television in influencing less tech-savvy brackets and of Telegram shaping opinions in people from ages 18 to 24.
Finally, as the authors themselves address throughout the work, the main pitfall is that it is centred around the correlation of variables but not on causation. Thus, it would be interesting to analyse the data further to potentially discover causal linkages. A discovery of causation would strengthen the study’s arguments, unveiling substantial mechanisms.
For instance, a strong correlation was found for H1. However, discovering causation in such an analysis would imply that it is solely exposure to partisan TV that leads to an increase in narratives, not variables such as regional bias. This effectively translates to easier enforcement of solutions, as it is more feasible to interrupt the influence of television on individuals compared to factors such as personal bias and geographical location.
Overall, a confirmation that television singlehandedly shapes individual political choices would allow governments targets of influence operations, such as Ukraine, to ensure factual and uninfluenced information flows. Practically, such studies hint at where to direct fact-checking efforts and the flagship of false content (e.g. focusing on TV rather than social media censorship), or serve as a legal basis to justify the limitation and sanctioning of partisan/adversary TV channels. Considering the field of international politics and policy, Maschmeyer’s findings could substantially influence media control of traditionally disregarded media. Subsequently, countries and institutions could silence malicious broadcasting services and economically sanction their owners if found to be the case, as occurred with Taras Kozak, a former pro-Russia Ukranian deputy sanctioned for undermining Ukrainian sovereignty in 2014. The politician was in control of three news channels (NewsOne, 112 Ukraine and ZIK) which were shut down as part of sanctions imposed on Kozak on February 02, 2021.[33]
Countries and institutions could silence malicious broadcasting services and economically sanction their owners if found to be the case.
Luckily, such research cannot be carried out in the current or future scenario as–first and foremost–public opinion has radically changed in favour of the West, and–more practically–the propaganda broadcasts taken into account have been blocked in 2021 by the National Security and Defence Council[34] and the owner has been sanctioned.[35] Therefore, this research may not be useful in understanding the current landscape of Russia and Ukraine but could become a blueprint for new studies and policies.
Even though it does not apply anymore to the case of Ukraine, these findings can be applied to other cases of cross-national tensions and external interference: Peisakhin and Rozenas[36] reference, for instance, of the case of Shi’a television being a threat to Sunni governments.
However, the discovery of social media’s neutral (and sometimes beneficial for cross-cutting exposure) effect could lead to a destigmatisation of new media, and potentially use their algorithms to achieve more widespread information.
As modern war-making techniques revolve around influence operations, further research can teach us how to use media as offensive weapons and defensive techniques to combat narrative warfare.
Matilde Bufano is a Master’s student in the joint MSc in International Security Studies at the University of Trento and Scuola Superiore Sant’Anna. Her research interests include information warfare, OSINT, digital public diplomacy, and strategic communications in conflict and international politics. The views contained in this article are the author’s alone and do not represent the views of the University of Trento and Scuola Superiore Sant’Anna.
[1] Bettina Renz, “Russia and ‘hybrid warfare’,” Contemporary Politics, 22:3 (2016) 283–300, DOI: 10.1080/13569775.2016.1201316.
[2] Deepfakes can be defined as ‘AI-generated hyper-realistic pictures and manipulated videos’, according to Lundberg and Mozelius (in The potential effects of deepfakes on news media and entertainment, AI & Soc (2024), https://doi.org/10.1007/s00146-024-02072-1).
[3] “Influence Operations,” Committee on National Security Services (2022) in Glossary, CNSSI 4009-2015, accessed December 05, 2024.
[4] Arild Bergh, “Understanding Influence Operations in Social Media: A Cyber Kill Chain Approach,” Journal of Information Warfare 19, no. 4 (2020): 110–31, https://www.jstor.org/stable/27033648.
[5] Hypothesis will be shortened to H from here on.
[6] Lennart Maschmeyer, Alexei Abrahams, Peter Pomerantsev, and Volodymyr Yermolenko, “Donetsk Don’t Tell – ‘Hybrid War’ in Ukraine and the Limits of Social Media Influence Operations,” Journal of Information Technology & Politics 22 no1 (2023): 49–64, doi:10.1080/19331681.2023.2211969.
[7] Telegram is a cloud-based messaging app, deemed as more secure due to various levels of encryption. The platform, owned by Russian entrepreneurs Pavel and Nikolai Durov, has been chosen by the authors since it is a relevant and anonymous source of disinformation (Britannica, n.d; DFRLab, 2020; Osadchuk, 2020 in Maschmeyer, 2023).
[8] Idem.
[9] Maschmeyer et al.. “Donetsk Don’t Tell”, 7.
[10] Timothy Fry, “What Do Voters in Ukraine Want?: A Survey Experiment on Candidate Ethnicity, Language, and Policy Orientation,” Problems of Post-Communism 62 no.5 (2015): 247–257.
[11] Moritz Pieper, “‘Rising Power’ Status and the Evolution of International Order: Conceptualising Russia’s Syria Policies,” Europe-Asia Studies 71 no. 3 (2019): 365–87, doi:10.1080/09668136.2019.1575950.
[12] John J. Mearsheimer, “Why the Ukraine Crisis Is the West’s Fault: The Liberal Delusions That Provoked Putin,” Foreign Affairs 93, no. 5 (2014): 77–89, http://www.jstor.org/stable/24483306.
[13] Carly Nyst and Nicholas Monaco, State-Sponsored Trolling: How Governments Are Deploying Disinformation as Part of Broader Digital Harassment Campaigns, (Palo Alto, CA: Institute for the Future, 2018).
[14] Facebook, Threat Report The State of Influence Operations 2017-2020, (2021), https://about.fb.com/wp-content/uploads/2021/05/IO-Threat-Report-May-20-2021.pdf.
[15] Stefano DellaVigna, Ruben Enikolopov, Vera Mironova, Maria Petrova, and Ekaterina Zhuravskaya, “Cross-Border Media and Nationalism: Evidence from Serbian Radio in Croatia,” American Economic Journal: Applied Economics, 6 no. 3 (2014): 103–32, DOI: 10.1257/app.6.3.103.
[16] Leonid Peisakhin and Arturas Rozenas, “Electoral Effects of Biased Media: Russian Television in Ukraine,” American Journal of Political Science 62, no. 3 (2018): 535–50, http://www.jstor.org/stable/26598765.
[17] Ukrainian administrative divisions, also referred to as regions in English.
[18] “Global Perspectives on Influence Operations Investigations: Shared Challenges, Unequal Resources,” Carissa Goodwin and Dean Jackson, Carnegie Endowment for Peace, accessed February 02, 2025 https://carnegieendowment.org/research/2022/02/global-perspectives-on-influence-operations-investigations-shared-challenges-unequal-resources?lang=en.
[19] Idem.
[20] Maschmeyer et al.. “Donetsk Don’t Tell”, online appendix.
[21] JM. Rohrer, “Thinking Clearly About Correlations and Causation: Graphical Causal Models for Observational Data,” Advances in Methods and Practices in Psychological Science, 2018;1(1):27-42, doi:10.1177/2515245917745629.
[22] Maschmeyer et al.. “Donetsk Don’t Tell.”
[23] Thomas J. Johnson and Barbara K. Kaye, “Reasons to believe: Influence of credibility on motivations for using social networks,” Computers in Human Behavior, 50, (2015): 544–55.
[24] Maschmeyer et al.. “Donetsk Don’t Tell”, 8.
[25] Ivan V. Kozitsin and Alexander. G. Chkhartishvili, “Users’ Activity in Online Social Networks and the Formation of Echo Chambers,” 2020 13th International Conference “Management of large-scale system development” (MLSD)(2020) 1-5, doi: 10.1109/MLSD49919.2020.9247720.
[26] Maschmeyer et al.. “Donetsk Don’t Tell”, online appendix, section 1.
[27] Maschmeyer et al.. “Donetsk Don’t Tell”, table 7.
[28] Olena Svevchenko, “Influence of the Russian TV product on the Ukrainian audience,” East of Europe vol 4, no.1 (2018), DOI:10.17951/we.2018.4.1.137.
[29] Thomas J. Johnson and Barbara K. Kaye, “Reasons to believe: Influence of credibility on motivations for using social networks,” Computers in Human Behavior, 50, (2015): 544–55;
Roy L. Behr and Shanto Iyengar, “Television News, Real-World Cues, and Changes in the Public Agenda,” Public Opinion Quarterly, Volume 49,no 1, (1985), 38–57, https://doi.org/10.1086/268900;
Stergios Fotopoulos, ”Traditional media versus new media: Between trust and use,” European View 22 no.2(2023), 277–286, https://doi.org/10.1177/17816858231204738.
[30] Maschmeyer et al.. “Donetsk Don’t Tell,” online appendix, section 1.
[31] “Share of social media users in Ukraine in January and February 2021, by age group,” Statista, 2022, accessed February 02, 2025, https://www.statista.com/statistics/1256255/most-popular-social-media-by-age-ukraine/.
[32] Leonid Peisakhin and Arturas Rozenas, “Electoral Effects of Biased Media: Russian Television in Ukraine,” American Journal of Political Science 62, no. 3 (2018): 535–50, http://www.jstor.org/stable/26598765.
[33] “Ukraine Sanctions ‘Russian Trojan Horse’ Media Assets Associated With Putin Friend,” Radio Free Europe (February 03, 2021), accessed February 24, 2025, https://www.rferl.org/a/ukraine-russia-television-stations-sanctions-putin/31083423.html.
[34] “Ukraine bans pro-Russian TV stations,” DW, (March 02, 2021), accessed February 02, 2025.
[35] “Hacked Pluralism: How Ukrainian TV Channels Were Used to Spread Russian Propaganda Messages,” Ukraine World (February 22, 2021), accessed February 02, 2025, https://ukraineworld.org/en/articles/infowatch/ukrainian-tv-channels-were-used-spread-propaganda-messages.
[36] Peisakhin and Rozenas, “Electoral Effects of Biased Media.”