More

    Troll Farms: The Mercenaries of Online Disinformation?


    Troll farms now occupy a growing place in online communication and influence strategies. These public or private structures coordinate networks of accounts designed to amplify certain narratives, steer debates, or destabilize opponents. Present in Russia, China, India, Turkey, Israel and Saudi Arabia, they illustrate the rise of a new tool of political and geopolitical action based on information control and perception management.

    “They don’t invent these topics. They spot tensions already present on Facebook or Twitter, then they amplify them.” That’s how Professor Sarah Mendelson, a specialist in public policy, sums up the methods used by troll farms. From coordinated disinformation campaigns to state-run digital armies, troll farms have become one of the most discreet yet formidable tools of modern conflict. Born in the margins of the web about a decade ago, these structures—combining propaganda, manipulation and automated technologies—have become so professionalized that they now form genuine “private armies” of information. From Russia to China, Israel to the Philippines, they shape public debate, infiltrate social networks, and now play a central role in influence warfare between powers.

    A troll farm (or “troll factory”) is an organized structure, public or private, that coordinates individuals—and often bots or automated scripts—to produce, disseminate and amplify content (posts, comments, rumors, videos, memes) on social media in order to influence opinion, destabilize opponents or promote a specific narrative. These operations often exist at the intersection of political communication, propaganda and cyberwarfare. Their methods include the creation of fake accounts, the use of paid commentators, and automated dissemination bots.

    The phenomenon is not new. In Russia, as early as 2011–2012, pro-government commenting networks were active during protests against Vladimir Putin’s return to power. A progressive militarization of online discourse became visible from 2014, during the war in Donbas, when disinformation campaigns began to take shape as a strategic influence tool. The model went global from 2015 onwards, following revelations about the Internet Research Agency (IRA), a Russian troll farm.

    Russia is not alone, however. In the West, troll farms entered the public eye with the Cambridge Analytica affair. This scandal erupted in 2018 when investigations revealed that the company had exploited, without consent, the personal data of over 80 million Facebook users for political purposes. That data was used to design microtargeted advertising aimed at influencing voters during campaigns such as Brexit and Donald Trump’s 2016 election.

    China, for its part, institutionalized a pro-government commenting army as early as the 2010s—the so-called “50 Cent Army” (wumao)—reportedly mobilizing hundreds of thousands of people to relay Communist Party propaganda on social networks. Over time, this ecosystem expanded: private companies, specialized communication agencies and clandestine entities now offer online influence services across the globe, blurring the line between private contractors and state-run disinformation networks. A phenomenon that has now gone fully global.

    Several state or semi-private entities have been identified as operating troll farms or sophisticated influence campaigns. One of the best known is the Internet Research Agency (IRA). Founded in Saint Petersburg and linked to Kremlin insiders, it has long been at the center of investigations into digital interference. Its employees, working under strict posting quotas, created fake Twitter, Facebook and Instagram accounts and flooded the Anglophone and international spheres with pro-Russian content.

    For instance, a former employee, Lyudmila Savchuk, reported having to produce a set number of political posts and comments per day, according to precise quotas.

    Another notable player is the Social Design Agency (SDA), a newer and particularly aggressive structure. Leaked internal documents in 2024 revealed how this Moscow-based agency, linked to the Kremlin, coordinates influence campaigns across Europe, producing memes, videos, and comments with country-specific quotas. It is sometimes described as a modernized version of the troll factory—more flexible, more connected, and capable of hybrid propaganda operations.

    These leaks showed how the agency had planned to influence European elections by supporting far-right parties, such as France’s National Rally, by planting anti-Ukraine narratives online and coordinating disinformation campaigns that mimic local media.

    Russia is far from alone in exploiting troll farms.

    In SaudiArabia, the so-called “Saudi Troll Army” designates a vast network of coordinated accounts dedicated to defending the monarchy and silencing dissent. This system, linked to Crown Prince Mohammed bin Salman’s entourage, has led massive harassment campaigns against journalists, human rights activists and political opponents—particularly after the assassination of journalist Jamal Khashoggi.

    In Israel, “Team Jorge,” a private Israeli company, was exposed in 2023 following a journalistic investigation. The firm offers electoral manipulation services, including the creation and control of fake social media profiles, hacking operations and the dissemination of disinformation content.

    Other powers also exploit this lever—China, India, and Turkey among them. Chinese troll farms, often called the “50 Cent Army,” mobilize hundreds of thousands of paid commentators to divert attention from sensitive issues—such as Xinjiang, Hong Kong or Taiwan—and to promote a positive image of the regime. Integrated into public institutions, these networks are also used abroad to influence Chinese diaspora communities and counter government criticism.

    In Turkey, the “AK Trolls,” linked to President Recep Tayyip Erdoğan’s Justice and Development Party (AKP), form a militant digital army overseen by government communication services. These coordinated accounts target opponents, journalists, and critical academics while amplifying official propaganda—particularly regarding military operations in Syria and domestic politics.

    In short, the goal of these troll farms is to fragment or shape public opinion within a targeted audience.

    Troll farms today operate as true private armies of disinformation. States use them to wage outsourced information wars while maintaining plausible deniability. This setup allows them to distance their official image from manipulation campaigns carried out in their name: if a campaign is exposed, they can deny direct responsibility.

    China, for example, mobilizes its “50 Cent Army” to promote its positions on human rights, Taiwan and international affairs, while Iran, North Korea and several Gulf monarchies also orchestrate foreign disinformation operations. In Saudi Arabia, these digital networks serve to silence dissenting voices.

    Having become a central instrument of contemporary conflicts, troll farms allow their sponsors to control narratives, fragment public opinion and impose their version of events. Once activated, these online armies are nearly impossible to neutralize or dismantle. Though seemingly private, these “mercenaries 2.0” remain closely tied to state interests, following political agendas and serving alternately as tools of domestic repression and instruments of external influence.

     

    Latest articles

    Related articles