Funders keep missing disinformation’s real battleground
By Francis Allan L. Angelo A new peer-reviewed paper published this month in Information, Communication & Society puts a name to something many of us in Global South newsrooms have felt for years but could not always articulate cleanly: the counter-disinformation industry, for all its billions, has been asking the wrong questions and funding the

By Staff Writer
By Francis Allan L. Angelo
A new peer-reviewed paper published this month in Information, Communication & Society puts a name to something many of us in Global South newsrooms have felt for years but could not always articulate cleanly: the counter-disinformation industry, for all its billions, has been asking the wrong questions and funding the wrong answers. (Ong & Jackson, 2026)
Jonathan Corpus Ong of the University of Massachusetts Amherst and Dean Jackson of the University of Pittsburgh spent two years running workshops and interviews with over 100 civil society leaders, tech policy experts, and researchers across the Philippines, Brazil, India, Indonesia, Kenya, South Africa, and beyond. Their conclusion is blunt. The mainstream counter-disinformation field — dominated by Global North donors, tech companies, and elite universities — has been disconnected from the priorities of the communities it claims to help and has worsened the power imbalance between funders and the people doing the actual work on the ground.
This matters right now because USAID, which bankrolled a significant chunk of global information integrity work and health programs, has been gutted. A Lancet study published in February 2026 projected that global aid cuts could lead to at least 9.4 million additional deaths by 2030. And it matters because the lessons Ong and Jackson draw do not just explain what went wrong with disinformation funding — they explain what is going wrong with the entire architecture of international tech-for-good programming, including the rush into artificial intelligence governance.
The scalability trap
Here is the pattern nowadays. A donor — whether a Silicon Valley foundation, a European government agency, or a US federal grant program — puts out a call for proposals. The proposals that win are invariably the ones promising scale: a dashboard that monitors millions of social media posts, a media literacy curriculum that can be deployed in 15 countries, a fact-checking database with a slick interface. These are the interventions that look good in a slide deck and generate tidy metrics for a donor’s annual report.
What does not win? Community dialogues. Trust-building exercises in rural barangays or favelas. Narrative change campaigns that take two years to show results instead of two months. Grassroots organizing among indigenous communities or fisherfolk who are most vulnerable to political manipulation but least likely to show up on a Facebook monitoring tool.
One Philippine participant in the Ong-Jackson study said it plainly: “Funders are obsessed with tools that are scalable. It’s not sexy to do community dialogues.” A Brazilian researcher added a darker observation — that after election cycles end, organizations are forced to shut down projects and lay off staff, even when outcomes were considered successful. Funding is pegged to election calendars, not to the slow, unglamorous work of rebuilding civic trust. (Ong & Jackson, 2026, p. 10)
The numbers back this up. Between 2017 and 2021, private philanthropy alone funneled more than USD 1 billion into so-called information ecosystem work for aid-recipient countries (Ordóñez, 2024). Under the Biden administration, at least USD 267 million in federal grants went to disinformation research (Bernstein, 2024). Yet when Ong and Jackson asked their participants what they actually needed, the answer was not more dashboards. It was more presence — more in-person, on-the-ground engagement with the communities where disinformation does the most damage.
There is something perverse about a system that builds monitoring tools for election-season disinformation in Manila or Nairobi, then pulls the funding once the votes are counted. The disinformation doesn’t stop after elections. The grievances that fuel it don’t stop. The troll farms don’t close shop. Only the money disappears.
The proxy war problem
If the scalability trap is the first failure, the geopolitical hijacking of counter-disinformation is the second — and arguably the more dangerous one.
Ong and Jackson document how Global North government funding has systematically steered researchers in developing countries away from local, homegrown concerns and toward geopolitical narratives about “foreign influence operations” and “malign authoritarian influence” — code for Chinese and Russian propaganda. One Brazilian participant in the study was characteristically direct, calling such work “a war that doesn’t deal with our problems … If I were making a list of priorities, this would probably be 73rd on my list.” (Ong & Jackson, 2026, p. 12)
That frustration makes sense when you consider the actual information landscape in most Global South countries. The biggest disinformation threats in the Philippines, for instance, are not Russian bots. They are homegrown troll networks servicing local politicians, commercial clickfarm operations, and patronage-driven media manipulation. In Brazil, the most damaging disinformation around the 2022 elections came from domestic extremist networks, not foreign intelligence services. In India, the ruling party’s own digital ecosystem dwarfs anything Moscow or Beijing has deployed in the country.
But when Washington and Brussels set the funding agenda, the money follows their threat perceptions, not ours. Scarce research capacity gets diverted to monitoring Chinese state media influence in Southeast Asia when there are more urgent questions about, say, how local political dynasties weaponize Facebook groups during barangay elections or how agricultural disinformation depresses commodity prices.
And then there is the hypocrisy problem. In June 2024, Reuters revealed that the Pentagon ran a covert anti-vaccination campaign in the Philippines during the height of the COVID-19 pandemic. Using at least 300 fake social media accounts impersonating Filipinos, the US military sought to discredit China’s Sinovac vaccine — at a time when Sinovac was the only vaccine available to most Filipinos and the country was recording one of the worst death rates in Southeast Asia. By mid-2021, only 2.1 million out of 114 million Filipinos were fully vaccinated. Former Philippine health secretary Esperanza Cabral stated that the campaign likely contributed to unnecessary deaths.
The Ong-Jackson paper notes the near-total absence of public indignation and follow-up investigation into this episode, both in the Global North and in the Philippines. The chilling effect is real. It is difficult for researchers and officials in US-allied countries to call out American information manipulation when they have been, as the paper puts it, “historically positioned as being ‘obliged to be grateful’ for their financial aid.” (Ong & Jackson, 2026, p. 12)
You cannot credibly fund “information integrity” programs in the Philippines while your own military ran fake accounts there to suppress vaccine uptake. And yet, no one with funding power seems to have reckoned with that contradiction.
It was never just a tech problem
The deepest insight in the Ong-Jackson paper is not about funding structures or geopolitics. It is about what disinformation actually is. The authors argue that Global South practitioners see disinformation not primarily as a content problem or a technology problem, but as a broader economic and social problem tied to inequality, extraction, and precarious labor. (Ong & Jackson, 2026, p. 13)
This reframing matters enormously. If disinformation is fundamentally about poverty, grievance, exclusion, and platform business models that monetize outrage, then no amount of fact-checking will solve it. Fact-checking is fine as a practice — every newsroom should do it. But treating it as the primary intervention against disinformation is like treating a fever with cold compresses while ignoring the infection.
The paper mentions digital boycotts — movements like Sleeping Giants Brazil that target the advertising revenue streams sustaining disinformation sites — as a model that has emerged on at least three continents. These boycotts work because they attack the commercial incentives behind disinformation rather than playing whack-a-mole with individual false claims. They address the business model, not the symptom.
And consider the labor dimension. Global South countries supply the low-wage, precarious workers who staff both commercial disinformation operations and the content moderation systems meant to combat them. Indonesian clickfarm workers produce political propaganda for overseas clients (Lindquist & Weltevrede, 2024). Filipino content moderators, paid a fraction of their Silicon Valley counterparts, review the most traumatizing material the internet produces. The same countries that bear the brunt of disinformation’s harms also supply the cheap labor that sustains the information economy. That’s not a content problem. That’s a structural one.
What comes next
Ong and Jackson don’t pretend to have all the answers, and their paper is honest about the constraints of their own project, which was funded by the Carnegie Corporation, Luminate Group, and the Open Society Foundations — the very kind of Global North philanthropies they critique. But they offer a framework that is more useful than most.
First, fund longer. Election-cycle funding creates boom-and-bust dynamics that destroy organizational capacity. If a counter-disinformation group has to lay off trained staff after every “successful” election, you are not building resilience. You are renting it.
Second, fund deeper. Community-driven approaches — grassroots dialogues, narrative change campaigns, local coalition-building — are harder to measure and slower to show results. They are also the only interventions that address the trust deficit at the heart of disinformation vulnerability. Funders need to adjust their timescales, their theories of change, and their appetite for risk.
Third, fund sideways. South-to-South collaboration — Brazilian activists sharing lessons with Filipino journalists, Kenyan researchers comparing notes with Indonesian civil society — produces insights that top-down, Global North-to-Global South programming never will. The paper’s own workshops, where Brazilian participants debated the wisdom of their partnership with the electoral court against skeptical colleagues from Thailand and India, are a model of what this looks like in practice.
Fourth, stop exporting one-size-fits-all regulation. The contrast between Brazil’s successful partnership with its Tribunal Superior Electoral and Southeast Asian participants’ fears of abusive anti-fake news laws is a powerful reminder that context determines whether government regulation helps or hurts. In countries where the state itself is the biggest source of disinformation, handing it more power to police online speech is not reform. It is surrender.
The paper closes with a warning that applies far beyond disinformation. As funding streams reconstitute after USAID’s destruction and as the world rushes into AI governance, the same mistakes are being set up: the Global South used as a testing ground, local voices sidelined, commercial interests disguised as development, and the comfortable fiction that better technology alone can fix problems rooted in power and inequality.
We’ve seen this movie and know how it ends. The question is whether anyone with funding power is willing to change the script.
Article Information
Comments (0)
LEAVE A REPLY
No comments yet
Be the first to share your thoughts!
Related Articles

When the force becomes the ‘like farm’
The PNP, in its eternal search for relevance, has discovered engagement metrics. Word in the ranks is that personnel are now being asked — not formally, of course, never formally — to like, share, and comment on the official PNP posts. Hashtags are involved. #PNP is one of them. There may be others. One imagines


