Back to blog

Exploring alternatives to a war on mis- and disinformation

Reos Partners
May, 2022

By Lisa Rudnick and Mille Bojer

For more than 40 years after the “war on drugs” was declared by Richard Nixon in 1971, the dominant narrative in the international drugs regime was that the production and use of drugs must be combatted, prohibited, criminalised, and eradicated. Alternative narratives were essentially taboo. By the early 2010s, this narrative was increasingly challenged: the global Commission on Drug Policy, which included former presidents of Brazil, Chile, Mexico, and Colombia, called for “breaking the taboo” on discussing policy alternatives. In 2012, the Summit of the Americas mandated the Organisation of American States (OAS) to explore new approaches. The OAS invited Reos Partners to support this work by facilitating a transformative scenarios process on the future of the drug problem in the Americas.

These scenarios, published alongside an analytical report on the drug problem, were credited for breaking the taboo. They explored ways to address underlying social and economic dysfunctions, named the value of learning from alternative regulatory regimes, and shed light on the unfair costs of the dominant strategy. 

Last year, in 2021, the US armed forces withdrew from Afghanistan 20 years after George W. Bush coined the term “war on terror” in a speech to Congress in the wake of 9/11. The war on terror quickly became contentious, primarily because of how it was immediately used to suppress civil liberties, and the Obama administration that took over some years later avoided use of the term. Nevertheless, the term persisted in political rhetoric, media, and general discourse, and a repressive approach was consistently applied. Reos Partners recently facilitated a transformative scenarios process with 25 Afghan civil society leaders — they see alternatives to the war on terror.  

This “war on…” framing has been on our minds lately. Over the past year, we have been immersed in an ecosystem of actors currently working on addressing the rise of misinformation¹ and tactics for using information to cause harm. We have been struck by the combative language of the media, the event landscape, and of our own peers when discussing the problem of misinformation and what should be done about it. Such language tells us this is about “combating misinformation,” “fighting misinformation,” “countering misinformation,” “spotting bad actors,” and, yes, “war on misinformation.” 

While we recognise that the “weaponisation² of information” is employed by many conflict actors as tactics of war, and that disinformation³ plays a central role in the present war in Ukraine, we can’t help but wonder whether we are drawing the lessons we should from previous “wars” on complex and emergent phenomena. According to Alexei Abrahams and Gabrielle Lim:

We are repeating the mistakes of the war on terror, prioritizing repressive, technologically deterministic solutions while failing to redress the root sociopolitical grievances that cultivate our receptivity to misinformation in the first place. ”

(In the case of the war on drugs, this sentence might read: repressive and prohibitive solutions were prioritized while failing to redress the root socioeconomic conditions that cultivate susceptibility to drug abuse and vulnerability to recruitment into organised crime networks.) 

Are we sufficiently asking ourselves, what is our propensity to adopt and share misinformation symptomatic of?  

What the war on terror, the war on drugs, and the war on misinformation have in common is that they are all wars against symptoms of underlying social, economic, and political drivers, disguised as battles against reified threats — terrorism, drugs, and “bad” information. In this way, they turn these threats into enemies and engender policy orientations and solutions that can be effectively used to erode civil liberties and constrain civic space. Further, by using polarising language and repressive tactics, they risk fueling the very phenomena they are aiming to address. 

So what can we do differently?

There is certainly an important role for countering measures such as fact-checking, debunking, counter messaging, and rumor tracking in ensuring our information ecosystems are healthy and safe. But unless we go beyond trying to repress “bad” information to replace it with “good” information, and attend to the roots of the problem, our vulnerabilities will grow and evolve.  

Applying a systemic approach to mis- and disinformation is one way of doing this. In recent months, we designed and facilitated a convening series4 that did just this. With the participation of a multi-stakeholder group including more than 80 thought leaders, funders, and implementers, we took up different lenses to explore the issue, and through them identified key leverage areas for shifting underlying drivers of mis- and disinformation, and game-changing strategies for doing so. Though not exhaustive, this set of leverage areas provides an indication of how a different way of framing the problem might focus our attention by bringing systemic drivers into view. 

Along the lines of these leverage areas, Reos Partners is currently pursuing three systemic strategies to help address some of these key drivers:  

1. Shifting the feedback loop between mis- and disinformation and the erosion of social cohesion 

In the digital age, mis- and disinformation spread further and faster. Using micro-targeting techniques, digital information technologies combine with human use to rapidly amplify grievances, exacerbate tensions, and deepen mistrust. We can see these dynamics playing out in Ukraine as well as other conflict contexts like Myanmar, Ethiopia, and Syria (among others).

Such conditions play an important role in the erosion of social cohesion through a pervasive feedback loop: mis- and disinformation exploit existing social tensions and fears, and erode social cohesion; but weakened social cohesion makes us more susceptible, as individuals and communities, to both mis- and disinformation. Even where social cohesion is relatively strong, no group is definitively impermeable to the effects of on-going disinformation campaigns. 

At Reos Partners, we have been thinking about how important it is to keep this feedback loop in view as societies around the globe grapple with the impacts of mis- and disinformation, and develop and advocate for solutions. Focusing on one side or the other can train our attention on what Dr. Claire Wardle has called the “atoms of information,” and causes us to miss the systemic dynamics and drivers that animate this feedback loop, exacerbate its negative effects, and keep it in place. We think our future depends on our collective capacity to shift this loop which is so central to conflict dynamics — from one that erodes trust and deepens divisiveness, to one that contributes to conflict resilience and the pursuit of healthy and just societies. 

To this end, we have developed the Shared Realities Project, an ambitious initiative to work with stakeholders around the world on the future of social cohesion in an age of disinformation, through the application of transformative scenarios

2. Supporting collaboration to influence the disinformation economy 

It is often assumed that mis- and disinformation are driven by ideology and politics, but they also fuel, and are fueled by, a lucrative industry. Many incentives exist for platform users to create opportunities to monetize information in this poorly regulated space. More importantly, the advertising-driven business models animating social and legacy media companies incentivize practices that privilege profit over people and encourage the dissemination and amplification of misinformation. 

Innovative strategies to try and address these powerful financial drivers include campaigns to reduce the financial incentives for publishing inflammatory and divisive stories (Stop Funding Hate); employing litigation to establish the right of shareholders to insist on responsible business conduct (Shareholder Commons); and regulatory frameworks such as the EU’s Code of Practice on Disinformation and the recent Digital Services Act package. 

Each of these are powerful approaches which show promise. However, the abovementioned disinformation convenings demonstrated that creating better collaboration and exchange of knowledge across geographies, thematic silos, and tactical approaches among stakeholders working to shift the disinformation economy can help enhance, amplify, and accelerate impact in this complex and continually evolving situation. We are therefore developing a Disinformation Economy Lab to help a diverse and complementary group of actors harness the potential that better collaboration can offer for effectively influencing the financial and commercial incentives of the disinformation economy.

3. Bridging knowledge and application gaps

In the first session of the disinformation convenings, Dr. Claire Wardle observed that the field of mis- and disinformation studies is both very new and rapidly evolving, as are the technologies, tactics, and broader dynamics implicated. Many knowledge gaps exist, including (but not limited to) around how mis- and disinformation flow through information ecosystems (complex systems of dynamic sociotechnological relationships through which information moves and transforms) in different languages and contexts around the world; how systemic drivers influence such dynamics; and what the most effective responses are to the challenges, events, and harms being faced at scale.

In this context there is an “application gap” between current research and its application to improve approaches and responses to emergent events and challenges. Bridging this gap requires: collaborative infrastructure for sharing research and practice; dedicated processes, approaches, and spaces for translating complex findings in to usable insights, and for explicitly mobilizing emerging knowledge in the design of both immediate responses and long-term solutions; and collaborative capacities and expertise to facilitate such work across disciplines, institutions, sectors, and silos. 

Effective responses to mis- and disinformation depend on our abilities to effectively bridge these gaps. But we must do so in a way that moves us beyond a singular focus on tracking and detecting misinformation events and patterns, to building a more holistic understanding of information ecosystems. Promising and useful approaches are being used by several organisations in the context of work on elections (Sentinel Project), health communication (Meedan), and in humanitarian contexts (Internews), but challenges still exist when it comes to translating research for use in response and intervention, and learning about impacts and challenges. 

We aim to collaborate with a group of thought leaders to support a research, learning, and application ecosystem that is fit for the purpose of addressing disinformation dynamics across all its dimensions.  

Instead of searching for a target to destroy with a silver bullet in a war on disinformation, we choose instead to look more deeply at the underlying drivers of mis- and disinformation, so that we might build resilience, shift the system, and avoid the mistakes made by “wars.” 


¹ We use the term misinformation to refer to false or decontextualized information that is presented as fact without the intent to manipulate or deceive. It is a product of error, misinterpretation, or misunderstanding, and is an inherent part of human communication.

² We use the term weaponisation of information to refer to the practices and strategies of using information to cause harm. 

³ We use the term disinformation to refer to information that is strategically created and shared with the intent to deceive, manipulate, and sow confusion and doubt. Disinformation may be entirely fabricated, or may include factual information that has been altered, recontextualized, or combined with false information.

4 The 2021-2022 disinformation convenings mentioned were initiated and convened by CIFF and the Oak Foundation and facilitated by Reos Partners.   

Share
Sign up to our newsletter