USAID 'Disinformation Primer' targets gamers, advertisers, memes, 'right not to be disinformed'
State Department component acknowledges the "inherent contradiction between the democratic ideal of free speech and the regulation of online content," and risk of giving authoritarians another excuse to crack down.
Going further than the Treasury Department's tacit support for debanking alleged "hate groups" after the Jan. 6, 2021, Capitol riot, a State Department component that "strengthen[s] resilient democratic societies" developed a more sweeping plan for financially choking off disfavored narratives a month later.
The U.S. Agency for International Development wrote a "Disinformation Primer" that appears to have been started in "late 2020," judging by a reference to the most up-to-date "social media initiatives" by Facebook and Twitter "to address disinformation and misinformation."
It was still being written at least 10 days into the Biden administration, referring twice to the Jan. 31, 2021 Myanmar military coup, which the primer says "underlined issues of internet freedom and disinformation."
State turned over the 88-page document, dated February 2021 and marked "internal use only," as part of a 102-page production in response to Freedom of Information Act litigation by America First Legal.
The primer was "conceived and developed by Joshua Machleder and Shannon Maguire at USAID, together with" the University of Chicago organization formerly known as the National Opinion Research Center, the introduction says.
It thanks dozens of individuals at universities and government institutions including the National Endowment for Democracy and U.S. Institute for Peace "for sharing their time, opinions, and expertise," and current and former USAID staff for "opinions and feedback."
USAID didn't respond to requests for comment on its view of the primer now.
"Just as human rights advocates have argued that internet access is a human right and that there is a fundamental right to access of information, there is also a right not to be disinformed," the primer says in a section laying out 10 steps for USAID and partners to take in "countering and preventing disinformation."
These include partnering with other governments, supporting "media monitoring and fact-checking initiatives," and "form[ing] synergies" with "civil society, media, social media platforms, internet governance forums, and other donors" to prevent disinformation, especially working with "data scientists, social scientists, and journalists."
AFL gave the FOIA production to the Foundation for Freedom Online, which published an analysis Thursday by executive director and former State communications official Mike Benz, managing director Allum Bokhari and graduate student Oscar Buynevich.
The primer mentions "advertising" 19 times and advertisers or advertisements a dozen times, showing USAID support for the "well-established tactic of the censorship industry – financial blacklisting," FFO's analysis says.
While much of the report promotes the more-speech-is-better approach, such as funding local journalism, a section on "supply-side responses" – written in the passive tense – implies both governments and activists should pressure advertisers to flee alleged sources of disinformation.
Most advertisers are "unaware of the disinformation risk of the domains featuring their ads due to the automated process of ad placement," the "Advertiser Outreach" subsection says.
"Thus, cutting this financial support found in the ad-tech space would obstruct disinformation actors from spreading messaging online," it says. "Efforts have been made to inform advertisers of their risks, such as the threat to brand safety by being placed next to objectionable content, through conducting research and assessments of online media content."
While the primer says it provides examples of advertiser outreach in an annex on "emerging solutions," none of the examples explicitly mentions advertising. The annex does include the Cybersecurity and Infrastructure Security Agency's public awareness campaign about pineapple on pizza.
FFO highlighted the primer's support for the supply-side "Redirect Method," piloted by Moonshot and Google's Jigsaw in 2016, "an open-source methodology that uses targeted advertising" on platforms such as Google Adwords "to connect people searching online for harmful content with constructive alternative messages."
It was used to "target individuals susceptible to ISIS radicalization via recruiting measures" and "is being adapted in several countries to combat vaccine hesitancy and hate speech," the primer says.
The Redirect Method is a "precursor" to Google's "prebunking" initiative, which the Alphabet-owned company said last year it was expanding into Germany and India, FFO said. Prebunking is based on inoculation theory, which treats disfavored narratives as a virus.
FFO notes the primer touts the work of former Harvard misinformation researcher and Hunter Biden laptop truther Joan Donovan, who advocates prebunking.
Comedian Jon Stewart publicly challenged Donovan's work and former Facebook policy chief Eliot Schrage reportedly did so privately as a Harvard Kennedy School adviser, both questioning who gets to decide what counts as "disinformation."
USAID separately thanked State's Global Engagement Center staff for "their review and feedback."
GEC has funded and promoted internet games that teach children to view populism as disinformation – a form of prebunking – and promote "social justice."
It also funded the Global Disinformation Index, which pressures advertisers to cut off alleged sources of disinformation, predominantly conservative publishers – drawing the ire of congressional Republicans and a lawsuit by two publishers.
The primer calls for targeting "gaming sites," including Amazon-owned Twitch, because "problematic information more regularly originates from networks of alternative sites and anonymous individuals" than from state actors.
USAID put scare quotes around "mainstream" twice to describe sources outside of gaming communities. While these "conspiracy theories may seem farcical or preposterous to an outsider," the primer warns, mainstream sources covering gaming communities "may unknowingly provide wide coverage of misleading information."
The primer also puts scare quotes around "research" to describe gamers taking the initiative to educate themselves, "collectively reviewing and validating each other to create a populist expertise that justifies, shapes and supports their alternative beliefs."
Malinformation, sometimes described as true but inconvenient, also makes an appearance in the primer. It's defined as "deliberate publication of private information for personal or private interest" that is "taken out of context," as well as the "deliberate manipulation of genuine content" and "hate speech."
Memes are "a common means" for malinformation to spread, the primer warns. "[T]he most effective are humorous or critical of society ... but researchers note they also can be dangerous."
The supply-side section acknowledges the "inherent contradiction between the democratic ideal of free speech and the regulation of online content" and warns that democracies that pass anti-disinformation laws can embolden "authoritarians" to do the same.
One way to have their cake and eat it too is for democracies to adopt "co-regulatory models" such as the "industry-led Global Internet Forum to Counter Terrorism (GIFCT) and the government-led Information Sharing and Analysis Centers (ISACs)," the primer says.
The Facts Inside Our Reporter's Notebook
Links
- Treasury Department's tacit support for debanking alleged "hate groups
- "strengthen[s] resilient democratic societies
- Jan. 31, 2021 Myanmar military coup
- 102-page production
- published an analysis
- public awareness campaign about pineapple on pizza
- piloted by Moonshot and Google's Jigsaw
- Google's "prebunking" initiative
- Comedian Jon Stewart publicly challenged Donovan's work
- Eliot Schrage reportedly did so privately
- funded and promoted internet games
- drawing the ire of congressional Republicans
- lawsuit by two publishers