Follow Us

With courts skeptical, Biden outsources misinfo policing to researchers, reelection campaign

Anti-misinformation efforts show up in unexpected National Science Foundation grants. Former White House aide who cursed out Big Tech now leads campaign's social media monitoring army.

Published: September 25, 2023 11:03pm

Stung by court decisions limiting its ability to pressure and coerce social media companies to remove purported misinformation and facing an uncertain response at the Supreme Court, the Biden administration is finding workarounds.

The State Department and National Science Foundation together awarded more than a dozen grants for misinformation-related programs with start dates this month or next, totaling about $4.4 million. 

The Department of Defense awarded a no-bid $2.5 million contract last week to an artificial intelligence company that scours social media, the "dark web" and audio transmissions, whose software will be used by a DOD component that neighbors the National Security Agency.

Parallel to his agencies' efforts, the Biden reelection campaign is creating an army to identify and correct purported misinformation about the octogenarian president expected to come from rivals including Donald Trump that will almost certainly include his family overseas business dealings.

Its leaders include deputy campaign manager Rob Flaherty, whose profane marching orders to social media companies as White House director of digital strategy were exposed by Republican attorneys general's litigation against the social media pressure campaign. 

Anti-censorship group Reclaim the Net flagged the State and NSF grants. One surprising recipient is Cornell University postdoctoral fellow Darryl Seligman, who received an NSF astronomy and astrophysics fellowship worth $330,000

While focused on revealing "new insights into how the planet formation process varies throughout the galaxy," Seligman will develop "educational materials to help identify misinformation in media, using the case study of artificial origins of interstellar objects to motivate more pressing matters such as climate change and the global pandemic," the award says.

Citing the "spread of misinformation about hazards, risks, and how to manage them," a $1.5 million grant will fund an Arizona State University project that "combines gamification of the rules of life with narrative storytelling to develop new strategies for collectively managing risk of natural disasters, infrastructure challenges, pandemics, and other shocks."

Florida International University received more than half a million to study, detect and contain "influence campaigns" that rely on "social networks to distribute and amplify misinformation and hate speech with significant societal impact."

The State University of New York's Research Foundation and Boston University received about $730,000 to together develop a "platform for measuring and understanding information manipulation" that "active practitioners" will use to identify and mitigate misinformation and disinformation.

The University of Florida and University of North Carolina Charlotte received nearly $550,000 to collaborate on a project that identifies "tipping points in public dialogue on controversial issues" and tests "user-centric interventions that integrates [sic] psychological and socio-cultural constructs." It involves UF's Poynter Institute, whose International Fact-Checking Network received $1.3 million from grant-making organizations founded by progressive megadonors Pierre Omidyar and George Soros.

Small five-figure State Department grants went to thinly described programs in Albania ("whole-of-society response to cyber incidents and misinformation"), Paraguay ("disinformation and media literacy" workshops for high schoolers) and Indonesia ("digital literacy" for journalists and "social media influencers" to combat misinformation before the 2024 general election).

The department also gave $50,000 grants each to New York University's Abu Dhabi campus for a speaker series across the United Arab Emirates "to support countering misinformation," and Digital Rights Nepal to create a youth network with "collective resilience towards misinformation and disinformation."

DOD's contract award went to New York-based Dataminr for its First Alert V2 and Event Detection software. The company claims its social media surveillance detected China's COVID-19 outbreak and threats to "storm" the U.S. Capitol a day before the Jan. 6, 2021, riot.

The software will be used by DOD's Defense Information Systems Agency, which runs the White House Communications Agency and is based at Fort Meade, Maryland, like the NSA. It's a one-year contract with "three, one-year option periods."

Though sparse on details, it strongly resembles a redacted no-bid Dataminr/DISA contract revealed nearly a year ago, also by Reclaim the Net.

That document is missing dates, signatures and estimated value, but discloses DISA is purchasing 30 software licenses for non-DOD civilian personnel on an undisclosed "Watch floor" with a "12-month period of performance."

The software "monitors over 200,000 public information sources" including Twitter, Reddit and Internet of Things sensor data, "evaluate[s] content to detect emerging events as they are developing and push[es] alerts to users based on user-defined criteria," the document says. It notes military and DOD civilian personnel already use the same Dataminr software through an Air Force contract.

Former White House aide Flaherty, who received a glowing farewell from Biden in June and joined the reelection campaign in August, confirmed the broad outlines of the private social media monitoring operation to Politico.

“The campaign is going to have to be more aggressive pushing back on misinformation from a communications perspective and filling some of the gaps these companies are leaving behind,” Flaherty said, alluding to less content moderation on X, formerly Twitter, under Elon Musk's ownership as well as YouTube's summer decision to stop policing 2020 election claims.

Hundreds of staff and volunteers will "create and push out paid ads targeted at susceptible voters to counter any disinformation against Biden as a candidate … refute false claims from opponents that are intended to suppress voter turnout" and "work with media outlets to fact-check untruths," Politico reported, citing five campaign officials.