Drowning in disinformation


A new white paper co-authored by Nadya Bliss, executive director of the Global Security Initiative at Arizona State University, outlines a clear agenda for research on disinformation that could help inform a national response. Illustration by Shireen Dooling/Arizona State University

|

The use and spread of disinformation — false or misleading information intended to deceive people — is being amplified and accelerated at an alarming rate on the internet via social media.

Within the U.S., this has quickly eroded trust in institutions that serve as the bedrock of our society, such as science, the media, and government, to the point that we can’t even agree on basic facts.

In a white paper for the Computing Research Association’s Computing Community Consortium, a group of researchers from Arizona State University, Columbia University, Santa Fe Institute and the University of Colorado, outlines steps to begin dealing with the disinformation problem.

Disinformation is often used to create confusion and dismantle trust in traditionally trustworthy organizations. One obvious example of disinformation today is the way COVID-19 has been called a “hoax,” which resulted in many people not viewing it as a real threat to their health or taking necessary precautions to prevent and contain its spread. 

“Disinformation and the poisoned information environment we’re all swimming in needs to be a national priority,” said Nadya Bliss, executive director of the Global Security Initiative at ASU. “Our white paper outlines a clear agenda for research on the topic that could help inform a national response driven by the public and private sectors together.”

One of the CRA’s main goals is to explore how computing research can help support national priorities.

“Within the past few months, we’ve seen other large-scale disinformation about elections and the democratic process in terms of the validity, legality and security of mail-in ballots, fraudulent voting, rigged elections, dead people voting, supercomputers changing votes, etc.,” said co-author Joshua Garland, an Applied Complexity Fellow at the Santa Fe Institute. “And there are many other examples surrounding migrants, vaccines and climate change.”

Disinformation is an existential threat to democracy and society, points out Elizabeth Bradley, a professor of computer science at the University of Colorado.

“We technologists created many of the tools being used by disinformation creators and circulators — the internet, social media, etc. — and it’s incumbent upon us to think about solutions,” Bradley said.

To address disinformation, the researchers emphasize that both supply and demand must be addressed.

“On the supply side, we need to develop better methods for detecting and isolating or at least mitigating disinformation before it spreads,” Bliss said. “On the demand side, we need improved efforts to educate the citizenry so people are less susceptible to believing and spreading disinformation.”

Purveyors of disinformation are excellent at manipulating human emotions — they create content that is meant to seem believable while triggering an emotional response. As an individual, the best thing you can do to stop the spread of disinformation is to be sure you aren’t part of the problem. If you’re online and see a post that outrages you, Bliss cautions you to take a moment to think before sharing it.

The researchers say the challenge of combatting disinformation requires a comprehensive response that goes far beyond computing research and includes education, psychology, journalism and other disciplines.

“There's a tremendous need to understand how data empowered algorithms are impacting our reality and the offline world,” said co-author Chris Wiggins, an associate professor of applied mathematics at Columbia University’s School of Engineering and Applied Science and the chief data scientist at The New York Times. “Just like for any other complex system, addressing this will require interacting with the system — here the information ecosystem — in a way that respects ethical concerns for rights, harms and justice.”

Story by Arizona State University, Computing Research Association’s Computing Community Consortium, the Santa Fe Institute, University of Colorado and Columbia University.

More Science and technology

 

Man crouched in the dirt in a desert landscape.

Lucy's lasting legacy: Donald Johanson reflects on the discovery of a lifetime

Fifty years ago, in the dusty hills of Hadar, Ethiopia, a young paleoanthropologist, Donald Johanson, discovered what would…

A closeup of a silicon wafer next to a molded wafer

ASU and Deca Technologies selected to lead $100M SHIELD USA project to strengthen U.S. semiconductor packaging capabilities

The National Institute of Standards and Technology — part of the U.S. Department of Commerce — announced today that it plans to…

Close-up illustration of cancer cells

From food crops to cancer clinics: Lessons in extermination resistance

Just as crop-devouring insects evolve to resist pesticides, cancer cells can increase their lethality by developing resistance to…