Tanushree Mitra, DAC faculty member and assistant professor of CS

Tanushree (Tanu) Mitra, an assistant professor of computer science and a DAC faculty member, has received a grant from the National Science Foundation supported by the Division of Information and Intelligent Systems to lead a study that will use social computing and human-centered approaches to better understand the relationship between people and technology in the context of online news.

“The aim is to provide new perspectives that address digital misinformation by focusing on how we can establish differences between mainstream sources and misleading sources of online news and how we can nudge people to be more careful and conscious consumers of online news,” said Mitra.

The study, entitled “Empirical and Design Investigations to Address Misleading Online News in Social Media,will be conducted along two symbiotic lines of inquiry.

Using data from a professionally curated list of online news sources, along with credibility labels from expert fact-checkers, and tweets sent out by these news sources over a period of at least a year, the researchers will empirically investigate misleading online news sources.

“We will look at how the topical and writing style of these misleading online sources differ from mainstream sources, how the user distinguishes between them and any corresponding time-related changes,” said Mitra.

The second thrust of the study will explore design interventions to increase people’s awareness while they read news on social media sites. Specifically, it will investigate two classes of design nudges on Twitter.

The first intervention, “emphasize,” will nudge users to reflect on the ambiguity and uncertainty present in certain news posts and will automatically detect whether a social media news post from a mainstream source has been questioned and highlight those questionable tweets for the news reader. For example, several users questioned a report from the Associated Press that United Emirates orchestrated the hacking of a Qatari government news site, asking how the AP knows this.

The second intervention, “de-emphasize,” will be triggered whenever news posts originate from misleading sources to make that post less visible in an attempt to minimize the user’s exposure to it.

“The human-centered evaluations accompanying these interventions will provide qualitative and quantitative evidence about user experiences, as well as measurements of their efficacy,” Mitra said.

Mitra joined Virginia Tech in 2017 after earning a Ph.D. in computer science from the Georgia Institute of Technology where the GVU Center named her a Foley Scholar, the highest award for student excellence in research contributions to computing.