Fighting the Spread of Misinformation

It was 2012. Tim Weninger was a Ph.D. student surfing Facebook and Reddit. As a computer scientist, his procrastination took a turn when he noticed some alarming trends on the sites. The Islamic State group, commonly known as ISIS, was on the rise in Syria and was using social media to effectively convert and recruit young terrorists from the West. How, Weninger wondered, did they have such sway?

One year later, Weninger was an assistant professor at Notre Dame and continued to study Islamic State activity on social networks. He identified a group of individuals within the group who regularly pushed pro-IS material online by “liking” and retweeting posts and buried anti-IS reports by downvoting those posts. By working as a unified group, they significantly altered what the public saw and believed, effectively bringing in tens of thousands of recruits to their terrorist ideology.

Tim Weninger headshotTim Weninger

Perhaps more ominous was a new discovery: Foreign actors could apply the same tactics to influence world events and elections. Weninger wrote a letter to the U.S. Air Force detailing his assessments and predictions, including the prediction that Russia would try to influence world events such as the 2016 U.S. election through propaganda and known vulnerabilities in social media systems. The Air Force responded quickly, directing him to continue studying the issue.

Today, Weninger’s resume details projects with the Air Force Office of Scientific Research, the Army Research Office, the Defense Advanced Research Projects Agency (DARPA) and the Pacific Northwest National Laboratory. These projects touch on questions related to how humans consume information online, how people interact in social environments and how digital information shapes beliefs and actions online and offline.

Weninger was optimistic when he began — certainly people could identify and avoid deliberately false or misleading information. But what he uncovered was surprising.

“Unfortunately, what we ended up finding was that people on social media don’t inspect past the headlines as much as we would like,” he says, noting that one of their studies concluded a full 75 percent of retweets and votes on Reddit occur without users reading or even clicking to open the link they’re retweeting.

“It’s often the case that when we’re rating the news that other people will see we don’t inspect it deeply enough,” he says. “We don’t critically analyze the information, yet we vote on it the same. A vote without reading the article counts the same as a vote of someone who critically analyzed the article.”

Working with graduate student Maria Glenski, he also discovered just a handful of votes — sometimes a mere 100 or 200 “likes” — can launch a story to the top of a social media platform, regardless of its legitimacy. He found that with a handful of accounts, or a few hundred dollars to purchase votes, nearly any bit of disinformation can be spread easily. He points to clear examples such as false Brexit claims, the Pizzagate scandal, fake hurricane recovery efforts in Puerto Rico and the 2016 U.S. election. These votes, in a sense, give power to anyone on social media. Rather than the media outlets determining what’s “fit to print,” anyone with a social media account can influence what appears on people’s newsfeeds as trending stories.

Weninger, in front of a white board illustrating networks, presents to a group of students.Weninger, working closely with students, discovered that just a handful of votes can launch a story to the top of a social media platform, regardless of its legitimacy.

“Based on that information, at Notre Dame what we’re doing is we’re trying to assess the vulnerabilities of social media systems and we’re trying to come up with deterrent systems to make Facebook, Reddit, Twitter and social media systems more resilient to this type of propaganda spread, and to make it to where quality information reaches the right individuals at the right time so that propaganda and disinformation are not spread as easily,” he says.

That’s easier said than done, he explains, citing that they don’t want to violate the right to free speech in any way. Instead, his team is helping design algorithms that better assess the legitimacy of a vote or a “like” for social networks.

Weninger is also working with researchers at the University of Southern California and Indiana University on a DARPA project titled “COSINE: Cognitive Online Simulation of Information Network Environments,” which aims to simulate online social behavior so researchers can better understand and predict the spread of information. The project is part of DARPA’s Computational Simulation of Online Social Behavior program, also referred to as SocialSim.

At the helm at DARPA is Steve Walker ’87, ’97 Ph.D. Since graduating from Air Force ROTC, Walker has served in the Reserves and has worked at the Air Force Research Laboratory, at the Department of Defense and in various roles at DARPA. DARPA, he explains, is where the foundation of the internet was laid. It’s where stealth aircraft were invented. And it’s where the field of material science was developed.

Steve Walker '87, '97 Ph.D., discusses how his Notre Dame education has influenced his career in national security and current role as director of Defense Advanced Research Projects Agency (DARPA).

“The DARPA mission is to develop breakthrough technologies and capabilities for national security. We do that, I think, better than any agency in the federal government in that our projects are focused on high impact,” he says. “If it’s incremental innovation, we’re not interested. If it changes the world, that is the bar that we look to to start programs here at DARPA.”

The work at DARPA, he says, ranges across traditional defense domains like land, air, sea and space, but has more recently entered electronic, information and social media arenas as well. Progress in these fields is meant to help stabilize regions where U.S. fighters are on the ground. Truth, he says, is one way to increase stability.

“We’ve seen state actors as well as non-state actors — think terrorist groups like ISIS — use social media to misinform what’s really going on in the world,” he says. “The Russians used it in Crimea to their advantage. ISIS has used it to not only pass disinformation but also recruit online. We want to understand how this information gets passed online, how it affects people online especially in these areas where we’re trying to combat insurgency, combat terrorism overseas. And SocialSim is a program, among others, sort of directed at that.”

Jonathan Pfautz, Ph.D., is the program manager leading the Social Sim program at DARPA. He says the current estimate is that nearly 4 billion people use the internet, which makes understanding how and why they network online of critical consequence.

“It becomes very important to understand how do people interact with each other,” Pfautz says. “How is society changing as a function of these interactions? And how can we defend the kinds of values that we have here in America — against those that try to influence our online environment?”

The SocialSim team, including Weninger, are searching for the answers to those questions with the dual goals of better understanding human online behavior and making the social systems more resilient.

Weninger sees the role as a responsibility. In his estimation, computer scientists and engineers built these digital social systems, so they must play a part in identifying the problems and creating the solutions. And, he adds, they must do it quickly before values like democracy, truth and access to accurate information are put in danger.

“I would say that this is one of the most important topics in all of science right now,” Weninger says. “The truth is critical to our daily lives because it informs how we see the world. …”

“Through my research at Notre Dame, we are trying to come to the truth, to understand what is fact, what is falsehood, and to get the truth and facts to the people who need it.”

This isn’t a new problem, Weninger says, brushing off any sentiment of doom or hopelessness. It’s a new society, with new tools and technologies, trying to answer the same questions that were around at the origin of the printing press — how do we ensure the distribution of truthful information? For his part, Weninger plans to continue to work closely with DARPA, Facebook, Reddit and other key partners in the fight for greater digital resiliency and for truth in the information we receive.

 

Many of the visuals seen in this video were shot in the Newseum in Washington, D.C. The museum educates about the importance of a free press and the First Amendment.