For a moment, imagine yourself back in the Arab Spring of the early 2010s, when pro-democracy protests organized on cell phones toppled repressive governments across the Arab world.

Sparked by a Tunisian street vendor, ordinary people used social media platforms such as Twitter and Facebook to organize and demand freedom and justice. The old media gatekeepers were thrown aside. Uprisings were at least temporarily successful at toppling dictators from Cairo to Tripoli. The future seemed bright.

Then it all fell apart.

Authoritarians from Syria to Russia to China learned how to use social media to spread disinformation and impose surveillance on their own citizens. Terrorist organizations mastered the web to radicalize potential recruits and raise funds. Social media became a powerful tool for both maintaining control and spreading chaos. A decade later, a consensus had grown around calling social media a toxic cesspool.

Lisa Schirch smiles in a professional headshot. She has short, wavy grey-blonde hair and brown eyes, and is wearing a black blouse with a white micro-dot pattern against a soft-focus, warm-toned background.

Lisa Schirch

Richard G. Starmann Sr. Professor of the Practice of Peace Studies at the University of Notre Dame's Keough School of Global Affairs

Enter Lisa Schirch, the Richard G. Starmann Sr. Professor of the Practice of Peace Studies at the University of Notre Dame's Keough School of Global Affairs. Schirch has accepted the challenge of studying how online technology can be transformed into a positive influence that promotes constructive and democratic ideals. She edited a 2021 book on the topic titled Social Media Impacts on Conflict and Democracy: The Techtonic Shift.

Her research is not about content moderation, which leads to accusations of playing politics from both sides of the aisle. Instead, Schirch focuses on altering the underlying design choices and algorithms that reward polarization simply because it increases engagement and profits. And that was before artificial intelligence (AI) took off.

“I think everything I outlined in that book is exactly true today, but now we have AI and it's making it even worse,” Schirch said. “So it's made the problems bigger and more daunting.

“But then I started realizing we're just studying the negative side of technology, and we really need to also see how democracy activists could use technology in a positive way to amplify their work.”

Schirch is not pollyannish in this pursuit. She knows the challenge, but she also thinks the public is tiring of the divisiveness driven by platforms like Facebook. Ad-based business models also encourage an endless scrolling and social comparison that undermine mental health and society.

“I'm terribly critical of Facebook,” she said. “They have known how devastating their platform is for societies and they reject making changes to their platform design. They clearly care more about their profits than public safety. So you're going to have to have some serious competition and people leaving Facebook and going somewhere else for them to be open to these ideas.”

In an online op-ed essay in July, Schirch noted that every design choice that social media platforms make “nudges users toward certain actions, values and emotional states.”

“Platforms routinely claim they merely reflect user behavior, yet internal documents and whistleblower accounts have shown that toxic content often gets a boost because it captures people's attention,” Schirch wrote. “Platform design is a silent pilot steering human behavior.”

Rethinking algorithms to reduce polarization

Rather than focus on government regulation of social media or content, she points to other countries that have begun to fund new platform designs. “Governments that are still committed to democracy should be investing in social media and AI as public infrastructure that needs to be safe,” she said. “That's where a lot of my research is today—while content moderation may still be necessary at times, the bigger picture is having platforms be designed to support better healthy conflict, healthy expression of different points of view.”

A woman with grey-blonde hair speaks at a podium in front of a large blue screen. The screen displays a photo of Mark Zuckerberg and text about the power to share and connect. She wears a black top and gestures with her hands while speaking.
Lisa Schirch gives a 2022 lecture titled “Mapping Technologies for Peace and Conflict: From the Weaponization of Social Media to Digital Peacebuilding and Peacetech.”

Rather than reward extremism, social media algorithms could be designed to find and promote comments that get liked by people from both sides of any political divide. Then the technology would be a tool to find common ground through productive conversations. Countries such as Finland and Taiwan have already begun to solve thorny societal problems using an open-source technology called Polis.

“These deliberative platforms help people listen to each other, not get into toxic personal attacks, and their algorithms are showing where there's common ground,” Schirch said. “They're a really exciting new way of doing democracy, of having the public participate in building solutions and finding consensus where there's just been a lot of polarization.”

Another solution Schirch recommends is separating family news and personal photos from political opinions, as well as real journalism from unverified claims. Political opinions often find better reception when they are considered for their merits, rather than attached to a profile that leads to personal attacks.

“Right now, Facebook is just a mix of a bunch of different kinds of information,” she said. “Here's my birthday party pictures, here's who I'm dating and these are my political views all in your feed, all mixed up; that's really maybe entertaining, but it's also crazy making.”

Instead, the public could just post family news on a family channel. And political views could be posted separately—possibly anonymously, without a reply button, and with an algorithm that promotes common ground.

Finally, some new models are experimenting with interoperable platforms, where people can communicate across platforms and get outside their partisan information bubbles. Schirch said the current Facebook model of a “walled garden” is already losing favor among younger users. The company's business model is to buy off competition, but its user base is aging and it may be forced to change.

“There are a variety of platforms out there looking at completely different profit models and how to incentivize a different way people would interact,” she said.

Lisa Schirch smiles in a professional headshot. She has wavy grey-blonde hair and wears a dark grey blazer over a black turtleneck sweater, posed against a gradient blue to red background.

Peacebuilding strategies for social media reform

Schirch grew up in Ohio in the Mennonite church, where religious values shaped her goals and career. While studying at Goshen College, she said, a formative experience was working in a refugee camp for Nicaraguans displaced by civil war.

“When I graduated, I decided to study diplomacy, conflict resolution, and how to involve people in peace processes so that they're not imposed on people, but people really buy in that coexistence is the best way,” she said.

A political scientist by training, Schirch earned her doctorate in 1989 from George Mason University's Carter School for Peace and Conflict Resolution. The practitioner-scholar was working in Afghanistan with Notre Dame faculty to design a peace process when she decided in 2021 to join her mentor, fellow Mennonite peace scholar John Paul Lederach, a Notre Dame professor emeritus well known in the field.

Her interest in technology grew from the realization that social media was dramatically changing politics and democratic institutions. She published a blueprint for prosocial tech design in May and launched it with the Democracy Initiative, a University-wide effort that has increased her collaboration with scholars ranging from economists to computer scientists. This effort included a dozen workshops with more than 450 experts that Schirch convened over two years to analyze the root causes of harmful online content.

Francis and Kathleen Rooney recently made a $55 million gift to the University to endow an institute committed to the preservation of American democracy through research, teaching, and public engagement. Schirch said Notre Dame is uniquely situated to lead this fight, such as introducing a framework for AI ethics at a September summit on “AI, Faith, and Human Flourishing.” Schirch is also affiliated with the Keough School's Kroc Institute for International Peace Studies and Kellogg Institute.

“Something coming from the Catholic Church would actually have weight because Notre Dame is not left or right, it's Catholic,” she said. “I think that religious actors these days actually have an interesting role to play in bridging some of these ethical questions.” For instance, she said creating a ratings system similar to the way that food ingredients are clearly listed on the package could help the public make more informed choices about social media options.

“We need really easy labels for the public to be able to see that some independent entity has rated this AI system as safe and this social media platform as an A-plus and these others as a C or a D, and then let the public have the choice,” she said.

A wooden artist's mannequin stands on a dark, reflective surface, facing a glowing tablet displaying the Facebook interface. The mannequin has one arm raised to its head as if peering forward into the light against a black background.

Digital literacy to inspire change

Teaching digital literacy from youth through adulthood is also critical, Schirch said, for creating conditions so that the public “will want to choose social media and AI that is safe and verifiable.” In the same way that Twitter changed after it became X, she said, “We need governments understanding that these are political machines and they can be hijacked.”

The consolidation of social media platforms by tech billionaires over the last decade creates even more challenges, Schirch said. While Facebook simply removed fact-checking and misinformation disclaimers, Elon Musk changed X's algorithm to highlight right-wing content, including Russian propaganda.

The sale of TikTok to allies of President Donald Trump could create even more leverage on a platform that young people rely on for information. California Gov. Gavin Newsom has ordered an investigation into whether TikTok’s algorithm is already suppressing criticism of the president and posts related to the Jeffrey Epstein files.

Schirch acknowledged that the current social media climate looks bleak and that her students seem depressed and hopeless about changing it. But she believes that a focus on underlying tech design rather than content outcomes can make a difference.

At a June conference on campus called “National Convening on Social Media and Democracy,” Schirch presented her blueprint on designing prosocial technology that prioritizes trust, cooperation, and problem solving.

“And now that's what I do all day long,” she said. “I'm on Zoom with different people in different parts of the world who are trying to operationalize that blueprint and make it come true.”