Skip to main
Loading...

With the recent purchase of Twitter by Elon Musk and the rising profile of platforms like TikTok and Truth Social, the landscape of social media’s influence on the public sphere is in a tenuous state.

As the U.S. tries to keep up with disinformation, misinformation, and hate speech on these platforms, there have been several recent legislative proposals aimed at standardizing social media regulation. The Digital Platform Commission Act (DCPA) and the Open App Markets Act (OAMA) are just a few ways in which congress is looking to regulate social media platforms.

To more closely examine these issues, Duke faculty member Phil Napoli led a discussion with Nora Benavidez, Senior Counsel and Director of Digital Justice and Civil Rights at Free Press, and Harold Feld, Senior Vice President at Public Knowledge. Together they explored the good, bad and ugly ways social media companies have changed societal communication. In this webinar hosted by Duke in DC, the three experts discussed how big tech has tried to self-regulate and possible solutions to users’ current problems on these platforms.

Excerpts

On Elon musk’s recent takeover of Twitter

Nora Benavidez: “Elon Musk took over Twitter about a week before the midterm elections here in the U.S. After many months of will he won’t he, should he shouldn’t he. He is, on the one hand, a perfect villain for some. He is also a staunch advocate, a self-proclaimed free speech advocate, but where is the reality?

The number of tweets using slurs has risen since he took over. There is a content problem.

For policymakers, the most important finding is that the steps companies take actually make a difference in their ability to mitigate and blunt the spread of problematic, violative content. There has been a long arc in trying to convince people on the Hill, researchers, advocates, allies, and other sectors that the relationship between our online world and the experiences that people have in the real world is something worth discussing.

The number one issue that we keep coming back to is that the steps that leaders take on social media have very real consequences. These issues can also extend into elections, voting, and democracy. It’s worth monitoring these issues long term because those are the moments for intervention from policymakers and regulatory agencies.”

Harold Feld: “There are good points to these social media platforms, and on Twitter, we are now seeing them violently destroyed. The fact that we have a mass exodus of real sources of information. Reporters, experts, and others who have made Twitter worthwhile are exiting in droves.

The unchecked personal power that is invested in the individuals who run these companies is scary. We ought to be very concerned.

We are seeing communities that people have invested years in building are now being destroyed with no obvious place to go and no way to remedy this on their own. We need to recognize that there are enormous costs to people and communities. Social costs not only in losing their existing connections but in all the things people have invested in building social capital through these platforms over the years are now being just torched by one irresponsible individual.”

Key developments on social media platforms concerning the midterm elections

Benavidez: “There is at least some evidence that voters have been resilient to disinformation narratives. There were dozens of election deniers around the country running on campaigns with rhetoric about the Big Lie, that the election was stolen in 2020. Most of those candidates lost, and voters have made their voices heard.

What are new tactics? A lot of non-English language disinformation is targeting populations on social media, particularly in places like WhatsApp, where conversations are encrypted. Large groups get together, and if they are all in a single chat in another language, it often comes with a sense of trust and community.”

Feld: “We have seen some improvement over the last six years in which the most obvious pluggable holes have been plugged. It’s not 2017 anymore; we now understand the basics of bot armies and how to manage recommendation algorithms in sophisticated ways. That said, the biggest development this year has been the rise of TikTok, which now has become a substantial platform for the exchange of information for promoting candidates, but also has the least developed policies for addressing deliberate disinformation.

When we talk about how harassment is being organized, it is less common to see a famous person put out something that targets an individual on major platforms (unless you are Elon Musk). Where you really see that instead now is on these other platforms like 8chan and Truth and Gab and other platforms that have no content moderation policies. Therefore, it is much easier for people to organize harassment campaigns that affect people in the real world.”

What policymakers can do in the future to solve content moderation problems

Feld: “There are good things and bad things that come from social media. Without social media, we would not have witnessed something like Ferguson and the injustices that are happening within communities. In Iran, even though the government has shut down social media, we are still seeing information about the protests come out. On the other side, we see some countries where governments use social media to commit genocide.

The history of electronic media regulation has some lessons to teach us, but it cannot be definitive. We can’t try to make existing regulations from other sectors work with social media. Social media needs a sector-specific regulator.”

Benavidez: “The role of government should not be to penalize these private companies. If anything, the goal should be to make them more transparent through their mechanisms, encouraging auditing of algorithmic decision-making, and increasing data privacy for users. There are a host of policy recommendations and decisions we can make. It is not through the lens of the government regulating private speech. For congress, real people matter, and we need to understand the ways in which social media has created (and destroyed) communities.”

PANELISTS

Nora Benavidez, Senior Counsel and Director of Digital Justice and Civil Rights at Free Press 

Nora Benavidez manages the organization's efforts around platform and media accountability to defend against digital threats to democracy. She previously served as the director of PEN America's U.S. Free Expression Programs, guiding the organization's national advocacy agenda on First Amendment and free-expression issues, including press freedom, disinformation defense and protest rights. Nora launched and led PEN America's media literacy and disinformation-defense program. Nora is a civil rights and constitutional lawyer who previously worked in private practice and at the ACLU of Georgia, litigating significant cases representing victims of voting-rights violations, unconstitutional police practices, First Amendment infringements and more. Nora graduated from Emory University School of Law and received her B.A. from New York University's Gallatin School.

Harold Feld, Senior Vice President at Public Knowledge 

Harold Feld is Public Knowledge's Senior Vice President and author of "The Case for the Digital Platform Act," a guide to what the government can do to preserve competition and empower individual users in the vast swath of our economy now referred to as "Big Tech." Former FCC Chairman Tom Wheeler described this book as "[…] a tour de force of the issues raised by the digital economy and internet capitalism." For more than 20 years, Feld has practiced law at the intersection of technology, broadband and media policy in both the private sector and the public interest community. Feld also writes "Tales of the Sausage Factory," a progressive blog on media and telecom policy. In 2007, Illinois Senator Dick Durbin praised him and his blog for "[doing] a lot of great work helping people understand how FCC decisions affect people and communities on the ground." Feld has an undergraduate degree from Princeton University, a law degree from Boston University and clerked for the D.C. Court of Appeals.

Philip Napoli, James R. Shepley Distinguished Professor of Public Policy at Duke University 

Philip M. Napoli is the James R. Shepley Professor of Public Policy, Director of the DeWitt Wallace Center for Media & Democracy, and Senior Associate Dean for Faculty and Research for the Sanford School. He also serves as a Docent at the University of Helsinki. Professor Napoli's research focuses on media institutions and media regulation and policy. He has provided formal and informal expert testimony to government bodies such as the U.S. Senate, the Federal Communications Commission, the Federal Trade Commission, and the Congressional Research Service. Professor Napoli's research has received awards from the National Business and Economics Society, the Broadcast Education Association, the International Communication Association and the National Communication Association and has been cited in several government proceedings and reports. Napoli received a Ph.D. from Northwestern University, an M.S. from Boston University, and a B.A. from the University of California – Berkeley.