Friday, May 8, 2020
<br />
1:00 pm
–
2:00 pm
EDT
In this week’s episode of BIG, If True, our host Joan Donovan, PhD asks: should we trust our search engines? Have joint industry efforts – led by Facebook, Google, Microsoft, YouTube, Twitter, Reddit and LinkedIn – to limit misinformation been successful? How are new content policies specific to COVID-19 being enforced? And if so, at what cost?
As we clumsily shift our lives online, the cracks in the information infrastructure are bursting open. While there’s been an uptick in boosting trusted content by credible sources, like the Center for Disease Control and the World Health Organization, there has simultaneously been sweeping purges of advertisements seeking to capitalize on the crisis and suspicious accounts, leaving us to wonder who’s heard and who’s harmed in the current infodemic. Amidst this sliding scale of uncertainty, we turn to leading voices in the field, UCLA professors Safiya Umoja Noble, PhD and Sarah T. Roberts, PhD and Washington Post Reporter, Elizabeth Dwoskin, who have been taking stock of how commercial content is being moderated during the pandemic.
Safiya Umoja Noble is an Associate Professor at the University of California, Los Angeles, in the Department of Information Studies and serves as the Co-Director of the UCLA Center for Critical Internet Inquiry. She is the author of a best-selling book on racist and sexist algorithmic bias in search engines titled: Algorithms of Oppression: How Search Engines Reinforce Racism.
Sarah T. Roberts serves as an Assistant Professor of Information Studies at UCLA’s School of Education and Information Studies. Roberts is a leading authority on “commercial content moderation”, the term she coined to describe the work of those responsible for making sure the photos, videos and stories posted to commercial websites fit within legal, ethical and the site’s own guidelines and standards. Her book,
Behind the Screen: Content Moderation in the Shadows of Social Media, was released on Yale University Press in 2019.
Elizabeth Dwoskin, a Silicon Valley correspondent at The Washington Post, covers the rise of data mining, machine learning and AI throughout the tech industry and in the economy at large. Dwoskin’s recent articles – from smartphone apps that map infection pathways to new trends in consumer habits that give way to greater market monopolization – offer readers around the world fresh insight on what’s at play amid the coronavirus pandemic.
Registration for this event has closed, you can view a live stream of the webinar here.
Hosted by
Joan Donovan, PhD,
BIG, If True is a seminar series presented by the
Technology and Social Change Research Project (TaSC) at the Shorenstein Center.
Want to keep up with what TaSC is seeing week to week? Sign up for their newsletter,
Meme War Weekly, and get fresh insights from the team straight to your inbox.
Dr. Donovan’s research specializes in Critical Internet Studies, Science and Technology Studies, and the Sociology of Social Movements. Dr. Donovan’s research and expertise has been showcased in a wide array of media outlets including NPR, Washington Post, The New York Times, Rolling Stone, ABC News, NBC News, Columbia Journalism Review, The Atlantic, Nature, and more.
The TaSC Project researches media manipulation, disinformation, political communication, and technology’s relationship to society. The research team is composed of subject matter experts, Brian Friedberg, an investigative ethnographer of online social worlds, Gabrielle Lim, a researcher of sociotechnical systems and information controls, and Rob Faris, co-author of Network Propaganda and researcher of large-scale media ecosystems. The TaSC Project aims to understand how media manipulation is a means to control public conversation, derail democracy, and disrupt society. The project conducts research, develops methods, and facilitates workshops for journalists, policy makers, technologists, and civil society organizations on how to detect, document, and debunk media manipulation campaigns. The project is creating a research platform called the Media Manipulation Case Book, which will include 100 case studies to advance our knowledge of how misinformation travels across the web and platforms.