The Human Costs of Content Moderation
What, if anything, should be banned from online media? And who should review violent and explicit content, in order to decide if it’s okay for the public? Thousands of people around the world are working long, difficult hours as content moderators in support of sites like Facebook, Twitter, and YouTube. They are guided by complex and shifting guidelines, and their work can sometimes lead to psychological trauma. But the practice of content moderation also raises questions about censorship and free expression online.
In this IRL episode, host Manoush Zomorodi talks with a forensic investigator who compares the work she does solving disturbing crimes with the work done by content moderators. We hear the stories of content moderators working in the Philippines, as told by the directors of a new documentary called The Cleaners. Ellen Silver from Facebook joins us to outline Facebook's content moderation policies. Kalev Leetaru flags the risks that come from relying on artificial intelligence to clean the web. And Kat Lo explains why this work is impossible to get exactly right.
Some of the content in this episode is sensitive and may be difficult to hear for some listeners.
IRL is an original podcast from Mozilla, maker of Firefox and always fighting for you. For more on the series go to irlpodcast.org.
Read the New York Times article on Facebook's content moderation policies and also Facebook’s response.
Want more? Mozilla has teamed up with 826 Valencia to bring you perspectives written by students on IRL topics this season. Nicole M. from De Marillac Academy wrote this piece on inappropriate content online.
And, check out this article from Common Sense Media, on disturbing YouTube videos that are supposed to be for kids.
And finally, this IRL episode’s content underscores the importance of supporting companies committed to ethical tech and humane practices. Thank you for supporting Mozilla by choosing Firefox.
Leave a rating or review in Apple Podcasts so we know what you think.