Content Moderation

by Seth Erickson

Sarah Roberts spoke at the Information Studies department last week. Because her discussion of online content moderation relates to our reading for the class, I’ll summarize some of the highlights from her talk here.

Content moderators are people who review and perhaps remove user-generated  content on social media sites and discussion forums. Typically these workers are payed either directly or indirectly (i.e., the work is outsourced) by sites with large amounts of user generated content (e.g., YouTube, Facebook) or by companies that want to carefully manage their corporate image. The general goal of content moderation is to “ensure that that content complies with legal and regulatory exigencies, site/community guidelines, user agreements, and that it falls within norms of taste and acceptability for that site and its cultural context” (Roberts).

Content moderation is a necessary part of digital media production on the web but it is largely ignored or hidden from view. Most of us don’t imagine that our contributions to social media sites are reviewed and vetted by an actual person (not to mention someone whose full-time job is to do that vetting). In other words, content moderation goes against the popular conception of the Internet as a place of direct, un-mediated, and seamless communication. Roberts relates content moderation to other types of hidden digital labor — she refers to Andrew Wilson’s  video essay, “Workers Leaving the Googleplex”, on Google’s tight-lipped handling of an investigation of book scanner operators, which we looked at in this class a few weeks ago.

For me, probably the most powerful part of Roberts’ discussion was her account of the psychological and emotional stresses that this kind of work entails. Content moderation involves a sinister double bind: on the one hand workers are subject to depictions of the very worst kinds of human behavior (Roberts recounted one moderator’s trouble with videos of atrocities from the war in Syria), on the other hand being “good at your job” in this context means being able to “handle it”. Moderators sometimes work in call center-like environments with psychologists on hand, but they are reluctant to seek counseling as this would entail standing up and walking to the counselor’s office in plain view of their coworkers — an open admission of not being able to handle it. Roberts also reported high burn-out rates and even PTSD among her informants. Clearly, content moderation is a new and important type of emotional labor in the new “digital economy”