By Scott Shigeoka, Community & Design Lead at OpenIDEO
“At a major conference on content-moderation in February at the Santa Clara University law school, where executives from Facebook, Google and Reddit detailed their operations, Caplan said nearly everyone agreed that human moderators were still the reigning gatekeepers for information on the Internet’s most popular sites.”
— ‘AI will solve Facebook’s most vexing problems, Mark Zuckerberg says. Just don’t ask when or how’ by Drew Harwell, The Washington Post, 4/11/18
The moderator is one of the most overlooked yet vital roles in modern technology. They protect people from harassment, foil terrorist recruiters, combat spam, and remove offensive content, bringing a human touch to work that many assume is done by a robot.
Sometimes they participate in debates, keep people in line, and answer questions. Moderators also hold critical knowledge about the audiences and conversations that occur online. When designing products for better interaction, there’s a lot we can learn from moderators and how they work.
A few months ago, I teamed up with Sam Hankins, the Lead Designer at The Coral Project. Together, we met with many of those responsible for moderating the comments on some of the world’s leading publications that use Talk. Through several weeks of interviews using human-centered design, we uncovered insights and shed light on the different approaches that moderators can take to improve conversation online.
The goal of this research isn’t to be prescriptive about what good and bad human moderation looks like. We also understand that this isn’t an exhaustive summary of the space. This is an ongoing dialogue, and we intend to continually design with and alongside those who use our tools, to make them better for everyone.
As well as gaining vital insights to streamline the user experience of our products, through our interviews, we also saw the emergence of four distinct roles that moderators adopt as part of their work: the Superhero; the Screener; the Referee; and the Community Organizer. Some moderators take on a single role; others might move between roles as their job requires.
We hope that by sharing these roles, we can encourage greater understanding of the work of the moderator. And if you moderate comments, which of these do you adopt most in your work? Or maybe you take a different approach? Let us know in the comments. We’ll be looking for your answers.
Four Faces of Moderation
The Superhero
“Talk has given me the power to push back against the darkness that has creeped into the online community.”
— Moderator at a large news outlet
The Superhero describes their job as “keeping the bad out.” Their approach to moderation is reactive. When the villains come out, they respond to the call. They believe it’s their job to create a space for people to have conversations without name-calling, trolling or violence. They want commenters to feel safe.
Efficiency is a principle that came up frequently in our interviews with Superheroes. “I was told not to read all the comments, and [only] find bad words or messages that feel like attacks,” one moderator told us about their training. Most of their day is spent reviewing reported comments.
One moderator from a prominent international news outlet said, ”Some commenters think I’m just a bot.”
Generally, human moderators are more trusted at removal than technology because they can read into context. However, they often want to bring more humanity into their work, such as being more proactive in building a positive culture by participating in conversations.
Their biggest obstacle here is lack the time – many of the moderators we spoke to are remote, part-time shift workers who are contracted or work for a third-party firm. They are encouraged to be as efficient as possible in their work.
The Screener
“Our community sends us emails asking us about the status of their comments which haven’t posted yet.”
— Moderator at national news outlet
The Screener works within publications that use pre-moderation – that is, they read all comments before they are published. This means that their workload is usually larger than other types of moderators. Some of these publications don’t have moderators working on a 24/7 clock – like a shop, they often make the working hours of their moderators public, so that readers know when the comments are open for business.
When asked why their publications use pre-moderation, moderators said they have been told it’s to avert risk and liability. To get through the volume of stories and comments, moderators go story-by-story to approve a few comments, to let readers know there’s been activity to approve some comments on every article.
“The first few comments are the most important for me, then I stop moderating after that,” one moderator said.
The Referee
“I try to be fair and even handed. Let different voices speak. I read carefully to see what somebody is actually trying to say.”
— Moderator at international news outlet
Every publication has a set of community rules or guidelines that help inform commenters on what is and isn’t acceptable.
This is intended to set clear lines for commenters, and warn against trolling, harassment or other forms of hate from appearing into comments. However, except in the most obvious of instances, usually it falls on moderators to assess whether a commenter has broken the rules. As reported in a recent Radiolab story, social media and news outlets take a more black-and-white approach on how to interpret community guidelines, and have internal memos that help human moderators understand what is and isn’t allowed. Like common law, rules are added regularly as new scenarios play out on comments.
Referees spend the majority of their amount responding to commenters—usually through email—linking back to their rules to justify content removal, user suspensions or bans.
Community Organizer
“I learned early that the moderator doesn’t just look like a policeman slapping people’s hands.”
— Moderator at a regional news outlet
The Community Organizer builds strong cultures of belonging with the community members. They usually go above and beyond just removing bad comments and often are involved in in-person gatherings too. They’re either highly experienced as community builders or have a deep relationship with the newsroom (e.g. they were active commenters themselves before being promoted to become a moderator.)
Their goal is to build a strong, resilient and meaningful community of commenters that are learning and connecting alongside each other, and to be a clear and present community member themselves.
What’s Ahead
We believe that moderators should be a lot more involved in every part of the news community process. And as product designers, we should work more closely with moderators when designing our tools.
Newsroom leaders should closely consult with their community moderators to learn knowledge and insights about their most loyal community members. Journalists should talk to moderators to identify potential leads and sources, as well as corrections and ideas for stories that emerge in the community.
Finally, moderators would benefit from being more closely connected with other moderators in the industry using communities such as Gather, to help create common best practices and learn from each other.
We believe that the goal for online community moderation is not to build technology that removes the need for moderators, but instead that moderators are given the technology that can make removal of what are clearly offensive comments faster and easier, allowing them to focus on more nuanced decisions, and on cultivating more positive and engaged communities.
Moderators, we salute the work you do. Online communities and meaningful conversations couldn’t exist without you. Thank you, and let’s keep talking.
Illustrations from Humaaans by Pablo Stanley, CC-BY 4.0