by Adrian Chen
In this month’s Wired, Adrian Chen visits the Philippines to speak with professional content moderators — the people who scrub all the dick pics and beheadings from the world’s biggest sites before they reach users’ eyes. It’s job that, he says, “might very well comprise as much as half the total workforce for social media sites.” Sarah Roberts, a media studies scholar at the University of Western Ontario focusing on commercial content moderation, is quoted in the piece. They caught up over chat.
AC: One thing I would have liked to include in my piece was how you got interested in studying content moderation.
SR: Well, it’s a pretty simple story. I was perusing the NYT one day and there was a very small story in the Tech section about workers in rural Iowa who were doing this content screening job. They were doing it for low wages, essentially as contractors in a call center in a place that, a couple generations ago, was populated by family farms. I call it “Farm Aid Country.” I say this as a born and raised Wisconsinite, from right next door.
So this was a pretty small piece, but it really hit me. The workers at this call center, and others like it, were looking at very troubling user-generated content (UGC) day in and day out. It was taking a toll on them psychologically, in some cases. I should say that I’ve been online for a long time (over twenty years) and, at the time I read this, was working on my Ph.D. in digital information studies. I was surrounded at all times by really smart internet geeks and scholars. So I started asking my peers and professors, “Hey, have you ever heard of this practice?” To my surprise, no one — no one-had.
This was in the summer of 2010. Right there, I knew that it wasn’t simple coincidence that no one had heard of it. It was clear to me that this was a very unglamorous and unpleasant aspect of the social media industries and no one involved was likely in a rush to discuss it. As I interrogated my colleagues, I realized that many of them, once they were given over to think about it at all, immediately assumed that moderation tasks of UGC must be automated. In other words, “Don’t computers/machines/robots do that?”
Right. I actually thought that at least some of it would be done like that before doing this story. That was one of the most surprising things, how little is actually automated.
So that got me wondering about our propensity to collectively believe (I’d say it’s more aspirational, actually — wishful thinking) that unpleasant work tasks are done by machines when so many of them are done by humans. As I’m sure you learned, and I did, too, content moderation of video and images is computationally very difficult. It’s an extremely sophisticated series of judgments that are called upon to make content decisions.
“A list of categories, scrawled on a whiteboard, reminds the workers of what they’re hunting for: pornography, gore, minors, sexual solicitation, sexual body parts/images, racism. When Baybayan sees a potential violation, he drills in on it to confirm, then sends it away — erasing it from the user’s account and the service altogether — and moves back to the grid. Within 25 minutes, Baybayan has eliminated an impressive variety of dick pics, thong shots, exotic objects inserted into bodies, hateful taunts, and requests for oral sex.
More difficult is a post that features a stock image of a man’s chiseled torso, overlaid with the text ‘I want to have a gay experience, M18 here.’ Is this the confession of a hidden desire (allowed) or a hookup request (forbidden)? Baybayan — who, like most employees of TaskUs, has a college degree — spoke thoughtfully about how to judge this distinction.”
It’s interesting that this still seems so undiscovered because content moderation has been going on since the beginning of the commercial internet, really.
It certainly has. Of course, a lot of early moderation was organized as volunteer labor-people doing it for fun, for status or prestige or for perks. (Think people who like to edit Wikipedia nowadays.)
Right, AOL used to run completely on volunteer moderators.
That’s right. [More here.]
Then they had that scandal where the mods sued for payment, right?
That’s right, too! Hector Postigo has a piece on this that I cite in my own work. As the internet went from a niche and rarified space filled with people mostly accessing from universities, R&D facilities, and the like, to a graphical and commercial medium, the need to be more tightly control content immediately grew.
I can’t help thinking of the development of commercial content moderation as a sort of loss of innocence, like going from idyllic farm communities where everyone looked after each other to this industrial hellhole where people work in factories as fast as possible.
Whereas, when I started hanging out online, I was accessing systems that were, quite literally, in somebody’s closet in an apartment in Iowa City. I think there’s a real tendency to look at it that way. I certainly deal with a certain level of nostalgia for “the way it was” before the World Wide Web took off in a big way, around 1994.
That having been said, I think that kind of hindsight is a little anachronistic. For example, consider Usenet. This was a place/site/medium that was self-governed, for the most part, broken down by each newsgroup’s own norms, interests and users. Back in these pre-Web days, an errant click in there could take you to content you might never, ever be able to unsee. Trust me-it happened to me. And so the CCM workers I’ve talked to in my own research have been quick to point this fact out. They’ve told me things like, “Look-you wouldn’t want an internet without us. In fact, you wouldn’t be able to handle it.”
I don’t know if you had this experience with the Manila-based folks, but the people I talked to in Silicon Valley and scattered elsewhere in North America had a real sense of altruism about the work they did. One of my participants told me she used to refer to herself, in her CCM days, as a “sin-eater.” She’d eat other people’s spew, garbage, verbal vomit, racist invective — their sins, in other words. So that other people wouldn’t have to.
I think to some extent all of this craziness around internet harassment has been the bubble bursting on a long illusion that the problem of nastiness on the internet had been sort of solved by Facebook.
So much of hand-wringing online (and let me say that I do not condone online bullying or harassment in any way-witness this nightmare called #gamergate as the latest in many gross examples) is focused on the consumers of the content. In the mid-90s, it was very much a “think of the children!” kind of thing. But, my God, if just being exposed, as a viewer/recipient/consumer to such content can be so profoundly damaging, what happens when you are embedded in that cycle of production, like CCM workers are?
And let’s talk, for a moment, about the labor organization situation for CCM workers, and how that plays into their invisibility. The kind of labor organization that we tend to see, and you can corroborate or speak to what you’ve seen that might differ, among people who participate in this practice, is almost always not a full-time, full-benefits, full-status kind of job
Instead, people are often contractors, limited term, hourly/no benefits, or some other kind of arrangement that is decidedly “less than”-all the way down to digital piecework kinds of arrangements.
Well, in the Philippines it is actually full time, with benefits, at least if you’re working for one of the big outsourcing firms. I was actually surprised at how little content moderation is done through crowdsourcing. I talked to an exec at CrowdFlower. They had a photo moderation tool but they ended up shutting it down because it wasn’t effective enough. And there was another app that was supposed to allow you to do photo moderation on your smartphone, but they didn’t even launch before pivoting to some other kind of crowdsourcing. It seems like you need a certain amount of training and sustained labor to make it work.
“’I get really affected by bestiality with children,” she says. ‘I have to stop. I have to stop for a moment and loosen up, maybe go to Starbucks and have a coffee.’ She laughs at the absurd juxtaposition of a horrific sex crime and an overpriced latte.
Constant exposure to videos like this has turned some of Maria’s coworkers intensely paranoid. Every day they see proof of the infinite variety of human depravity. They begin to suspect the worst of people they meet in real life, wondering what secrets their hard drives might hold. Two of Maria’s female coworkers have become so suspicious that they no longer leave their children with babysitters. They sometimes miss work because they can’t find someone they trust to take care of their kids.
Maria is especially haunted by one video that came across her queue soon after she started the job. ‘There’s this lady,’ she says, dropping her voice. ‘Probably in the age of 15 to 18, I don’t know. She looks like a minor. There’s this bald guy putting his head to the lady’s vagina. The lady is blindfolded, handcuffed, screaming and crying.’”
In your disseration you talked to some guy who lives in Mexico and seems to be doing pretty well content moderating.
Yes, he was a principal in that company, a boutique firm, and he came into that situation in a position of wealth from a previous career. So, in essence, he was management, not one of the lay CCM workers. He appreciated the flexibility of his worklife. And some of the workers I’ve talked to who have the ability to work in that way do like it. but the other CCMers I talked to worked in a major Silicon Valley tech firm and they were on-site there, even though they were contractors (i.e., not full-time, full-status employees of the company).
So where are things headed in the future? It seems to me that moderation work, or at least a similar kind of assembly line content processing, is only growing. When I was in the Philippines I saw one outsourcing firm where dozens of workers were transcribing 19th-century census records for a geneaology firm.
One interesting trend I’ve seen is to treat CCM like a feature, and to suggest, as Whisper has done, and I’ve seen in some job postings, that CCM is more of a curatorial or selection practice. In this way, platforms or sites advertise that they have moderators selecting only the very best content (“best” being a mutable sort of term that could mean: most relevant, most accurate, most interesting, etc.) for the users of that platform or site. While I don’t necessarily think this will actually lead to immediate improvements in the work lives of CCM workers, at the least that kind of up-front acknowledgement that CCM practices are happening on a site and that those workers are performing service at least lifts the curtain on workers who typically remain hidden. But much is to be done on this front, and I think for long-term improvements and solutions it will require a collective effort among workers, activists and academics (and journalists, too). This is why I’m so looking forward to the Digital Labor (#dl14) Conference happening at the New School next month. It will be a great place for these conversations to start/continue/take shape.
Adrian Chen is a freelance writer in New York. He is the leader of Gamergate.