In 2020, Artistic Director Steve Cosson and I were awarded a commission from Ensemble Studio Theatre and the Sloan Foundation to develop my new play The Moderate. This project is about the world of internet content moderators. Inspired by the work of scholars like Sarah T. Roberts and documentaries like The Cleaners, the play follows Frank during his lockdown year. Recently unemployed and separated from his wife and son during a global pandemic, Frank accepts a job as a content moderator for a large social media company. He must watch and read at least 2,000 flagged videos and posts a day, “accepting” or “rejecting” them. Watching this material day-in and day-out forces Frank to make a choice about his own life and take a chance saving a teenage girl he has never met.
Steve suggested that a join this year’s R&D Group in order to structure our work together on the play. In the fall, I conducted interviews with scientists, researchers and policymakers working in the fields of AI and internet moderation. Many were incredibly generous, one or two were a little suspicious, and a few flat-out refused any request. I was also fortunate to speak with people currently working as internet content moderators. They spoke to me at great risk since they are made to sign NDAs (non-disclosure agreements) as part of their jobs. These conversations changed me. They were not only invaluable for writing the play, but they altered my relationship to the internet.
Here are some selected highlights from those conversations:
Interview with Sarah T. Roberts
Sarah is an assistant professor of information studies in the Graduate School of Education and Information Studies at UCLA. She is an expert in the areas of internet culture, social media, digital labor, and the intersections of media and technology. She coined the term “commercial content moderation” (CCM) to describe the job paid content moderators do to regulate legal guidelines and standards. Roberts wrote the book Behind the Screen: Content Moderation in the Shadows of Social Media.
Ken: From the interviews you’ve done with content moderators, what did the contractors look for when hiring them? Why did the people that you interviewed end up getting these jobs?
Sarah: I will preface what I say by saying that I think, at this point, the need for bodies has superseded the desire to be picky. So I think it’s probably not quite as discriminating anymore for these kinds of cattle-call jobs and call centers.
But when I talked to the workers, particularly the workers that worked at MegaTech [euphemism for a company we cannot name], there was a bit of a vetting process for them. Each one of them came through a different contracting agency, which already was funny and weird, and strategic on some level I imagine. And so, they were being hired. Their whole hiring process was handled by the third party contractors, but MegaTech of course ultimately sets the terms of what kind of employee they want. And so they had said things like, we need four-year university graduates. That was part of the mandate. MegaTech’s preference is for graduates of schools of where we teach now [Ken teaches at MIT]. So, graduates from more elite institutions, that’s what they were looking for. So the people that I talked to had graduated from Berkeley, USC. One person graduated from a small liberal arts college with a good reputation.
All of them graduated with significant debt. So I think that’s a huge piece. They have to pay their student loans.
Another person I know who worked down in Austin has a PhD in English literature too. And the job market was so bad. So bad that it’s actually… We might question the ethics of continuing to produce those students. Knowing something about the inner workings of that department, how can they completely not support their Ph.D. students, turn them out with debt? They are now overeducated for just about every other job.
At that time content moderators were highly competent and highly educated, but the propensity of Silicon Valley is to devalue anything that isn’t in the STEM fields. I don’t need to tell you that working at MIT. So there was no way that they really saw bringing a four year grad from Cal State who had a degree in economics as having any value in any other domain within their company.
Of course, now these companies hire vast swaths of people. When I have interacted with the them, Facebook, for example, or other big companies, I’m usually interacting with mid-level people who are like in their late twenties, they have master’s degrees from like Georgetown, in international relations or peace studies. They’re really do-gooder types. They love to hire people like that for the policy arm, but for the implementation of the actual moderation, it was a lot of new grads, for whom there was no clear path in the world, outside of what was being touted as America’s new economic hope which was the tech industry.
And of course for them – Facebooks and the companies like them – the internet content moderators they hire… some of them graduated with English degrees or history degrees, so it was all the wrong degree. The right schools and the wrong degrees. As if the humanities and social science understandings of the world are not key to doing a good job at content moderation. Of course, they are. Because you know something, you’re not an ignoramus. You know what you’re looking at when you’re looking at symbols. I mean, you’re looking at signifiers, right? You’re looking at things that have meaning and you are called on to use your cultural capital and your intellect to decide. And so, in fact, they knew on some level that that education was a value, but they devalue it at the cultural level, of the ideological level of Silicon Valley. They wanted that, but they knew they could get it at fire-sale prices.
This is my interpretation. If it were this thought out, I’d be surprised, but this is how this kind of economy got generated. And so you have this caldron of people who are highly educated, who are very smart, who are achievers, and you put them in a job that is rote, boring, no upside, no room for growth, instead of sort of a trajectory up your inner revolving door, that’s going to spit you out.
It’s almost like factory work. Of course we know it doesn’t do the physical things to one one’s body that being on an assembly line or being in manufacturing might do, you’re not going to lose a finger. You’re not doing repetitive motion stuff, but it has these other damages, and these other harms that, frankly, are invisible. It’s even worse in a way, because you can’t even make an informed decision when you go into the job. Maybe a little bit more so now, but certainly not at that time.
Interview with Mary L. Gray
Mary L. Gray is Senior Principal Researcher at Microsoft Research and Faculty Associate at Harvard University’s Berkman Klein Center for Internet and Society. She maintains a faculty position in the Luddy School of Informatics, Computing, and Engineering with affiliations in Anthropology and Gender Studies at Indiana University. Mary, an anthropologist and media scholar by training, focuses on how people’s everyday uses of technologies transform labor, identity, and human rights.
Ken: So I wanted to ask–
Mary: Look, I’m on a mission to help give your approach depth, because my biggest fear is that most people are assuming it’s traumatic to look at content, and I think we should be looking very closely at what’s the theory operating there.
The most basic way of putting the thing is… media have strong effects and we are, unwittingly as humans, unable to deal with that. When you meet people doing this challenging work, it changes how you talk about what is it that we need to repair and address. Simply put, it’s their work conditions.
I think of it like the job like this. It’s the intersection of a librarian, a 911 dispatcher, a schoolyard monitor, and an editor that curates. So, I hope that your play draws out how mundane this work can be and what it means to have mundane work interrupted by something that presents difficulty. This mom I met in India, she was looking at what’s called adult content and that container of adult content can include everything from fairly graphic depictions of sex… of many different kinds. You might call it pornography. But she’s looking at a range of materials that sometimes feature acts of violence. It’s in that bucket and for her, she’s like she’s cleaning up the internet for her kids and other people’s kids.
And when you talk with men who do that kind of review work. In many ways, we should all note how it doesn’t look that different than any kind of porn you can find on the internet… I’m going to stop reacting in a second…. When we start talking about this material as traumatic, we should look at the ways in which we’re bringing a very anti sex approach to how we look at material, how we imagine it, how we engage it. And in fact, sexual material is not traumatizing. It’s the broader context of misogyny that’s traumatizing. The broader context of violence against women that is more disturbing than any representation of sex that somebody might be reviewing. And that’s what people respond to.
These workers, yes, they see the underbelly of humanity. […] But it’s not inherently traumatic. What is traumatizing is the way that businesses pretend that like they’re just algorithms, right? That they don’t have any like expertise or knowledge that allows them to have any engagement with the material.
People are aware now there’s a thing that’s called content moderation and that person does it. My experience interviewing these companies, though, is that the engineers working on these problems see it as a technical challenge. Look, most engineers and computer scientists are not trained to see content moderation as something they can’t fix. They believe there’s a way of improving how tools would capture this information, filter this information. It’s an undying belief that algorithms can get this right. But no no no, they can’t. You are gonna need people and what matters… is their work environment.
Are they taking breaks? Are they paid to take breaks? Do they have access to health resources? What kinds of workplace safety measures have to be there?
You can probably hear it in my voice. I’m so worried you’re going to have a character of who is unable to manage this through their own means. When I know lots of people do.
Interview with Andrew Marantz
Andrew Marantz is an American author and journalist. He has contributed to The New Yorker since 2011. He has written extensively for the magazine about technology, social media, the alt-right, and the press, as well as about comedy and pop culture. He is the author of Antisocial: Online Extremists, Techno-Utopians, and the Hijacking of the American Conversation.
Ken: How much does cultural context influence the work of a content moderator? Yes, there is a rule book and as a moderator, your job is to uphold the policies of the corporation that you’re working for. But what about what the moderator thinks or understands about the image? There have been debates about moderating images and videos of blackface, for instance–
Andrew: Even a beheading video could also be a totally ambiguous case if what’s at issue is needing to document war crimes. Or you need to show that the Sinaloa drug cartel is beheading people. That’s a newsworthy thing that people are trying to share, to document their own experiences of horror within their communities.
So really anything can be an ambiguous case. There was a good Radiolab episode about these ambiguous cases. There’s a policy [on social media] that you’re not allowed to show women’s nipples. However, what about breastfeeding? What about breastfeeding where you can see a little bit of nipple? What about breastfeeding but you’re breastfeeding a goat? Actually breastfeeding a goat is a thing that people do in Kenya when there’s a drought. So that’s a cultural practice that I didn’t know about.
Moderation is always influenced by context. But these companies wish there could be one set of rules that can be applied consistently across the world. It might sound like libertarian universalism, but it actually feels like libertarian imperialism.
That is just the way these people think: they’re engineers. This is how their brain works. You have to have a rule book that is consistently applied no matter where you are. But what ends up happening is… thinking of your blackface example… you can’t have a rule that says no blackface because then you can’t show Bamboozled by Spike Lee, or you can’t show Sarah Silverman using blackface in order to condemn the use of blackface. So do you add a rule [that] you can only use it in a condemnatory or mocking or satirical context. What if it’s Black people doing blackface? What if it’s historical documentation of the practice?
What ends up happening is this. These companies are so committed to this idea of a neutral rule book that is comprehensive and applied throughout the world, that they end up with this insanely Byzantine rulebook that is basically absurd.
Ken: It’s almost as if these companies wish there was a way that there could just be algorithms to interpret these images and videos. But that’s not how you interpret images and text. It’s not simple logic: if x, then y.
Andrew: Right. And that is really what they’re trying to do. There are moderators I’ve talked to who are convinced that their current role is to train the algorithms that will eventually replace them. Because, obviously, if you’re one of these companies, you would rather have software doing this job than having people suing you for PTSD. You don’t have to have shifts. You don’t have to monitor people and pay for their psychological counseling. You don’t have them leaking stuff to the press. But it’s an impossible goal.
Interview with “G”
This person is a senior content moderator. His specialty is child pornography and terrorism. This is not verbatim, but taken from notes since he asked me not to record our three-hour conversation.
G: When you are looking at child pornography, read the markers, not the face, not the expression, not the tears. You gotta do your best to ignore that. Those don’t matter.
You look for body hair, you look at the genitals, the breasts, the size of the nipples, the length of the limbs.
White people get a bridge in their noses as they age. Blacks and Asians don’t. Asians are the hardest to tell if they are legal. The lack of pubic hair.
Americans hit puberty earlier than anyone else which can cause troubles.
Most people shooting child porn are smart to edit out any tell-tale details.
So you focus on the three landmarks so you can tell if it is a child.
I had a hard experience recently. I watched this ISIS video – they are really well-produced actually, multiple camera angles and music. I watched a four year old boy shoot a group of people point blank in the head. I have a four year old boy and so it was hard to shake that video when I was looking at him.
My wife thought he was doing something cute. He was using his toy truck to shovel up his action figures and all I could see was…
When I was working at home, cause of COVID, my son, I didn’t hear him, he came into my study, to bring me a bagel, and on the screen was… [crying]
Ken: Do you wanna stop?
G: I don’t want you to think I cry all the time… How do I describe what happens sometimes? It’s like a cramp in the brain. You can’t comprehend what you are seeing. That’s a ten-month old. That’s a dog. And you watch and you think… You can’t ever let it not affect you though. It stops affecting you, you gotta get out of this line of work.
Ken: It’s like to do this work you need empathy but you can’t let what you see paralyze you.
G: The hardest thing about right now [during COVID] is that we aren’t all together in the [name of company redacted] offices. Your co-workers can see when you want to take a break. There’s schedule spontaneity. You take a walk. All the time. You get up from your desk. You walk. You see a therapist. My company is better than the rest. I feel so bad for the people who work for Accenture [they hire the content moderators for Facebook] or whatever they’re calling themselves now. They pay for us to see a therapist. It costs these companies next-to-nothing to do something like that. But most refuse. I had to fight like hell to get [name of company redacted] to do it. They call me Father “G” [his real first name] because I’m like a religious zealot about making sure we have a proper work place.
Ken: When you confirm something is child porn, how do you report it? How do you get the authorities involved?
G: Look, these companies are motivated by business interests, they aren’t motivated to save children. Ultimately it’s bad for their brand to have this stuff on their platforms. What I do is collect the facts, file a report. It’s reported to the state. Here’s a story for you. I’ve being doing this thirteen years, if you can believe it. There was a girl. I watched her be abused… since she was a kid. Her father. His friends…. But we got him. We put that bastard father of hers in jail. I got to meet her actually. At a conference. There’s a conference where we all get to be together. That’s my tribe. Other moderators are the only ones who really know what we go through. And I met her. This woman. She’s now married and happy. It was hard to not see her… as she was… in those videos. But to see her as she was now. That was hard. But it’s good she’s happy. And that fucking father of hers will stay locked up for the rest of his life.
Ken thanks Steve, Ilana and the members of the 20-21 R&D Group, EST/Sloan Foundation, all of the people who spoke to me this past year, and my Undergraduate Research Assistant at MIT, Chen Xu, for her hard work on transcribing all of these interviews.
To learn more about The Civilians and to access exclusive discounts to shows, join our email list at TheCivilians.org.
Author
-
Ken Urban is a playwright and screenwriter. Recent plays include A Guide For The Homesick (Huntington Theatre Company, West End), The Remains (Studio Theatre), Sense of an Ending (59E59 Theatres, London’s Theatre503), and Nibbler (The Amoralists). Awards include Weissberger Playwriting Award, New York Foundation for the Arts Fellowship, Dramatist Guild Fellowship, and MacDowell Colony Fellowships. He is a resident playwright at New Dramatists and affiliated writer at the Playwrights’ Center.