Content Moderators Can’t Moderate their Jobs or Their Minds

From WIRED: A contractor at the Manila office of TaskUs, a firm that provides content moderation services to U.S. tech companies. MOISES SAMAN/MAGNUM

Sometimes there’s stories about work that I want to write about but just can’t even find the words to describe my feelings about it. It’s a mixed experience because one one hand I want to deliver quality content and compelling content can often give me a lot to work with. But sometimes the content is so compelling independently of any content I could possibly add to it that it seems like it’d just be some kind of insult to even give my two-cents on it.

This is one example.

Baybayan is part of a massive labor force that handles “content moderation”—the removal of offensive material—for US social-networking sites. As social media connects more people more intimately than ever before, companies have been confronted with the Grandma Problem: Now that grandparents routinely use services like Facebook to connect with their kids and grandkids, they are potentially exposed to the Internet’s panoply of jerks, racists, creeps, criminals, and bullies. They won’t continue to log on if they find their family photos sandwiched between a gruesome Russian highway accident and a hardcore porn video. Social media’s growth into a multibillion-dollar industry, and its lasting mainstream appeal, has depended in large part on companies’ ability to police the borders of their user-generated content—to ensure that Grandma never has to see images like the one Baybayan just nuked.

Watching Baybayan’s work makes terrifyingly clear the amount of labor that goes into keeping Whisper’s toothpaste in the tube. …  He begins with a grid of posts, each of which is a rectangular photo, many with bold text overlays—the same rough format as old-school Internet memes. In its freewheeling anonymity, Whisper functions for its users as a sort of externalized id, an outlet for confessions, rants, and secret desires that might be too sensitive (or too boring) for Facebook or Twitter. Moderators here view a raw feed of Whisper posts in real time. Shorn from context, the posts read like the collected tics of a Tourette’s sufferer. Any bisexual women in NYC wanna chat? Or: I hate Irish accents! Or: I fucked my stepdad then blackmailed him into buying me a car.

I gave it some thought and pretty much concluded that the article speaks for itself so instead of using this story as a springboard to go into something I’m just going to quote some of the most relevant parts of the article itself in case you don’t have the time to read it right now. Though if you do get the time, I definitely recommend giving it a read and sharing it around.

Here’s another bad part about this situation:

This work is increasingly done in the Philippines. A former US colony, the Philippines has maintained close cultural ties to the United States, which content moderation companies say helps Filipinos determine what Americans find offensive. And moderators in the Philippines can be hired for a fraction of American wages. Ryan Cardeno, a former contractor for Microsoft in the Philippines, told me that he made $500 per month by the end of his three-and-a-half-year tenure with outsourcing firm Sykes. Last year, Cardeno was offered $312 per month by another firm to moderate content for Facebook, paltry even by industry standards.

The wages are pitiful for doing work that Americans are paid much more to do though still not much. For Philippine’s this work is just something they may have to do to get by but for Americans it’s a desperate attempt to get by if they have no other way:

Rob became a content moderator in 2010. He’d graduated from college and followed his girlfriend to the Bay Area, where he found his history degree had approximately the same effect on employers as a face tattoo. Months went by, and Rob grew increasingly desperate. Then came the cold call from CDI, a contracting firm. The recruiter wanted him to interview for a position with Google, moderating videos on YouTube. Google! Sure, he would just be a contractor, but he was told there was a chance of turning the job into a real career there. The pay, at roughly $20 an hour, was far superior to a fast-food salary. He interviewed and was given a one-year contract. “I was pretty stoked,” Rob said. “It paid well, and I figured YouTube would look good on a résumé.”

But this job comes at a price, which may include more social withdrawal, a darker look at humanity than you may otherwise would have had.

A person called Rob in the article is one example:

But as months dragged on, the rough stuff began to take a toll. The worst was the gore: brutal street fights, animal torture, suicide bombings, decapitations, and horrific traffic accidents. The Arab Spring was in full swing, and activists were using YouTube to show the world the government crackdowns that resulted. Moderators were instructed to leave such “newsworthy” videos up with a warning, even if they violated the content guidelines. But the close-ups of protesters’ corpses and street battles were tough for Rob and his coworkers to handle. So were the videos that documented misery just for the sick thrill of it.

Rob began to dwell on the videos outside of work. He became withdrawn and testy. YouTube employs counselors whom moderators can theoretically talk to, but Rob had no idea how to access them. He didn’t know anyone who had. Instead, he self-medicated. He began drinking more and gained weight.

And just in case all of this wasn’t awful enough, some research suggests that these workers can even get PTSD-like symptoms:

In Manila, I meet Denise (not her real name), a psychologist who consults for two content-moderation firms in the Philippines. “It’s like PTSD,” she tells me as we sit in her office above one of the city’s perpetually snarled freeways. “There is a memory trace in their mind.” Denise and her team set up extensive monitoring systems for their clients. Employees are given a battery of psychological tests to determine their mental baseline, then interviewed and counseled regularly to minimize the effect of disturbing images. But even with the best counseling, staring into the heart of human darkness exacts a toll. Workers quit because they feel desensitized by the hours of pornography they watch each day and no longer want to be intimate with their spouses. Others report a supercharged sex drive. “How would you feel watching pornography for eight hours a day, every day?” Denise says. “How long can you take that?”

This story has no happy ending. There’s no government regulations that would solve these issues in all likelihood and there’s no radical unions being forms, black market affinity groups being formed or some savior coming in and “rescuing” these workers from their fucked up jobs.

There’s only the work, the awful, mind-breaking in some cases, work.

If our vision of the future means anything it must be a world where people don’t need to do this type of labor and face these sorts of consequences on a daily basis.

Share on FacebookShare on RedditTweet about this on TwitterShare on Google+Share on Tumblr

Leave a Reply

Your email address will not be published. Required fields are marked *