Content moderation is little discussed yet half the employees of social media sites work mainly on it. How do these people cope with spending their working days holding back a tide of extreme depravity?
If you report a post on a social media site as being offensive or inappropriate what happens next? An article by Adrian Chen in Wired magazine explores this question and the answer is equal parts fascinating and disturbing. He suggests this is crucial for the future of social media because if “won’t continue to log on if they find their family photos sandwiched between a gruesome Russian highway accident and a hardcore porn video.” However, it’s not a job which can really be done by machine. Therefore, a small army of content moderators has arisen. While many of these moderators are still based in or around Silicon Valley, it’s increasingly being outsourced to the Philippines. Chen estimates that moderators account for half the people employed by social media companies. Their work involves looking at thousands of disturbing images every day:
Rob became a content moderator in 2010. He’d graduated from college and followed his girlfriend to the Bay Area, where he found his history degree had approximately the same effect on employers as a face tattoo. Months went by, and Rob grew increasingly desperate. Then came the cold call from CDI, a contracting firm. The recruiter wanted him to interview for a position with Google, moderating videos on YouTube. Google! Sure, he would just be a contractor, but he was told there was a chance of turning the job into a real career there. The pay, at roughly $20 an hour, was far superior to a fast-food salary. He interviewed and was given a one-year contract. “I was pretty stoked,” Rob said. “It paid well, and I figured YouTube would look good on a résumé.”
For the first few months, Rob didn’t mind his job moderating videos at YouTube’s headquarters in San Bruno. His coworkers were mostly new graduates like himself, many of them liberal arts majors just happy to have found employment that didn’t require a hairnet. His supervisor was great, and there were even a few perks, like free lunch at the cafeteria. During his eight-hour shifts, Rob sat at a desk in YouTube’s open office with two monitors. On one he flicked through batches of 10 videos at a time. On the other monitor, he could do whatever he wanted. He watched the entire Battlestar Galactica series with one eye while nuking torture videos and hate speech with the other. He also got a fascinating glimpse into the inner workings of YouTube. For instance, in late 2010, Google’s legal team gave moderators the urgent task of deleting the violent sermons of American radical Islamist preacher Anwar al-Awlaki, after a British woman said she was inspired by them to stab a politician.
But as months dragged on, the rough stuff began to take a toll. The worst was the gore: brutal street fights, animal torture, suicide bombings, decapitations, and horrific traffic accidents. The Arab Spring was in full swing, and activists were using YouTube to show the world the government crackdowns that resulted. Moderators were instructed to leave such “newsworthy” videos up with a warning, even if they violated the content guidelines. But the close-ups of protesters’ corpses and street battles were tough for Rob and his coworkers to handle. So were the videos that documented misery just for the sick thrill of it.
“If someone was uploading animal abuse, a lot of the time it was the person who did it. He was proud of that,” Rob says. “And seeing it from the eyes of someone who was proud to do the fucked-up thing, rather than news reporting on the fucked-up thing—it just hurts you so much harder, for some reason. It just gives you a much darker view of humanity.”
Rob began to dwell on the videos outside of work. He became withdrawn and testy. YouTube employs counselors whom moderators can theoretically talk to, but Rob had no idea how to access them. He didn’t know anyone who had. Instead, he self-medicated. He began drinking more and gained weight.
It became clear to Rob that he would likely never become a real Google employee. A few months into his contract, he applied for a job with Google but says he was turned down for an interview because his GPA didn’t meet the requirement. (Google denies that GPA alone would be a deciding factor in its hiring.) Even if it had, Rob says, he’s heard of only a few contractors who ended up with staff positions at Google.
A couple of months before the end of his contract, he found another job and quit. When Rob’s last shift ended at 7 pm, he left feeling elated. He jumped into his car, drove to his parents’ house in Orange County, and slept for three days straight.
Stories like this lead Chen and a number of experts he interviews to conclude that moderators should receive the same kind of support that police officers looking at abusive images do.