“It’s mostly pornography,” says Sarah Katz, recalling her eight-month stint working as a Facebook moderator.
“The agency was very upfront about what type of content we would be seeing, in terms of how graphic it was, so we weren’t left in the dark.”
In 2016, Sarah was one of hundreds of human moderators working for a third-party agency in California.
Her job was to review complaints of inappropriate content, as flagged by Facebook’s users.
She shared her experience with BBC Radio 5 live’s Emma Barnett.
“They capped us on spending about one minute per post to decide whether it was spam and whether to remove the content,” she said.
“Sometimes we would also remove the associated account.
“Management liked us not to work any more than eight hours per day, and we would review an average of about 8,000 posts per day, so roughly about 1,000 posts per hour.
“You pretty much learn on the job, specifically on day one. If I had to describe the job in one word, it would be ‘strenuous’.
“You definitely have to be prepared to see anything after just one click. You can be hit with things really fast without a warning.
“The piece of content that sticks with me was a piece of child pornography.
“Two children – the boy was maybe about 12 and the girl about eight or nine – standing facing each other.
“They weren’t wearing pants and they were touching each other. It really seemed like an adult was probably off camera telling them what to do. It was very disturbing, mostly because you could tell that it was real.
“A lot of these explicit posts circulate. We would often see them pop up from about six different users in one day, so that made it pretty challenging to find the original source.
“At the time there was nothing in the way of counselling services. There might be today, I’m not sure.”
Sarah says she would probably have taken up counselling if it had been offered.
“They definitely warn you, but warning you and actually seeing it are different.