Get Instant Loan up to ₹10 Lakh!
New Delhi: When the protests against the Citizenship Amendment Act (CAA) first began in India in December 2019, moderators at Bigo Live were asked to “reduce visibility” of videos involving protest. Bigo Live is a Chinese social media platform that allows people to share live videos. It has over 200 million users in the country. Eventually, the platform told its moderators to ban all content around the protests altogether.
It instantly changed what 200 million Indians could see or perceive. And a handful of Indians affected that change – Bigo’s content moderators, many of whom sit inside the company’s Gurugram office. A lot of the moderation work is also outsourced, to a slew of companies that originally cropped up during the business process outsourcing (BPO) boom of the late-1990s. Content moderation is arguably the most important task that BPOs in India perform today, fulfilling outsourced contracts for social media giants ranging from Facebook and TikTok to LIKEE and Bigo Live, among others.
The outcome: a process-driven BPO industry has become the refuge for quick-fix content moderation, a job which may often involve subjectivity. Add to that the fact that many of the mods are often young people (average age is less than 30), who sometimes join even before finishing college, and the problems begin to add up. They decide what you get to see. Or not see. As the number of Indian who have access to the internet surges, the mods will decide the reality that everyone gets to witness.
But despite this, there has been very little attention on the world of content moderation in the Indian context. Several teams operate out of back offices, teaming with rows of computers, in cities across the country – from Hyderabad and Bengaluru to Gurugram.
Some of the mods view as many as 6000 images over the course of a workday. The average time to make a judgment call: 4.5 seconds. Nearly all of them follow an internal, company-specific guideline document, which some insiders snidely describe as a “policy Wikipedia” for its seeming vastness. All of it needs to be memorised, to be able to make split-second decisions. And those guidelines, which are beyond the scrutiny of those outside a company, carry immense power.
Protests against a government policy can get blocked, for instance. There have also been allegations that TikTok hides the content of “ugly, poor or disabled users” (the company has said these reports are based on internal guidelines which have since changed). This week, news reports have also poured in about how Facebook is seemingly blocking posts about Covid-19, even from legitimate platforms like Buzzfeed and USA Today. The initial suspicion is that an algorithm might have gone rogue.
With so much at stake, even the Indian government has waded in, filing an affidavit in the Supreme Court recently in which it claimed regulation was necessary because social media can cause “unimaginable disruption” to democracy.
Nearly everyone agrees that the status quo is not ideal, and that includes even the content moderators who sit in front of a terminal for 9 hours a day, or longer, mindlessly clearing images and video. “I don’t know what brain made these policies (guidelines). You can’t question them,” said Sania (name changed), who moderates content for Facebook, through its India partner Accenture.
But nearly everyone also agrees on one other thing: government mandating how moderation gets done can quickly turn into surveillance, silencing of speech, or worse. Caught up in between the social media giants, obscure guideline documents, and the government are the users themselves. Who will prioritize and protect their interests?
This process won’t be easy, said Saravjeet Singh, a research fellow at the National Law University, Delhi. To begin with, social media platforms could at least have rules for who is hired for moderation and what their qualifications are, he said. How they are trained and the support systems available to those who witness and filter out violence may also need serious review. Because an average day in the life of a content moderator may often be harder than it seems.
Robotic work
“I don’t know what I’m doing here,” said Sania, the moderator quoted above who acts as a gate-keeper on Facebook. She works in the “image queue”, which includes looking at hundreds and even thousands of images, and deciding whether they are appropriate for Facebook. She has been given a set of policies that she is supposed to follow, without question and with as little of her intelligence as possible.
That is the case with almost every social media firm’s mod unit, according to multiple moderators Mint interviewed over the past month. Nearly all of them are mandated to sign a non-disclosure agreement, which means they can’t reveal their identities.
During the course of a recent workday, Sania landed on an image where a person is showing off a freshly made tattoo which is still bloody. This must be tagged as a violation of Facebook’s guidelines, even though she knows that it shouldn’t offend anyone. “Some policies are very stupid. In many cases, we’re supposed to tag something as a violation even when it’s evidently not offending,” she said. For instance, any edible item that resembles human genitalia must be tagged as adult content, according to her.
She likened her work to her school life, it’s all about what percentage you score. She has to keep up her accuracy or she will lose her job. Her seniors tell her she has to have an accuracy rate of at least 90%, but she hasn’t been able to surpass 85% yet.
Sania says that’s the bare minimum she needs to make sure she doesn’t get fired. “Hardly 10 people out of 70 will get that score (of 90%),” she says. Sania reviews 5700-6000 images per day. She can take longer than 4.5 seconds if required but at the risk of not meeting her targets, which in turn reduces her “efficiency”. Those who don’t make the cut are fired. Attrition is high.
The team leader’s (the moderators call them TLs) scores depend on the average score of the people under him/her. According to Sania, while they may be slightly better paid, they are in the same boat. “If you’re doing it wrong, then everyone do (sic) it wrong,” is what Sania’s TL tells his team.
But how difficult can it be to check an image against a bunch of guidelines? Well, there aren’t just 10 short points to follow. The document is thousands of pages long and the faster a moderator can get accustomed to all of it, the better for them. But even if they do, they still have to watch out for changes in those guidelines, which often happen on the fly. Missing a change in the guideline will affect accuracy.
“Earlier, we were told to ban all videos where one could see cleavage. Now, we only ban extreme cases,” said another moderator who only identified her client as a “well established TikTok-like platform”.
Like many in this industry, Sania does moderation simply as a means to an end, until she can find another job. But she’s stuck here because her team leader refuses to give her any leaves. “You can’t even fall sick for more than 7 days in a year,” she said.
Stress and trauma
But Sania’s life is perhaps one of the happier ones in the industry. Her biggest problem with the job is that it is boring and she learns nothing from it. There are those in the industry who suffer from major trauma as a result of what they are forced to watch at work, which affect their lives, relationships and daily interactions. Post-traumatic stress disorder and clinical depression are quite common, said a counsellor who Facebook appoints to help moderators deal with the stress of their jobs.
“Normal people do not go back to their trauma every day or they can avoid it after a therapy session. People who work in content moderation do not have that luxury,” one company appointed counsellor said. “Sometimes, they may already be suffering from depression and this triggers it further.”
Many moderators feel worthless and hopeless about life and about the world, she explained. It affects their daily life drastically, according to the counsellor. They aren’t allowed to carry a phone inside or look at anything other than the content they’re moderating during their shift, which makes their conditions worse. Sania and three other moderators confirmed this saying they aren’t even allowed a piece of pen and paper on their desk.
Job-induced trauma is a global problem. India, however, poses a unique challenge. Getting therapy is a taboo in the country, even among many urban elites. Another counsellor said that many working in this industry locally aren’t even aware of its impact on their mental health.
Many refuse to open up to the therapist, or don’t take interest in the sessions because they don’t believe they need it. In fact, therapists are called “wellness coaches” to work around the taboo. But a wellness coach can only do so much, with 300 people on the floor and a 9-hour shift. When people watch content around self-harm all day, they eventually try it on themselves, one therapist said.
But despite that, the level of awareness is low. “Indians don’t feel trauma that easily. We take it as part of our job,” a Facebook moderator said, having just admitted that he used to keep imagining car accidents whenever he was driving. “I got over it. It’s all good now,” he said.
“I don’t think this job should exist,” the second counsellor cited above said. “It affects these childrens’ mental health badly,” she added. Most moderators have taken the job to add experience to their resumes, so they can apply for others. Some have degrees in economics, some in biotechnology, and many in journalism.
Currently, an entry bar is non-existent. A moderator from TikTok’s Gurgaon office said he joined before finishing college. Another said she worked while giving entrance exams and failed the exams because her managers refused to give her leave to study.
How much stress and trauma a mod faces will depend entirely on the kind of process, or “queue” they are on. A queue basically is the kind of content one has to look at. Particularly disturbing are queues that bring graphic violence, pornography, terrorism, and suicides, said multiple moderators. These are the ones that leave one scarred and cause trauma.
“Sometimes, you can see a body part being mutilated quite clearly, in which case you can ban a video without spending much time on it. But, at times, you have to find a finger lying on the road in the video of an accident. You have to watch the video till you do, irrespective of how long the video is,” said a Facebook moderator from Genpact, who used to moderate approximately two to three hundred videos a day as long as he worked.
The floor is akin to a sweatshop, he said. Managers tell employees that they’re doing the “most important job in the world” because they protect people from all the content that could harm them, but in the daily grind, the focus is on maintaining the 90 percent-plus accuracy rather than protecting free speech or avoiding graphic content.
Guideline is king
When asked how mistakes happen, or why do posts or images sometimes get banned or removed when they shouldn’t have been, every moderator says they are simply following the guidelines. “It doesn’t matter what people think. It matters what the guidelines say,” said a moderator with LIKEE.
Neither LIKEE nor Bigo Live responded to an email seeking comment.
On platforms like TikTok, LIKEE and Bigo Live, the newer short and live video platforms going viral these days, luxuries like counsellors aren’t available. Four moderators, two from TikTok and one each from the other two firms, said that counsellors are not provided by these platforms. Neither are there any provisions for recreation like Facebook and YouTube.
When 21-year-old Vaibhav (name changed), who works on TikTok’s moderation team, spoke to his seniors about the need for counsellors, he wasn’t provided one. Instead, he was asked to moderate 250 videos per hour instead of 600, his usual target. “At the very least, people become irritable because of the job,” he said. TikTok did not answer questions about whether counsellors are provided.
Vaibhav works on the “keyframes” team which sees frames from videos instead of the entire audio visual. He moderates 4000 videos per day, while a moderator from LIKEE said his team does 5000 videos per day per moderator.
Vaibhav and another moderator who used to lead teams of moderators at Bytedance’s Gurgaon office said that they can take three kinds of actions on a video. They either approve the video, ban it, or there is a third option, which reduces the visibility of the video to the uploader only. This, according to moderators, is what many users like TikTok influencer Ajay Burman have referred to as a “shadow ban” – something which TikTok has denied doing for months now. The company didn’t respond to a question about this either, but instead said: “we deploy a combination of technology and human moderation to review content and take appropriate actions if they violate our community guidelines”.
Such shadow bans happen on Bigo Live and LIKEE too, moderators say. They also have the power to ban a user for a limited period or stop them from uploading videos to the platforms.
A moderator for Bigo Live said they don’t allow content which includes “topless men”, “people smoking on the platform”, etc. For political content, the tolerance is even lower. “If the government says a Shaheen Bagh protest has to be banned, it will be banned,” he said.
Update: Facebook did not acknowledge two emails asking for comments on this story, but got in touch with us after it was published, to share the following statement:
“Facebook’s global team of 15,000 moderators do vital work to keep our community safe and we take our responsibility to ensure their well-being incredibly seriously. We have standard, global agreements in place with all our partners, and work closely with them to provide the support people need. This includes training, psychological support and technology and filters to limit exposure to graphic content. — a Facebook company spokesperson
Catch all the Business News , Breaking News Events and Latest News Updates on Live Mint. Download The Mint News App to get Daily Market Updates.