It's no secret that Facebook's parent company Meta, IG, WA et al employ contract workers to do a lot of the heavy lifting in enforcing its content moderation policies.
Even though they can work in the most valuable companies in the world, these workers often complain that the work is not commensurate with the compensation and mental health they face.
Some workers now also say that they are treated worse than other workers.
According to BuzzFeed News, Meta subcontractor Genpact, previously accused of encouraging poor working conditions, has asked Spanish-speaking moderators from the Richardson, Texas office to report live work since April 2021.
The workers had to risk their health against the delta and omicron variants of the coronavirus while their English-speaking colleagues were allowed to cycle in the office on a three-month rotation.
News of the situation at Genpact comes just one week after workers at Accenture, another Meta subcontractor managed to protest to force the company to drop a requirement for hundreds of Facebook moderators to return to full-time work in the office on January 24.
The contractors who spoke to BuzzFeed News claim Genpact also applies unreasonable standards. They say they are expected to make a moderation decision in about a minute while maintaining an 85% accuracy rate.
Complicating things is the fact that Meta reportedly does not disseminate guidelines on how to implement the Facebook Community Standards in languages other than English, leaving those workers in a situation where they are forced to translate the guidelines first before implementing them.
There is a scale of problems the team has to deal with. Genpact's Spanish moderation team even though the name is Mexico, but apart from moderating the content posted by people living in the North American country, they are also responsible for Facebook and Instagram posts from Spanish speaking users in most Latin American countries as well.
In Mexico alone, Facebook has more than 84 million users. In contrast, Genpact Mexico's market team consists of about 50 individuals.
"We use a combination of technology and people to keep content that violates our rules off our platform, and while AI has made progress in this area, people are an important part of our safety efforts," a Meta spokesperson told Engadget. , Saturday (15/1/2022).
"We know this job can be difficult, which is why we are working closely with our partners to continuously evaluate how best to support this team." he added.