Facebook has reportedly agreed to pay its content moderator employees $52 million as part of a settlement over a lawsuit filed by workers who suffered mental health issues as a result of their jobs.
The social media giant will pay a minimum of $1,000 to 11,250 content moderators who developed issues such as post-traumatic stress disorder because of the stressful job of moderating graphic and disturbing content on Facebook, according to The Verge.
Content moderators who work or have worked in the California, Arizona, Texas, and Florida offices from 2015 onward are eligible for the payout.
Moderators who have been diagnosed with a mental health condition can reportedly get up to $1,500, and those who have multiple diagnoses can receive up to $6,000.
On top of a cash payout, content moderators will also get access to personal sessions with a licensed mental health professional or counselor, as well as monthly group therapy sessions, the Verge reported.
As part of the settlement, Facebook agreed to make changes to its content moderation tools, which include muting audio and making videos black and white.
Facebook did not immediately respond to a request for comment from Digital Trends, but told The Verge that they are committed to helping these workers.
“We are grateful to the people who do this important work to make Facebook a safe environment for everyone,” Facebook said in a statement to The Verge. “We’re committed to providing them additional support through this settlement and in the future.”
The content moderators of Facebook are the unsung heroes of the platform, working long hours to ensure graphic content is policed or deleted from Facebook users’ timelines.
Last year, a report from The Verge detailed how one moderator collapsed at his desk while on the clock and died of a heart attack.
According to the report, moderators at the time only received two 15 minute breaks, a 30-minute lunch break, and a 9-minute “wellness” break.
Content moderators are forced to see images or videos of graphic violence, child pornography, hate speech, conspiracy theories, and even murder day in and day out, all so they can delete awful content before it reaches Facebook’s billions of users.