Skip to main content

Facebook will reportedly pay $52 million to employees who suffered PTSD

Facebook has reportedly agreed to pay its content moderator employees $52 million as part of a settlement over a lawsuit filed by workers who suffered mental health issues as a result of their jobs. 

The social media giant will pay a minimum of $1,000 to 11,250 content moderators who developed issues such as post-traumatic stress disorder because of the stressful job of moderating graphic and disturbing content on Facebook, according to The Verge. 

Recommended Videos

Content moderators who work or have worked in the California, Arizona, Texas, and Florida offices from 2015 onward are eligible for the payout. 

Please enable Javascript to view this content

Moderators who have been diagnosed with a mental health condition can reportedly get up to $1,500, and those who have multiple diagnoses can receive up to $6,000. 

Image used with permission by copyright holder

On top of a cash payout, content moderators will also get access to personal sessions with a licensed mental health professional or counselor, as well as monthly group therapy sessions, the Verge reported.

As part of the settlement, Facebook agreed to make changes to its content moderation tools, which include muting audio and making videos black and white.

Facebook did not immediately respond to a request for comment from Digital Trends, but told The Verge that they are committed to helping these workers. 

“We are grateful to the people who do this important work to make Facebook a safe environment for everyone,” Facebook said in a statement to The Verge. “We’re committed to providing them additional support through this settlement and in the future.”

The content moderators of Facebook are the unsung heroes of the platform, working long hours to ensure graphic content is policed or deleted from Facebook users’ timelines.

Last year, a report from The Verge detailed how one moderator collapsed at his desk while on the clock and died of a heart attack. 

According to the report, moderators at the time only received two 15 minute breaks, a 30-minute lunch break, and a 9-minute “wellness” break. 

Content moderators are forced to see images or videos of graphic violence, child pornography, hate speech, conspiracy theories, and even murder day in and day out, all so they can delete awful content before it reaches Facebook’s billions of users. 

Allison Matyus
Former Digital Trends Contributor
Allison Matyus is a general news reporter at Digital Trends. She covers any and all tech news, including issues around social…
Facebook says it has helped 2.5 million people register to vote
facebook voter registration 25 million nrp national voters day banner 1

Ahead of National Voter Registration Day tomorrow, Facebook says it is playing its part in getting people registered to vote in the 2020 presidential election.

Facebook announced it has helped 2.5 million people register to vote, and that it aims to get 4 million eligible voters registered. "It’s a promising start," the company announced Monday, "but we have more work to do."

Read more
Facebook reportedly suffers outage in parts of U.S.
The Facebook home page on a screen.

If you’re having trouble with Facebook today, you’re not the only one.

Down Detector shows that the social network reportedly started having issues around 1 p.m. PT. Some people are reporting a complete blackout of the site and trouble with the news feed. Some issues are also being reported in South America. 

Read more
Facebook is paying some users to suspend their accounts before the 2020 election
money-facebook

If Facebook offered to pay you to temporarily shutter your account, would you take the money?

Such an offer could even be coming your way after it emerged the social networking giant is offering cash payments to some Facebook and Instagram users as part of a study to learn more about the effects of social media on democracy.

Read more