An effort by WhatsApp to crack down on the spread of misinformation circulating on its platform appears to be bearing fruit.
The company said that following action taken earlier this month, it has seen a 70% reduction in the number of what it calls “highly forwarded” messages, in other words, viral content that may carry misinformation.
The action comes as social media companies focus more closely on keeping a lid on misleading and erroneous content linked to the coronavirus. While it’s easier to monitor content on more open platforms such as Facebook, Twitter, and YouTube, WhatsApp faces a greater challenge as its messaging service offers end-to-end encryption.
To reduce forwarding, WhatsApp recently made a change that meant a message that’s already been shared at least five times can only be forwarded to one person or group at a time. The tweak has significantly reduced forwarding, and with it, one has to assume, misleading viral content.
“WhatsApp is committed to doing our part to tackle viral messages,” the company said in a statement this week, adding, “We recently introduced a limit to sharing ‘highly forwarded messages’ to just one chat. Since putting into place this new limit, globally there has been a 70% reduction in the number of highly forwarded messages sent on WhatsApp. This change is helping keep WhatsApp a place for personal and private conversations.”
While WhatsApp acknowledges that forwarding can be useful as well as fun, it said that before it added the limits, it had noticed a “significant” increase in the amount of sharing, which, it pointed out, “can feel overwhelming and can contribute to the spread of misinformation.”
You can tell if a message has been forwarded multiple times as it’s marked with two arrows.
In similar efforts, Facebook said it will soon begin alerting users to COVID-19-related content they viewed if it was later found to be false.
The alerts will show up in your News Feed if you’ve liked, reacted to, or commented on a post that Facebook later flagged as harmful and later removed. Users will also be directed to a World Health Organization “myth busters” website.
“We want to connect people who may have interacted with harmful misinformation about the virus with the truth from authoritative sources in case they see or hear these claims again off of Facebook,” the social networking giant said.
Digital Trends has some useful tips on how to spot misinformation online.