How WhatsApp is outshining Facebook in the fight against fake news
WhatsApp is mixing things up in a bid to tackle fake news, limiting the forwarding of messages to a maximum of 20 people to curb the spread of false information. The move comes hot on the heels of Facebook CEO – and owner of WhatsApp’s parent company – Mark Zuckerberg’s comments about not silencing Holocaust deniers and faux information purveyors on the platform.
WhatsApp was bought by Facebook for $19 (£14.6) billion back in 2014, with the former’s co-founder since decrying Facebook’s bad behaviour.
Both companies have become embroiled in a battle against fake news, with Zuckerberg’s congressional grilling earlier this year spurring on efforts. However, it seems that WhatsApp is taking the lead in the battle, with news that it will limit message forwarding to 20 people, thus inhibiting the ability to disperse information widely with enormous ease.
The rules in India will be particularly stringent, with users limited to forwarding messages to five users only, after the spreading of misinformation led to 30 lynchings of purported kidnappers in the past year.
That’s not all WhatsApp is shaking up. It now flags messages that have been forwarded, letting users know that the information they receive does not stem from the sender. Tweaks like this make users more wary not to take information as gospel; sources may be dubious, and content falsified.
WhatsApp has also invested in newspaper ads worldwide detailing how users can be more alert to the presence of false information on the platform. Pared-down infographics suggest tactics like “questioning information that upsets you”, “checking photos in messages carefully” and “using other sources”.
Hear, hear. It’s been a long time coming, and a morally dubious one at that. We’re lauding WhatsApp for its proactivity. Now if only its unwieldy parent company could be quite so vigilant…