Our awareness of it may be higher, but there’s little sign of fake news losing its disruptive power. Is your organization prepared to deal with a fake news storm?
2018 saw a concerted effort by social media companies and news organizations to try to combat the rise of fake news. Facebook, reeling from criticism of its alleged role in the 2016 US presidential election, announced a slew of measures to root out false stories. It also launched a PR offensive about its efforts, proclaiming “fake news is not our friend”.
The good news is that techniques to spot fake news are evolving fast, particularly through the use of artificial intelligence (AI). Several sites, such as Snopes and Hoaxy, now offer to check the veracity of stories. The not-so-good news is that AI is also enabling more sophisticated fake news generation. Take the AI technology used to create fake speeches by Barack Obama. Put that into the hands of a determined and powerful rogue player, for example, and the potential to disrupt democracy or cause mass panic or unrest is obvious.
Emotional appeal
The power and potential spread of fake news isn’t so much the tech used to create and disseminate it (although bots that generate huge volumes of likes, comments and shares undoubtedly play a vital role), rather, it’s the visceral response fake news can stir within us as humans.
A study by researchers at MIT shows that fake news travels faster and wider than truth because of the emotion it triggers. Fake stories often appear more novel than truth, prompting surprise, fear and disgust – and with it the urge to comment and share.
Unchecked, the consequences can be deadly. False news stories that have taken hold on social media have, in extreme cases, led to vigilante-style attacks in Nigeria on two men accused of child abduction in Mexico, and at a pizza restaurant in Washington DC suspected of harboring a child sex ring.
Tackling fake news
For crisis communications, these disturbing examples are a reminder that baseless accusations can appear suddenly and often for no obvious reason. During the crisis exercises we run with clients sometimes we’ll throw in fake content, say from a spoof account, to test whether and how quickly the team tries to douse that particular flame.
While every scenario is different, there are some best practices your organization can do to prepare for and deal with a fake news event (and stop it from turning into a full-blown, real-life crisis).
- Have a crisis plan in place and share it with key colleagues: Your plan should cover your crisis team’s roles and responsibilities.
- Practice your plan: Running through the plan, ideally in an exercise, is the best way to prepare for the real thing. If possible, include fake news scenarios.
- Monitor mentions of your organization on social: This is your early warning system that a fake story may be circulating.
- Establish your own channels as the authoritative, single source of truth: This takes time and effort, but the more you can be open and transparent about your own activities in ‘peacetime’, the more evidence you provide to those debunking fake news should a crisis strike.
- Publish your truth on all your channels: If a fake story drops, use every available outlet that you control: social, website, intranet, print media (if the falsehood runs for long enough).
- Use fast facts: Prepare these in advance, so you can quickly publish and share. Answer the key questions that stakeholders will likely ask.
- Call on your allies to share your truth: If you’ve won their trust in peacetime, you can call on stakeholders, partners, journalists, employees and loyal customers to fight your corner now.
- Don’t comment on posts of fake news or mention spoof account names: Social networks’ algorithms mean your mentions could inadvertently promote the fake story.
- Instead, report fake content to publishers: Social media channels, news organizations, and search engines have made it easier to report dubious stories or posts.
- Ask social media networks to shut down fake accounts: Worth a shot, but often easier said than done – for example, if the account openly describes itself as a spoof.