Visual misinformation is widespread on Facebook – and often undercounted by researchers
Several studies have found that the amount of misinformation on Facebook is low or that the problem has declined over time.
- Several studies have found that the amount of misinformation on Facebook is low or that the problem has declined over time.
- The biggest source of misinformation on Facebook is not links to fake news sites but something more basic: images.
- For instance, on the eve of the 2020 election, nearly one out of every four political image posts on Facebook contained misinformation.
Visual misinformation by the numbers
- Overall, our findings are grim: 23% of image posts in our data contained misinformation.
- Some previous research had found that misinformation posts generated more engagement than true posts.
- Controlling for page subscribers and group size, we found no relationship between engagement and the presence of misinformation.
- Misinformation didn’t guarantee virality – but it also didn’t diminish the chances that a post would go viral.
- But image posts on Facebook were toxic in ways that went beyond simple misinformation.
Yawning gap in knowledge
- While Facebook remains the most used social media platform, more than a billion images a day are posted on Facebook’s sister platform Instagram, and billions more on rival Snapchat.
- Videos posted on YouTube, or more recent arrival TikTok, may also be an important vector of political misinformation about which researchers still know too little.
- Perhaps the most disturbing finding of our study, then, is that it highlights the breadth of collective ignorance about misinformation on social media.