The Prevalence of Visual Misinformation on Facebook
Category Technology Sunday - July 16 2023, 15:47 UTC - 1 year ago Our study found that 23% of political image posts on Facebook contained misinformation and that it was distributed unequally based on political spectrums, with 5% on left-leaning posts and 39% on right-leaning posts. Images have been used as a tool for manipulation and distortion of information, particularly by state-sponsored disinformation campaigns.
How much misinformation is on Facebook? Several studies have found that the amount of misinformation on Facebook is low or that the problem has declined over time. This previous work, though, missed most of the story. We are a communications researcher, a media and public affairs researcher and a founder of a digital intelligence company. We conducted a study that shows that massive amounts of misinformation have been overlooked by other studies. The biggest source of misinformation on Facebook is not links to fake news sites but something more basic: images. And a large portion of posted pictures are misleading.
For instance, on the eve of the 2020 election, nearly one out of every four political image posts on Facebook contained misinformation. Widely shared falsehoods included QAnon conspiracy theories, misleading statements about the Black Lives Matter movement and unfounded claims about Joe Biden’s son Hunter Biden.
Visual misinformation by the numbers .
Our study is the first large-scale effort, on any social media platform, to measure the prevalence of image-based misinformation about U.S. politics. Image posts are important to study, in part because they are the most common type of post on Facebook at roughly 40% of all posts.Previous research suggests that images may be especially potent. Adding images to news stories can shift attitudes, and posts with images are more likely to be reshared. Images have also been a longtime component of state-sponsored disinformation campaigns, like those of Russia’s Internet Research Agency.
We went big, collecting more than 13 million Facebook image posts from August through October 2020, from 25,000 pages and public groups. Audiences on Facebook are so concentrated that these pages and groups account for at least 94% of all engagement – likes, shares, reactions – for political image posts. We used facial recognition to identify public figures, and we tracked reposted images. We then classified large, random draws of images in our sample, as well as the most frequently reposted images.
Overall, our findings are grim: 23% of image posts in our data contained misinformation. Consistent with previous work, we found that misinformation was unequally distributed along partisan lines. While only 5% of left-leaning posts contained misinformation, 39% of right-leaning posts did.
The misinformation we found on Facebook was highly repetitive and often simple. While there were plenty of images doctored in a misleading way, these were outnumbered by memes with misleading text, screenshots of fake posts from other platforms, or posts that took unaltered images and misrepresented them.
For example, a picture was repeatedly posted as "proof" that now-former Fox News anchor Chris Wallace was a close associate of sexual predator Jeffrey Epstein. In reality, the gray-haired man in the image is not Epstein but actor George Clooney.There was one piece of good news. Some previous research had found that misinformation posts generated more engagement than true posts. We did not find that. Controlling for page subscribers and group size, we found no relationship between engagement and whether a post contained misinformation or not.
Share