Rewards for Social Media: Could Financial Incentives Reduce Misinformation?

Category Technology

tldr #

Our research, presented at the 2023 Nobel Prize Summit, has found that financial rewards for accuracy can encourage social media users to form habits to share high-quality content. We also found that social media platforms tend to promote the sharing of attention-grabbing, controversial content regardless of its accuracy. Our research has broad implications for changing user behaviour on social media by changing the reward structure.


content #

Is social media designed to reward people for acting badly? The answer is clearly yes, given that the reward structure on social media platforms relies on popularity, as indicated by the number of responses – likes and comments – a post receives from other users. Black-box algorithms then further amplify the spread of posts that have attracted attention. Sharing widely read content, by itself, isn’t a problem. But it becomes a problem when attention-getting, controversial content is prioritized by design. Given the design of social media sites, users form habits to automatically share the most engaging information regardless of its accuracy and potential harm. Offensive statements, attacks on out groups and false news are amplified, and misinformation often spreads further and faster than the truth.

The study found that when an incentive of financial rewards for accuracy was introduced, participants shared significantly more accurate content than before and even continued to do so after the reward had been removed.

We are two social psychologists and a marketing scholar. Our research, presented at the 2023 Nobel Prize Summit, shows that social media actually has the ability to create user habits to share high-quality content. After a few tweaks to the reward structure of social media platforms, users begin to share information that is accurate and fact-based.

The problem with habit-driven misinformation-sharing is significant. Facebook’s own research shows that being able to share already shared content with a single click drives misinformation. Thirty-eight percent of views of text misinformation and 65% of views of photographic misinformation come from content that has been reshared twice, meaning a share of a share of a share of an original post. The biggest sources of misinformation, such as Steve Bannon’s War Room, exploit social media’s popularity optimization to promote controversy and misinformation beyond their immediate audience.

At the same time, financial incentives for sharing misinformation had no effect on participants’ accuracy - those with incentives still shared information with the same accuracy as those without any incentives.

To investigate the effect of a new reward structure, we gave financial rewards to some users for sharing accurate content and not sharing misinformation. These financial rewards simulated the positive social feedback, such as likes, that users typically receive when they share content on platforms. In essence, we created a new reward structure based on accuracy instead of attention.

As on popular social media platforms, participants in our research learned what got rewarded by sharing information and observing the outcome, without being explicitly informed of the rewards beforehand. This means that the intervention did not change the users’ goals, just their online experiences. After the change in reward structure, participants shared significantly more content that was accurate. More remarkably, users continued to share accurate content even after we removed rewards for accuracy in a subsequent round of testing. These results show that users can be given incentives to share accurate information as a matter of habit.

Voluntary human behaviour is likely to be more effective in curbing the spread of misinformation than automated algorithms.

A different group of users received rewards for sharing misinformation and for not sharing accurate content. Surprisingly, their sharing most resembled that of users who shared news as they normally would, without any financial reward. The striking similarity between these groups reveals that social media platforms encourage users to share attenti ion-grabbing information, regardless of accuracy.

Our research has broad implications for what incentives and disincentives can be used to encourage more accurate information sharing on social media. Changing how content is rewarded on social media may be a relatively low-hanging fruit in shifting people’s habits away from misinformation.

Research by the French Centre for the Analysis of Social Media (CAMS) found that false stories on Twitter were 70% more likely to be retweeted than true stories.,

hashtags #
worddensity #

Share