The rate of misinformation online these days has skyrocketed and people seem to no longer care if what they are posting will affect others. Take a look at this article by John Timmer, science editor of ARS Technica. Here is an extract:
Piecing together why so many people are willing to share misinformation online is a major focus among behavioral scientists. It’s easy to think partisanship is driving it all—people will simply share things that make their side look good or their opponents look bad. But the reality is a bit more complicated. Studies have indicated that many people don’t seem to carefully evaluate links for accuracy and that partisanship may be secondary to the rush of getting a lot of likes on social media. Given those results, it’s not clear what induces users to stop sharing things that a small bit of checking would show to be untrue.
So, a team of researchers tried the obvious: We’ll give you money if you stop and evaluate a story’s accuracy. The work shows that small payments and even minimal rewards boost people’s ability to evaluate whether stories are accurate or not. Nearly all of that effect is due to people recognizing stories as factually accurate that don’t favor their political stance. While the cash boosted the accuracy of conservatives more, they were so far behind liberals in judging accuracy that the gap remains substantial.
Money for accuracy
The basic outline of the new experiment is pretty simple: get a bunch of people, ask them about their political leanings, and then show them a bunch of headlines as they would appear on a social media site such as Facebook. The headlines were rated based on their accuracy (i.e., whether they were true or misinformation) and whether they would be more favorable to liberals or conservatives.
Read full article here.
Leave a Reply