Study finds why people think they’re right, even when they are wrong

New research reveals why people often believe they have enough information to make decisions, even when they don’t.

A new study explores the illusion of information adequacy

A new study explores the illusion of information adequacy. (CREDIT: CC BY-SA 3.0)

A new study sheds light on why you might be overconfident in disagreements with friends or colleagues, even when you lack all the necessary information. Researchers found that people often assume they have enough knowledge to make an informed decision or stand by their position, even when this isn't the case. This phenomenon is called the “illusion of information adequacy.”

Angus Fletcher, a professor of English at The Ohio State University and co-author of the study, explains, "We found that, in general, people don’t stop to think whether there might be more information that would help them make a more informed decision."

Fletcher worked on the study with co-authors Hunter Gehlbach, an educational psychologist at Johns Hopkins University, and Carly Robinson, a senior researcher at Stanford University. The study, published in PLOS ONE, reveals how easily people fall into this illusion.

Mean proportions of participants from each condition recommending merging and 95% CIs. (CREDIT: PLOS One)

The research involved 1,261 American participants who took part in an online experiment. They were divided into three groups and asked to read an article about a fictional school facing a water shortage.

The first group read an article that presented arguments for merging the school with another that had adequate water. The second group read arguments for keeping the school separate and looking for other solutions. The third, the control group, received both sides of the argument.

The results were striking. Participants who only read one side of the issue—either just the pro-merger or anti-merger arguments—believed they had enough information to make a well-informed decision. Even though they only had half the story, most were confident enough to follow the recommendations of the article they read.

“Those with only half the information were actually more confident in their decision to merge or remain separate than those who had the complete story,” Fletcher noted. "They were quite sure that their decision was the right one, even though they didn’t have all the information."

Interestingly, participants who were exposed to only one side of the argument also thought that most other people would agree with their decision. This highlights how the illusion of information adequacy can not only affect individual decisions but can also make people believe that their viewpoint is the majority opinion.

However, there is a silver lining. Some participants, after reading just one side of the story, were later presented with arguments from the opposing perspective. Many of them were willing to change their minds once they had access to all the information. This suggests that providing people with the complete picture can help counteract the illusion and lead to more informed decisions.

Treatment groups’ mean endorsements of merging recommendation before (T1) and after (T2) reading additional information with 95% CIs. (CREDIT: PLOS One)

Fletcher acknowledged that this approach may not work in every situation, especially when it comes to deeply ingrained ideological beliefs. In such cases, new information might be distrusted, or people may find ways to reinterpret the facts to fit their preexisting views. “But most interpersonal conflicts aren’t about ideology,” Fletcher said. “They are just misunderstandings in the course of daily life.”

These findings complement previous research on naïve realism, a psychological concept that refers to the tendency for people to believe their subjective understanding of a situation is the objective truth. While naïve realism often focuses on how people have different perceptions of the same situation, the illusion of information adequacy points out that people can share the same incomplete understanding—and believe it's enough to make a sound judgment.

Fletcher, who studies how narratives influence human behavior, emphasizes the importance of making sure you have all the relevant facts before taking a firm stance or making a decision. "As we found in this study, there’s this default mode in which people think they know all the relevant facts, even if they don’t," he said.

Descriptive statistics and correlations. (CREDIT: PLOS One)

So what can you do the next time you find yourself in a disagreement? According to Fletcher, the key is to ask yourself, "Is there something that I’m missing that would help me see their perspective and understand their position better?" This simple step could help you overcome the illusion of information adequacy and foster better, more informed decision-making.

Note: Materials provided above by The Brighter Side of News. Content may be edited for style and length.


Like these kind of feel good stories? Get The Brighter Side of News' newsletter.


Joshua Shavit
Joshua ShavitScience and Good News Writer
Joshua Shavit is a bright and enthusiastic 18-year-old student with a passion for sharing positive stories that uplift and inspire. With a flair for writing and a deep appreciation for the beauty of human kindness, Joshua has embarked on a journey to spotlight the good news that happens around the world daily. His youthful perspective and genuine interest in spreading positivity make him a promising writer and co-founder at The Brighter Side of News.