The human brain naturally tends to maintain a more positive outlook, and that is the scientific explanation behind seeing the silver lining in every dark cloud.

While this understanding leaves more questions unanswered, it does provide more awareness on the concept of "prediction error" and its role in a person's optimism.

Different studies have looked into human optimism for decades, figuring out how people shape their hopes when it comes to the likelihood of good things happening to them.

Tali Sharot, PhD, a professor at University College London, wanted to find out how many people develop what seems like a pathological optimism even when statistics shows that bad things are generally likely to happen to them.

In Dr. Sharot's study, hypothetical disasters were presented to 19 volunteers who participated in the study. Then they were asked to estimate the odds of the disasters happening to them.

After the volunteers have reacted in terms of the likelihood of bad things unfolding in their lives, researchers showed them the actual statistics of such disasters. Some of the participants' estimates were higher, while others underestimated the likelihood of the bad scenarios in their lives.

Following the presentation of actual statistics, the participants were asked to assess the same scenarios again, this time they would make informed predictions.

Researchers observed that volunteers updated their initial estimates when the revealed true figures were less gloomy, but they generally ignored the more gloomy statistics and kept a more positive outlook even when a disaster is more likely to happen.

In this research, the participants' prediction error signals were also studied in relation to brain activities.

Prediction error signals were related to brain activity commonly involved in forecasting how a situation is to be experienced.

"Our study suggests that we pick and choose the information that we listen to," said Sharot, adding, "The more optimistic we are, the less likely we are to be influenced by negative information about the future."

Timothy Behrens and colleagues from Oxford University used prediction errors to model how humans incorporate advice from social partners into their decisions. Participants repeatedly had to choose which one of two options would work better. Before they made their decision, they saw which option another person would advise them to choose. So participants had to form prediction errors for two types of information: non-social (how rewarding are the two options) and social (how good is the other person's advice).

In other studies, prediction error was used to study the brain's way of keeping track of how well it is doing at forecasting what is going to happen in the future.

Researchers searched and identified brain regions that are related to the calculation of prediction errors.

In these studies, participants' brains are monitored in functional magnetic resonance imaging (fMRI) scanners.

The more optimistic a participant was the less efficiently one of the prediction error regions coded for undesirable information. Thus, the bias in how errors are processed in the brain can account for the tendency to maintain rose-colored views, explained the scientists in Nature Neuroscience.