Here is an example of how the average person might make a prediction. Someone buys an old, used car. A friend says, "You just wasted your money. It’ll break down within the year (prediction). You would have done better buying a new one." We are not only good at making a lot of predictions we are also good at covering our tracks when the prediction doesn’t pan out. If the car is purring along after a year, the friend might say, "Of course, it lasted longer than a year. You took better care of it than most people." After two years, the friend might be saying, "Well, I guess you just got lucky and got a better-than-average care." Very few people would say, "I was flat wrong about my prediction because your car is still running."
Of course, we mortals act like this. That’s because we are not professional prognosticators. We all know that the pros do a much better job at making predictions. That is why they are interviewed for the knowledge they have to share.
Oops, maybe they aren’t all that good. Several years ago, a psychologist decided to check out how well professional forecasters actually do. Philip Tetlock is a professor at the Haas School of Business in Berkeley. He decided to find out if experts who actually made their living in the field of politics and economics could predict events accurately in their field of study. He found people who regularly offered advice and made public comments on a variety of trends in their specific fields of expertise. He chose 284 experts to study. Over the course of the study, the experts had made an amazing total of 82,361 forecasts.
Let’s see how these experts did. Would you pass the envelope, please. Okay, let’s look at the results of these super mortals. Oh — something must be wrong here. The results are unimpressive. Did they do only slightly better than you or I would have done? No. Did they do the same as Jane and John Doe might have done. Nope. Omagosh. They actually did worse than the average person would have done by just guessing.
The study also found that the more the experts knew the less reliable they were at guessing what would happen to the world in the future. Dr. Tetlock explained this by saying, "We reach the point of diminishing marginal predictive returns for knowledge disconcertingly quickly. . . . In this age of academic hyperspecialization, there is no reason for supposing that contributors to top journals—distinguished political scientists, area study specialists, economists, and so on—are any better than journalists or attentive readers of the New York Times in ‘reading’ emerging situations." This study also showed that forecasters who were well known would have exaggerated confidence in their forecasts. He concluded that "Experts in demand were more overconfident than their colleagues who eked out existences far from the limelight."
This is terrible. Why do our experts do so poorly? The best guess is based on human nature. People love to be right but hate being wrong. So, when we make a guess (forecast) we "fall in love" with our choice. No matter how it turns out we stick with the choice, justifying it right to the bitter end. This seems to be exactly what plagued the experts.
You would think that really smart people would learn from their mistakes. Even though experts are smarter than most of us, they are also human and subject to the thinking errors that affect all humans. One (of the many) thinking errors that might explain why smart people don’t learn from their failed predictions is called confirmation bias. This error describes how most people tend to dismiss new information that doesn’t fit with what they already believe. So if experts really believe their predictions actually come true most of the time, then they will automatically refuse to accept the failures.
Other psychologists think that maybe too much information may also be a handicap for accurate predictions. By having more information than the rest of us, an expert can marshal more facts to support her predictions. Although these predictions may be more appealing to the average person, they are still subject to the same failure rate.
Even when a prediction does come true, an expert will believe they had made the prediction with a great amount of certainty. Dr. Tetlock found out this did not match the data they had collected. In other words the experts were actually more tentative prior to the prediction than what they thought after the predicted event.
This also happens to people who believe they have special powers to see into the future, like psychics. Years ago, a young woman contacted me to tell me that she had precognitive dreams. For example, she would dream about a terrible tragedy like an airplane crash prior to it actually happening. After she gave me some very impressive examples, I suggested she write down the details of her next tragedy dream immediately upon waking in the morning. After writing down the details, she was to seal the paper in an envelope and mail it to me immediately. She called me a few weeks later and excitedly told me about a dream she had about an accident that had killed many people. When she read about it a few days later all the details in her dream matched the newspaper account. I had her come to my office and we opened her letter together. She was surprised, shocked and disappointed with what she had written. The information was not even close to the event in the newspaper. Her mind had tricked her with something called retrograde memory. As she read about the accident in the newspaper the information went into that part of her brain that held long-term memory. As she tried to recall what she had dreamed, the dream details were replaced with the actually details.
How about those of us who buy into the predictions made by other people? Why would we be so gullible? There are many factors that make it more likely we will do this. One reason is that people tend to be more willing to believe information that has a lot of detail. One study asked people which of two health policies they might choose. The first policy covered their hospital expenses for any reason. The second policy also covered their hospitalization expenses for all diseases and accidents. Even though the coverage on these policies were identical, most people said they would be willing to pay a higher insurance premium for the second policy. More detail. This might mean we are more likely to accept a prediction if it has a lot of details instead of one that is less descriptive.
All of this is to say that people should be cautious about making decisions. A few years a really smart person made the mistake of making his prediction public. William Dembski is a Ph.D. mathematician who did not understand the pitfalls of predictions. Oddly enough, even though he teaches in an academic setting, he is a vocal critic of Darwin and evolution. He believes that evolution will never be able to answer how life evolved on our planet and is a strong supporter of something called Intelligent Design. He was so confident in his beliefs that he wagered a bottle of single-malt scotch about the truth of his beliefs. He publicly said that if the issue of teaching evolution or Intelligent Design in schools should ever go to trial, Intelligent Design would win hands down. Then the Dover trial took place. Intelligent Design was thoroughly dismissed (by a conservative judge) as an acceptable subject for school biology classes. Like many prognosticators, Dr. Dembski will not admit defeat. Evidently he has never given away that bottle of single-malt scotch.
Another smart person is more humble. Paul Krugman, the 2008 Nobel Prize winner in economics and a professor at Princeton University, says that "making predictions is hard … especially about the future, and sometimes about the recent past." Even that great philosopher about life, Yogi Berra, agreed with Dr. Krugman when he said, "It’s tough to make predictions, especially about the future."
If you are interested in the book from which much of this material was taken, it is called "Expert Political Judgment: How Good is it? How Can we Know?" (2005), Princeton University Press.
No comments:
Post a Comment