Any scientists with some unused grant money lying around would do well to buy some posters for their labs with the Richard Feynman quote: “The first principle is that you must not fool yourself – and you are the easiest person to fool.”
We have just seen the Michael LaCour affair blow up in the National Media. His study, which appeared to find “that opponents of gay marriage could be convinced to support it after a single conversation with someone who identifies as gay,” was retracted by Science after statistical irregularities were found and the original data could not be produced. Among the reasons this retraction is receiving so much attention is that the original findings were broadcast so widely in the first place. For proponents of gay marriage, such news seemed too good to be true – that their opponents could be won over so easily. This goes against previous findings on political opinions, and probably should have been taken a little more skeptically.
Part of why LaCour’s results were so noteworthy was that they flew in the face of just about every established tenet of political persuasion. While past research had shown canvassing can be effective at influencing people in certain ways, the sheer magnitude of effect LaCour had found in his study simply doesn’t happen — piles of previous research had shown that, all else being equal, human beings cling dearly to their political beliefs, and even when you can nudge them an inch to the left or right, people’s views are likely to snap back into place shortly after they hear whatever message you’ve so carefully and cleverly crafted. Not so in this case: When LaCour compared the before-and-after views on gay marriage in his study, he found that opinions had shifted about the distance between the average Georgian and the average Massachusettsian, and this effect appeared to have persisted for months.
Also in the news, a scientist purposely designed an experiment with some of the common flaws of many dietary experiments. By using too few subjects and testing for too many outcomes, it became very likely that something would pass a correlation significance test. In this case, what emerged from the noise is the conclusion that eating chocolate helps you lose weight. Many media outlets jumped on this sweet headline, without asking too many pesky questions.
There is a very pernicious mechanism by which confirmation bias can operate called “Satisficing.” When we find a conclusion that agrees with what we already believe (or wish were true), we can easily say that the analysis is “good enough” and stop thinking, thus saving valuable brain power for other functions. Conversely, if we get the “wrong answer,” our intellect goes into overdrive to find the flaw. Think of all the convolution leaps of logic required in some legal rulings just to reach a specific conclusion.
While not directly related to confirmation bias, there were also a couple of examples of bad science reporting related to viruses. Most laypeople (and scientist) use the same mental picture for viruses as living organisms. However, viruses are not “alive” in the sense that they do not metabolize, nor replicate without the help of a host cell. This can lead to confusion
First, TIME reported on the use of herpes as a way of treating melanoma. However, the original headline mentioned “herpes cells.” Another study about containing HIV by interfering with a specific pathway spoke of “starving HIV to death.” Viruses are not cells, nor do they need food to live (but they do need certain nutrients to build new viruses).