Bad Data

One of my favorite podcasts is Econtalk by Stanford Economics professor Russ Roberts. In the latest episode, he interviews researcher  Morten Jerven, who wrote a book about the difficulties in obtaining good economic data for countries in Sub-Saharan Africa. They are both of the opinion that the large uncertainty in the data makes virtually meaningless a large number of complicated regression analyses. I think this is a problem in science in general. We might know that some data collected is suspect, for whatever reason, be in experimental error, inherent limitations of the method, or simple variability in results. However, there is a large physiological pull to set aside these caveats and just go with the numbers, perhaps reasoning that, while not perfect, they are “better than nothing” or “the best we have to go on.” We will tend to give too much credence to data even when we know it is faulty. This is a special case of the well documented mental bias called anchoring. To combat this, we should first think of the computer aphorism GIGO (“Garbage in, garbage out“). Compounding the problem is the special credence lent to anything with equations in it, regardless of either the quality of the mathematical model OR the data put into it.

Advertisements

Author: lnemzer

Assistant Professor Nova Southeastern University

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s