The Everydata Interview: Dean Karlan, PhD

It only seemed appropriate given our post earlier this week on randomized control trials that this week's everydata interview series features Professor Dean Karlan from Yale University. Dean Karlan is an American development economist and a Research Fellow at the Abdul Latif Jameel Poverty Action Lab (J-PAL) at the Massachusetts Institute of Technology. Karlan is also the President and Founder of Innovations for Poverty Action (IPA), a New Haven, Connecticut based research outfit dedicated to creating and evaluating solutions to social and international development problems. Along with economists Jonathan Morduch and Sendhil Mullainathan, Karlan serves as Director of the Financial Access Initiative (FAI), a consortium of researchers focused on substantially expanding access to quality financial services for low-income individuals. He is also a co-founder of stickK.com.

How do you use data in your work?

Hope and intuition are great, but can only get you so far. We need data to make the critical leap from ideas to evidence. I’m interested in why some people in some countries are rich while others in poor countries are struggling and what we can do to help. To do this, I setup randomized controlled trials to test whether a program works or not, or how to make a program work best. To make it easier for researchers to do this, I founded the non-profit Innovations for Poverty Action which works with academics and trained research staff in the countries where we work to carry out high-quality studies. Lots of what we’ve learned focuses on human behavior, for example, how to design programs that help people to save more money. When it comes to arenas like that, we often find that most people’s intuitions about human behavior are way off from what the data says. To be fair, most economists' views of how people behave is far from accurate as well, and my intuition isn’t always great either. That’s why I think as a field we’ve come to trust what the data say rather than someone who confidently proclaims their product/charity/investment will be superior.

What is the most common mistake you see in terms of people misrepresenting or misinterpreting data?                                                         

I’ll highlight perhaps the most important (in my view): correlation is not causation! My thirteen year-old daughter actually has a sticker that says this on her laptop, so this is an important message I try to give not just as a professor and researcher, but a father too. This is the heart of why we do randomized trials. And this is also where understanding why a pattern exists in the data is critical for establishing it as a genuine and reliable and replicable statistical relationship.

How do you think the media affects the ways in which people consume data? 

There’s a feedback loop between the supply and demand side of media. I think readers would be reluctant to click on a headline that says “Some people, some of the time, do X, possibly because of Y, wait for studies to accumulate over the next few years to be sure.” Yet the truth is that’s the main finding of most social science studies. There’s probably a mismatch in the way science progresses and the way media frames stories.

On the other hand, Vox.com’s Ezra Klein recently told a group of economists that if people aren’t reading their findings, it’s their fault for not working harder to get them out there and communicating them in a way lay people can understand.  If I could, I’d start a graduate seminar where social science Ph.D. students and journalism students had to take a course together and each learn the constraints and language of the others’ fields. Their assignments would be co-authored news articles that the scientist agreed accurately reflected the research and its limitations, but the journalist felt like was engaging for an average reader.

Of course bias exist, there is no escaping that. One of my favorite example of bias comes from background data we gathered for an experiment about the effect of newspapers on public opinions (side note: we found that subscribing people to the Washington Post made them more likely to vote for the Democratic gubernatorial candidate in the 2005 election). In November 2005, after a report from the military about an increase in enrollment in the military, the Washington Post reported: “Youths in Rural U.S. Are Drawn to Military - Worries About Jobs Outweigh War Fears”, and the Washington Times reported: “Recruits Join Armed Forces Seeking War - A Sort of Vendetta Spurs Youth to Enlist After 9/11.”  Same underlying news “facts”!

Why should people care about understanding data? What are the consequences?

I’d say we should think more about “numeracy” and numerical reasoning. A great example of this is the Powerball lottery, which was deliberately changed to exploit people’s innate difficulties with probability reasoning. As a result the lottery were able to offer even tinier odds of winning a huge jackpot, and the customers flocked to it as predicted. But examples abound - health studies often report “risk halved” or “risk doubled” but not what the actual risk was – doubling sounds like a lot, but doubling a very small number is still a small number. Is it worth a more expensive drug to change the risk of something that will probably never happen a further tiny bit? 

What is one thing people could do to become a better consumer of data in their everyday lives?

I’d love to see people precommit to what they think an answer will be in the data. So much of policy debates are policy-based evidence-making, rather than evidence-based policy-making. Data are twisted and distorted (or ignored) to reinforce a view. Imagine advocates agreeing up front “yes, I think this is a good test and am confident the answer will be 12.” Opponents state they think the answer will be -7. And they agree to a test. The answer comes out, is probably in between.  Say 8.  And now maybe the -7 person shifts and says “well, ok, maybe its -3” and the 12 person shifts down to 11. That isn’t necessarily consensus, but it is progress.

Ultimately this is about making people pay attention to the incentives of the people who produce the data. Did that new national study showing people don’t get enough sleep come from a bed company?  Researchers also have their own incentives, our careers can depend on finding publishable results, and our own beliefs which can color how we ask questions or what we conclude. Transparent processes for data production and analysis can be key for breaking gridlock and finding the best way forward.

 

Please note that guest interviews are informational only, and do not necessarily represent the views of John H. Johnson, PhD.