Election 2016: How Polls Can Mislead You

State Polls May Matter More Than National Polls

In the first half of 2016, most of what we've seen have been national polls. But - as we've seen four times before* - a president can lose the national popular vote and still win the election. With our Electoral College system, a few "swing states" typically determine which candidate will head to the White House. 

Takeaway: Focus on the data that matters. Swing states will likely determine the election, so don't get distracted by national polls.

John Quincy Adams, Rutherford B. Hayes, Benjamin Harrison and George W. Bush

Watch Out for Independents and Undecideds

In some polls, many respondents are classified as independent — that is, neither Republican nor Democrat. But while these independents may truly not identify with either party, it’s likely that many of them will vote for either the Republican or Democratic nominee. We've seen a Gallup poll that found more than 40 percent of people identified as independents, but when Gallup asked them which way they leaned, the vast majority of them chose a side.

Takeaway: Even though people say they're independent or undecided, they may not act that way at the polls.

Beware of a “Poll of Polls”

RealClearPolitics and others offer a national average based on multiple polls. But with a poll of polls, you’re taking data that may look the same on the surface, but is collected, interpreted, and aggregated in ways that may affect the results. Are all polls treated equally in the aggregation, or are certain ones given more weight? How are the margins of error from each individual poll accounted for? What are the differences among each poll in terms of sample size, survey questions, etc.? 

Takeaway: Aggregation polls may (at times) be more accurate than individual polls, but one very high-quality survey may tell us more about where candidates stand than averaging a bunch of surveys of variable quality together.

Can 245 People Represent Millions?

Polls are based on samples of people. Pollsters ask a few hundred (or a few thousand) people what they think, and then extrapolate those results (using weighting and other sophisticated methods, ideally). But any time you’re looking at a sample of people, you need to think about whether the sample you’re collecting is representative of the full population. In one poll, 254 survey-participating Republicans represented tens of millions of Republicans nationwide.

Takeaway: Look at the sample size for the poll - not just for the entire poll, but also for key subgroups. For example, if the poll surveyed 10,000 people, but only a dozen of them were female Republicans ages 18-24, it may be difficult to draw accurate conclusions about this group.

The Questions Affect the Results

Trump and Clinton often have different polling numbers when voters are asked to choose between the two of them, versus when voters are also asked about a third-party candidate. Similarly, a US Presidential election poll found Trump and Clinton closer than previous polls, perhaps in part because the questions were less open-ended, as the Washington Post pointed out.

Takeaway: Consider the exact language of the questions—and even the order they are asked in—when looking at poll results.

Look at the Margin of Error

Margin of error is one common way to measure statistical uncertainty from polling. It’s a way of answering the question, “How sure are you?” For example, if a candidate is polling at 52 percent of the vote with a margin of error of 4 percent, it means the pollsters were only “sure” that the candidate has somewhere between 48 percent (52 minus 4) and 56 percent of the vote. Moreover, it typically means that if the study were repeated 100 times, 95 times you would get findings within the margin of error — a 95 percent confidence interval, as statisticians would call it.

Takeaway: The margin of error can show you how inaccurate the poll may be.

Online vs. Telephone vs. In-Person Polls

In some cases, differences in results from telephone-administered polls versus internet-only polls create conflicting results. Why? People may feel more anonymous when they're answering questions on the internet or over the phone. Also consider that not everyone has internet access—which means you'll get a different sample set online versus through a telephone or in-person poll. Even for telephone polls, look at whether it includes landlines, cell phones or both—and consider how that might affect the results.

Takeaway: Using a different medium can affect the sample as well as the respondents' answers, both of which impact the results.

Is Self-Reported Data Reliable?

When you ask people for information about themselves, you run the risk of getting flawed data. People aren’t always honest. If you ask someone who they're going to vote for, they may tell you the truth - or they may lie. This is one reason why polls may be inaccurate.  

Takeaway: Whether you're looking at pre-election polls or exit polls, know that people may not be telling the truth - which can skew the results.