Hero banner image

We can answer that.

The podcast where we sit down with UCalgary professors, researchers and experts to get the answers to five questions submitted by you.

 

Episode 7: Margin of error

October 29, 2020

Polls are often used to predict how elections will turn out. But are they accurate? In 2016, most polls predicted Hillary Clinton would become president of the U.S. In 2012, polls predicted the Wildrose Party winning the Alberta provincial election. In 2017, polls had Bill Smith becoming mayor of Calgary. Do a few high-profile misses mean that polls are unreliable?

In this episode, we talk to Corey Hogan, former chief communications officer for the government of Alberta, who now leads the communications team for UCalgary's Office of Advancement, about some of the factors that can skew polls, how to make sure they're as unbiased as possible, and how to look at them with a critical eye.

  

Mike MacKinnon (MM): Welcome to We Can Answer That. I'm your host, Mike MacKinnon. Each week, I sit down with an expert from the UCalgary community to ask five questions related to their field of study and to shed light on topics that matter most in the world. This week, we're talking about opinion polls. During election time polls dominate headlines and news broadcasts. This is especially true just days ahead of the US presidential election. Polls are often used to predict how elections will turn out, but are they accurate? In 2016, most polls predicted Hillary Clinton would become president of the US. In 2012, polls predicted the Wildrose Party winning the Alberta provincial election. In 2017, polls had Bill Smith becoming mayor of Calgary. Do a few high profile misses mean that polls are unreliable? Our guest today is Corey Hogan, senior associate vice president of communications in UCalgary's Office of Advancement. Corey leads the university's institutional communications team. He also has a background in politics, having been the chief communications officer for the government of Alberta under both the NDP and the UCP, so he's got a bit of experience with reading pools. Corey, thanks for joining us.

 

Corey Hogan (CH): Hey, happy to be here.

 

MM: So there's a lot of mistrust about polls lately, especially in light of certain high profile elections, but would you say in general opinion polls are reliable?

 

CH: Yeah, I would say good opinion polls are in general reliable. And I think just like anything else in life, if you have enough of it, if there's enough volume, you're going to find examples where they're not particularly good. Think about it almost like bottles of wine. Occasionally, you're going to get a bad cork and the whole thing is going to be spoiled, but that doesn't mean wine itself is bad or can't be trusted and is normally going to let you down. Polls are very much the same way. There's so many ways that they can be constructed. There's not huge barriers to entry for non-professionals to come into the field, and that as a result can lead to some of those misses you were talking about. There's also some things that happen just in kind of, I guess, industry wide sense, the changing nature of polling over time, and sometimes pollsters get a step behind and there needs to be a bit of a course correction before polling can get back on track.

 

MM: Now, polls always include what's called a margin of error. Can you define that and explain how it affects polling results?

 

CH: Yeah, I mean, in the simplest sense, it says that, and normally it's stated in a way like, "19 times out of 20, this poll has a margin of error of about 3%." So that's based on the math of it all saying that if your sample, and this is a pretty big caveat that I suspect we'll unpack, but if your sample is perfectly representative, you're just randomly picking people from your audience, then you would expect by the math that 19 times out of 20, it would be within this band of error. Now people see that and they tend to read it two different ways, neither of which is particularly correct. One is that it's just as likely to be 3% off as it is to be right on. That's not really correct because it's really talking about what scientists would call a normal distribution. It's much more likely to be right there in the center then than it is at that 3% extremity. And then the other way that people tend to see it is that means that it's right or it is going to be representative and therefore it is going to be correct most of the time. But in reality, pollsters run into an awful lot of trouble creating what is a purely representative sample. You find it's very hard to get certain people on the phone. Try getting an 18 to 24 year old on the phone to answer a phone telephone poll. It can be incredibly difficult to hit certain demographics just based on economic status. If you're paying for minutes, for example, on your mobile phone, you might be less inclined to answer a number that shows a number that you don't know. So as a result, pollsters rarely have a perfectly probabilistic sample and they have to do a number of things to adjust for that, and then you'll hear things like waiting and whatnot that come into it. But the bottom line for margin of error is it tells you that in a perfect world, you would expect this poll to be within this band most of the time, and that's not necessarily how people see it.

 

MM: And what are some of the other ways that polls can be misinterpreted or biased?

 

CH: Oh, I mean, there's just a big list of biases that can bring in to the average poll. And in fact, I think in general, if we can do one thing, as people who purchase polls and talk about polls and even in a podcast like this, it's just make people aware that error is less likely to come from margin of error and more likely to come from all of those other biases that we should unpack a bit here. There's the idea of the sample bias, which I already introduced. It's the idea that you're not actually representing the population perfectly. Maybe like I said, you're not hitting certain age demographics, maybe you're not hitting certain income demographics. There's also what's called an order effect bias. And you can think of this as a poll that's generally designed to skew, where you ask questions ahead of time that get results down the road. And the example I always give on this is if you're talking about a political poll and your first question is, "How do you feel about the economy's collapsing? And your second question is, "How do you intend to vote?" You're going to get a very different answer than if your first question is, "How do you feel about the strongest economy that America has ever had, how do you intend to vote?" What you're doing is you're essentially getting people into the frame where they're going to be more or less charitable to the political candidate in the next question, so order effect bias is something that you should always ask about. Like when a poll looks too good to be true or is too different for you to take at face value, you want to ask yourself, "What questions came before?" And of course, how were questions phrased, because question bias is another one of these biases that that is worth talking about. And the famous one in the Canadian context is the 1995 Quebec referendum, right? So the referendum was not, "Do you want to separate Quebec, yes or no?" It was a question that was designed almost to get more people to yes than you otherwise would naturally get. The question was, "Do you agree that Quebec should become sovereign after having made a formal offer to Canada for a new economic and political partnership within the scope of the bill in respect to the future of Quebec, and of the agreement signed June 12th, 1995." Now, what are you agreeing to when you say yes on a question like that? Question bias is another thing you need to be very, very mindful of in polling. And then finally, there's the response and non-response biases. Response bias is just, people tend to exaggerate. People tend to lie. They don't like to share socially unpopular opinions with live callers. So if you ask a question like, "Do you use drugs?" You're going to have people say yes at far lower rates than then if you were able to reach into their brain and pull out the actual honest answer. Non-response bias is just the idea that some people don't want to participate, and not participating can happen on a question by question basis or it can happen for the poll as a whole.

 

MM: The last time we spoke about this, you talked about lying about polls and letting polls lie to you. So what about manipulation or intentional misuse of polling data?

 

CH: Well, this happens both as pollsters and as users of polling data, I guess. So, yeah, this is based on my two rules for polling. And those are number one, don't lie about polls. And number two, don't let polls lie to you. Now, number one is entirely in your control. You have the ability as a pollster to take your craft seriously. It doesn't mean you won't accidentally make a mistake. Maybe you introduce one of these biases I've talked about out of ignorance, if you're not particularly seasoned in this area, or maybe you introduce it simply by accident. There's a drafting error. All of a sudden something occurs in the wrong order and you don't get the answer that you were supposed to get. But if you were going out there as a pollster and you are actively trying to use these biases that I've introduced. And rather than protecting against them, which is what a reputable pollster will do and go through and say, have I dealt with any order effects, have I dealt with my response bias problems, if you decide instead to game them. So as a pollster you call only at a certain time of night, or during the day is probably a better example, when people who are working shift work are not going to be available. Or if you go out and you decide that you are going to weight up the youth demographics and say they count for much higher, knowing that your youth sample is not even remotely representative, but those first questions have given you the answer you want. That's lying about polls, and that's just dishonest if you're a pollster. And certainly there's a lot of examples of pollsters who have had been caught doing things less than reputably because they were interested in an outcome. Sometimes it's just to make their poll salacious and to draw media attention and by extension business. Sometimes it's political. That's obviously one we're more familiar with, but I really want to stress that is the exception. It's not the rule. You don't see a lot of pollsters doing because it's a good way to go out of business pretty quickly. And that's because it's very hard to lie about polls these days, because election laws require you to make information about your methodology available. The way you can lie about polls, if you've received a poll, is pretty simple. You get polling information, you have to go brief your boss about this. Your boss says, "Is it good or is it bad?" You don't want to tell them it's bad so you sugarcoat it. You pick the answers that maybe you know don't tell the whole story, but allow you to get through that conversation with less awkwardness. That's dangerous because it breaks confidence in polling overall. But not letting polls lie to you, that's a little trickier than those other two, because the other two just require moral fortitude, I suppose. But not letting polls lie to you means understanding polls. And I think this is one that we all have a bit of trouble with. I can't even count how many biases I've thrown out in the past three minutes here. Good luck if you're not keeping track with a notepad at home. It's pretty difficult for you then to then go throughout your daily life and read the Calgary Herald or the Edmonton Journal or anything like that and say, "Okay, I get it. I want to be aware that there might be a possible response bias here and I should look into this." And the reality is, reporters... No criticism really meant of reporters, but they don't have a ton of background in this either. And so they can sometimes uncritically pass on some of these polls, and they're doing that by a mistake. They've let the polls lie to them. And so really it's about informing yourself about the way these polls can be used or misused and making sure you know that this is all just information and it's all about probability.

 

MM: And so speaking of using or misusing the data, what are some of the ways that politicians use polls to adjust their campaigns or their strategies?

 

CH: Well, politicians that in the long run want to win are using polls to get outside of their own bubble, right? There's something called the false consensus effect, which we are all subject. And basically what it means if we want to take it into more simple terms is, I tend to think that my habits, my customs, my opinions, my beliefs are more common than they are. And that is really true when you get together a group of like-minded people. You can really convince yourself you're in the majority very quickly. And obviously political campaigns almost by definition are like-minded people getting together. You share the same basic philosophy and worldview. So smart politicians will use the information that is available to them through polling and pop that bubble and say, "Okay, well, this is my opinion. This is how I feel about the issue, but this data is telling me something different." And if you're not letting the polls lie to you, and if you are just being very honest with yourself and forthright about what the polling says, that allows you as a politician to adjust your tactics. Maybe you're not saying something entirely different. I'm not suggesting that politicians are just losing their moral core the minute they get new polling. But maybe they want to emphasize something differently. Maybe they need to talk about it in a different frame. Maybe instead of saying it's about jobs, it becomes it's about healthcare. But you're talking about the same issue, you're just wrapping it in a different bit of paper. The way politicians misuse polls is a little more troubling. And certainly I think people have realized over time that nothing wins like a winner. And this is particularly true in municipal elections, where maybe you haven't spent your whole life thinking I'm a new democrat or I'm a conservative, right? Maybe you are reacting to an individual for the first time. Somebody who you'd never even heard of before two weeks prior, and you don't know a lot about their worldview and you can really be persuaded to go with one candidate or the other. Well, in particular, in those low-information elections, as we call them, you might decide, "I don't want to split my vote. I don't want to vote for the person who has no chance. I want to vote for the person that does have a chance." And so politicians have misused polls by saying, "Look at me. I am in the lead, or maybe I'm in second ahead of the person who is in third. Ergo, everybody in third, give your votes to me if you want to stop the person who is in first." Now, maybe they've introduced an order effect. Maybe they have gotten responses from people that they know are not representative of the population as a whole, but they released this poll and they create this sense of momentum and they effectively torpedo anybody else who's there by saying, "That would be a wasted vote. Come to me. This is the real fight. It's between the two of us juggernauts at the top." So that's a very common way that politicians misuse polling during elections. The other way that politicians tend to, or tend to, I shouldn't say, but have been known to misuse polling during elections is simply to show that they are in the lead to get themselves a temporary advantage and things like fundraising or in volunteers. So you get yourself a bit of a bump that allows you, you think as a politician, to make real this fictional bump, by getting them those resources that you need to actually be competitive forward. Now, both ultimately in my opinion are self-defeating because they are both based on a fiction. You can be caught very quickly. And if somebody has given you their vote just based on a poll, they're likely to take their vote from you based on a different poll. So you've got to be very careful of that. And then the idea of getting the support turned to fundraising because the support's there. Well, the reality is, that support wasn't real. And so any fundraising bump, anything like that, is likely to be equally fictional, shall we say. So it's a good way to get through the moment, people tend to think. It is not a good way to get through an election.

 

MM: Now, at the top I said five questions, but I'm going to throw another one in here. What's the most important thing for people to know about polls?

 

CH: If you know nothing else about polls, know this: is all about probability. You should neither immediately discount polls nor take them as gospel. You've got to look at what went into their construction. If you have the ability and the time, see if this pollster has been relatively correct in the past. Certainly when we talk about polling on political matters, there's usually a track record. You can see how they've done in previous elections, and you can take that information and process it in. But you've got to be aware that just because you view something with a critical eye, it should not allow you to overwhelm your judgment on the value of polling overall. It's just one input among many, but it's more likely to be in that center of that margin of error than not, and it's more likely to be directionally correct than directionally wrong. So when you get into a tight race, is it five points up, five points down? Yeah, okay. Those biases matter a lot, and it might not tell you who's in the lead. But if you're seeing an issue which is 60% supported, 20% opposed with 20% in the middle, you know that's probably a supported issue. So don't become a polling nihilist, please. Take it, know that it has got it's limitations and that there are biases in every poll. But also know that polling is on the whole more often right, at least directionally, than it is wrong. And if you do that, you can make better decisions. You aren't then just falling subject to your own false consensus, your own idea that your worldview is more correct. And if you can do that, that just allows you to be a more informed citizen, a better reader of polling and make better decisions. When you think about this in a corporate sense, instead of just relying on nothing, you're relying on something, even if you know that something isn't going to be right all of the time.

 

MM: This has been We Can Answer That. We've been talking to Corey Hogan, senior associate vice president of communications in UCalgary's Office of Advancement about opinion polling and margins of error. You can subscribe to We Can Answer That on Apple, Google, or Spotify, or by visiting ucalgary.ca/podcasts. Follow our social channels to see which one of our experts will be featured in our next episode, and to send us questions you'd like them to answer. We Can Answer That as a production of the University of Calgary. Thanks for listening.

 

 

Other ways to listen and subscribe
Apple Podcasts  |  Spotify  |  Google Podcasts

A series of reaction emojis