Skip to content
The Late-Week Mop-Up: Talking Polls With Kristen Soltis Anderson
Go to my account

The Late-Week Mop-Up: Talking Polls With Kristen Soltis Anderson

Registered voters vs. likely voters, polling samples, and whether we'll see any shifts.

I decided it was time to pull back the curtain on pollster world. So this week, we are talking to Kristen Soltis Anderson about what she sees when she’s looking for trends in the polls in these last few weeks. 

And for those who are not familiar, Kristen is the co-founder of Echelon Insights, and you’ve probably seen her on Fox News Sunday, and she’s written a book, and she has a radio show. I am also contractually obligated to tell you that she beat me this week in our fantasy football league 158-98 due to Christian McCaffrey’s good life choices  (and her wise choice to draft McCaffrey). But her most important job is as the Brisket’s godmother and he is working hard to comprehend the book on Bayesian probability she gave him. 

Let’s dive in.

Sarah: Democratic pollster Mark Mellman wrote an interesting piece this week about whether it’s really possible or even a good idea to try to separate out registered voters from likely voters in polls. What is your take on how important that distinction is?

Kristen: As you get closer and closer to Election Day, there is more and more value in trying to determine who is and is not going to turn out. So if you’re just looking at registered voters, you’re talking to a lot of people who won’t participate. And if they are disproportionately favoring one candidate over another, it could give you an inaccurate view of how the electorate itself might go. 

But I am sympathetic to Mellman’s comment that pollsters can get themselves in a lot of trouble with how they determine who is and is not likely to vote. If you are asking people to assess their own likelihood of voting, people are not great at that. They usually overestimate their own likelihood to vote. Some pollsters will look at people’s past voting history and determine that a likely voter is someone who has voted in three of the past four elections. But that can be problematic because then you might be missing new voters systematically. 

At Echelon, what we have tried to do that is different is—instead of saying we are doing registered voters or likely voters as our frame—we try to aim for the likely electorate. So even if you are an unlikely voter, some unlikely voters are nonetheless voters on Election Day. But there are fewer of them as a proportion. We can go back and look at past elections and see what percentage of voters were unlikely voters, and what percentage of votes were cast by people who actually had really bad track records of having turned out. And that way you can make sure that you’re including some unlikely voters in your sample, but you’re not letting it throw things out of whack. And that’s the different approach that we are trying this year. That I think differs from the way many polling outlets look at likely voters.

Sarah: I know you’ve been asked this question to death—what went wrong in the 2016 polls and what has been fixed in 2020. My readers know that the national polls were right yada yada and that it was really the state polls that were wrong. But what are you looking at in a state poll right now that they could look at to determine whether this state poll has fixed the problems from 2016?

Kristen: So one big thing to look for is what percentage of people in a poll sample have college degrees. Because that’s the key dividing line. It used to not be the case that Republicans and Democrats did very differently among college educated versus those voters who didn’t have a bachelor’s degree, but that has become an increasing issue. And if you have too many people with college degrees in your poll, which was a problem in many of the state polls in 2016, you are going to overstate the Democrat’s vote share. 

And so pollsters who are doing the right thing should be disclosing what demographics they are using to weight their data. And they ought to be releasing complete enough results that you can go and look and see what percentage of their sample is what. 

Now there’s controversy in pollster world because there are different estimates from different sources about what percentage of your sample should have a college degree. There’s Census data, there’s the exit polls, which a lot of people have problems with. There’s there’s lots of different options. But you know if a poll is coming back and 70 percent of people in the sample have bachelor’s degrees, that’s a red flag. Because that’s not what America looks like. You’re gonna be looking at a poll that has way too many Democratic voters in it if that’s what you’ve got.

Sarah: What about the SMAGAs as we’ve been calling them?

Kristen: So there are two different ways to look at the shy Trump voter theory. One is that Trump voters are systematically lying to pollsters. They are picking up the phone, and they are trolling people like us. And they are saying, “no, actually, I’m voting for Joe Biden.” And there is vanishingly little evidence of something like that occurring in part because—especially if you are doing a survey where you’re calling off of a voter list—you know if somebody is a registered Republican or not. If all of a sudden you are getting this weird spike in the Joe Biden vote among Republicans, maybe that’s something you’d look into. But typically Trump does extraordinarily well among Republicans. So that’s not likely to be the case. 

What makes me more nervous is what we would call non-response bias. That means more Trump voters are simply not picking up the phone. So it’s not that they’re lying to pollsters or giving false information. It’s that they’re just saying, “I’m not even answering that number that I don’t know.” 

But Trump has spent years talking about how much he dislikes the polls and how much they’re fake news. If that has led Republicans to have an extra special disdain for pollsters above and beyond what Democrats would have, and it means they’re sitting out polls systematically and in higher numbers, that could be an issue. It should still be something that pollsters can account for because they should be making sure that things like the partisan identification balance isn’t totally out of whack. But if it’s happening among independents more than Republicans, it might be hard to pick up. 

So this is where I say take polls with a healthy dose of cautious skepticism. I think most pollsters out there are trying to do the best they can. I think, by and large, you do not have widespread systematic polling problems this time. Even in 2016, the errors were confined to those Blue Wall states. Those just happened to be so critical to every prediction model out there possible that it blew everything up. I feel reasonably confident that the polls are pointing us in the right direction. But that does not mean that I think that when a new poll comes out that shows Joe Biden up by 7 in Wisconsin, that ‘Oh, yeah, Joe Biden is up by 7 in Wisconsin.’ The range of possibilities that ‘Biden up by 7 in Wisconsin’ could lead to, in my mind, is a fairly wide range. That does not preclude the idea of Trump winning Wisconsin.

Sarah: A lot of this newsletter is from the campaign operative perspective. Last week, I talked about yard signs and the big divide between candidates’ feelings about yard signs and campaign managers’ feelings about yard signs. What is the equivalent of that for pollsters and candidates in your experience? What is your yard sign?

Kristen: Two things. I actually had an incident occur on Twitter where a Republican former member of Congress who is running to take her seat back after having lost in 2018 saw me on Fox. I had said something to the effect of “Trump needs a clearer message in order to improve his standing in the election,” which I don’t think is a terribly controversial statement. But nonetheless, and she tweeted at me out of the blue, saying, “Well, your analysis is very different from what I hear when I knock on doors and talk to people in my district.” And I have no doubt that that’s the case. Because people that she is talking to and doors she is knocking on are not a representative sample of American voters.

Sarah: For so, so many reasons.

Kristen: So that would be I think the biggest frustration is I have. People who are running for office are taking the time to talk to their constituents, they’re doing town halls or knocking on doors. That’s great. That’s democracy in action. But that is also not a statistically representative sample of people in your district in your state. Those are the people who are the most motivated to come out to a town hall. Those are the folks who were on your campaign’s list as highly likely voters who may need an extra bit of motivation or something like that. But polls are a different method for a reason. And so it just frustrates me from time to time when someone will discount your poll because they talked to three people yesterday who held a different view than what your poll shows 70 percent of people in the district believe.

Sarah: And just to highlight what you just said about canvassing. Those canvassing lists—not only are they not representative—it would be malpractice for a campaign to send their candidate to a representative sample of their district. Because that’s not who you’re looking to turn out. We again covered this in the newsletter somewhat but worth just underlining for our newest readers.

Kristen: What is even more frustrating in a post-2016 environment is that even though that is a deeply flawed kind of analysis, it was that very kind of analysis that led people to say ‘Well, you know, the poll say Clinton is up by 4 in Pennsylvania but I saw a lot of yard signs in, you know, the Pittsburgh suburbs and so I’ll bet you he’s gonna win.’ 

Sarah: Exactly. They were correct. But it doesn’t mean their yard sign logic poll was correct.

Kristen: 2016 validated this faulty type of analysis. And it did in a way that has made it even harder to uproot now.

Sarah: You’re speaking my yard sign language. Next up: What are you looking for heading into the debates and immediately after the first debate in terms of data and numbers that we should be looking at as well.

Kristen: I am interested to see if anything moves this race at all. 

Sarah: Ha! The pollster says she’s looking for the polls to change.

Kristen: Yeah exactly. I do a radio show on Sirius XM, which I guess I’m plugging right now. POTUS Channel 124 on Saturday mornings. Anyway, every week I start off the show with a segment where I go through the big numbers. I look at the RealClearPolitics average and I say 27 percent of the country says we’re on the right track. Biden is at 49 percent, Trump’s 43 percent. Trump’s job approval is 45 percent. And his job approval on COVID is 43 percent. And Trump’s approval on the economy is 51 percent. 

And those numbers don’t change week to week that much at all. They did a bit early on in COVID. But they didn’t really after the convention. People were really squinting hard to see who’s getting a bounce, and the answer is that the race is basically where it was before the conventions. So I’m just interested in whether there is anything that they can do to disrupt and move these numbers because they are barely budging in the aggregate. 

I also think, though, you have this pattern of people who are presumed ahead or who are incumbents who sometimes show up at the first debate a little rusty—taking it for granted a little bit. And you’ve seen the Trump campaign has really downplayed expectations for Biden so much. They’ve basically tried to make the case that if he can walk up to the podium and form sentences that will be somehow miraculous.

Sarah: Though in fairness, if you lower expectations enough to form a narrative like they have and then Biden does slip up—somehow doesn’t manage to walk to the podium or doesn’t remember his own name—that will really, really drive home the point. 

Kristen: Yep. And then I would expect the polls to move. But I think I’m just so skeptical. People have such strong opinions about the candidates in this race. And I think unless one candidate or another does something to absolutely scare off all of their wobbly people. I’m not of the mind that a lot is going to change.

Sarah: Okay, for someone who wants to be a pollster when they grow up: What is the best piece of advice that you got along the way?

Kristen: Being a pollster requires both math and verbal skills. So you need to build up a base level of intelligence in the fields of statistics and the quantitative world, but don’t forget about the qualitative world as well. You need to be able to think creatively to write a questionnaire that’s going to actually get at the heart of what you need to learn. You need to be able to think about the best way to talk about an issue, which requires being able to tell stories and distill complicated ideas into simple concepts. And that’s not something you’re going to learn in a statistics class. 

And it’s also a field where I think you just have to work in it for a little while because then you begin to learn—is 43 percent job approval good or bad? If you’ve been looking at job approval numbers for 10 years, you can know what that means in context. So I would encourage folks to try to get experience on some other part of a campaign or at a think tank or somewhere else so that you can check off that box and know what it’s like to be on the client side a little bit. So don’t just focus on statistics because it’s so much more than that.

Sarah: And this is coming from someone who won their statewide debate tournament, which I know because I have the trophy in my garage because it is also our fantasy football league trophy. And who is a huge numbers nerd because we also play a lot of probability-based board games together. Thanks, KSA!

Sarah Isgur is a senior editor at The Dispatch and is based in northern Virginia. Prior to joining the company in 2019, she had worked in every branch of the federal government and on three presidential campaigns. When Sarah is not hosting podcasts or writing newsletters, she’s probably sending uplifting stories about spiders to Jonah, who only pretends to love all animals.

Please note that we at The Dispatch hold ourselves, our work, and our commenters to a higher standard than other places on the internet. We welcome comments that foster genuine debate or discussion—including comments critical of us or our work—but responses that include ad hominem attacks on fellow Dispatch members or are intended to stoke fear and anger may be moderated.