As Election Day 2024 draws near, more Americans are looking at polls to predict what will happen. With so many surveys being churned out in a seemingly constant stream, it becomes an arduous task to make sense of them all. Most folks just throw up their hands and settle for an average of polls. However, as many have realized from the debacle of 2016, taking an average of garbage is still garbage. So, how can such a trap be avoided this time around? Recently, Time published an article about how to read a poll. It had many excellent suggestions, but it missed some key elements involved in getting the most out of an election survey. Here are five tried and true pointers for those who want to be in the know:
Get the Most From the Polls
Go on a national poll diet: Make sure you know what kind of poll you’re reading. Is it a national poll or a swing/battleground state survey? Most pollsters admit that national polls are much less reliable than statewide polls. Kellyanne Conway, who is considered to be an elite pollster, calls national polls “phony” and insists they are “not relevant.” National polls tell you trends, according to Conway, but they don’t dig into the electoral map, and that is where presidential elections are won and lost. So, if you love reading the polls, try to wean yourself off national polls because they can be misleading.
How much candy is in the box: Sample size matters. Did the pollster interview 400 people or 4,000? The smaller the sample, the less reliable. It’s best to check the sample size even before you start reading the poll. A general rule of thumb is to make sure the poll samples at least 1,000 likely voters. Likely voters are more dependable than registered voters because it means they have been asked about their voting patterns and have participated in recent elections.
What kind of candy is in the box: This is one poll trick worth scoping out. The question here is who is being sampled and how many of them are there. Does the poll oversample Democrats or women? A poll should reflect reality. Let’s say a survey is being conducted in Arizona, and no Hispanics are included – that would be misleading because the reality is that over 30% of people who live in the Grand Canyon State are Latinos. This may be an extreme example, but what if only 10% of the respondents are Latino? That still doesn’t really fix things. This rubric can be spread across an entire poll: How many men have college degrees, how many blacks, how many black women, and on and on. If a survey is worth its salt, the sampling will reflect reality.
Take your time: You’ll experience more satisfaction if you look at a poll as a five-course meal rather than drive-by junk food. Polls are best digested slowly, so take time to chew them over. Don’t be afraid to click the link to the actual survey instead of just glancing at the averages because plenty of interesting information can be found in the cross tabs. (Hint: Do this on your desktop, not your phone.) So many surprising tidbits can be found in the tabs, like how many white men between the ages of 18 and 34 say they are voting for Trump and, conversely, how many black women in that age category say they are not voting for Trump. If you notice it’s a high number and read a couple of other polls with the same outcome, that’s a trend. Revealing trends is what polls do best.
Name that tune: One of the interesting things that can happen when you take time to open the actual poll (not just the so-called “top lines”) is that there is an opportunity to hear the pollster’s voice. The way a question is worded matters. If it didn’t, pollsters wouldn’t get paid so much money. As you read the questions and the order in which they are asked, you’ll get a better sense of what the pollster is after. Let’s say they have an entire section about former President Trump’s legal troubles. That’s a signal that the surveyor believes it’s a critical issue for the voter. The political leanings of a pollster can be spotted when you listen to their voice.
There are other considerations for political junkies, like who is the pollster and what is the historical accuracy of that organization. Still, with so many survey organizations springing up, it becomes more difficult to assess individual accuracy. Some might also point to methodology regarding how the pollster gathered their information (landline, smartphone text, etc.).
One Final Story
In the 2016 election, the polls were notably mistaken, but there was one that had it right. It was the USC/LA Times Daybreak Tracking Poll. The methodology and weighting of the survey was admittedly complicated – suffice it to say its differences from the run-of-the-mill polls caused it to come up aces. One of the glories of the tracking poll was that it was conducted online so “shy voters” who didn’t feel comfortable talking about their vote on the phone could be counted. Sadly, despite the experiment’s success, the Daybreak Tracking Poll was abandoned for unknown reasons. However, it did prove one thing: There is more than one way to conduct an accurate election survey, and technology will likely be at the root of it. But in the meantime, these five tips will provide a way to sift through the garbage, and you may even come across a nugget of gold or two in the process.
~
Liberty Nation does not endorse candidates, campaigns, or legislation, and this presentation is no endorsement.