Don’t cry for Eric Cantor. The former House Majority Leader who suffered a “surprising” June Primary defeat at the hands of upstart Dave Brat in Virginia’s 7th Congressional District has landed on his feet.
Cantor is the latest member of Congress to benefit from his experience on Capitol Hill, joining the Wall Street investment bank Moelis & Co. as vice chairman and managing director. He’s also getting a pay increase, from his $174,000 Congressional salary to $1.2 million guaranteed his first year, topped off by a tidy little $3.4 million bonus in cash and stocks. That buys a lot of monogrammed handkerchiefs to dry the non-existent tears with.
So the age-old lesson that every dark cloud has a silver lining applies.
But not all candidates who lose the way Cantor did will be so fortunate. Assuming a Wall Street job isn’t waiting when you lose an election, this is what you should learn from Cantor’s story.
Cantor’s pollster, John McLaughlin, was so stunningly far off the mark before the Primary that his results didn’t have even a passing acquaintance with the actual outcome. McLaughlin & Associates’ May 29 survey projected Congressman Eric Cantor getting 69%. Twelve days later it was Cantor’s opponent, economics professor Dave Brat, getting 55.6% of the vote.
At least McLaughlin did the right thing by conducting a post-election survey in an effort to figure out why his results had no relationship with reality. In Competitive Edge Research’s 28 years we’ve had a couple misses (although nothing close to McLaughlin’s). In those rare circumstances, we conduct follow-up surveys in order to try and understand what caused the inaccurate prediction and learn from it. Usually we find some aspect of the research was faulty; in other words, the culprit is more than just bad luck or sampling error.
McLaughlin now admits in his memo on the topic that his pre-election sample excluded far too many people who ended up voting in the June primary. Because McLaughlin only included voters with a history of voting in Republican primaries in his sample, he did not include nearly as many independents and Democrats as he should have. That was not the only technical problem with his poll, but it is the most glaring.
So if we believe McLaughlin’s post-election results, one big reason for his bad prediction is that non-Republicans came into a Republican primary and strategically voted for the conservative Brat. As hard as that is to swallow, McLaughlin’s post-election survey points to Obama lovers combining with Tea Party members to oust Cantor.
To avoid this situation, McLaughlin needed to take a fresh look at what was happening in the District (well before the election, obviously) and then create a sampling frame to suit it. I’ll add that McLaughlin is not alone among GOP pollsters in this kind of failure. It should push Republican candidates and consultants to look outside their echo chamber for their research.
But the crosstabs from the post-election survey contain another clue as to why McLaughlin was way off. Only 52% of the voters who normally cast Republican primary ballots – the ones McLaughlin did have in his pre-election poll – actually voted for Cantor. That’s 17% less than what the poll just twelve days earlier predicted and way outside the margin of error. Voter opinion does not change that much that quickly. So, in addition to the problems with the sample, McLaughlin’s questions, data collection or tabulation in his pre-election poll were definitely wrong.
Why? It’s difficult to tell from my vantage point. But one thing is clear: survey research in the age of caller-ID, mobile phones and declining cooperation rates isn’t easy. Meticulous attention to detail is a must. Control of the interviewing process — which tends to be farmed out to faraway, sometimes offshore, firms — must be maintained. Corners cannot be cut.
All of Competitive Edge’s interviewing is conducted in-house, no exceptions. Never subcontracting any aspect of the research process allows us to completely control the most important part of our job.
Although Eric Cantor enjoyed a cushy landing, you might not be so lucky. When hiring a research firm, be sure to ask who is actually collecting your data and make your decision accordingly.