Monday, January 20, 2014

The curious silence on Rasmussen's supposed bias

During the last two major election cycles--the 2010 Presidential Election and the 2012 Mid Term Elections--one would have been hard-pressed to enter any discussion relating to current polling data and not be smacked in the face with the "Rasmussen is biased in favor of Republicans" canard. Indeed, this idea was given seemingly eternal justification (in the minds of those who spread it) by none other than Nate Silver in a 2010 article where he argued that Rasmussen's bias could be explicitly measured and that it stood--in that election cycle--at around 3.9 percentage points (meaning Rasmussen polls wrongly leaned almost four points towards Republican candidates, by and large):
Moreover, Rasmussen’s polls were quite biased, overestimating the standing of the Republican candidate by almost 4 points on average. In just 12 cases, Rasmussen’s polls overestimated the margin for the Democrat by 3 or more points. But it did so for the Republican candidate in 55 cases — that is, in more than half of the polls that it issued.

If one focused solely on the final poll issued by Rasmussen Reports or Pulse Opinion Research in each state — rather than including all polls within the three-week interval — it would not have made much difference. Their average error would be 5.7 points rather than 5.8, and their average bias 3.8 points rather than 3.9.

Nor did it make much difference whether the polls were branded as Rasmussen Reports surveys, or instead, were commissioned for Fox News by its subsidiary Pulse Opinion Research. (Both sets of surveys used an essentially identical methodology.) Polls branded as Rasmussen Reports missed by an average of 5.9 points and had a 3.9 point bias. The polls it commissioned on behalf of Fox News had a 5.1 point error, and a 3.6 point bias.
Now I respect Silver as a thinker, as a person with an exceedingly strong grasp on numbers and their manipulation, when it comes to polling data and other things. But the idea that there is a firm basis for establishing bias, as distinct from error, is laughable in my opinion. Note how, in this same article, Silver singles out CNN for a high error rate--comparable to that of Rasmussen--yet says nothing about bias:
Other polling firms that joined Rasmussen toward the bottom of the chart were Marist College, whose polls also had a notable Republican bias, and CNN/Opinion Research, whose polls missed by almost 5 points on average. Their scores are less statistically meaningful than that for Rasmussen Reports, however, because they had only released surveys in 14 and 17 races, respectively, as compared to Rasmussen’s 105 polls.
Polling is, of course, far from an exact science. And sometimes, polls do err badly, because of the way they are built or because of pollster assumptions. Silver's FiveThirtyEight blog has been dedicated to analyzing all of the available polling data in order to supposedly get a better picture of things. And in that regard, his predictions on political races have never really been all that impressive. It's not that they're bad, it's just that they're nothing special, in my humble opinion.

Still, he has a lot of fans, a lot of frequent readers (oodles more than me, to be sure). And in my experience, a huge chunk of such folks are primarily fans because FiveThirtyEight provides them with what they believe to be strong ammunition for political arguments. Again, the built-in bias of Rasmussen polls is one of the top pieces of ammo in this regard. Or at least it was. It's not being bandied about very much, these days. Why is that? Well, there is the fact that there aren't too many election polls out there right now, that the 2014 Mid Terms are still too far out. But what about other polls, like Presidential job approval ones? Is Rasmussen biased there, too? Anyone who accepts Silver's analysis would say "yes" in a heartbeat, I think. Silver, in fact, noted this bias as well:
The discrepancies between Rasmussen Reports polls and those issued by other companies were apparent from virtually the first day that Barack Obama took office. Rasmussen showed Barack Obama’s disapproval rating at 36 percent, for instance, just a week after his inauguration, at a point when no other pollster had that figure higher than 20 percent.
With Obama's approval numbers having dipped considerably across the last year or so, one would think--one would expect--that the ones issued by Rasmussen would not only be leading the pack (because of the bias) but would be getting hammered by the Left (because of the bias). And yet...

Do a Google search for 'Rasmussen "presidential approval" bias' and see what you find. Here's a hint: you won't find much, at least from the present or recent past. What you will find from the recent past are dimwitted posts from the Left--like this one--trumpeting the Rasmussen polling numbers based on the assumption that the "real" numbers must be even better for Obama than the ones reported by Rasmussen.

The problem with such silliness: Rasmussen is no longer an outlier tilting towards the Right at all (if it ever was). Take a look at the RCP aggregate for Presidential approval. In late May of 2013, Obama's aggregate approval and disapproval numbers converged, with the latter rising and the former falling. And since then, these trends have continued, much to the Administration's chagrin. Now take a look at Rasmussen's daily tracking numbers in that same period. See it? Rasmussen's numbers are consistently better for the President than the RCP average for this period.* Right now, that average has the President with a 42.5%/52.2% approval/disapproval split. That's a 9.7% split against Obama. The latest Rasmussen poll? A 1% split against Obama. And it's been this way for a long while now. To put it another way: Rasmussen's polls are the only thing keeping Obama from being consistently down by double digits in aggregate numbers like those at RCP.

As long as this is the case, bias in Rasmussen is forgivable, even laudable, for those same leftists who were so vehemently criticizing Rasmussen not all that long ago. Savvy intellectualism is only hip when it works in their favor, it would seem.

Cheers, all.




* There is another argument to be has as to Rasmussen's numbers: that they've changed significantly since Scott Rasmussen left the organization in July of last year. The company insists, however, that it hasn't changed its methodology at all. But either way, this doesn't impact what I am arguing here, really.

1 comment:

  1. "But the idea that there is a firm basis for establishing bias, as distinct from error, is laughable in my opinion. Note how, in this same article, Silver singles out CNN for a high error rate--comparable to that of Rasmussen--yet says nothing about bias"

    You're confused. "Bias" here means how far the estimates are from the average, whereas "error" refers to variance. Error can be in any direction, whereas bias is in a particular direction. It's the difference between accuracy and precision.

    The value that Silver adds with his models interpreting polls is that he can apply corrections to bias and errors. Errors can be reduced through adding more data, whereas bias can be corrected by factoring the bias into calculation.

    It's also a little unfair to quote Silver talking about Rasmussen in 2010 in an article from 2014. Rasmussen poll data doesn't have a bias against Democrats or the president anymore, and it's generally had the opposite problem since 2013. But that doesn't mean that it didn't have biased poll data for the prior years.

    ReplyDelete