Thursday, December 11, 2008

Auditor's Survey flawed

Editor's Note

The Pierce County Auditor's office's survey of voters on Ranked Choice Voting was flawed and biased for many reasons. Read below for an academic analysis of the survey.


By Professor Richard Anderson-Connolly
University of Puget Sound

Given the very low response rate for the opinion poll on Ranked Choice Voting conducted by the auditor it is unwarranted to claim, as in the News Tribune headline of December 6, that a majority opposes RCV. We would need much better data to assess this claim.

Every voter received a poll about RCV along with his/her ballot. Thus the total number of possible respondents was 333,824. Of those 90,738 were returned making a response rate of only 27%. As a rule of thumb the sociologist and methodologist Earl Babbie (2005) suggests that “a response rate of 50% is adequate for analysis and reporting. A response of 60% is good; a response rate of 70% is very good.” The auditor’s rate was barely half the minimum standard and thus the findings do not legitimately deserve analysis and reporting.

The problem with such a low response rate is the strong possibility of nonresponse bias. “If the response and nonresponse strata were randomly formed, the respondent and nonrespondent means would be equal in expectation, and there would be no nonresponse bias. In practice, however, it is dangerous to assume that the missing responses are missing at random; indeed there are often good grounds from believing otherwise” (Kalton, 1983). In other words, if those who returned their RCV poll differ on average from those who did not respond, then the data are biased.

This situation seems very likely. It is plausible that those who did not return their surveys may feel less strongly about the topic, at least on average. Or maybe they were busier and had less time to fill it out. Or maybe they had the time but were just lazier. Or maybe they were more suspicious about government polls. Who knows? The groups could differ in many ways, including ways we might not even imagine, but the standard scientific approach is to put the burden of proof on those reporting the data to demonstrate that there are not likely to be any systematic differences when the response rate is this low. In science skepticism is a virtue. While I do not expect the auditor or the newspaper to follow this convention some recognition of the limitations resulting from the very low response rate would have been proper.

A more accurate reporting of the data would be the following:
72.8% No response
17.0% Negative view toward RCV
8.7% Positive view toward RCV
1.4% Undecided on RCV

Given that almost three-quarters of voters did not return the survey we clearly can not claim the majority oppose RCV. Recognizing that we have no comparable data on the popularity of the Top-2, the survey findings offer almost no guidance on future policy. More time and better data are needed.

In addition to the extremely low response rate the auditor’s poll on RCV has an additional problem involving sponsorship. The auditor almost certainly biased the data by including her name on the survey, which was given even more attention when it created something of a controversy in the news as a possible ethical violation. The percentage of those responding who had a negative view of RCV was close to the percentage of voters who had someone other than Pat McCarthy as their first choice for executive (and the percent that liked RCV was close to the percent that put Pat first).

Groves and Peycheva (2008) observed: “Sponsors of the survey are often policy-makers or advocates for the topics of the surveys they sponsor (e.g., companies conduct customer satisfaction surveys and manage the service delivery with customers). When the sample persons judge that the sponsor has an identifiable ‘point of view’ on the survey topic, that viewpoint can influence the person’s decision. Sample persons who have prior connection with the sponsor are most likely to experience these influences. For survey variables that are related to that point of view, nonresponse bias can result.”

By putting her name on the survey did the auditor make the RCV poll into something of a referendum on herself? It’s impossible to estimate the strength of the bias but certainly this was bad polling practice (regardless of the ethics) and compounds the nonresponse bias.

Given the poor quality of the data we do not know how many voters in Pierce County were satisfied, dissatisfied, or didn’t feel strongly either way about RCV. But as an advocate of RCV I am willing to admit that much more should have done to explain to voters the advantages to the new system. Although it would be possible to dismiss this as anecdotal evidence I have heard a few people say that they didn’t know what the “point” was to RCV. In the absence of good reasons to change I can understand why many voters would say they want to go back to something more familiar.

To give RCV a fair chance, advocates, the media, and auditor’s office should communicate to the voters not merely the mechanics of ranking or the counting algorithm but also the “whole point” to the change: more choices for voters, one election instead of two, and no spoilers, wasted votes, or vote-splitting. Once voters get the point then it would make sense to talk about the relative advantages and disadvantages of RCV versus the Top-2 or some other system. The calls for immediate repeal are occurring in an environment filled with ulterior political scheming by powerful interests and devoid of reliable information regarding the true wishes of the public.

Labels: ,

2 Comments:

At 11:34 AM, Anonymous Anonymous said...

I would have pushed further on some of your key points:

1) It is easy to construct plausible scenarios in which respondents disproportionately hold unfavorable views of IRV. That group would have been more motivated to fill it out. Its core lost the referendum, and the survey presented a low-cost opportunity for protest.

2) Because the report is not widely available, the methods are not transparent. I cannot find a report of the survey anywhere on the auditor's website. Accordingly I cannot assess the questionnaire design, sample frame, sampling method, etc. All we have are anecdotes from the newspaper. (And the only report I can find is this one from the Tribune, via a Google News search.)

3) The newspaper was highly irresponsible in headlining that "63 percent disliked RCV" while recognizing in fine print that this 63 percent was only among "voters who responded."

4) It seems to me that the appropriate IRV assessment tool is an exit poll. That can avoid the self-selection problem more effectively, reducing sample bias.

 
At 1:41 PM, Blogger Bob Richard said...

What, exactly, was the question that got a 63% negative response? What other questions were on the questionnaire? How were they worded? What were the responses to them? How was the purpose of the study described to voters?

As Jack has already said, there is little information available about the design, data collection, or results. Just a number, 63%. Why, exactly, is that?

 

Post a Comment

<< Home