The MyDD Poll: Methodology Statement

OK, folks, your MyDD Poll is out of the data collection phase. We'll be posting findings and question wording starting today, so this post is to lay out the methodological details of the research for public consumption.

Before I do that, however, I'd like to express my deep appreciation and gratitude to you, the members of the MyDD community. This truly is your poll. And it truly is a groundbreaking effort. You made history developing the content and direction of questions collaboratively, with people all across the country, and you made history by funding the work yourselves. This also is the first poll I'm aware of that is sponsored by a blog. So, big hearts from me to all of you with such big hearts. You are the best.

I also want to thank Chris Bowers profusely. He thought this up, got the ball rolling and is nothing short of fabulous to work with. Big thanks also to Matt Stoller for his tireless efforts behind the scenes and, especially, to Marc Laitin of StartChange, without whom this wouldn't have worked so smoothly. And a special thanks to KiSquared Research, who did the field work. Big kudos and huge thanks to Katherine, Richard and the crew. What that said, let's put on our lab coats and take a look under the microscope at The MyDD Poll methodology.

This study is sponsored by MyDD.com and its community members. Members across the country contributed personal funds to pay the entire cost of the research. The study itself consists of 1004 registered voters from every state in the union. Data were collected between January 16, 2006 and January 25, 2006. The interview content areas were developed by MyDD.com members. Joel Wright of Wright Consulting Services worked that content into clear and concise research measurements and logical question ordering. Chris Bowers gave sign-off on the final interview for field work.

Data collection was accomplished by KiSquared Research of Winnipeg, Manitoba, Canada (web site: here). All data were collected from their central location data collection facility and interviewing was conducted from afternoon to early evening hours during each day of the ten days the project was in the field. Only professional interviewers were allowed to work on this project and each was personally briefed and trained on administration of the interview protocol. All data collection was supervised by professional staff 100% of the time. KiSquared developed the initial coding scheme for the open end question in this protocol (reasons for support or opposition to the Iraq invasion), and codes were reviewed, expanded and then approved for application by Joel Wright. Respondent comments were then reduced to numerical codes and input into the final dataset. KiSquared performed its final quality control and logic checks before transferring the dataset to Joel Wright.

Once received by him, the dataset was further analyzed for logic and consistency and to determine and apply key sample balancing parameters. The results of initial weighting by voter registration and region of the country proved unsatisfactory and, thus, conventional methods of balancing by region and ethnicity were applied. Sample quotas prior to data collection included gender, so the final sample is balanced across region, ethnicity and gender. Parameters for balancing were obtained from the US Census Bureau's Current Population Survey and its analysis of 2004 voter registration and turnout by state. Table 4a from their website (here) was used.

The following table shows the key sample parameters.

                                                                        Within                       Within
                                             Total                    Region                      Region
                                             Sample            Male     Female        Anglo        Minority

Northeast                               22.1%            46%       54%            83%          17%
South                                     30.4%            45%       55%            73%           27%
Midwest                                  26.3%             49%       51%           88%           12%
Rockies                                   6.6%             48%       52%           85%           15%
West Coast, AK, HI                14.6%            50%        50%           70%           30%

Total Sample                        100.0%            47%       53%            80%           20%

Once balanced, voter registration data from the sample were compared to the results of the Pew Center for The People and The Press' 2004 analysis of voter identification nationally. Their analysis can be found here. The MyDD Poll sample reflects their findings precisely: 33% Democrat, 29% Republican and 38% Independent/Other Party/No Party Preference.

Given the above, this sample is found to be representative, within the margin of error, of the universe under study: registered voters in the United States. The margin of error of these data is +/- 3.1 percentage points at a 95% level of confidence. This means, 19 times out of 20, these results would not differ by more than the margin of error from results if every single voter in the country was interviewed.

This statement conforms to the principles of disclosure regarding public polls as identified by the National Council on Public Polls. Their website is found here.

Tags: MyDD Poll, research methods (all tags)

Comments

2 Comments

Re: The MyDD Poll: Methodology Statement

The results of initial weighting by voter registration and region of the country proved unsatisfactory and, thus, conventional methods of balancing by region and ethnicity were applied.

Explain please? We didn't a weight by party reg work? What were the results? Haven't we spent the better part of the last two years screaming that pollsters should weight for party registration? My head's exploding ...

by ColoDem 2006-01-27 10:53AM | 0 recs
Re: The MyDD Poll: Methodology Statement
Here's exactly what it boiled down to. The weights by party reg generated 40% Dems in the sample. There's a decent case to make that this is correct, from the data I compiled and also a few other sources. However, most other polls don't show that figure for Dems. That would have caused a controversy. It would have misdirected discussion about the poll into a discussion about proper proportions, etc. We would have been talking about weights instead of data and findings. Accused of partisanship when it's not so in the least. All in the first time out for the poll. So I made a command decision, a very hard decision to make, late last night to go with conventional method in the best interest of the poll. This time.

I'm not done with this issue by any means. I'm actually more motivated now than before because we were close. And I've got a dataset I compiled where I can run tests and model weights. I can show serious weaknesses in conventional method now. I've already done it here. I'll be working behind the scenes on a more rigorous and strongly defensible method. This wasn't the time to fight over it, though. We have our ducks in a row with this stuff or we all get roasted.

Thanks very much for caring about this and asking, commenting. Keep the faith, it's going to happen.

by Sun Tzu 2006-01-27 01:43PM | 0 recs

Diaries

Advertise Blogads