Friday, January 20, 2006

bogus surveys, all

Kevin Drum gets it about right that survey research should be taken with a shaker of salt, from my experience as a phone surveyor and data coder in the survey research industry. Invariably, the first response to my calls, and certainly the only rational response, was to hang up after offering a mumbled excuse (or a lucid and hostile retort). The people generally most responsive to my calls were those who were: bored, lonely, too polite to hang up (elderly respondents often seemed to be some combination of the three, and were generally a good bet), or drunk or stoned—these last a rare treat. People were a bit more likely to answer political rather than market surveys, but not much.

Once I convinced whoever didn't hang up in the first 5 seconds that I wasn't trying to sell them something, I could often whiz through a survey, as long as the respondents didn't realize there was absolutely nothing in it for them, other than the "satisfaction of expressing their opinions."

Calls at dinnertime were particularly ill-received. Some states allowed us to make calls until 10:00 PM, after many people put their children to bed (or even go to bed themselves). For this reason, the last hour of the day was my least favorite.

In short, few people who value their time are willing to donate 20 minutes or more of it so that politicians or companies can get a better grasp of the public's preferences. The sample group for almost every survey is seriously flawed for this simple reason. I'm not exactly sure how you translate "major aversion to answering the survey" into a margin of error. Someone has probably studied this issue, but most people accept the idea that survey research more or less accurately reflects public opinion on the topic at hand. I'm unconvinced--and I haven't even gotten into issues like: many cell phone users' numbers are unlisted; many surveyors are still in high school and don't have the slightest clue about what it is they are asking, which translates into questioning and coding errors; different surveyors ask questions in different ways, which results in differing responses; some surveyors complete surveys that weren't actually conducted to improve their numbers; people who commission the survey often have an agenda which skews the questioning and calls the results into question; respondents often lie (especially the ones who are drunk or stoned) … the list goes on.

It always seemed to me that the whole survey research industry is built on a faulty premise: assuming the good faith of the people you are surveying, when there is little incentive to provide accurate responses, or to respond at all. But I've rarely seen this pointed out. People seem to be willing to forget about methodological concerns as soon as the "research" tells them something they want to hear.

Kevin's disclaimer should be a standard feature of every survey from here on out.

No comments: