Bad surveys

Yesterday, a grad student asked me whether I could answer a survey he was doing for his research. I’ve struggled getting participants in the past, and seeing that the survey would only take a couple of minutes, I accepted.

It was a survey about the urban design of a particular place at the University of Toronto, which was fine with me. Halfway through the survey, though, I realized he was trying to put some answers in my mouth. He asked me whether I liked that place, and when I said I did he replied: “Really? There’s nothing to like there.” I insisted that I liked the landscape around it; he objected, pointing out that it was just a grass field. I kept insisting, and he grudgingly wrote down my answer.

We went through this process several times. The last straw fell when I said the lighting at that place was good and he responded that “it’s pretty dark there right now,” circling the “bad lighting” answer. He only erased that answer after I lost my patience and told him that these were my answers and if he wanted others he should ask someone else.

I know how this survey’s results will look like. Some large percentage of users of this space, it will say, are terribly dissatisfied with it –hence providing support for whatever project this student is designing. What gets me is that results from a survey as poorly and dishonestly executed as this one will carry greater weight than any non-quantitative arguments simply because they produce a percentage number in the end. We’re in love with quantitative evidence, no matter how poorly it is constructed.

As I left the place that evening I looked around with a critical eye. There were definitely some areas that could be improved. Come to think about it, I thought, it was plain to see that lighting was actually pretty bad — and no survey results will convince me otherwise.

 

About Jorge Aranda

I'm currently a Postdoctoral Fellow at the SEGAL and CHISEL labs in the Department of Computer Science of the University of Victoria.
This entry was posted in Academia. Bookmark the permalink.

6 Responses to Bad surveys

  1. George says:

    That seems like a really poorly done survey. I really hope it isn’t used to justify anything. Was it possible to find who the person was and somehow report their unscientific practices?

    • Jorge says:

      I think I have enough information (that I omitted here) to track him down, but I haven’t decided whether I should do it, and what to do once I identify him.

  2. I’m told that political think tanks and PR outfits do this all the time – the survey is worded deliberately to get the desired result, which can then be presented as “this is what people said”. Whatever it is, it’s definitely not science.

    I’d be strongly tempted to let the ethics board know about it.

    • Jorge says:

      The alternative is to get back to him and tell him to stop, though I know whatever I tell him will not have the same weight (or consequences) as an intervention from the ethics board…

  3. Neil says:

    Ok now I’m dying to know what place this was. I guess you are not mentioning it for obvious reasons. ‘Grass field’ is not narrowing it down much!

    I wonder what different biases exist in person-to-person surveys vs. online or textual surveys. E.g., government agents interviewing people about homelessness.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s