Finding the “Truth”: Using surveys to find out what people really think

December 16, 2009 at 8:17 am Leave a comment

In a world where people feel surveyed to death, it is becoming much harder to get people to give organizations answers that will help them make good decisions.  But until somebody invents a way to get information directly from people’s brains (and deals with the ethical issues related to such an activity), surveys are one of the main tools we have. So how can we maximize our chances of getting good information using survey technology? The answer is to do a good survey.

A good survey is one in which questions are inviting, clear and related to what we want to find out. You would be amazed at the number of pitfalls related to each of these criteria. An inviting survey encourages people to take time out of their busy lives to answer your questions in a thoughtful manner. A clear survey encourages people to interpret the question the same way as others and, in particular, as we did when we created the survey. Surveys whose questions tell us what we need to know to make good decisions are worth the money it costs to produce them. Let’s look at some of the principles to follow in order to achieve these gold standards of surveying.

Getting to Yes

The first step is to get people to agree to answer our questions, regardless of whether we are conducting the survey face-to-face, over the phone, by mail or online. People are more likely to give us their time if they

  • Trust in the integrity of the request (i.e., there is no hidden sales pitch)
  • Care about the subject and feel that related decisions have an impact on them
  • Believe that their input will make a difference in our decisions

Before the first question is asked, the introduction to the survey must satisfy the potential respondent that the survey is not simply a marketing gimmick. Generation X and Y are particularly cynical about such possibilities. Lately I have noticed a number of phone surveyors starting the conversation by saying that they are not trying to sell anything. As long as marketers do not start to claim that they are not marketing anything, this will remain a helpful second statement (just after the survey topic).

I have noticed a trend in phone surveys of describing the purpose of the survey in fairly general terms and then introducing the “hot topic” that the survey really wants to know about 10 minutes into the conversation. In many cases, I had been hoping someone would ask my opinion about that topic and they were lucky I am a survey junkie and said yes to the more generic topic. However, I recognize that the purpose of describing the survey topic more generically is to ensure that those who respond are representative of the full range of the population, not just those rabidly interested in the key issue. The survey designer must walk the fine line between describing the survey so generically that potential respondents do not care enough to complete the survey and describing the key issue so specifically as to lose representativeness.

Cynicism also comes into play in terms of potential respondents feeling as if their input will make a difference. Surveys whose results will ultimately be read by governments or high-profile industries will achieve good or poor rates of return based on whether people trust them to listen and act accordingly. In short, past history makes a difference even if the surveyor is a third party.

The other main factors affecting whether people respond are time and timing. A short survey (single sheet of paper, regardless of size, or 10 minutes completion time) is more likely to be answered than one that looks time-consuming. Time of year is also important. It is hard to collect information from people in December or the summer months. People already feel overcommitted in the lead-up to the holidays and are simply not available during vacation time.

The Truth is Clear

There is an art to writing clear and unbiased questions. And the penalty for failure to do so is public mistrust and unwillingness to respond in the future. Although it is impossible to cover all the principles of good design, here are a few key points:

  • Avoid wording that suggests that one answer is better or more socially acceptable than another.
  • Rating scale descriptions should be balanced between positive and negative options, regardless of whether a “neutral” point is included or not. (For example, do not use 1=Very Dissatisfied 2=Somewhat Satisfied 3=Satisfied 4=Very Satisfied.)
  • Use the simplest wording possible, in particular in phone surveys where memory constraints are greater.
  • Put demographic questions at the end unless using them to ensure a proper balance of gender, age, etc.

With respect to this last point, we want information from as many people as possible even if we do not have responses to sensitive questions such as income; it just means we will have more limited data for analysis of whether responses differ by income categories. However, if such sensitive questions appear first, some people may simply stop responding or fail to submit their survey because of privacy concerns, so all of their opinions are lost to us.

The best strategy for ensuring that survey questions are clear and unbiased is to run them by others. Ask people to reword the questions; this will flag misinterpretations. Have them identify biased questions and suggest alternatives.

While We Have Your Attention…

There is something about getting people to answer questions that unleashes surveyors’ desire to get as much information as possible, regardless of whether it is relevant to the research question or not. And often in the flurry of devising questions whose answers would be interesting to know, they lose track of the decisions the survey was supposed to inform. As a result, the survey gets longer and less useful. Because it is less useful, the results make suggestions that are irrelevant and unlikely to get the buy-in needed for implementation. People feel less “heard” and less likely to respond to future surveys.

Make every question count. Review each proposed survey question and identify what information the answer will provide and how it will answer the research question or inform decisions that the organization plans to make. Make sure that every research question has survey questions that address it adequately. Finally, although survey techniques are useful, they cannot answer all our questions and we may need to draw on other sources of information (e.g., cost analysis, outcome measures) to inform our decisions.

Advertisements

Entry filed under: Management, Research. Tags: , .

How many is enough? The quest for an acceptable survey response rate Professional Development in the 21st Century: A world of options

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Trackback this post  |  Subscribe to the comments via RSS Feed


Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 12 other followers

Recent Posts


%d bloggers like this: