Recently Filed in Questionnaires

I recently completed a detailed customer survey for a software program I subscribe to. It was a conjoint analysis, which presents different feature profiles and price points to the respondent, asking them to pick the one they prefer.

What they will know from my responses? I’m cheap.

What they won’t know? Why.

One of the best ways to shorten a complex survey is to use skips, also sometimes called branching. This makes it a more relevant experience for respondents, as well as less effort—both factors which can impact your completion rate.

I've been helping a client develop an assessment which they will deploy across many companies. It's relatively easy to run a survey today for one particular firm, but when you want to slice the data you'll have two years from now, it gets a bit more interesting.

Usually when I’m asked about random or shuffled questions and values, it’s because someone came across the function as The Right Way to research, or spotted it in a software feature list. Rarely does it pop up because of a serious problem with order biases. My general advice is to first look at all the other tweaks you can make to your survey, from sampling to usability to scale phrasing to analysis—then decide if random makes your survey’s must have list.

For when it is an issue—or anyone thinking “Why not use it if I have it?”—here we go...

As much as I'm a general advocate for respondents, I know it can be a challenge when they're an amorphous group and your manager or client is an immediate voice. But, there are two very pragmatic reasons for putting yourself in the respondent's shoes long enough to make sure your survey will be a good fit.

Because we're not binary beings, we have text domain names such as:
   practicalsurveys.com
to reach Websites, but what really directs traffic around the Internet are the corresponding numeric identities, IP (Internet Protocol) addresses:
   209.197.67.60

In addition entering IP addresses as destinations, whenever you (or your e-mails) travel the Web you're also leaving little IP address footprints wherever you go. Because of this, survey managers sometimes want to use the respondent's IP address in one of two ways:

  1. To prevent ballot box stuffing by only allowing one response per IP address
  2. As a supplemental source of information

Before you join them, let's look a bit more about exactly what the respondent's IP may or may not tell you.

Recently I decided it was.

People generally agree investing in usability for Websites and Web applications is a “good idea” when it comes to retaining visitors and users—or in my specialty, survey respondents. The challenge is assigning a value to that investment because product managers, graphic designers, user interface specialists, technical support managers, trainers, and executives will often have wildly differing opinions.

"A foolish consistency is the hobgoblin of little minds, adored by little statesmen and philosophers and divines. With consistency a great soul has simply nothing to do."

Apart from the common misquote which drops "foolish," most people are unaware of how Ralph Waldo Emerson closed that paragraph:

"To be great is to be misunderstood."

Cover image by Howard Schuman & Stanley Presser

Despite being in a quantitative industry, surveyors rarely conduct tests to measure what happens when we rearrange questions, add a neutral point in a scale, or make other adjustments. If you're ready to absorb some more advanced issues, complete with footnotes, this is a great book to pick up. Note that the 1996 copyright is simply a reprint of the 1981 text.

Amazon

Recently I completed a telephone survey, and in the course of the 22 minute conversation (estimated at 12-15) I was asked:

Would you recommend a friend or family member attend University of California Davis?

Recommend for what? I'm aware of the school's solid reputation in engineering and veterinary medicine, but have no notion where their other programs rank.

Recommend for whom? I have to think of an individual as to whether the programs, lifestyle, location and tuition (in-state resident vs. full) would be a fit.

Most everyone wants to measure (and improve) customer satisfaction, but how?

First and foremost, if your organization is new to surveys and doing this in-house, start simple! The goal of any survey is better information for decision-making, and a modest quantity of information that you actually use is far more valuable than a complex picture that may be flawed or too troublesome to maintain.

A client working on a project for a non-profit recently sent me two questionnaires:

Version 1
Written by my client, and primarily driven by their contact, the CEO. The survey focused on evaluating the organization as a whole, though a significant emphasis was on communication and fundraising.
Version 2
Written by one of the non-profit's board members. The questionnaire was far more granular, focusing on specific programs offered.

I recently had lunch with a fellow consultant who focuses on employee surveys. Contrary to what you might expect in the grand scheme of employer/staff relations, he spends much of his time getting executives to pay attention to "small" problems. In one case, it was a departmental laser printer long overdue for replacement—very like the machine which met a violent end in Office Space.

"Piping" at its most generic refers to moving information in, out, and around a survey. This makes it a very useful technology for any dynamic survey (Web, telephone, kiosk), potentially enriching the respondent experience or increasing data quality.

It's also one of the many survey terms that can mean different things to different people, so before you assume your survey tool does what you want, make sure you're using the same definitions as your vendor. This article covers four main approaches to piping:

Answer every one you can.

When I spot a "Take our survey" link on a Web site or feedback URL on a store receipt, I generally answer the questionnaire. Sometimes it's a smooth, well orchestrated experience. Occasionally I find myself wondering "What were they thinking?"

Just as an editor reading a manuscript watches for grammar, clarity, flow, and other issues, there are a number of aspects you'll want to watch for when taking a survey. While some of the issues I list here are an abstract analysis of the survey, the most important thing to pay attention to is your personal reactions to the questionnaire. After all, you are a respondent, and what you feel while completing the survey will be happening for other people as well. When you get started you'll only notice a few issues with each questionnaire, but over time the evaluation will become second nature—just like an advertising executive dissecting Super Bowl commercials.

When we write surveys, we often have to make a choice for dollar or frequency questions. We can ask for a precise number such as:

How many times have you visited any of our stores in the past twelve months prior to your most recent visit? (Please enter a number from 0-365)

A common design for surveys is to make them like a PowerPoint slide show, with one question on each page. In practice, what this does is make the survey longer for respondents, which is a good way to irritate them and increase your abandonment rate. For illustration, let's consider a modest size survey:

  • Five individual questions
  • Two Five question grids
  • Five demographics

I love my new little laptop, so when the manufacturer popped up a survey invitation today (time delayed 3 months from purchase) I was happy to provide feedback. The survey was primarily about how I'd been using the system, which should have been easy questions to answer. However, as I went through the survey many of the questions weren't easy for me, because 9 of the 19 questions were missing my preferred answer.

Two representative questions are:

Which best describes your game-playing habits?
     Playing online PC games
     I do not play games

Which of the following music-related activities have you performed on this PC? (Select all that apply)
     Transferred music to a portable player device
     Purchased and downloaded music from an online Service
     Streaming music through the Internet (Internet Radio)
     Created a music CD
     None of the above

One of the recurring debates in the research community is about using "even" or "odd" scales. This refers to scales such as:

Where the labels go from one extreme to the other (this Likert scale is a common label set). An odd number of scale points has a neutral point in the middle, while an even scale requires the respondent go to one side or the other.

Survey designers of all experience levels make mistakes, and even when we have multiple proofreaders it's amazing what can be missed. I've seen errors ranging from typos to—more than once—overlooking an entire division of employees. These things happen.

This article is about one error that I hate to see, because it has a disastrous effect on data and is fairly easy to catch: coding a single answer scale as multiple answer or vice-versa.

One of the strengths of Web, CATI and CAPI surveys is their adaptability. We can create a questionnaire which tailors itself in obvious and hidden ways to a respondent's answers. While in a sense the respondent is "driving" this process, they're doing so after a researcher carefully lays out the cones, sets up hay bales on dangerous curves, and straps the respondent into their go-cart with a safety harness and helmet.

I recently completed a survey where they forgot some of the safety precautions. It was an interesting experience, weaving around the debris from prior respondents while simultaneously dropping cones for my own course.

I'm not talking about current events polls whose results too often depend on which online community pounces first. I'm talking about real, useful feedback achieved with just one question.

While we always like to study best practices, sometimes it's the failures which are most illuminating—in this case an unusual password design. The plan was hatched by a client's client before we came in on the rush project, so I can only report on the results and infer motivations.

Need a Hand?

A little help can add a lot of polish—or just save hours and headaches:

(206) 399-2344 Download VCard LinkedIn Profile
info@querygroup.com

I used one of the tips from class in working with data for [my] seminar and it saved me four hours! You were not a good instructor. You were great.

Steve Bottfeld
Executive Vice President
Marketing Solutions