Want to write better surveys?
Answer every one you can.
When I spot a "Take our survey" link on a Web site or feedback URL on a store receipt, I generally answer the questionnaire. Sometimes it's a smooth, well orchestrated experience. Occasionally I find myself wondering "What were they thinking?"
Just as an editor reading a manuscript watches for grammar, clarity, flow, and other issues, there are a number of aspects you'll want to watch for when taking a survey. While some of the issues I list here are an abstract analysis of the survey, the most important thing to pay attention to is your personal reactions to the questionnaire. After all, you are a respondent, and what you feel while completing the survey will be happening for other people as well. When you get started you'll only notice a few issues with each questionnaire, but over time the evaluation will become second nature—just like an advertising executive dissecting Super Bowl commercials.
Do not go through submitting an empty form or attempting to game the questionnaire—be courteous to your fellow researchers. Likewise if a friend or colleague forwards you a survey invitation which was clearly meant for a limited audience of which you're not a part, don't take the survey. Not all surveys have a cleaning pass on the data so even if you put "TEST" in the text fields your responses may end up in the final report.
Speculate on the purpose
While a survey may be obviously about customer satisfaction, can you find signs of the specific issues they're concerned about or what directions they're considering for changes?
Consider the invitation
How are you invited to take the survey? Who's the audience exposed to the invitation? Does that match your guess at the target population of the survey or is it a misalignment? Are there moments when you ask yourself "Why do they want me to take this survey?" Are there incentives and do they make sense?
Watch the time
Do they provide an estimate? How close is it to your pace? Does the survey feel longer than the actual duration? Is there a progress bar or "Page x of y" notice?
Is there a technical control you want to click, such as a Back or Pause button, but it isn't there? Are you uncertain exactly what a question is asking? When you go to answer, is the response you want missing? Are there intrusive questions that you resist answering?
Spot the skips
Can you see places where the survey is likely sending you detailed questions based on your earlier answers? Are there sections which don't apply to you that you can be skipped past? On very polished questionnaires these are hard to spot since all the questions just make sense as you go through.
Find the holes
Do you finish a section or the survey thinking "I really wanted to be asked ____."? If there is no comment field, do you miss it?
Notice the flow
Do sections transition smoothly and progress in a logical order or is it more discordant? Do you want to back-track to see a prior answer before completing the current question?
Evaluate error messages
How do you feel when it appears? Is it a technical issue that can be handled more gracefully? Is it for required questions that don't seem necessary?
Gauge your irritation
Are you considering abandoning the survey? Is it because of one particular problem or an accumulation of small issues?
Scope out the host or software
If the technology is particularly good or bad, look for a notice in the footer of the survey for who's hosting. Often "good" is the absence of anything exceptional—it all just works exactly how you'd like. If there's no name in the footer or on the Thanks page, try looking for a comment in the page's HTML source code or going to the root of the URL.
I rely on Query Group for prompt, accurate turnaround of my reporting projects. Being able to call on Ann lets me take on projects when my full time staff is already busy.