Recently Filed in Management

Spiral notebooks

Are your product development surveys all about new bells and whistles? That’s critical information—and fun—but it’s never the whole picture.

On the less fun side is asking what irritates your customers. This is also more expensive to research, both because collecting the information is more involved (verbatims, interviews, forum mining), and because digging through both measured and SHOUTED complaints about our products is exhausting.

Anyone else feel like their head is spinning this election? Here's some info and resources to evaluate all the headlines.

A couple months ago, I saw a presentation about agile research by Zach Simmons of Discuss.IO. He comes at it from the qualitative side, with a great platform for on-demand remote interviews, but it reminded me of some survey ideas I’d been mulling.

Most market research mirrors classic product development (waterfall model), with a steady progression through needs analysis, research & development, and delivery. It assumes big releases which stay stable for an extended period, which is a good fit to physical projects like manufacturing and construction.

Agile development works in smaller chunks, with products continuously evolving. The idea is to release a good start and keep improving it, rather than a giant push to create the end all be all specification. You may have noticed many of your smart phone apps update frequently, sometimes as often as every two weeks—this is a reflection of agile methodology.

I've been helping a client develop an assessment which they will deploy across many companies. It's relatively easy to run a survey today for one particular firm, but when you want to slice the data you'll have two years from now, it gets a bit more interesting.

You study, you brainstorm, you have endless meetings to find the best metrics—and then spend years accumulating, trending, and applying data. But while a metric can be long-lived, they're unlikely to be immortal.

If you're lucky, something dramatic happens to highlight the need for a new measure—such as when Lufthansa developed thinner seats which provided more usable space in less “seat pitch,” throwing a wrench in one of the most common cabin metrics.

But more often, it’s simply an accumulation of technical, marketplace, and fashion shifts which might have you coming up with a slightly different set of metrics—if you thought about it today.

So how long has it been since yours had a check-up?

As much as I'm a general advocate for respondents, I know it can be a challenge when they're an amorphous group and your manager or client is an immediate voice. But, there are two very pragmatic reasons for putting yourself in the respondent's shoes long enough to make sure your survey will be a good fit.

Recently I decided it was.

People generally agree investing in usability for Websites and Web applications is a “good idea” when it comes to retaining visitors and users—or in my specialty, survey respondents. The challenge is assigning a value to that investment because product managers, graphic designers, user interface specialists, technical support managers, trainers, and executives will often have wildly differing opinions.

There are three possibilities when you have a theory (or better yet, your boss or client has a theory) about survey results and start reviewing it in the data:

  • You were right! All is well in the universe, the sun continues to shine.
    They like your feature best, frequent buyers have higher satisfaction levels, and last year’s hybrid matrix re-org was the best thing since sliced bread.

Cover image by Joel Best

While a fascinating read for all of us, this is most applicable if you're combining secondary research with your surveys. You'll never look at "facts" the same way again.

Amazon

Cover image by Harry Beckwith

When you want to broaden your perspective, this will help you understand how customer and employee satisfaction mixes with and reinforces other marketing efforts. While the author focuses on services, it's useful in any industry—we're all competing on intangibles these days.

Amazon

Most everyone wants to measure (and improve) customer satisfaction, but how?

First and foremost, if your organization is new to surveys and doing this in-house, start simple! The goal of any survey is better information for decision-making, and a modest quantity of information that you actually use is far more valuable than a complex picture that may be flawed or too troublesome to maintain.

A client working on a project for a non-profit recently sent me two questionnaires:

Version 1
Written by my client, and primarily driven by their contact, the CEO. The survey focused on evaluating the organization as a whole, though a significant emphasis was on communication and fundraising.
Version 2
Written by one of the non-profit's board members. The questionnaire was far more granular, focusing on specific programs offered.

One of the great things about the Web is that almost any functionality is possible—it's just a small matter of programming (and budget and time and compromises). Sometimes you can imagine a widget which will make your respondent's or visitor's experience smoother or richer. Sometimes it's a function which will make your site easier to manage.

For Web surveys, there's a huge range of tools and services, so someone may already offer your dream feature. However, there are times when you just need something custom.

I recently had lunch with a fellow consultant who focuses on employee surveys. Contrary to what you might expect in the grand scheme of employer/staff relations, he spends much of his time getting executives to pay attention to "small" problems. In one case, it was a departmental laser printer long overdue for replacement—very like the machine which met a violent end in Office Space.

Just because you have a Web server doesn't mean it's the best choice for running your Web surveys. While your own box has the advantage of full control, it also has some drawbacks. This doesn't mean you have to go with an Application Service Provider (ASP) survey vendor for your projects. Many Web survey software vendors also provide "self-service" hosting for their software users.

Answer every one you can.

When I spot a "Take our survey" link on a Web site or feedback URL on a store receipt, I generally answer the questionnaire. Sometimes it's a smooth, well orchestrated experience. Occasionally I find myself wondering "What were they thinking?"

Just as an editor reading a manuscript watches for grammar, clarity, flow, and other issues, there are a number of aspects you'll want to watch for when taking a survey. While some of the issues I list here are an abstract analysis of the survey, the most important thing to pay attention to is your personal reactions to the questionnaire. After all, you are a respondent, and what you feel while completing the survey will be happening for other people as well. When you get started you'll only notice a few issues with each questionnaire, but over time the evaluation will become second nature—just like an advertising executive dissecting Super Bowl commercials.

It's become accepted knowledge that people talk more about bad service experiences than good ones. That extra energy people put into spreading the word was bad enough when they talked to friends and family. Now that energy is going elsewhere, and seriously bad experiences are no longer just someone complaining, they're now news and entertainment thanks to technologies like blogging and YouTube. In other words, they're mass media bad PR.

In addition to my work on surveys, I'm also Chapter Leader of DigitalEve Seattle"http://digitaleveseattle.org/, an association which supports women in technology.

We have an e-mail discussion list, and it's been quite lively the past few days talking about gender assumptions in the workplace. This was started when someone asked how to deal with a job applicant who became defensive when she suggested "Gentlemen" may not be the best salutation in this day and age.

On a similar note, I have a recommendation for anyone conducting telephone or in-person interviews. Train your interviewers to address women as "Ms." instead of "Mrs." unless your list explicitly identifies the marital status of the person being called. You may also spot other places in your scripts or through monitoring where assumptions are being made about your respondents.

After all, why open on an awkward note if you don't need to?

One of the strengths of Web, CATI and CAPI surveys is their adaptability. We can create a questionnaire which tailors itself in obvious and hidden ways to a respondent's answers. While in a sense the respondent is "driving" this process, they're doing so after a researcher carefully lays out the cones, sets up hay bales on dangerous curves, and straps the respondent into their go-cart with a safety harness and helmet.

I recently completed a survey where they forgot some of the safety precautions. It was an interesting experience, weaving around the debris from prior respondents while simultaneously dropping cones for my own course.

In our sound byte culture we love condensed statistics, magic values which will let us know how many customers will buy again or whether employees are engaged in their work—all at an easily compared glance. While summary information such as means (averages) is very useful, it can also obscure some critical details and trends within your data. When many factors are condensed into a single index value for executive dashboards, even more can be hidden.

I was in my car dealership the other day, and posted behind my service representative was a copy of their satisfaction survey. Naturally I couldn't make out every line from 6 feet away, but their point came across loud and clear. On the survey, the "Outstanding" column had been highlighted in green and "PASS" marked above it. The other four levels of the scale—including one positive answer and the mid-point—had been highlighted in pink and marked "FAIL." Likewise, the Yes/No responses had been helpfully highlighted. And then of course, the form had been laminated for display.

Apart from being such lovely sound bytes, numbers have an apparent precision, which is why we often give them more weight than they deserve. And these days, we’re all getting hit with one alarming statistic after another, so it’s a good time to dissect exactly what those two or three digit bytes actually represent.

Here are six questions to ask any statistic, whether it's one you're generating via a survey, contemplating over your morning latte, or incorporating in a marketing plan:

Need a Hand?

A little help can add a lot of polish—or just save hours and headaches:

(206) 399-2344 Download VCard LinkedIn Profile
info@querygroup.com

I was keen on getting all the juice out of our survey findings. Your consultation helped us do that and I would do it again!

Bonnieclare Erling
President
F2F Consulting