Typical abandonment rates

One of my most popular articles via Web searches is about typical response rates. What many researchers forget to look at is the abandonment or completion rate. If you extend an invitation to someone for the survey (mail, e-mail, banner ad, phone, dancing monkey, etc.) and they begin the questionnaire, do they finish it?

For each of your surveys, look at the number of people who:

  • Received the invitation
  • Showed interest by clicking through the banner or link to see the survey purpose, length, sponsor, etc. (for e-mail and paper invites you won't know)
  • Began the survey—one presumes with the intention of finishing
  • Completed the survey

The number of people who begin your survey should be pretty close to the number who finish it, with the few abandoned sessions due to random interruptions for respondents. However, whenever I teach workshops, I ask participants who has abandoned a survey due to length and every single hand goes up.

Yesterday I completed a survey and am embarrassed to admit I lied on one question—the only way I could get to the end and finish the screen captures for my example file. Well, technically I could have answered the question, but I was out of patience and it was easier to click "I have not attended an event in the past 2 years" (logically inconsistent with my earlier answers) than to read the list of 21 events, try to remember which I'd attended, and rank my top three.

Why was I out of patience? This survey managed to hit all the common problems:

  • The survey invite claimed it "will take approximately 10 minutes to complete," and while I can never time myself since I'm capturing screens, that's unrealistic for 88 questions on 41 pages (a good estimate is 2.5-3.5 questions/minute)
  • It had no progress bar
  • One page per question syndrome
  • Every response was required, including a comment field into which I typed "none" and numerous other questions where I simply didn't have much of an opinion
  • Difficult questions, such as the event grid where I lied to avoid answering
  • While the survey could be paused to finish later, that information was buried in the e-mail invitation, with no instructions on the survey and a counter-intuitive design (you closed the browser and clicking your invite link again would resume)

On your surveys, are people dropping off? For telephone, and interviewer-conducted surveys you can pinpoint where people are leaving, and with Web surveys you should be able to find out the last page they submit. You may have a specific question that's a problem, but more often it will be an accumulation of smaller inconveniences that cause people to leave at different points.

Even if you're happy with the total number of responses you're getting, remember surveys are all about hearing from a representative sample of your population. The people who drop off are more likely to be in your middle majority than the passionate ends, so you really want them to finish.

1 Comment

Note: New comments disabled for a few days while debugging.

I believe abandonment rates are criticl in analysing the form factor of a survey. In your experience, what are average/acceptable abandonment rates for B2B email/web based surveys.

It would be great to underatand abandonment rate by number of questions.

Need a Hand?

A little help can add a lot of polish—or just save hours and headaches:

(206) 399-2344 Download VCard LinkedIn Profile
info@querygroup.com

I was keen on getting all the juice out of our survey findings. Your consultation helped us do that and I would do it again!

Bonnieclare Erling
President
F2F Consulting