Posts tagged “likert”

Son of Survey Madness

We’ve posted any number of survey design critiques over the years, and here’s the latest, a close read of a question and the cues associated with different responses.

In response to the prompt How closely do you agree or disagree with this statement: “We saw business strengthening in the Spring, but it seems to be stagnant or falling off again. We thought we had seen the bottom, but now we are not sure.” we’re asked to move a slider between Agree Completely and Disagree Completely.
smiley
frowny

As we move the slider, the expression on the little green character changes, supposedly to provide an additional cue to ensure that our response is accurate.

But when we agree (a positive emotion), the guy is frowning. Because we are agreeing with a negative in which case we making a negative observation? So we feel negative? But the green dude isn’t mapping our feeling about the situation, he’s mapped to our response – our degree of agreement. We can feel positive about agreeing, even if the thing we agreeing about is negative (haven’t you ever exclaimed enthusiastically at someone that expresses a similar frustration to you? That’s being positive about a negative). The mapping here is wrong.

It’s further complicated by the indirectness of the prompt – that situation you are agreeing or disagreeing with – describing a situation going from positive to uncertain. How much do you agree or disagree with: something was positive but now it’s negative? In fact, besides being indirect and somewhat abstract, it’s also a compound question. You might agree that things were positive, or you might now. You might agree that things have gone downhill, or you might not. The question is asking you to agree ONLY to the cause where i) things were positive and ii) things have gone downhill. If you don’t agree with both of those, then what do you do? And since you can indicate the strength of agreement/disagreement, how will people interpret the question? I would suggest not very reliably!

Ironically, this is a survey aimed at providers of market research services, who should absolutely know better.

AT&T Email Support Survey

Here’s an interesting way to ensure the feedback from customers makes you look good: ask the right questions! After a frustrating experience with AT&T (short version: I switched to automatic bill payment, where they just suck the funds out of your account instead of having you actively make a payment…but when you switch over to that service, it takes some time to kick in, so your next bill won’t get paid – they don’t tell you that, in fact the website indicates that your next bill will be paid automatically, and meanwhile, they remove all the one-click “make a payment” functions from your online account, so you are in limbo where you need to write a check or something once they start nagging you for the missing payment that you thought you’d already made) they sent a customer satisfaction survey (“AT&T Email Support Survey”) that only asked me to rate the service I received against my expectations. It was the familiar Likert scale survey, where the rankings were


  1. Much Better than Expected
  2. Better than Expected
  3. Just as Expected
  4. Worse than Expected
  5. Much Worse than Expected

Nicely done! Who expects good support from a phone company? Not me. But “just as expected” sounds more contented than pessimistic. They could deliver consistently crappy service, but as long as they are within their brand perception of crappy service, everything is A-OK.

Series

About Steve