Posts tagged “survey”

Badly written survey invite

To: [steve]
Subject: Give Your Feedback to Sundance Channel

Dear Sundance Channel viewer,

We’d like to hear about your opinions and experiences! Sundance Channel is conducting an online survey to collect information about your opinions and attitudes regarding Sundance Channel and its various programs. You must be 21 or older to participate and due to video content within the survey, the link below can only be accessed using Internet Explorer, so we encourage you to use a PC. We apologize in advance for any inconvenience this may cause.

IE =/= PC, however. PC users can use other browsers (as in my case), and Mac users can run IE. But the survey doesn’t (as they say) work with Firefox. Ah well.

AT&T Email Support Survey

Here’s an interesting way to ensure the feedback from customers makes you look good: ask the right questions! After a frustrating experience with AT&T (short version: I switched to automatic bill payment, where they just suck the funds out of your account instead of having you actively make a payment…but when you switch over to that service, it takes some time to kick in, so your next bill won’t get paid – they don’t tell you that, in fact the website indicates that your next bill will be paid automatically, and meanwhile, they remove all the one-click “make a payment” functions from your online account, so you are in limbo where you need to write a check or something once they start nagging you for the missing payment that you thought you’d already made) they sent a customer satisfaction survey (“AT&T Email Support Survey”) that only asked me to rate the service I received against my expectations. It was the familiar Likert scale survey, where the rankings were


  1. Much Better than Expected
  2. Better than Expected
  3. Just as Expected
  4. Worse than Expected
  5. Much Worse than Expected

Nicely done! Who expects good support from a phone company? Not me. But “just as expected” sounds more contented than pessimistic. They could deliver consistently crappy service, but as long as they are within their brand perception of crappy service, everything is A-OK.

user centered design – sort of – in NYC subways

full story

Mr. Malave was one of dozens of curious riders who attended an ‘open house’ sponsored yesterday afternoon by New York City Transit to show off and receive feedback on a five-car test train, a prototype of the R160, the newest generation of subway cars.

Riders yesterday, told to focus on the FIND panel, were asked questions like, ‘Do you feel reassured that the train is going to your station?’ and ‘How easy or hard is it to read the words and letters on the sign?’

But riders seemed to be paying less attention to the sign than the rest of the car. Some of them said they did not regularly take the Nos. 2, 4, 5 and 6 lines (which use R142 cars, similar in design to the R143) or the L line and so were not familiar with the latest design.

Asked to compare the new car with the F train that she normally rides, Mar?a Romero, 72, a retired nurse’s aide from Gravesend, Brooklyn, said, ‘This is three times more advanced!’ Jared M. Skolnick, 34, an Internet marketer from the Upper West Side, said he admired the bright fluorescent lights, since he often took photographs in the subway.

James V. Sears, the agency’s senior director of marketing research, said the results of the surveys – along with comments from focus groups convened in 2003 – could be incorporated into the final design of the FIND panel.

Right. Because in order to understand the reassurance of a design feature, you simply ask people if they find a certain feature to be reassuring? Sigh!

Anonymous Responses Are Useless

From Garrick Van Buren’s Work Better Weblog

One of my current projects has a major survey component. The survey ends with:

If you’d be open to follow up questions, enter your email address below.

There’s about a 60 / 40 split on responses with emails and those without. The responses without email addresses have skipped questions, irrelevant answers, and are generally unusable. This is so much the case, that I’ve found it a better use of time to check for an email address first – then read the response.

It’s interesting that people comfortable with being contacted give useful answers, while those providing non-useful responses don’t provide a way to be contacted.

Conventional wisdom on requiring accountability has it backward. Accountable people want to be responsible for their actions. Those that aren’t don’t. Forcing it doesn’t change anything.

This is purely anecdotal of course, and may not generalize to other surveys about other topics, but nonetheless seemed an interesting data point.

Intuit Customer Survey

I’m doing an online chat with a customer support agent at Intuit about a problem wtih getting a refund for buggy software. At the end of the session, I get this

550 Ernie : You will be asked to complete a survey after this chat. One of the questions asks if I have completely resolved your issue today. Can we agree that the solution I’m providing will accomplish this for you?

In fact, all they’ve done is have me tell them AGAIN about my problem (after all the software problems, they agreed to issue a refund, but then don’t, and so I follow up by email and they tell me to call or chat and I have to go through the story again and so they’ve opened up a case with a case number and presumably in 8 weeks I should have my refund. Who knows?).

550 Ernie : I am very sorry to interrupt you. I am awaiting your response, Steve.

Obviously they need to game the system and try some social engineering to get me to agree to fill out the survey properly. I’m sure there’s documented evidence that if I agree to say something there’s a higher likelihood I’ll grade them higher. From some customer research into this sort of metric, I realized that the score is more important than the actual problem solving. As long as numbers can demonstrate adequate performance, people keep their jobs. I don’t mean Ernie, I mean someone who manages 1000 Ernies.

Check-out, opt-out, crap-out


You’ll probably need to click on this picture to make it large enough to read it. It’s a detail of the invoice from my recent stay at a Hilton. As usual, they encourage the rapid check-out where you leave the keys in the room, take this document with you, and don’t even bother to stop at the front desk.
In this case, however, they’ve added a “violator” – a gold sticker with a bunch of extra info. Looks like they are planning to send out mail surveys, and it’s opt-out, not opt-in. To opt-out, I’d have to stop by the front desk on my way out, exactly what the Zip-Out Check-Out (R) is designed to avoid.
I did not bother, and I guess maybe I’ll actually complete the survey since that will be my chance to tell them i) how crappy the room was (the desk lamp was broken – I mean badly broken, with the bulb-assembly bent over at 90 degrees, the power plug didn’t work)
ii) how crappy the food was (my chicken wrap was made with chicken that was grilled, then frozen, then thawed to assemble the sandwhich – partially thawed – nothing like chicken icicles in your dinner
iii) how crappy the service was (what kind of business hotel – and this place was in an office park, business accomodation is the only reason is exists – doesn’t offer a breakfast-room-service-hang-tag deal where you can order your breakfast before you go to bed and it’ll arrive at the time you specify)

As far as i) I guess I get some lame points myself for not telling them about it, so the next visitor will have the same discovery. When you arrive at 9:15 pm and you have to eat and get work done, it’s not like you want to be dealing with workers in your work or the frustration of the whole repair/request process. Clearly they don’t check out stuff that is broken that badly and they (ineffectively) rely on the guests to take care of the notification.

Series

About Steve