Seen along highway 92 in Half Moon Bay. Subsequent screens indicate it’s an online survey about commuting. Still, really?
Seen along highway 92 in Half Moon Bay. Subsequent screens indicate it’s an online survey about commuting. Still, really?
Earlier this week I stayed at a Marriott hotel. When I checked out, they were unable to get me a bill. My room service from 2 hours earlier was not in the computer. The clerk tried to raise someone on the walkie-talkie but it was to no avail. They offered to email it to me, but 36 hours later as I prepared to submit my travel invoice to my client, I still didn’t have the bill. I explored the website, dealt with several different types of support, and it still took another 12 hours to get the bill!
Today comes the inevitable customer-satisfaction survey. With the audacious subject line Help us innovate your experience at Marriott hotels.
Besides the horribly ugly phrasing (“innovate your experience”?) how hard must they be kidding here?
Someone has hypothesized that escalating the language of the invite they can increase their response rate, but outright lying is really not the way to start the dialog.
Customer satisfaction surveys are not a way to innovate. Sure, it’s possible that this type of tool could uncover unmet needs, but those are going to be the needs that they already know about, right? Honestly, when have you ever taken a corporate customer satisfaction survey that has done anything but treat you like an idiot? This sort of tool is only used for ass-covering, at best, and at worst for one group to preempt any negative feedback that might go to another group that oversees or funds them.
The word innovation has become a meaningless catch-all for any sort of improvement and here Marriott stoops even lower, using it as a proxy for any sort of customer interaction, despite the low likelihood of any change or improvement resulting.
Today, I am proud to carry on the lively tradition of eviscerating… I mean, learning valuable lessons from other folk’s attempts at research. I will be examining a Braun-fielded poll that appeared on my Facebook page. (Recent notable additions to the oeuvre include Jared Spool’s 19 Lessons from United Airlines on How To Build a Crappy Survey and Steve’s imagined reaction to a Netflix survey, Effective Concept Testing: Getting the Answers You Want to Hear!)
OK, here it is:
I have a few questions.
1) Who? The pollsters don’t seem to care that I am neither a fan nor a consumer of Braun shavers specifically, or of electric shavers in general. They’ve got the audience all wrong. To respond would merely be to sabotoge their data-set in response to the absurdity. Which of course I wouldn’t dream of doing!
2) What is the purpose of this (Part I)? What is the marketing or social media team going to do with this information? What is the business question behind this? Knowing, as they must, that the data will be terribly corrupted, they can’t possibly believe that they’re actually getting useful information. So maybe it’s just one of these crazy social media ploys that appears to be important research but is really a bit of marketing designed at the level of a made-you-look joke?
3) What is the purpose of this (Part II)? If they just want me to look, then what did they want me to think upon having seen this survey/ad/poll? Is is supposed to intrigue me into thinking, “GOSH now I do wonder how new and different Braun shavers actually are! Let me look into that and then get back to them on this relevant question.” (If so, where’s the link?) Or is it, “Wow – Braun makes electric shavers!” Or is it merely an unconscious, Pavlovian Braaaaauuuuunn they’re trying to get? “Oh yeah, Braun is a brand. I need a shave.” Or do they really just want me to answer their ridiculous question?
4) Can it make sense, please? Don’t ask me to compare Braun, a brand responsible for a wide variety of consumer products, to the more specific but still questionable category of “other electric shavers.” I can’t compare things that are not comparable.
5) “New and Different?” Really? New and different are not necessarily positives, especially as those attributes relate to whirring blades that come into contact with your body parts. Is this the most important consumer response that the marketing team is really hoping to understand, if, in fact, they are hoping to understand anything at all?
Even though I’ve no doubt that this is an insignifiant throw-away in the overall universe of Braun marketing, it definitely made an impression. If you’re going to bother to ask people questions, know who you’re asking and make it seem, even just a little bit, like the whole exercise matters to you.
We’ve learned quite a bit from other people’s surveys many times before:
Bad Survey Design. Please Stop!
Son of Survey
Son of Survey Madness
Thank You For Voting
The Space Between Yes and No
Does Calling it a Report Card Make it Not a Survey?
I was quite amused to see two topics near to my heart appear on The Simpsons last weekend. This this episode, the Simpsons travel with Ned Flanders and other Springfieldians to Israel. Ned gets very fed up with Homer and explodes: “You come all the way to Jerusalem, the happiest place on earth, and all the photos in your camera are of funny soda pops!” Yes! My Museum of Foreign Groceries (including Israeli beverages)! Here’s Homer’s pictures:
The episode also hits on another favorite topic – bad surveys – when Marge is asked to evaluate her tour guide:
We were intrigued to see that Netflix is soliciting customer feedback about a new product concept. It’s great to see them incorporating users into the development process, but we figure if they are going to be asking these sorts of questions, they might want to take the next logical step. Check out our re-enactment:
After embarking on a customer research process (see Focus grouping the future), Devo (yes, the band) is now running a color survey. Surveys? What’s not to love! While we encourage you to check it out (if for no other reason than the satisfying UI, one of the best we’ve ever seen in an online survey), we’ve picked a few choice questions as a teaser. My Devo color is red. What’s yours?
Also see some fave survey posts from the past
1. I see you reading.
2. I remember what page you’re on in the book.
3. I head to the bookstore, and make a note of the text.
4. I let my imagination rip.
5. Readers become celebrities.
6. People get giddy and buy more books.
Why do you do this?
Readers are cool. Authors work hard. Publishers take chances. And you all deserve to be seen!
(Thanks Suzanne Long!)
You will find us near major subway stations on the first Tuesday of each month.The idea is that once someone is finished with a book, they either drop it off in one of our conveniently located drop boxes or back to us at a station. Unlike a library, there will be no due dates, penalties, fees or registrations. We only ask that you return it once you are done so that the same book can be enjoyed by another commuter.
There are those whose commutes are carefully timed to the length of a Talk of the Town section of The New Yorker, those who methodically page their way through the classics, and those who always carry a second trash novel in case they unexpectedly make it to the end of the first on a glacial F train."
(thanks Avi and Anne)
While there are no reliable statistics on summer reading programs, a recent survey of more than 100 programs by a student researcher at Gustavus Adolphus College in St. Peter, Minn., found that most had started in the last four years, although a few go back decades.
The range of books colleges use is enormous, covering fiction and nonfiction. Classics are largely absent, with most of the works chosen falling closer to Oprah than academic.
Still, a certain canon of summer reading is emerging: books that are readable, short, engaging, cheap. Often, it helps if the book is a best seller dealing with some aspect of diversity, some multicultural encounter — and if the author will come to speak on campus.
Our main objectives are to determine why and how people come together to share reading through a comparative study of selected mass reading events.
The mass reading event is a new, proliferating literary phenomenon. Events typically focus on a work of literary fiction and employ the mass media as a means of promoting participation in the themed activities and discussions that take place around the selected book. Beyond the Book uses research methodologies drawn from both the humanities and social sciences to investigate whether mass reading events attract new readers and marginalized communities. We also wish to determine whether this contemporary version of shared reading fosters new reading practices and even whether it is capable of initiating social change.
The Big Read gives communities the opportunity to come together to read, discuss, and celebrate one of 30 selections from American and world literature. This initiative supports innovative reading programs in selected communities, providing engaging educational resources for discussing outstanding literature and conducting expansive outreach and publicity campaigns, and a Web site offering comprehensive information about the authors and their works.
"Smart people, major players that are sophisticated in the ways of publishing, are still at loggerheads," said Ted Weinstein, a San Francisco literary agent. He said they're not just arguing whether the deal is good or bad, "but still expressing disagreement about what exactly it will do. That's a problem."
We’ve posted any number of survey design critiques over the years, and here’s the latest, a close read of a question and the cues associated with different responses.
In response to the prompt How closely do you agree or disagree with this statement: “We saw business strengthening in the Spring, but it seems to be stagnant or falling off again. We thought we had seen the bottom, but now we are not sure.” we’re asked to move a slider between Agree Completely and Disagree Completely.
As we move the slider, the expression on the little green character changes, supposedly to provide an additional cue to ensure that our response is accurate.
But when we agree (a positive emotion), the guy is frowning. Because we are agreeing with a negative in which case we making a negative observation? So we feel negative? But the green dude isn’t mapping our feeling about the situation, he’s mapped to our response – our degree of agreement. We can feel positive about agreeing, even if the thing we agreeing about is negative (haven’t you ever exclaimed enthusiastically at someone that expresses a similar frustration to you? That’s being positive about a negative). The mapping here is wrong.
It’s further complicated by the indirectness of the prompt – that situation you are agreeing or disagreeing with – describing a situation going from positive to uncertain. How much do you agree or disagree with: something was positive but now it’s negative? In fact, besides being indirect and somewhat abstract, it’s also a compound question. You might agree that things were positive, or you might now. You might agree that things have gone downhill, or you might not. The question is asking you to agree ONLY to the cause where i) things were positive and ii) things have gone downhill. If you don’t agree with both of those, then what do you do? And since you can indicate the strength of agreement/disagreement, how will people interpret the question? I would suggest not very reliably!
Ironically, this is a survey aimed at providers of market research services, who should absolutely know better.
In Jimmyjane’s Sex Change Operation I described how Jimmyjane had used design to normalize and shift the meaning of sex and sexuality away from dirty. Although Jimmyjane isn’t mentioned specifically, this article further illuminates this cultural moment:
“What this tells us is we’ve reached a tipping point,” said Debby Herbenick, an author of the studies along with her Indiana University colleague Michael Reece. “Something once regarded as exotic has become commonplace.”
The surveys, conducted in April 2008 and paid for by Church & Dwight, which makes Trojan condoms and a line of vibrators, document vibrator use and the related sexual practices of 2,056 women and 1,047 men; 93 percent of those surveyed said they are heterosexual.
The researchers attribute the widespread use to easier availability and a cultural shift away from the bad ol’ boy, Triple-X-rated sex toy industry. Vibrators are now sold at Wal-Mart, 7-Eleven and CVS; new Internet sites for sex products feature middle-aged models and aim at mainstream couples. Several companies market sex toys to women as young as sorority sisters and as old as postmenopausal golden girls through Tupperware-style home parties.
“You can now buy your toothpaste, shampoo and vibrator at the local convenience store,” Dr. Herbenick said. “They’re not hidden in a dark corner of some adult store.”
This is the first vibrator research based on a sampling reflective of the nation’s demographic mix, so there is no means of authoritatively measuring changing use over time.
(Thanks to CPT!)