I’m at Microsoft Tech-Ed in Berlin where 7000-odd IT admins and developers (though more admins) are looking at Microsoft technology.
I was browsing round the stands in the Technical Learning Centre here when I came to one where the technical documentation team at Microsoft was handing out a survey. Fill in the survey, get a plastic rocket. I looked through the survey where you had to rate innumerable aspects of the documentation on Microsoft’s technical resource sites (MDSN, TechNet etc).
I refused to complete it, on the grounds that it would not yield anything of value. I can put numbers in boxes as well as anyone else, but they tend to be arbitrary, and all too often the real answers cannot be easily condensed into a 1 to 5 rating. I said that the way to find out what people thought of the documentation was to ask them, not to get them putting numbers on a form.
Inevitably, the guys asked me that question, and we has a discussion of the issues I’ve found with the sites including:
- Broken links. I don’t think Microsoft should ever delete a knowledgebase entry. Mark them obsolete or even wrong, but don’t remove them.
- Too many locations with overlapping content – MSDN, Technet, specialist sites, team blogs etc.
- Documentation that states the obvious – eg how to enable or disable a feature – but neglects to mention the interesting stuff like why you would want to enable or disable it and what the implications are.
- Documentation that is excessively verbose or makes you drill down into link after link before finding the real content.
- Documentation that is not clearly dated, so that you might be reading obsolete advice.
Anyway, I felt I had a worthwhile discussion and was listened to; whereas completing the survey would not have brought out these points effectively.
Most importantly, did you get the rocket?
I couldn’t agree more.
Surveys are relied upon too much these days and as you say, they never really get the nub of the matter.
Take eBay feedback for example. How can you condense an accurate account of your transaction into a few words and a 5 star rating?
Seller Communication: So is that 5,3 or 1 star if the transaction went smoothly but I never had any communication with them? Do I deduct points because they did not confirm my order, e-mail me when it shipped, etc?
Delivery time: Heres a good example. My mum has a low delivery time rating on eBay because she is house bound and I have to do the post outs. I do them once a week as I do not pass the post office on a daily basis, I go especially for that purpose.
However, should she be low rated? She always clearly explains on the listings that it will not be next day delivery and is only posted out once a week. Yet the low rating. As the rating should be related to if you delivered within your specified time period, it makes the rating null and void entirely as people do not rate based on that.
Everyone has their own idea of what a 5 star service is so everyone will interpret the results differently.
There is no replacement for a good long explanation but people do not like having to use words so that leaves surveys as pretty much pointless full stop.
The biggest issue with surveys? You are not rewarded enough for your time. If I have to spend *my* time to fill out a survey that benefits someone else then I better get proper recompense otherwise why would I do it?
Its the old ‘whats in it for me’ (WIIFM) that smart marketers understand but dummys who create surveys simply dont.
Gary