This year's HOW Interactive Design Conference was in 2 cities-- Washington, D.C. in September, and San Francisco in October. I presented a session on getting website feedback at both events. Below is a transcript of my presentation-- pretty close to what I really stated, though I'm not stating there aren't a couple of edits here and there to make me look good.
Thank you, very much. Great early morning! I hope you all are well rested, caffeinated, and otherwise all set to put your minds to work today. There's going to be a lot of excellent things coming your way. In fact, I'm really thankful to be going initially so I can join you all in soaking it in!
This morning, I wish to show you what I've learned about feedback-- particularly, the methods we consider and then collect feedback about our sites.
The very best place to begin, I believe, is by taking a look at some pictures. ... And they all breathed a sigh of relief! Seriously, however, we do need to alleviate our minds in to this sort of thing very first thing in the early morning. So, I have actually brought some images with me that I wish to scan. If you can identify these images as I expose them, feel free to say so aloud for everyone to hear.
Here we go:.
What do these things have in common?
They are all kinds of feedback! The interesting thing about feedback is that its a signal that just makes sense once you know what it's reacting to.
For instance, the scantron, which all of us keep in mind from high school, is a matrix of responses to check concerns. The tally records your action to the concern, "who are you voting for?" And the scale, it displays the answer to your question, "how much do I weigh today?".
What about this form of feedback?
This is a blood test analysis, comparable, actually, to one I received just a number of days after my last physical. It took me a few minutes of staring at mine to recognize that I had no idea what to make of it. The problem with this kind of feedback is that it does not answer the simple concern I envision anyone without medical training would have upon examining it: "Am I healthy or not?" I rapidly went from being impressed with the service of having actually received this report by mail, to aggravation with not having the ability to draw out any significance from it.
If I can't make sense of a report, what good is it?
Let me show you one more form of feedback, one you might be more acquainted with.
This is a KPI report. In fact, a KPI report that one of my firm's clients acquired a number of years back.
For those who don't understand, KPI stands for key efficiency indications. This is how web feedback is normally recorded: a huge matrix of responses but no questions to provide context. Remarkable as it may appear, if you have an http://www.bbc.co.uk/search?q=friendly sites interest in making any sense of what's going on with your site, the KPI is most likely not going to be much help. This sort of report shows that we've come down with a pretty common fallacy: that the volume of data verifies the measurement itself. Now, this particular KPI report goes on for 90 pages ...
... so there must be something intriguing and important in there, right? Riiiight. Most likely not. But even if there was, many people will probably simply ignore a few pages in. The conclusion that "measurement is being done" is good enough for them.
The bright side is that we can switch in some new words for KPI that are a lot more straightforward about what it does. How about: Keeping People Oblivious!
Now, repeat-after-me is an old standby for instructors to get sleepy trainees engaged, so let's all say that together:.
KPI reports are: KEEPING PEOPLE IGNORANT.
They do say that oral regurgitation is among blog the best ways of inscribing something on your brain. I have actually always felt that writing things down is even better, so I hope some of you did that too.
The failures of the keeping individuals ignorant report make a fine example of broken feedback, which tends to have three core attributes. It's:.
Outsourcing this sort of thing isn't objectively bad-- there are plenty of great analytics experts out there who are worth their fees-- however it's not generally in your best interests.
A report like the one we just took a look at is the product of metric regurgitation, not questioning. Consider it: They do not have the questions that you have. If they aren't asking the ideal questions, they won't be able to supply the best answers. They can offer you 90 pages of data and want you all the best.
If you're not collecting your own feedback, then you're most likely paying somebody else to do it. And if you're paying somebody else to do it, it's probably not getting done very often. And if it's not getting done extremely often, it's most likely not getting done till you're desperate for answers. And here's the important things: Information gathered in crisis is guaranteed to be misinterpreted.
As we have actually currently seen, broken feedback also tends to be extremely quantitative. It's all responses, and no concerns.
How do we ever anticipate that to work?
That's a great concern, isn't it?
Now that we understand what sort of feedback isn't working, what I 'd like to do with the time we have actually left is go a little bit more with developing a standard for excellent feedback-- that's the theoretical side of all of this-- and then build upon that foundation a repeatable method for collecting feedback-- that's the HOW.
The Foundations of Useful Feedback.
Feedback that is useful also has 3 attributes. It's:.
done by you.
Simply put, it chooses implying to numbers.
But just what do I mean by that? How do we make sure that our feedback event chooses significance? Well, for one thing, we make sure it responds to the specific questions we need to be asking about our websites. There are five concerns that anyone involved in website design, development or marketing must be asking all the time:.
Who is coming to my website?
Where are they originating from?
What material are they taking in?
How are they engaging with that content?
What can I do to enhance their experience?
Beyond these concerns, I'm unsure what else you would truly require to understand.
These five questions have something really important in typical. Can you think what?
The responses to these questions are not numbers! We utilize numbers to answer them. Numbers aren't constantly good enough; they're a method to an end. Which means that we can conclude 2 exciting things about feedback ...
There are no independently significant metrics. It's not just about page views or downloads, and it's never ever, ever source about hits. No one states hits anymore, right? It's about the connections in between metrics that supply responses to our 5 questions.
Anything can be a source of information.
Which brings us, naturally, to robotics and individuals.
As far as robotics are concerned the ones I'm going to talk about work for Google-- of course they do!-- and one of the things they do is make possible a wonderful feedback tool called Google Analytics.
If you were expecting something more sci-fi, we can talk about robopocalypses and such at tonight's cocktail hour, I promise.
After we look at what the analytics bots have to offer us, I 'd like to present you to a couple of techniques for collecting exceptionally useful feedback from living, breathing humans.