If I’d had more time, I’d have written a shorter blog post
I thought about trying to write this post as a poem, “Twas the week before Christmas”-style but realized that would probably be more difficult and time-consuming than just writing a normal post..
Last week was fun! I was very busy note-taking for the two tests we ran with TryMyUI. I did my best to try to work through one and reflect on the findings before switching to the other, but one of my colleagues was off on break as of Thursday so we tried to squeeze in all our discussions before she left. That ended up being not quite feasible, so we have another discussion scheduled for those of us who are still working.
One thing that goes unstated is the time it takes to analyze and synthesize after performing tests. Sure, there’s some straightforward quantitative measures you can get from the platform, but it takes some time to sift through the more nuanced information to discover things.
For this project (Task Performance Indicators, to use Gerry McGovern’s vernacular), we are reporting on task success, time-on-task and reported ease-of-use.
But more interesting than the “what” is the “why” (and the “how”). Our teams spent time reviewing the recordings of our participants, keeping an eye out for anything surprising or interesting (I don’t want to use the word “unusual” because do we really know what “usual” is? And just because something is “usual” doesn’t mean it’s as good as it could be).
I also took the time to map out the paths of pages on the site that the participants took to complete the task. It was fascinating to look at the different ways people approached problems, and all the unintentional false paths we may lead them on.
Back when I was still a web developer, I worked on the checkout flow for an e-commerce site. We all know that when the point is sales conversion, you want to minimize distractions to the site visitor. You want to drive them through the checkout process as quickly and easily as possible. But that’s relatively straightforward because that flow can be pretty self-contained. The person has put things in a cart and clicked “checkout”.
When we’re talking about uncovering answers on a content-heavy site, things aren’t so clear cut. What’s that trigger that lets us know what the site visitor’s trying to get done?
On an e-commerce checkout flow, the items to act upon are part of the system. They’re the book or the gift listed on the website, that I want to engage with. Sure I have to provide my mailing and payment information, but these are all common elements.
Contrast that to trying to find answers to a ‘real world’ question. There may be so many unknowns! This is why so many of those “answer bots” can be so frustrating. Their body of knowledge is limited.
As a designer, there are two general approaches we can take to this imbalance of information. We can ask people for a lot of detail up-front, to attempt to be more precise in the answer we provide, or we can provide more information that may or may not be relevant to the site visitors’ situation (site visitors need to process and interpret the information themselves). The approach we take should take into account how invested the site visitor is in performing this task (we may turn them away if they have to put in too much effort up-front), and how confident we are in our ability to accurately assess their situation and provide a complete and accurate answer.
I just realized this has very little to do with my work this week. :shrug: sometimes that happens with these posts! We’re still at the research stage, not the design stage.
Suffice to say that this was a fun week of observing user videos, and I look forward to continuing to dig through what we saw to try to identify opportunities to help site visitors find the answers they’re looking for on our site.