Prioritize Usability Testing and Web Analytics
If you've performed usability tests and tried to reconcile those results with your current site metrics, you've probably been left scratching your head. Usability respondents find something wrong on a particular page, yet the same "problem" isn't evident in the site analytics.
This leaves you with a rather big question: How do you justify Web analytics and usability, and what role does each play in the conversion equation?
Measuring Oranges and Apples
Although both attempt to measure a site's ability to convert, Web analytics and usability testing actually measure two entirely distinct aspects of a site.
Web analytics measure visitor intent and persuasive momentum, as well as the site's ability to move visitors through a conversion scenario. Usability examines the site's interface and process barriers that keep visitors from accomplishing a conversion task. Usability is:
The ability to effectively implement knowledge concerning the human-computer interface to remove any obstacles impeding the experience and process of online interactions.
Usability Tests Do Only So Much
Well-known online usability expert Jakob Neilson points out in his latest newsletter:
One downside of users' [participants'] tendency to engage strongly is that they sometimes work harder on tasks in a test session than they would at home. If the user says, "I would stop here," you can bet that they'd probably stop a few screens earlier if they weren't being tested.
A usability test can't measure two key factors in the conversion process: persuasive momentum and individual motivation. A visitor's willingness to click through to a site and participate in its conversion processes is directly tied to her intent and motivations and the relevance of the product or service to her needs.
We've all endured poorly executed sites or shopping experiences because we felt it was the only way to meet our needs. We've also bailed out of sites because we just didn't have a high enough intent to purchase or were aware of an easier-to-use site.
In usability tests, respondents can't fake a visitor's real-time persuasive momentum, but they can indicate how well the site may be doing in allowing visitors to complete a task.
Combine Apples and Oranges for a Clearer Picture
According to Neilson, "In usability studies, participants easily pretend that the scenario is real and that they're really using the design." However, it's much harder for participants to fake a need they don't have. If you disliked pungent cheese and were asked to shop for the best Roquefort, could you simulate the actions a true cheese lover would take?
Web analytics, on the other hand, track actual actions taken on your site from very large sample groups. They provide a true measure of activity and persuasive momentum.
Couple usability testing with Web analytics for a more holistic picture of what is (or isn't) happening on your site.
Web analytics provide the most accurate and objective measure of how individuals interact with a site. Usability studies provide insight into what's happening in particular instances.
Who, What, Where?
Generally speaking, use Web analytics to determine where to make site changes and usability tests to determine what to test.
A usability test reveals users have a problem on page 2 of a checkout process, while analytics reveal a much higher drop-off on page 3. On further investigation, we determined the following:
On this site, there was a high intent to purchase (indicated by a very short path from the home page to the shopping cart page). Even though visitors weren't completing checkout, they were willing to go one step further than usability testers.
Usability testers discovered some unmet customer expectations in the process. Users were looking for certain elements, possibly clicking as far as page 3 to find them. When they didn't find them, they abandoned the site.
In another case, we evaluated a mass merchant's Web site during a conversion assessment. One category page was losing almost 92 percent of visitors. No usability test could have found or solved the problem if analytics hadn't indicated there was a problem in the first place. The solution required swapping one image. Result? A daily increase of tens of thousands of dollars in revenue. The return on investment (ROI) is the result of taking action on the assessment.
Usability Testing Tips
Start respondents at whatever drives your traffic (e.g., a search engine) rather than at your site. You'll glean data other usability testers might ignore. You can gauge testers' first impression of your site relative to other choices, a situation much closer to real life. You may even find visitors have a hard time getting past an entry page or even the search engine results page.
Usability testing needn't cost a fortune. Morae is a usability lab on CD-ROM; it's revolutionized our approach to usability testing.
An Explosive, One-Two Optimization Punch
Visitors don't make decisions in a vacuum. They participate in multiple conversion scenarios, whether you plan them or not. They move through your site in the context of those scenarios, and their decisions are affected.
Usability tests can show where technology or interfaces can stand improvement, but only Web analytics can measure how well a site addresses a visitor's needs.
Using both Web analytics and usability testing correctly makes for an explosive, one-two optimization punch.
2006-08-28 17:40:10
·
answer #1
·
answered by Anonymous
·
0⤊
0⤋
Once a release is available for testing, the first step you should do is to test if, for example, each screen for a web based application works. This will ensure that nothing will stop you from further testing.
Once you have everything working (or accessible for more testing), you can then test each functionality as a standalone piece. The last step should be to test end-to-end integration scenarios.
2006-08-29 04:36:01
·
answer #2
·
answered by Life 2
·
0⤊
0⤋
apple
2017-02-06 00:33:48
·
answer #3
·
answered by maha 7
·
0⤊
0⤋
those do oftentimes take the form of an in-tray exercising. Doing a ridicule try ought to actual be counter-effective, except you'll discover one that's suitable to the form of pastime for that you will be interviewed. the rationalization for it truly is that diverse jobs have diverse prioritisations. keep in concepts the 4 diverse varieties, and do a short diagnosis of how the products more beneficial healthful into those diverse varieties, as they could be perceived by using an organisation - a million. urgent and critical - to be dealt with immediately 2. urgent yet not important - manage 2d, yet do not spend too a lot time on the pastime 3. not urgent, yet important - to be distributed time in the close to destiny 4. Neither urgent nor important - to be left on my own until eventually time is accessible. keep in concepts, too, that even in class a million, some added prioritisation ought to nicely be required with the intention to make your concepts up which pastime to do first.
2016-11-28 02:43:59
·
answer #4
·
answered by ? 4
·
0⤊
0⤋
well means you test something more important than the stuff waiting to be tested.
2006-08-28 06:39:13
·
answer #5
·
answered by Paultech 7
·
0⤊
0⤋
i start with the one i hate the most and work my way up from there.
2006-08-28 06:38:35
·
answer #6
·
answered by jon i 2
·
0⤊
0⤋