Arunkumar Khannur's Software Testing Knowledge Center

13.8. Usability Inspection

Usability inspection is a review of users’ potential task performance with a product. It was designed to help engineers to review a product and find a large number of usability defects. Usability inspection methods involve expert evaluators only, who inspect the user interface in order to find out possible usability problems, provide judgments based on their knowledge, and make recommendations for fixing the problems and improving the usability of the application. It is very similar to the code inspection methods with which software developers are familiar. It is carried out by the engineer designing the product and a team of peers, looking for defects.

In Usability Inspection approach, usability specialists and sometimes software developers, users and other professionals examine usability related aspects of a user interface. Commonly used Inspection methods are:
  • Cognitive Walkthroughs
  • Feature Inspection
  • Heuristic Evaluation
  • Participatory Heuristic Evaluation
  • Pluralistic Walkthrough
  • Perspective-based Inspection

Cognitive Walkthroughs
Cognitive Walkthrough is a expert analysis of a series of tasks where in users are not needed and expert puts themselves in users shoes and steps through the task sequence.

Cognitive Walkthrough aims to look at how easy and obvious goals and actions are and to highlight areas of possible confusion
  • Is it obvious what to do?
    • Will people formulate the right goals?
    • Will they realize a goal has been achieved?
    • Will they undertake inappropriate goals?
    • Will they inadvertently kill off some higher or related goal?
  • Is it obvious how to do it?
    • Will people identify the correct actions?
    • Will the action contribute to achieving the goal?
    • Do actions match people’s goals?
    • Are there physical difficulties performing actions?

In order to carry out Cognitive Walkthroughs, the expert must list out each action of the goal. Then carry out each action while observing how easy is it for users to identify what to do and how to do it. All such observations shall be recorded in the form of endnotes or differently formatted text.

Feature Inspection
This inspection technique focuses on the feature set of a product. The inspectors are usually given use cases with the end result to be obtained from the use of the product. Each feature is analyzed for its availability, understandability, and other aspects of usability. For example, a common user scenario for the use of a word processor is to produce a letter. The features that would be used include entering text, formatting text, spell-checking, saving the text to a file, and printing the letter. Each set of features used to produce the required output (a letter) is analyzed for its availability, understandability, and general usefulness. One time-tested way to perform feature inspection is to have the documentation staff attempt to document each user scenario as procedures. Features that are hard to describe in the documentation are probably hard to find for the user in the first place.

In order to carry out feature inspection, first experts shall list the features in the product in the sequences they would be used to perform various tasks. Following this, each task as prescribed shall be carried out. While doing so, expert has to look at the accessibility of each feature in the context of the tasks trying to answer questions like-Can the user get to each feature without much trouble? Are the features well named and easily recognized? The results of such observations shall be recorded.

Heuristic Evaluation
'Heuristic Evaluation' of a system is an 'Expert Evaluation'. This expert evaluator uses a checklist of heuristics against which the system (website, multimedia mobile hand set) is graded. This evaluation is carried out by 4 or 5 experts who can be the designer of the system, another designer, a person with good knowledge on the system, user of the system, or a person with more than one of the above characteristics

In heuristic evaluation which is a checklist based approach, we need a checklist specific to the system being evaluated with following points in it.
  • Familiarity of Elements
    Use of familiar icons and layouts helps reinforce user learning and confidence
  • Consistency of Elements
  • The interface should reinforce any expectations from previous contact with the system or other similar systems.
  • The user should not be expected to learn one method of performing an action in one area of the system and another method of performing the same action in another area of the same, or similar, systems.
  • There should be a consistent format for menus, messages etc.
  • Clarity of elements
    On screen information must be short and relevant, but it must still make sense.
    Making sense to the design team is not the same as making sense to everyone else.
    There should be a minimum keystroke effort.
    Requests for input should be relevant.
    Output should be easy to understand.
    A good interface should appear to be natural, i.e. that it is a good way of performing the task.
  • Appropriate language
    It should use the users words, and/or the jargon of the task, as opposed to the jargon of IT.information
  • User control and freedom
    The system should adapt to the needs of the user not the reverse
  • Navigational support
    • Where am I?
    • How did I get here?
    • What is happening?
    • Where can I go next?
    • How do I get there?
    • What can I do?
  • Recognition rather than Recall
  • Timely Feedback
  • Error Prevention
  • Flexibility of Use
    The interface should accommodate differences in user requirements, preferences, and level of performance
  • Aesthetics

Using this checklist, experts must inspect the system and list the elements along with: the appropriate screen(s) / element; the heuristic guideline(s) it does not comply with; a description of the particular error identified; a description of the user difficulty presented; a suggested solution; and the severity of the problem - severity should be rated from 1 to 5, 5 being the most severe

The list should be ordered in descending order of severity, from most severe to least, and then by screen.

Participatory Heuristic Evaluation
Participatory Heuristic Evaluation (PHE) is an extension of the traditional Heuristic Evaluation where a number of design guidelines are applied to a design or prototype by usability experts. PHE uses the same techniques, however users are included as ‘work-domain expert inspectors’. Extra heuristics are added to include the user experience. In addition to the 13 heuristics identified in heuristic evaluation, Participatory Heuristic Evaluation facilitates the checking of
  • Task flow
  • Suitability of design to task
  • Suitability of design to user
  • Quality of work produced

Previously unknown problems are uncovered with this method. In particular, users of the system in this case will have better knowledge of the existing system and may be aware of issues which the designers were not. With the use of the prototype, they may be easily able to identify issues of quality or suitability of prototype for different tasks.

According to Nielson training for heuristic evaluation takes only a few hours and this can be built into the evaluation schedule.

Pluralistic Walkthrough
At the design stage, when paper prototype is available, a group of users, developers, and human factors engineers meet together to step through a set of tasks, discussing and evaluating the usability of a system.

Group walkthroughs have the advantage of providing a diverse range of skills and perspectives to bear on usability problems. As with any inspection,, in this method also the more people looking for problems, the higher the probability of finding problems. Also, the interaction between the team during the walkthrough helps to resolve usability issues faster. In Pluralistic Walkthrough, we require several experts who may include product developers, interface designers, a coordinator and at least 2 users are required.

In Pluralistic Walkthrough, paper prototype of the interface is to be used for the walkthrough and shall be availed to experts. Following this, s a first step walkthrough team shall be established that includes- representative users, product developers, human factor engineers, one coordinator. Following this coordinator asks participants, except the coordinator, to assume the role of the user during the walkthrough. All participants are presented with the interface design in the form of a screen panel and asked to write down separately the action they want to take. The participants need to write their actions in as much detail as possible, down to the keystroke (or other input action) level, e.g. "Press the down arrow key three times, then press 'Enter'. After all participants have written the actions they would take for the task, a discussion begins, in which the users speak first. Only when the users' comments are exhausted do the usability experts and the product developers offer their opinions. After the discussion, the coordinator will tell the participants what actions they are supposed to take according to the user interface design and present the new screen panel after the actions. Thus the walkthrough moves to the next step.

Perspective-based Inspection
Perspective-based Inspection technique during each inspection session focuses on one of the three defined perspectives: novice use, expert use, and error handling.

In order to carry out Perspective-based Inspection, first of all, a set of well defined task scenarios for each perspective shall be created from the user's point of view. Expert shall take scenarios related to one perspective at a time and carry out tasks as mentioned in the scenarios. Any diffculties or issues that may come up during carrying out each step shall be recorded and provided to designers for appropriate actions.
Khannur's Book
Arunkumar Khannur, Software Testing - Techniques and Applications, Published by Pearson Publications, 2011 (ISBN:978-81-317-5836-6; Pages:341 + xxii)
Follow Khannur
Khannur's Company
ISQT Process & Consulting Services Pvt. Ltd., Bangalore, INDIA
Khannur's Software Testing Forum
 Contact Khannur
ISQT Process & Consulting Services Pvt. Ltd.
#732, 1st Floor, 12th Main,
3rd Block, Rajajinagar,
Bangalore - 560010, INDIA
Phone: +91 80 23012511
Skype: arun.isqt