Page 300 -
P. 300
10.5 The process of user-based testing 289
progress out loud, there is often very useful feedback. For instance, a user may say
things such as “Where is the menu choice? I would expect it to be right there” or “I
certainly would not purchase anything from this website. It looks so unprofessional.”
Even younger users can make useful comments during a usability session (see the
Leescircus sidebar). It is important to be aware that how comfortable someone may
feel about speaking aloud during the tasks may be culturally influenced, and there-
fore people from some cultures may not feel comfortable expressing their concerns
immediately (Shi and Clemmensen, 2008). Also, the more that users talk, the more
their task or time performance data may be influenced (Dumas and Loring, 2008).
The more they talk, the longer their task times will be (Dumas and Loring, 2008). If
you want both true user comments and very accurate task and time performance data,
it is possible to run a reflection session, also known as an interpretation session or a
retrospective session, after the tasks are performed. In an interpretation session, the
users watch raw video of themselves immediately after attempting a series of tasks;
working with the evaluators, they interpret the problems they encountered and where
they feel that the major interface flaws are (Frokjaer and Hornbæk, 2005). In more
traditional research with larger numbers of participants, the goal might be to cat-
egorize the qualitative comments using content analysis and look for patterns. With
usability testing, we’re trying to use these comments to help improve the interface.
Certainly, there is an even more important message for researchers if you hear the
same comment multiple times, but the strength of even one comment is important.
USABILITY TESTING OF THE SOFTWARE LEESCIRCUS
Usability testing took place for an educational software package called
Leescircus, designed for 6- and 7-year-old children in the Netherlands. One
example of a typical task was to match pictures that rhyme. A total of 70 Dutch
children (32 girls and 38 boys), aged 6 or 7, took part in the usability testing.
Most of the children had previous experience with computers and some had
previous experience with the program. The children were asked to find problems
with this version of the software. There were four sets of eight or nine tasks and
each child performed only one set of tasks. Usability evaluators observed the
children while they were performing the tasks. The children were encouraged
to speak their comments aloud while using the software. The time period was
limited to 30 minutes, as it was expected that the attention span of the children
wouldn’t last much longer. Although only 28 children did make comments out
loud, the novice students (with less computer experience) tended to make more
comments than the experts. Usability findings included the need to enlarge the
clickable objects, clarify the meaning of icons, and improve consistency (so
that it was clear whether an icon could or could not be clicked) (Donker and
Reitsma, 2004). This case study shows that children, too, can provide feedback
using the “think aloud” protocol during a usability test, although not all will feel
comfortable enough to speak up during the usability test