Page 235 -
P. 235

8.10  Analyzing interview data  223




                     Interviews and focus groups might also be examined for stories, responses, or
                  comments that are particularly insightful, interesting, or otherwise important. Known
                  as critical-incident analysis, this technique can be useful for identifying opportuni-
                  ties for digging deeper in search of useful information (Preece et al., 2015). In an
                  interview, a critical incident might be a story that describes a notable failure of an
                  existing system or a desired list of criteria for its replacement. As each critical in-
                  cident becomes a case study—chosen not as a representative incident but rather as
                  one that can provide useful information—techniques described in Chapter 7 can be
                  applicable.


                  8.10.3   VALIDITY
                  Analyses based on the interpretation of texts often face questions of validity. Due to
                  the necessarily subjective nature of the process of reading texts, any single analysis
                  may be influenced in subtle (or not-so-subtle) ways by the viewpoints and biases
                  of the individual analyst. If validity is a particular concern—as it might be when
                  your goal is to make a general claim—you might want to have multiple researchers
                  conduct independent analyses of your interviews. Ideally, their comments will be
                  largely in agreement with each other. High value measures of interrater reliability
                  can support your analysis (see Chapter 11).
                     Validity may not be a particular concern if your interviews are aimed at under-
                  standing user requirements. If you are working closely with users and customers,
                  you will probably present your findings to them once your analysis is complete. If
                  you have a good working relationship, they will let you know when your analysis has
                  gone wrong. This feedback is very useful for refining your understanding.


                  8.10.4   REPORTING RESULTS

                  After you have conducted countless interviews and spent untold hours analyzing re-
                  sponses, you must report the results. Expectations vary among contexts; descriptions
                  of a given set of results in an academic publication might differ significantly from
                  how the same results would be presented in a corporate memo or presentation for a
                  client. Despite these differences, some common principles apply.
                     Your presentation of interview results should be as clear and specific as pos-
                  sible. Tabulations of frequencies of responses can be used to give specific reports.
                  Instead of saying “many users complained about…,” say “seven out of 10 inter-
                  viewees who responded complained about…” Replacing terms such as “many,”
                  “most,” “often,” “frequently,” “rarely,” and other vague quantifiers with concrete
                  counts help users to understand not only the specific points but their relative
                  importance.
                     You can also use respondent's words to make your reporting more concrete.
                  Instead of paraphrasing or summarizing, use direct quotes. A small number of direct
                  quotes illustrating interviewee sentiment can make your arguments much more con-
                  crete. This strategy can be particularly effective when coupled with frequency counts
                  indicating widespread agreement with the quoted views.
   230   231   232   233   234   235   236   237   238   239   240