Page 383 -
P. 383

13.2  Eye tracking  373




                  in virtual environments were looking (Steptoe et al., 2008). Other studies have used
                  eye-gaze history data to help users monitor semiautonomous agents, using visual cues
                  from prior gaze information to highlight where users should look for a monitoring
                  task (Taylor, 2015). Eye-tracking systems have also been developed for GUI interface
                  control including pointing and clicking (Kumar et al., 2007), window selection (Fono
                  and Vertegaal, 2005), multimodal interfaces (Stellmach and Dachselt, 2013; Pfeuffer
                  et al., 2016), and for remote collaboration (Higuch et al., 2016).
                     Researchers have used eye tracking to study user behavior with a wide range
                  of computer interfaces. Web browsing and navigation have been particularly well-
                  studied in this regard. In a pair of studies, researchers at Microsoft used an eye-
                  tracking system to examine the impact of factors such as the placement of a target
                  link in a list of results and the length of the contextual text snippet that accompanies
                  the results (Cutrell and Guan, 2007; Guan and Cutrell, 2007). In study of placement,
                  users were observed to be more likely to look at links early in a list than later and to
                  spend more time looking at the earlier links (Guan and Cutrell, 2007). Consideration
                  of the length of text summaries led to interesting results: when looking for a specific
                  link, users tended to focus on more search results as the summaries got longer. This
                  effect was less notable for open-ended “informational” tasks that were not focused
                  on a specific goal. The researcher speculated that this difference was due to the rele-
                  vance of the summaries in each case: summaries that were useful in the informational
                  task were distractions that obscured the specific link name in the other tasks (Cutrell
                  and Guan, 2007). Other studies have examined patterns in eye movements as us-
                  ers interact with websites, moving both within individual pages and across multiple
                  pages (Card et al., 2001; Goldberg et al., 2002; Buscher et al., 2009).
                     Other experiments have used eye tracking to understand the progression of eye
                  focus during menu selection tasks. One study found that eye-focus patterns in tasks
                  involving reading menu items differed significantly from selecting items. Although
                  users fixated on each item when reading menus, they tended to use sequences of eye
                  movements in a given direction—known as “sweeps”—when performing selection
                  tasks (Aaltonen et al., 1998). Eye tracking has also been used to study differences
                  in how user attention differs for alternative visualizations of hierarchical structures
                  (Pirolli et al., 2000), and to build document summaries based on eye-gaze data de-
                  scribing areas that were the focus of user attention (Xu et al., 2009).
                     Given the complexity of eye tracking, some researchers might be tempted to look
                  for other measurements that might provide hints as to where a user's attention is
                  focused. For GUI-based systems, mouse position and movement might be seen as a
                  proxy for eye gaze, as we might tend to look where the pointer goes as we move the
                  mouse. A strong correlation between mouse movement and gaze might completely
                  eliminate the need for eye tracking in some GUI-based contexts. Alas, the reality is
                  somewhat more complicated. A number of studies have attempted to track the rela-
                  tionship between gaze and mouse movement, developing algorithms for using mouse
                  position to predict gaze (Chen et al., 2001; Bieg et al., 2010; Huang et al., 2012; Diaz
                  et al., 2013; Navalpakkam et al., 2013), although the nature of the relationship might
                  be somewhat task dependent (Liebling and Dumais, 2014).
   378   379   380   381   382   383   384   385   386   387   388