Page 438 -
P. 438

14.3  Human computation     429




                  games at http://www.genegames.org challenging users to complete tasks such as cu-
                  rating gene-disease associations (Good et al., 2012).
                     Despite the success of games with a purpose and related tools, not all tasks in need
                  of human input are easily converted into small subtasks amenable to competition or
                  collaboration between participants. Longer, more complex tasks may take more time to
                                                                            1
                  complete and require additional training or expertise. Crowdsourcing studies  use online
                  platforms to collect data from participants over the web, usually through the use of web
                  software designed to enroll participants, provide training, and complete relevant tasks.
                     Crowdsourced research studies can be (roughly) divided into two key groupings.
                  Studies involving systems based on crowdsourced data explore applications of user-
                  contributed data to develop novel solutions to challenging problems. Like CAPTCHA
                  and other human computation tasks described earlier, these studies are all focused
                  around some task(s) that humans can do better than computers. Examples include an-
                  notating research reports to identify discussions of potentially harmful drug-drug in-
                  teractions (Hochheiser et al., 2016), extracting relationships between texts and tables
                  in written reports (Kong et al., 2014); delivering crowd-based emotional support in
                  online interventions for depression (Morris et al., 2015); translating text (Hu et al.,
                  2014); prototyping user interface designs (Lasecki et al., 2015); and using real-time
                  crowd interpretation of cell phone images to help blind people identify nearby objects
                  (Bigham et al., 2010; Lasecki et al., 2014), to name just a few of many.
                     A second, crowdsourced model involves crowdsourced HCI experiments: web-
                  based studies involving large numbers of participants in more or less traditional em-
                  pirical evaluations of interfaces or visualizations. As the goal of these studies is to
                  evaluate how humans use a tool to accomplish a task, they are not necessarily strictly
                  human computation: some studies in this category may include tasks that might, in
                  fact, be done by computers. However, other elements are similar, in that large numbers
                  of people will be asked to complete tasks, through an online infrastructure supporting
                  with recruitment, enrollment, and data collection. Examples of crowdsourced experi-
                  ments have been used in studies evaluating visualization designs (Heer and Bostock,
                  2010; Abdul-Rahman et al., 2014; Micallef et al., 2012), mobile applications (Zhang
                  et al., 2016), and even (via a creative proxy) haptic interfaces (Schneider et al., 2016).

                  14.3.2   CONDUCTING HUMAN COMPUTATION STUDIES

                  Using crowdsourcing services to inexpensively identify and enroll a large pool of
                  study participants might appear to be a very appealing prospect. However, matters
                  are (perhaps unsurprisingly) not quite that simple, as previous work has identified
                  concerns that might impact the quality of the data collected. Consideration of these
                  concerns, and of recommendations originating in these earlier studies, can help you
                  design tasks and use task performance data to ensure that your experiments generate
                  the high-quality data that you need to move your research.

                  1
                   Not to be confused with crowdsourcing content, which refers to the process of combining the efforts
                  of multiple authors and editors to write articles such as those found on Wikipedia.
   433   434   435   436   437   438   439   440   441   442   443