Page 94 -
P. 94
2.5 Interaction paradigms 63
communication presented many opportunities for thinking about how to embed
such technologies on people in the clothes they wear. Jewelry, head-mounted caps,
glasses, shoes, and jackets have all been experimented with to provide the user with
a means of interacting with digital information while on the move in the physical
world. Applications that have been developed include automatic diaries that keep
users up to date on what is happening and what they need to do throughout the
day, and tour guides that inform users of relevant information as they walk through
an exhibition and other public places (Rhodes et al., 1999).
Tangible bits, augmented reality, and physicaUvirtua1 integration. Another de-
velopment that has evolved from ubiquitous computing is tangible user interfaces
or tangible bits (Ishii and Ullmer, 1997). The focus of this paradigm is the "integra-
tion of computational augmentations into the physical environment", in other
words, finding ways to combine digital information with physical objects and sur-
faces (e.g., buildings) to allow people to carry out their everyday activities. Exam-
ples include physical books embedded with digital information, greeting cards that
play a digital animation when opened, and physical bricks attached to virtual ob-
jects that when grasped have a similar effect on the virtual objects. Another illus-
tration of this approach is the one described in Chapter 1 of an enjoyable interface,
in which a person could use a physical hammer to hit a physical key with corre-
sponding virtual representations of the action being displayed on a screen.
Another part of this paradigm is augmented reality, where virtual representa-
tions are superimposed on physical devices and objects (as shown in Figure 2.1 on
Color Plate 2). Bridging the gulf between physical and virtual worlds is also cur-
rently undergoing much research. One of the earlier precursors of this work was
the Digital Desk (Wellner, 1993). Physical office tools, like books, documents and
paper, were integrated with virtual representations, using projectors and video
cameras. Both virtual and real documents were seamlessly combined.
Attentive environments and transparent computing. This interaction paradigm
proposes that the computer attend to user's needs through anticipating what the
user wants to do. Instead of users being in control, deciding what they want to do and
where to go, the burden should be shifted onto the computer. In this sense the mode
of interaction is much more implicit: computer interfaces respond to the user's ex-
pressions and gestures. Sensor-rich environments are used to detect the user's cur-
rent state and needs. For example, cameras can detect where people are looking on
a screen and decide what to display accordingly. The system should be able to de-
termine when someone wants to make a call and which websites they want to visit
at particular times. IBM's BlueEyes project is developing a range of computational
devices that use non-obtrusive sensing technology, including videos and micro-
phones, to track and identify users' actions. This information is then analyzed with
respect to where users are looking, what they are doing, their gestures, and their fa-
cial expressions. In turn, this is coded in terms of the users' physical, emotional or
informational state and is then used to determine what information they would
like. For example, a BlueEyes-enabled computer could become active when a user
first walks into a room, firing up any new email messages that have arrived. If the
user shakes his or her head, it would be interpreted by the computer as "I don't
want to read them," and instead show a listing of their appointments for that day.