Читать книгу Designing for Gesture and Tangible Interaction - Mary Lou Maher - Страница 10
ОглавлениеCHAPTER 1
Introduction
In this book we explore the design issues for embodied interaction design, and specifically for tangible and gesture interaction. This book describes engaging ways of interacting with tangible and gesture-based interactive systems through four different examples as vehicles for exploring the design issues and methods relevant to embodied interaction design. In recent years, new interaction styles have emerged. Researchers in human-computer interaction (HCI) have explored an embodied interaction that seeks to explain bodily action, human experiences, and physicality in the context of interaction with computational technology (Antle et al., 2009, 2011; Klemmer et al., 2006). Dourish (2004) set out a theoretical foundation of embodiment. The concept of embodiment in tangible user interfaces (TUIs) describes how physical objects may be used simultaneously as input and output for computational processes. Similarly, in gesture-based interaction the concept of embodiment recognizes and takes advantage of the fact that humans have bodies, and people can use those bodies when interacting with technology in the same ways they use them in the natural physical world (Antle et al., 2011). This is an attractive approach to interaction design because it relates to our previous experience and makes it easier to learn new systems.
The success of interaction design depends on providing appropriate methods for the task at hand that improve discoverability and learnability. Designers should consider the user’s mental model based on previous experience when defining how the user can interact with the system, and then give the user clues about expected behavior before they take action. Giving feedback to the users is important to make it clear how the user completes an interaction. Since interaction is a conversation between the user(s) and the system, interaction design for gesture input methods and real-time feedback to the user(s) should be very carefully considered. Well-timed, appropriate feedback helps users to notice and understand that the system is interactive, to learn how to interact with it, and to be motivated to interact with it. Ideally, feedback communicates to the user that the system status has altered as the user intended (Dillon, 2003).
As we move toward embodied interaction, we maintain these basic principles of good interaction design: the user initiates interaction with some form of action, and the system responds or alters as the user intended. However, the trend for embodied interaction is the design of very broad and varied ways in which the user is expected to act to initiate interaction, and the iterative action-response needs to be discovered and learned. For example, laptop and touchscreen interactions are ubiquitous enough that there are established guidelines and design patterns that designers adhere to (Norman, 1983). These patterns and guidelines cause users to have certain expectations of how a system might work even before they begin interacting with it. However, embodied interaction is relatively new and does not have as coherent a set of consistent design patterns for interaction. Therefore, we transition from an expectation for consistent and effective interaction design using keyboard, mouse, and display, toward novel interactive systems in which the user explores and learns which actions lead to expected changes in the state of the system. We propose that HCI is a cognitive process in which the user mental model is the basis for their exploration and use of the interactive system. Users decide how to interact on the basis of expectation and prior experience, and the affordances of the specific interactive system modify the user mental model.
As Dourish (2001) says, when users approach an embodied interactive system, they must construct a new understanding of how it works on the basis of their physical exploration. Different people may have unique experiences and expectations, which will affect the way in which they initially explore a system and, ultimately, the mental model they construct of how the system works (Dillon, 2003). Embodied interaction has been used to describe the interactions of users with a wide range of interactive technologies, including tangible and gesture-based interfaces.
We posit that good tangible and gesture interaction design depends on an understanding of the cognitive issues associated with these modalities. We organize these issues into four categories: embodiment, affordances, metaphor, and epistemic actions. These four categories can be used as clues that the designer can give the user to aid the user in understanding how the interactive system is to be operated. If these concepts are integrated into the design process, the user’s mental model and knowledge can be activated and extended as they try to use and understand the interactive system. While these cognitive issues require further exploration and empirical validation (Antle et al., 2011), we present specific projects that explore various aspects of embodied HCI.
1.1 EMBODIMENT
Interaction through tangible and gesture-based systems is intrinsically embodied, and therefore decisions about which embodied actions can be recognized by the interactive system are part of the design process. Human gestures are expressive body motions involving physical movements of the fingers, hands, arms, head, face, or body that may convey meaningful information or be performed to interact with the environment (Dan and Mohod, 2014). Designing embodied interaction is not just about designing computing ability, but is also about designing the human experience and anticipated human behavior.
Research has shown that gestures play an integral role in human cognition. Psychologists and cognitive scientists have explored the role of gesture and thought for several decades. McNeil (1992, 2008) explains that gesture and thought are tightly connected, and he also establishes a categorization of gestures and their role in human cognition and communication. There is evidence that gesturing aids thinking. Several studies have shown that learning to count is facilitated by touching physical objects (Efron 1941; Kessell and Tversky 2006; Cook et al., 2008). Kessell and Tversky (2006) show that when people are solving and explaining spatial insight problems, gesturing facilitates finding solutions. Goldin-Meadow and Beilock (2010) summarize findings as “gesture influences thought by linking it to action, (p. 669)” and “producing gesture changes thought (p. 670)” and can “create new knowledge (p. 671).” These studies show that gesture, while originally associated with communication, is also related to thinking. Embodied interaction design creates an environment that is activated by gesture and actions on objects and therefore induces cognitive effects that traditional user interaction does not.
One challenge to embodied interaction is that while it is built upon natural actions, it still requires some level of discovery, especially when it is a public display. Tangible and gesture-based interaction designers consider both the integration of technology and its effects on human experience. The major consideration that has emerged to influence tangible design is the physical embodiment of computing. Interaction design is not just screen-based digital interaction anymore. Tangible interaction designers should think about physical, graspable objects that give cues for understanding and provide the basis for interaction. Gesture interaction designers should think about how various human movements can be recognized and interpreted in the context of changing the state and response of the computational system. Interactive platforms can be interpreted as spaces to act and move in, and they effectively determine interaction patterns.
Dourish (2004) explores the physicality of embodied interaction and its affect on moving human computer interaction toward more social environments. He describes an approach to embodiment grounded in phenomenology, and claims that any understanding we have of the world is the result of some initial physical exploration. Embodied interaction is about establishing meaning and it is through embodied interaction that we develop an understanding of the meaning of the system. As the user constructs their mental model, they are influenced by the phenomena they are experiencing at that moment as well as their prior experiences and understanding of how technology works.
In this book, we take a cognitive view of embodied interaction design: Discovering the interaction model relies on pre-existing mental models derived from physical experiences, and executing intentional physical movements during interaction has an effect on cognition. We demonstrate and elaborate on this view of embodiment through four projects; where we describe the gestures that enable interaction, the design methods, and the usability issues for each project.
1.2 AFFORDANCE
The concept of affordance was introduced to the HCI community by Norman (1988) and Gibson (1982). According to Norman (1988), an affordance is the design aspect of an object that allows people to know how to use it and that gives a clue to its function and use. Norman discusses the concept of affordance as properties of an object that allow specific actions such as a handle affords holding and turning, a button affords pressing and make it its own function clear. Tangible interaction design is arguably more influenced by physical affordances than by visual or gesture interaction design.
TUIs change the way we interact with digital information, with physical affordances that are distinctly different from pointing and keyboard/mouse interaction. According to Wang et al. (2002), there are two advantages to tangible interaction; first, it allows direct, naïve manipulability and intuitive understanding; and second, the sense of touch provides an additional dimension. The tactile feedback afforded by TUIs is consistent with the early empiricist argument that kinesthetic information provides us with the ability to construct a spatial map of objects that we touch (Lederman and Klatzky, 1993; Loomis and Lederman, 1986). Fitzmaurice (Fitzmaurice, 1996; Fitzmaurice and Buxton, 1997) demonstrated that having multiple graspable interactive devices encourages two-handed interaction that calls upon everyday coordination skills. Leganchuk et al. (1998) explored the potential benefits of such two-handed input through experimental tasks to find that bimanual manipulation may bring two types of advantages to HCI: manual and cognitive. The two-handed interaction doubles the freedom simultaneously available to the user and reduces the cognitive load of the input performance.
The potential affordances of the TUIs, such as manipulability and physical arrangements, may reduce cognitive load associated with spatial reasoning, thus resulting in enhanced spatial cognition and creative cognition. Brereton and McGarry (2000) studied the role of objects in supporting design thinking as a precursor to designing tangible interaction; they found that design thinking is dependent on gesturing with objects, and recommend that the design of tangible devices consider a tradeoff between exploiting the ambiguous and varied affordances of specific physical objects. The affordances of design tools facilitate specific aspects of designing. As we move away from the traditional WIMP (Windows, Icons, Menus, and Pointer) interaction, we encounter new kinds of affordances in interactive digital design tools (Burlamaqui and Dong, 2015). Tangible interaction design takes advantage of natural physical affordances (Ishii and Ullmer, 1997) by exploiting the knowledge that people already have from their experience with nondigital objects to design novel forms of interacting and discovering. In this book, we focus on the affordances of the interaction that can be sensed by the interactive devices. Well-designed objects make it clear how they work just by looking at them. The successful design of embodied interaction systems does not ignore the affordances of the physical and visual aspects of the system.
1.3 METAPHOR
While affordances of physical objects are closely related to our experience with their physical properties, the properties of tangible interaction objects have both physical and digital relationships. In contrast to physical objects, on-screen objects are clusters of pixels without a physical dimension. A common way to create the appearance of physical affordances to on screen objects is the use of metaphor in designing interface elements (Szabó, 1995). By creating a visual reference on screen to familiar physical objects, the on-screen objects take on some of the affordances of the metaphorical
object (Mohnkern, 1997).
The use of a metaphor during design makes familiar that which is unknown or unfamiliar by connecting it with the user’s previous experience (Dillon, 2003). The most well-known is the “desktop metaphor” used in current operating systems. Another common example of metaphor is the trash can. You can grab a file with the mouse to take it above the trash can and release it. A designer can use the shape, the size, the color, the weight, and the texture of the object to invoke any number of metaphorical links (Fishkin, 2004).
Metaphors are an important concept for embodied interaction. An interaction model based on embodied metaphors effectively implements a mapping between action and output that is consistent with the metaphorical object. Through design, we can map human behaviors and bodily experiences onto abstract concepts in interactive environments (Bakker et al., 2012). Metaphor gives users a known model for an unknown system. Metaphor can help ease the transition to a new situation, so it is good for creating systems that will be used primarily by novices, like public displays. For embodied interaction design, in which there are few standards and fewer user manuals, the role of metaphor in design may be critical in creating easily discovered and learnable interactive systems.
1.4 EPISTEMIC ACTIONS
Epistemic action is exploratory motor activity aimed at uncovering information that is hard to compute mentally. Kirsh and Maglio (1994) distinguish between epistemic and pragmatic actions. A pragmatic action is the action needed to actually perform the task. Epistemic actions are actions that help the person explore the task and guide them to the solution. As such, epistemic actions enable the person to use physical objects and their environment to aid their cognition (Kirsh and Maglio, 1994; van den Hoven and Mazalek, 2011). Therefore, having a variety of tangible objects and physical arrangements may aid problem solving while interacting with a tangible interactive system. Fitzmaurice (1996) discussed the concepts of pragmatic and epistemic actions to provide the underlying theoretical support for workable graspable user interfaces (UIs). Pragmatic action refers to performatory motor activity that directs the user toward to the final goal. Epistemic action refers to exploratory motor activity that may uncover hidden information that would otherwise require a great deal of mental computation.
Kim and Maher (2007) found an increase of epistemic actions in a design task while using a tangible UI, and through a protocol analysis, were able to also observe an increase in the cognitive processes typically associated with creative design. The projects in this book build on that result to design tangible interfaces based on physical objects that offer more opportunities for epistemic (i.e., exploratory) actions than pragmatic (i.e., performatory) actions. Exploration through epistemic actions enables a better perception of the environment and supports learning more about the properties of the objects. When designing gesture-based interaction, the process of discovering the interaction model can be leveraged by encouraging and responding to epistemic actions.
1.5 THIS BOOK
In this book we present tangible and gesture interaction design with an underlying assumption that embodiment, affordances, metaphor, and epistemic actions are critical cognitive issues that can influence the quality of the design. If the interaction design is not well conceived with respect to these cognitive issues, users suffer from frustration, discomfort, stress, and fatigue. Applying appropriate design methods is crucial and should help bridge the differences between the designer’s view of the system and user’s mental model. It is important to conduct user research to know how to incorporate the insights from users’ experiences into the design. In this book, various user research and design methods such as gesture elicitation, protocol analysis, heuristic evaluation, prototyping, body-storming, role-playing, personas, and image boards are described to show how designers understand the potential user mental models of the interactive system. We describe these methods in the context of their use in the four design examples: Tangible Keyboard, Tangible Models, walk-up-and-use information display, and the willful marionette.
This book can provide HCI practitioners and researchers with new principles for better designs and new ideas for research in embodied interaction. For HCI practitioners, the book describes specific design projects and the methods used during design and evaluation. These methods are specific to designing for tangible and gesture interaction. The description of these methods will help practitioners understand how these methods are applied, and, when appropriate, how these methods are uniquely suited to tangible and gesture interaction. For the HCI researcher, the book identifies the cognitive and design research issues that are raised when designing for tangible and gesture interaction. Many of the methods described in the design projects are also applicable in a research context.
The organization of this book is as follows: Chapter 2 presents the concepts and significance of tangible interaction design. In Chapter 3, we present a description of our experience in designing the Tangible Keyboard and Tangible Models. Gesture interaction design is presented in terms of the technology and significance in Chapter 4. We follow this with a description of our experience in designing the walk-up-and-use information display and the willful marionette in Chapter 5. In Chapter 6, we conclude with our understanding of the research challenges in designing for embodied interaction design.