Human-Computer interaction 177407348X, 9781774073483

Human-Computer Interaction takes into account the development of human-computer interaction methods. It includes the dev

294 58 12MB

English Pages 419 [422] Year 2019

Report DMCA / Copyright

DOWNLOAD PDF FILE

Table of contents :
Cover
Title Page
Copyright
DECLARATION
ABOUT THE EDITOR
TABLE OF CONTENTS
List of Contributors
List of Abbreviations
Preface
SECTION I: DEVELOPMENT AND DESIGN OF HCI METHODS
Chapter 1 With or Against Each Other? The Influence of a Virtual Agent’s (Non) cooperative Behavior on User’s Cooperation Behavior in the Prisoners’ Dilemma
Abstract
Introduction
Materials and Methods
Results
Discussion, Limitations, And Future Work
Conclusion
Acknowledgments
References
Chapter 2 Development of a Human Computer Interface System using EOG
Abstract
Introduction
Base Fundamental Principles of System
System Implementation
Eog Signal Preprocessing
Experiments
Conclutions
Acknowledgement
References
Chapter 3 Development and Preliminary Investigation of a Semiautonomous Socially Assistive Robot (SAR) Designed to Elicit Communication, Motor Skills, Emotion, and Visual Regard (Engagement) from Young Children with Complex Cerebral Palsy: A Pilot Comparative Trial
Abstract
Introduction
Materials and Methods
Results and Discussion
Conclusions
Acknowledgments
References
Chapter 4 Ergonomic Design of Human-CNC Machine Interface
Introduction
Related Works
Methodology
Discussion
Conclusion
Acknowledgment
References
SECTION II: REAL-WORLD APPLICATIONS OF HCI TECHNIQUES
Chapter 5 Non-intrusive Physiological Monitoring for Affective Sensing of Computer Users
Introduction
Affective Computing and its Requirements
Affective Sensing
Rationale For Physiological Monitoring Towards Affective Sensing
Selection of Physiological Signals That Can Be Monitored Non-Intrusively
Effects of ANS Changes in GSR, BVP, ST and PD Signals
Affective Sensing Systems Based on Physiological Monitoring
Future Research Direction
Conclusion
References
Chapter 6 Modeling Human-Computer Interaction in Smart Spaces: Existing and Emerging Techniques
Introduction
Background and Motivation
Dimensions In Applicable Modeling Techniques
Design Elements and Semantics
Applying The Formalism
Discussion
Conclusion and Future Work
Acknowledgements
References
Chapter 7 Application Prospects of The Augmented Reality Technology for Improving Safety of Machine Operators
Introduction
Case Studies of Emergency Signs Applied to Machinery
The Augmented Reality in a Workplace
Some Ar System Designs Suggested for Warning Signal Generation
Methods
Conclusions
Acknowledgments
References
Chapter 8 Miniaturized Human 3D Motion Input
Introduction
3D Input Technologies
A Miniature 3D Input Device
Outlook
Acknowledgments
References
SECTION III: DATA ENTRY AND PHYSICAL INTERACTION SYSTEMS
Chapter 9 Structured Light Illumination Methods for Continuous Motion Hand and Face-Computer Interaction
Introduction
Structured Light Illumination
Structured Light Illumination Motion Capture Methods
Augmented Reality 3D Computer Mouse Demonstration
Results
Conclusion
References
Chapter 10 A Method for Designing Physical User Interfaces for Intelligent Production Environments
Abstract
Introduction
Materials and Methods
Results and Discussion
Conclusions
Outlook and Future Work
References
Chapter 11 A New PC-Based Text Entry System Based on EOG Coding
Abstract
Introduction
Materials and Methods
Results and Discussion
Conclusion
References
Chapter 12 Head-Mounted Displays in Ultrasound Scanning
Introduction
Purpose of the Study
Results
Conclusions
References
SECTION IV: ASSISTIVE AND COGNITIVE ROLE OF THE HCI TECHNIQUES
Chapter 13 Improving Human-Computer Interaction by Developing Culture-sensitive Applications based on Common Sense Knowledge
Introduction
Building Up Common Sense Knowledge Bases
The Project Omcs-BR
Designing and Developing Culture-Sensitive Applications
Conclusion and Future Works
Acknowledgements
References
Chapter 14 The Role of Head-Up Display in Computer-Assisted Instruction
Introduction
Hud Technology
Human Factors of HUDs
Hud-Based Cai
Case Study
Experiment
Discussion
Conclusion
References
Chapter 15 Elderly People Benefit More from Positive Feedback Based on Their Reactions in the Form of Facial Expressions during Human-Computer Interaction
Abstract
Introduction
Methods
Results
Discussion
Acknowledgements
References
Chapter 16 How Influential Are Mental Models on Interaction Performance? Exploring the Gap between Users’ and Designers’ Mental Models through a New Quantitative Method
Abstract
Introduction
Literature Review
Methodology
Results and Discussion
Discussion
Conclusion and Future Research
Acknowledgments
References
Index
Back Cover
Recommend Papers

Human-Computer interaction
 177407348X, 9781774073483

  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
File loading please wait...
Citation preview

HUMAN-COMPUTER INTERACTION

HUMAN-COMPUTER INTERACTION

Edited by:

Zoran Gacovski

ARCLER

P

r

e

s

s

www.arclerpress.com

Human-Computer interaction Zoran Gacovski

Arcler Press 2010 Winston Park Drive, 2nd Floor Oakville, ON L6H 5R7 Canada www.arclerpress.com Tel: 001-289-291-7705 001-905-616-2116 Fax: 001-289-291-7601 Email: [email protected] e-book Edition 2020 ISBN: 978-1-77407-414-5 (e-book)

This book contains information obtained from highly regarded resources. Reprinted material sources are indicated. Copyright for individual articles remains with the authors as indicated and published under Creative Commons License. A Wide variety of references are listed. Reasonable efforts have been made to publish reliable data and views articulated in the chapters are those of the individual contributors, and not necessarily those of the editors or publishers. Editors or publishers are not responsible for the accuracy of the information in the published chapters or consequences of their use. The publisher assumes no responsibility for any damage or grievance to the persons or property arising out of the use of any materials, instructions, methods or thoughts in the book. The editors and the publisher have attempted to trace the copyright holders of all material reproduced in this publication and apologize to copyright holders if permission has not been obtained. If any copyright holder has not been acknowledged, please write to us so we may rectify. Notice: Registered trademark of products or corporate names are used only for explanation and identification without intent of infringement. © 2020 Arcler Press ISBN: 978-1-77407-348-3 (Hardcover)

Arcler Press publishes wide variety of books and eBooks. For more information about Arcler Press and its products, visit our website at www.arclerpress.com

DECLARATION Some content or chapters in this book are open access copyright free published research work, which is published under Creative Commons License and are indicated with the citation. We are thankful to the publishers and authors of the content and chapters as without them this book wouldn’t have been possible.

ABOUT THE EDITOR

Dr. Zoran Gacovski has earned his PhD degree at Faculty of Electrical engineering, Skopje. His research interests include Intelligent systems and Software engineering, fuzzy systems, graphical models (Petri, Neural and Bayesian networks), and IT security. He has published over 50 journal and conference papers, and he has been reviewer of renowned Journals. Currently, he is a professor in Computer Engineering at European University, Skopje, Macedonia.

TABLE OF CONTENTS

List of Contributors .......................................................................................xv List of Abbreviations .................................................................................... xix Preface.................................................................................................... ....xxi SECTION I: DEVELOPMENT AND DESIGN OF HCI METHODS Chapter 1

With or Against Each Other? The Influence of a Virtual Agent’s (Non) cooperative Behavior on User’s Cooperation Behavior in the Prisoners’ Dilemma ................................................................................... 3 Abstract ..................................................................................................... 3 Introduction ............................................................................................... 4 Materials and Methods .............................................................................. 6 Results ..................................................................................................... 11 Discussion, Limitations, And Future Work................................................ 14 Conclusion .............................................................................................. 16 Acknowledgments ................................................................................... 16 References ............................................................................................... 18

Chapter 2

Development of a Human Computer Interface System using EOG ......... 21 Abstract ................................................................................................... 21 Introduction ............................................................................................. 22 Base Fundamental Principles of System ................................................... 23 System Implementation............................................................................ 26 Eog Signal Preprocessing ......................................................................... 27 Experiments ............................................................................................. 33 Conclutions ............................................................................................. 34 Acknowledgement ................................................................................... 34 References ............................................................................................... 35

Chapter 3

Development and Preliminary Investigation of a Semiautonomous Socially Assistive Robot (SAR) Designed to Elicit Communication, Motor Skills, Emotion, and Visual Regard (Engagement) from Young Children with Complex Cerebral Palsy: A Pilot Comparative Trial.......... 37 Abstract ................................................................................................... 38 Introduction ............................................................................................. 38 Materials and Methods ............................................................................ 41 Results and Discussion ............................................................................ 51 Conclusions ............................................................................................. 60 Acknowledgments ................................................................................... 61 References ............................................................................................... 62

Chapter 4

Ergonomic Design of Human-CNC MachineInterface ............................. 67 Introduction ............................................................................................. 67 Related Works ......................................................................................... 72 Methodology ........................................................................................... 79 Discussion ............................................................................................... 89 Conclusion .............................................................................................. 92 Acknowledgment ..................................................................................... 93 References ............................................................................................... 94 SECTION II: REAL-WORLD APPLICATIONS OF HCI TECHNIQUES

Chapter 5

Non-intrusive Physiological Monitoring for Affective Sensing of Computer Users ................................................................... 101 Introduction ........................................................................................... 101 Affective Computing and its Requirements ............................................. 102 Affective Sensing ................................................................................... 103 Rationale For Physiological Monitoring Towards Affective Sensing ......... 106 Selection of Physiological Signals That Can Be Monitored Non-Intrusively ............................................................................ 108 Effects of ANS Changes in GSR, BVP, ST and PD Signals ........................ 110 Affective Sensing Systems Based on Physiological Monitoring ............... 114 Future Research Direction ..................................................................... 117 Conclusion ............................................................................................ 118 References ............................................................................................. 120

x

Chapter 6

Modeling Human-Computer Interaction in Smart Spaces: Existing and Emerging Techniques ............................................ 125 Introduction ........................................................................................... 125 Background and Motivation ................................................................... 127 Dimensions In Applicable Modeling Techniques .................................... 129 Design Elements and Semantics ............................................................. 132 Applying The Formalism ........................................................................ 135 Discussion ............................................................................................. 141 Conclusion and Future Work ................................................................. 143 Acknowledgements ............................................................................... 144 References ............................................................................................. 145

Chapter 7

Application Prospects of The Augmented Reality Technology for Improving Safety of Machine Operators .......................................... 147 Introduction ........................................................................................... 147 Case Studies of Emergency Signs Applied to Machinery ......................... 148 The Augmented Reality in a Workplace ................................................. 151 Some Ar System Designs Suggested for Warning Signal Generation ....... 153 Methods ................................................................................................ 154 Conclusions ........................................................................................... 161 Acknowledgments ................................................................................. 163 References ............................................................................................. 164

Chapter 8

Miniaturized Human 3D Motion Input ................................................. 169 Introduction ........................................................................................... 169 3D Input Technologies ........................................................................... 170 A Miniature 3D Input Device................................................................. 175 Outlook ................................................................................................. 180 Acknowledgments ................................................................................. 181 References ............................................................................................. 182

SECTION III: DATA ENTRY AND PHYSICAL INTERACTION SYSTEMS Chapter 9

Structured Light Illumination Methods for Continuous Motion Hand and Face-Computer Interaction ................................................... 185 Introduction ........................................................................................... 185 Structured Light Illumination.................................................................. 186

xi

Structured Light Illumination Motion Capture Methods .......................... 187 Augmented Reality 3D Computer Mouse Demonstration ....................... 192 Results ................................................................................................... 194 Conclusion ............................................................................................ 197 References ............................................................................................. 199 Chapter 10 A Method for Designing Physical User Interfaces for Intelligent Production Environments ...................................................................... 201 Abstract ................................................................................................. 201 Introduction ........................................................................................... 202 Materials and Methods .......................................................................... 206 Results and Discussion .......................................................................... 228 Conclusions ........................................................................................... 235 Outlook and Future Work ...................................................................... 237 References ............................................................................................. 239 Chapter 11 A New PC-Based Text Entry System Based on EOG Coding .................. 247 Abstract ................................................................................................. 247 Introduction ........................................................................................... 248 Materials and Methods .......................................................................... 250 Results and Discussion .......................................................................... 257 Conclusion ............................................................................................ 259 References ............................................................................................. 261 Chapter 12 Head-Mounted Displays in Ultrasound Scanning .................................. 265 Introduction ........................................................................................... 265 Purpose of the Study .............................................................................. 268 Results ................................................................................................... 271 Conclusions ........................................................................................... 273 References ............................................................................................. 275 SECTION IV: ASSISTIVE AND COGNITIVE ROLE OF THE HCI TECHNIQUES Chapter 13 Improving Human-Computer Interaction by Developing Culturesensitive Applications based on Common Sense Knowledge ................. 281 Introduction ........................................................................................... 281 Building Up Common Sense Knowledge Bases ...................................... 284

xii

The Project Omcs-BR............................................................................. 287 Designing and Developing Culture-Sensitive Applications ..................... 304 Conclusion and Future Works ................................................................ 317 Acknowledgements ............................................................................... 317 References ............................................................................................. 319 Chapter 14 The Role of Head-Up Display in Computer-Assisted Instruction ........... 325 Introduction ........................................................................................... 325 Hud Technology .................................................................................... 326 Human Factors of HUDs........................................................................ 330 Hud-Based Cai ...................................................................................... 331 Case Study ............................................................................................. 334 Experiment ............................................................................................ 338 Discussion ............................................................................................. 343 Conclusion ............................................................................................ 343 References ............................................................................................. 345 Chapter 15 Elderly People Benefit More from Positive Feedback Based on Their Reactions in the Form of Facial Expressions during Human-Computer Interaction.................................................... 349 Abstract ................................................................................................. 349 Introduction ........................................................................................... 350 Methods ................................................................................................ 352 Results ................................................................................................... 353 Discussion ............................................................................................. 355 Acknowledgements ............................................................................... 356 References ............................................................................................. 357 Chapter 16 How Influential Are Mental Models on Interaction Performance? Exploring the Gap between Users’ and Designers’ Mental Models through a New Quantitative Method .................................................... 361 Abstract ................................................................................................. 361 Introduction ........................................................................................... 362 Literature Review ................................................................................... 363 Methodology ......................................................................................... 368 Results and Discussion .......................................................................... 376 Discussion ............................................................................................. 382 xiii

Conclusion and Future Research ............................................................ 385 Acknowledgments ................................................................................. 386 References ............................................................................................. 387 Index ..................................................................................................... 393

xiv

LIST OF CONTRIBUTORS Carolin Straßmann Social Psychology: Media and Communication, University Duisburg-Essen, 47057 Duisburg, Germany Astrid M. Rosenthal-von der Pütten Individual and Technology, RWTH Aachen University, 52062 Aachen, Germany Nicole C.Krämer Social Psychology: Media and Communication, University Duisburg-Essen, 47057 Duisburg, Germany Zhao Lv The Key Laboratory of Intelligent Computing & Signal Processing, Ministry of Education China, of Anhui University, Hefei, Anhui Province The First Aeronautical College of Air-Force, Xinyang, Henan Province, China Xiao-Pei Wu The Key Laboratory of Intelligent Computing & Signal Processing, Ministry of Education China, of Anhui University, Hefei, Anhui Province Mi Li The First Aeronautical College of Air-Force, Xinyang, Henan Province, China De-Xiang Zhang The Key Laboratory of Intelligent Computing & Signal Processing, Ministry of Education China, of Anhui University, Hefei, Anhui Province Cecilia Clark Assistive Technology Partners, UCD Auraria Campus, Hub, 1224 5th Street, Denver, CO 80204, USA Levin Sliker Assistive Technology Partners, UCD Auraria Campus, Hub, 1224 5th Street, Denver, CO 80204, USA xv

Jim Sandstrum Assistive Technology Partners, UCD Auraria Campus, Hub, 1224 5th Street, Denver, CO 80204, USA Brian Burne Assistive Technology Partners, UCD Auraria Campus, Admin Building, 1201 5th Street, Suite 240, Denver, CO 80204, USA VictoriaHaggett Assistive Technology Partners, UCD Auraria Campus, Hub, 1224 5th Street, Denver, CO 80204, USA Cathy Bodine Assistive Technology Partners, UCD Auraria Campus, Hub, 1224 5th Street, Denver, CO 80204, USA Imtiaz Ali Khan Department of Mechanical Engineering, Aligarh Muslim University, Aligarh, India Armando Barreto Florida International University USA Yngve Dahl Telenor Research and Innovation Norway Marek Dzwiarek Central Institute for Labour Protection-National Research Institute Poland Anna Łuczak Central Institute for Labour Protection-National Research Institute Poland Kynan Eng Institute of Neuroinformatics University of Zurich and ETH Zurich Switzerland Charles J. Casey University of Kentucky Department of Electrical and Computer Engineering Lexington, Kentucky USA

xvi

Laurence G. Hassebrook University of Kentucky Department of Electrical and Computer Engineering Lexington, Kentucky USA Daniel L. Lau University of Kentucky Department of Electrical and Computer Engineering Lexington, Kentucky USA Pierre Taner Kirisci University of Bremen, Faculty of Production Engineering, Bagdasteiner Str. 1, 28359 Bremen, Germany Bremer Institut für Produktion und Logistik GmbH (BIBA), University of Bremen, Hochschulring 20, 28359 Bremen, Germany Klaus-Dieter Thoben University of Bremen, Faculty of Production Engineering, Bagdasteiner Str. 1, 28359 Bremen, Germany Bremer Institut für Produktion und Logistik GmbH (BIBA), University of Bremen, Hochschulring 20, 28359 Bremen, Germany Metin Yildiz Department of Biomedical Engineering, Baskent University, Ankara, Turkey Hesna Özbek Ülkütaş Institute of Science and Engineering, Baskent University, Ankara, Turkey Juha Havukumpu University of Helsinki Department of Psychology, Finland Jari Takatalo University of Helsinki Department of Psychology, Finland Göte Nyman University of Helsinki Department of Psychology, Finland Jukka Häkkinen University of Helsinki Department of Psychology, Finland Nokia Research Center Finland

xvii

Junia Coutinho Anacleto Advanced Interaction Lab / Computing Department / Federal University of São Carlos Brazil Aparecido Fabiano Pinatti de Carvalho Advanced Interaction Lab / Computing Department / Federal University of São Carlos Brazil Kikuo Asai National Institute of Multimedia Education Japan Stefanie Rukavina Department of Psychosomatic Medicine and Psychotherapy, Medical Psychology, Ulm University, Ulm, Germany Sascha Gruss Department of Psychosomatic Medicine and Psychotherapy, Medical Psychology, Ulm University, Ulm, Germany Holger Hoffmann Department of Psychosomatic Medicine and Psychotherapy, Medical Psychology, Ulm University, Ulm, Germany Harald C. Traue Department of Psychosomatic Medicine and Psychotherapy, Medical Psychology, Ulm University, Ulm, Germany Bingjun Xie Department of Industrial Engineering, Chongqing University, Chongqing 400044, China Jia Zhou Department of Industrial Engineering, Chongqing University, Chongqing 400044, China Huilin Wang Department of Industrial Engineering, Chongqing University, Chongqing 400044, China

xviii

LIST OF ABBREVIATION ALS

Amyotrophic Lateral Sclerosis

AI

Artificial Intelligence

AR

Assistive Robots

AR

Augmented Reality

ANS

Autonomic Nervous System

BVP

Blood volume pulse

BCI

Brain Computer Interface

CAD

Computer-aided design

DOF

Degrees of freedom

EDA

Electrodermal activity

EOG

Electrooculogram

EOG

Electro-oculography

ES

Experimental sequences

FACS

Facial Action Coding System

FACES

Facial Expression Coding System

FOV

Field of View

GSR

Galvanic skin response

GUIs

Graphic user interfaces

HMD

Head-mounted display

HUD

Head-up display

HTA

Hierarchical Task Analysis

HCI

Human-computer interaction

IT

Information technology

IOP

Intraocular pressure

MIT

Massachusetts Institute of Technology

MVCs

Maximum voluntary contractions

MVE

Maximum voluntary excitation

MBUID

Model-based user interface development

NIDRR

National Institute on Disability and Rehabilitation Research

PDA

Personal digital assistant

PMP

Phase Modulation Profilometry

PD

Pupil diameter

PLR

Pupillary Light Reflex

SFFS

Sequential Floating Forward Search

SIR

Socially Interactive Robots

SLI

Structured light illumination

TMMs

Team Mental Models

xx

PREFACE

Observing the evolution of our society by historical stages, it is obvious that it is continually going through changes, aiming to correct defects, replace old values with new ones and thus bring more quality and functionality. But it comes to the question “Is it possible to create the perfect system?” - it refers to a social system or, in our case, a computer system and its interaction with the humans. The perfect system is impossible to define, and the same is valid for defining the users. Millions of different user requests, desires, needs, problems, limitations await even more attempts to be satisfied, accomplished, improved and resolved. Sometimes good solutions stem from years of experience and work, and sometimes just by accidentally poaching, or by adding something new. The key to any progress, even in improving the interaction between human and computer, is the constant testing, experimentation and finding of new solutions. Regarding online interaction - due to the overcrowded content of the Web (Internet), users also face huge number of different links and buttons on one side, which can be quite frustrating. In addition to a million offers of different items, if the user does not know the exact name, s/he will be able to search only the best-selling and most popular items, so the value of such a rich offer that cannot be explored in terms of completeness. So defining a clear, easy-to-use, intuitive user interface would be a primary goal for all companies. By usability we consider 5 aspects of the interface: satisfaction, efficiency, learnability, memorability and low number of errors. User-centered design role is to create cohesive, expected, and desirable effect for the target users. By focusing on users on a deep level, companies will make the user experience more thorough. The change in experience is accomplished by gathering emotional response from the users which are tied to their actions and accomplishments. One of the main purposes of UX design is to add context to the natural behavior of users and, by doing so, to provide them with a story that they can take from the experience. The goal of this edition is to cover different aspects of the human-computer interaction (both theoretical and practical).

Section 1 focuses on development and design of HCI methods, describing the influence of a virtual agent’s (non)cooperative behavior on user’s cooperation behavior in the prisoners’ dilemma, development of a human-computer interface system using EOG, development and preliminary investigation of a semiautonomous socially assistive robot, ergonomic design of human-CNC machine interface. Section 2 focuses on real-world applications of HCI techniques, describing non-intrusive physiological monitoring for affective sensing of computer users, modeling human-computer interaction in smart spaces, application prospects of the augmented reality technology, miniaturized human 3D motion input. Section 3 focuses on data entry and physical interaction systems, describing structured light illumination methods for continuous motion hand, method for designing physical user interfaces for intelligent production environments, new PC-based text entry system based on EOG coding, head-mounted displays in ultrasound scanning. Section 4 focuses on assistive and cognitive role of the HCI techniques, describing improved human-computer interaction by developing culturesensitive applications role of head-up display in computer-assisted instruction, reactions in the form of facial expressions during human-computer interaction, the gap between users’ and designers’ mental models.

xxii

SECTION I: DEVELOPMENT AND DESIGN OF HCI METHODS

With or Against Each Other? The Influence of a Virtual Agent’s (Non) cooperative Behavior on User’s Cooperation Behavior in the Prisoners’ Dilemma

1

Carolin Straßmann1 , Astrid M. Rosenthal-von der Pütten2 , and Nicole C.Krämer1

Social Psychology: Media and Communication, University DuisburgEssen, 47057 Duisburg, Germany

1

Individual and Technology, RWTH Aachen University, 52062 Aachen, Germany

2

ABSTRACT Most applications for virtual agents require the user to cooperate. Thus, it is helpful to investigate different strategies for virtual agents to evoke the Citation: Carolin Strassmann, Astrid M. Rosenthal-von der Pütten, and Nicole C. Krämer, “With or against Each Other? The Influence of a Virtual Agent’s (Non)cooperative Behavior on User’s Cooperation Behavior in the Prisoners’ Dilemma,” Advances in Human-Computer Interaction, vol. 2018, Article ID 2589542, 7 pages, 2018.https:// doi.org/10.1155/2018/2589542. Copyright: © 2018 Carolin Straßmann et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

4

Human-Computer Interaction

user’s cooperation. In the present work (N = 80), we experimentally tested the influence of an agent’s (non)cooperative nonverbal behavior and actual decision-making behavior on user’s cooperation in the Prisoners’ Dilemma considering different age groups (students and seniors). Therefore, we used a 2 (nonverbal behavior) x 2 (age group) between-subjects design in Wizardof-Oz study. Results show age differences with seniors cooperating more often than students do. The nonverbal behavior had no effect on the users’ willingness to cooperate nor on the evaluation of the agent’s cooperativeness. However, the agent’s decision-making behavior in the game influenced the users’ willingness to cooperate. In summary, the nonverbal behavior seemed to be too subtle, while the actions of the agent were important in terms of cooperation.

INTRODUCTION The long-term goal of most endeavors regarding virtual agents is to create engaging experiences and interactions that are beneficial for the user. In most scenarios, however, the success of these systems depends on the user’s willingness to cooperate with the virtual agent. In some scenarios this is even more crucial, for instance, with regard to interactions that are essential for the user’s health. Systems supporting elderly users in their daily life, organizing appointments, and reminding them to take their medication rely on users who are willing to share information in order to keep record of all appointments or to guarantee successful medicinal treatment (see [1]). The challenge is to create interactions in which users do not feel patronized by the system but to facilitate cooperation. Interlocutors can show cooperation via different channels: (a) via nonverbal behavior signaling the intent for cooperation, (b) via direct verbal messages, and (c) via their actual cooperative or noncooperative decision-making behavior. Since there is ample evidence that specific nonverbal behavior can lead to higher cooperativeness perception [2–6], the present study focuses on the effects of the agent’s nonverbal behavior. In most situations, it is risky to trust and cooperate with a dishonest person. In order to avoid putting trust into the wrong person, interlocutors use nonverbal behavior as indicators for commitment and cooperation [7]. Nonverbal behavior might function as an indicator for cooperation in such a subtle way that the agent can evoke users’ willingness to cooperate without making them feel forced to cooperate. Boone and Buck [3] claimed that nonverbal expressiveness plays an important role in the cooperation process. The authors further suggest testing this assumption

With or against Each Other? The Influence of a Virtual Agent’s (Non) ....

5

by integrating nonverbal emotional displays (nonverbal expressiveness) within a social dilemma setting (e.g., Prisoner’s Dilemma). In line with this claim, prior research demonstrated that interlocutors are sensitive to subtle facial expressions like smiling when they evaluate the other person’s cooperativeness [5] and that smiling can evoke more cooperation behavior [6]. It has been shown that people who are more cooperative showed more expressive facial displays (both positive and negative) [8]. These facial expressions are assumed to signal honest intentions to cooperate [2]. Thus, a smiling agent should be perceived to signal cooperativeness and therefore users are more willing to cooperate with this agent in return. The effect of expressiveness on the perception of cooperativeness is not limited to facial expressions, but also expressiveness in body movements (large and high frequent gestures) was found to elicit higher perceptions of cooperativeness [4]. Moreover, gaze behavior and head-tilts are related to trust, a concept highly correlated to commitment and cooperation. Hence, gazing towards the user displaying a lateral head-tilt has also been found to evoke a more cooperative impression of the interlocutor [4]. In sum, prior findings suggest that specific nonverbal behavior (e.g., high expressiveness, gaze behavior, and head-tilt) will lead to the attribution of cooperativeness, which recursively fosters the cooperation behavior of the interlocutor. Therefore, we hypothesize the following: H1: Users interacting with an agent showing cooperative nonverbal behavior cooperate more often with the agent. Besides the nonverbal behavior of interlocutors, their actual decisionmaking behavior in the Prisoner’s Dilemma is of course also a crucial indicator for trust and cooperation. Pletzer and colleagues [9] found that people are more willing to cooperate when they expect their counterpart to cooperate. Hence, once the trust between the interlocutors has been destroyed by not being cooperative, the perceived cooperativeness of the counterpart will also decrease. Following this we assume that if the agent decides not to cooperate, users’ cooperative behavior will drop in return. H2: The agent’s actual decision-making behavior in the Prisoner’s Dilemma influences users’ willingness to cooperate. After choosing not to cooperate, user’s cooperative decision-making behavior will decrease. As mentioned above, especial applications for special target groups like seniors (e.g., daily life assistance) require the user to cooperate. However, these target groups have special needs due to their features and possibilities. It has been shown, that elderly people are more nervous during an interaction

6

Human-Computer Interaction

with a virtual agent [10] and, thus, it is especially important to calm them down and find a subtle way to gain their willingness to cooperate. One potential factor to evoke cooperation behavior might be nonverbal cues, but prior findings suggest that seniors showed a higher tendency to misinterpret emotional displays [11]. Hence, it is possible that seniors might also show a higher tendency to misinterpret nonverbal cooperation indicators. Since nonverbal behavior is assumed to be a good natural way to induce cooperation [3], it is a matter of concern whether this is also the case for elderly people, who would particularly benefit from it. Thus, this research aims to investigate age-related differences and the following research question will be examined: RQ1: Are there age-related differences in the perception of the agent’s cooperativeness and the users’ willingness to cooperate?

MATERIALS AND METHODS Experimental Design and Experimental Tasks Depending on the experimental group, the virtual agent in this study showed either cooperative or noncooperative nonverbal behavior prior to and during the Prisoners’ Dilemma game. The study was set up as Wizard-of-Oz scenario in a 2x2 between-subjects design with the independent factors “age group” (students versus seniors) and “nonverbal behavior” (cooperative versus noncooperative). For the implementation of the two types of nonverbal behaviors, we rely on prior research [4] in which nonverbal behaviors were systematically tested with regard to how users perceived the virtual agent’s expressivity and gaze behavior with regard to cooperativity. For the present study, we adopted these behaviors. Therefore, in the cooperative condition the agent (“Billie”) showed expressive facial and body movements, turned the gaze towards the user, and showed a lateral head-tilt. Since expressiveness is one of the main indicators for cooperation, the agent in the noncooperative condition in contrast barely displayed nonverbal behavior and avoided the user’s gaze more often. Figure 1 presents examples of these behaviors.

With or against Each Other? The Influence of a Virtual Agent’s (Non) ....

7

Figure 1: Examples of the presented nonverbal behavior smiling, expressive gesture, lateral head-tilt (cooperative nonverbal behavior), and avoiding gaze (noncooperative nonverbal behavior).

To test the impact of (non)cooperative nonverbal behavior we created situations in which cooperation is the key for the interaction. Participants played a virtual version of the well-established Prisoners’ Dilemma (c.f. Figure 2). This version of the game is money-based. The players played four rounds in which both parties decided whether to place or to hold their money (10 €) without knowing in advance the decision of the other player. The gains in this game depended on the mutual decision to cooperate: when both parties cooperated and placed their money, they both doubled their placement (both gain 20 €). If just one player placed his or her money and the other player decided to hold, the cooperating player lost the placement (0 €) and the other tripled his or her stakes (30 €). When both players decided to hold their stakes nothing happens and both stakes remained (10 €). Within the four rounds, the virtual agent showed the same decision-making behavior for each participant: first, he cooperates, then he does not cooperate, and in the last two turns, he cooperates again.

8

Human-Computer Interaction

Figure 2: Prisoner’s Dilemma.

Before participants played the Prisoner’s Dilemma, they had the chance to get to know the agent. Participants interacted with the virtual agent in a calendar setting, where they managed appointments and decided whether to follow suggestions of the virtual agent. In particular, participants planned three appointments together with the virtual agent Billie and entered the appointments into the virtual calendar by stating activity as well as date, starting time, and duration of the appointment. During this interaction, the agent already displayed the nonverbal behavior (either cooperative or noncooperative), so that participants also had the chance to perceive the manipulated cooperativity of the agent during this joint task and prior to the dilemma task.

Participants and Procedure Overall 80 people participated in this lab experiment, 40 of which were students (age: M = 22.03, SD = 2.61; 24 female and 16 male) and 40 seniors (age: M = 68.15, SD = 5.59; 22 female and 18 male). Students received either course credits or 10 € and seniors received 10 € for participation. Upon arrival, participants read and signed informed consent. Afterwards, they first interacted with the agent in the calendar setting negotiating appointments and filled in a short evaluating questionnaire, in which they evaluated the agent’s person perception. Subsequently, they were instructed about the mechanisms of the Prisoners’ Dilemma and then they played four rounds of the game. After completion of the game, participants filled in questionnaires

With or against Each Other? The Influence of a Virtual Agent’s (Non) ....

9

evaluating the agent and its abilities, the interaction, and the experimental tasks as well as scales assessing the explanatory variables.

Measurements Cooperation Behavior We analyzed users’ behavior within the Prisoners’ Dilemma. Participants could choose to cooperate or not during each of the four turns. We also calculated the sum score for all four decisions.

Person Perception Participants were asked to evaluate the agent with regard to person perception on five dimensions (Likability, Intelligence, Cooperativity, Dominance, and Autonomy), indicating their agreement to 35 items on a 5-point Likert scale ranging from “I do not agree at all” to “I fully agree”. The scales are based on a person perception scale previously used in human-agent/robot interaction studies [12]. Participants stated their evaluation of the agent’s person perception twice: after the calendar interaction (T1) and after the Prisoners’ Dilemma (T2). The dimension Autonomy was measured only once at T2. The dimension Likability was measured with eight items (e.g., friendly, likable; Cronbach’s α T1 =.911; T2=.882). Intelligence was measured using five items (e.g., dumb (rev), intelligent; Cronbach’s α T1=.820; T2=.845). Cooperativity was measured using seven items (e.g., ask Billie for advice, is cooperative, is supportive; Cronbach’s α T1 =.891; T2 =.859). Autonomy was measured using six items (e.g., not autonomous (rev), self-dependent; Cronbach’s α T2 =.773). Dominance was measured using nine items (e.g., dominant, submissive (rev)). Reliability was too low for this subscale and did not improve when using a subset of the items. Therefore, we decided to use only the single item dominant for further analyses.

Communicative Abilities We assessed how participants rated the agent’s verbal and nonverbal communicative abilities with regard tomutual verbal understanding, dominance in conversation, nonverbal behavior production, and nonverbal behavior recognition/understanding. Mutual verbal understanding was measured with nine items (e.g., Billie understood me well, did not hear

10

Human-Computer Interaction

me, did not understand me; Cronbach’s α =.757; the item “the conversation with Billie was stiff” was omitted to enhance reliability). Dominance in conversation was measured using five items (e.g., I was able to control the conversation (rev), Billie was leading the conversation; Cronbach’s α =.744). Nonverbal behavior production was measured with five items (e.g., I had the feeling that Billie’s gestures were expressive, unambiguous; Cronbach’s α =.832). Nonverbal behavior understanding was measured with six items (e.g., I had the feeling that Billie was able to recognize my gestures, was able to recognize my mimic; Cronbach’s α =.888).

Physical Presence and Social Presence We assessed participants’ sense of copresence with the Nowak and Biocca Presence Scale [13], which contains 12 items on the concept of “perceived other’s copresence” (e.g., my interaction partner was intensely involved in our interaction; Cronbach’s α =.784) and 6 items on “self-reported copresence” (e.g., I was unwilling to share personal information with my interaction partner; Cronbach’s α =.685), both rated on a 5-point Likert scale. Furthermore, we used Bailenson et al.’s Social Presence Scale [14] with 5 items rated on a 5-point Likert scale (e.g., I perceived the presence of another person in the room with me, Cronbach’s α =.699). Since reliability of the Social Presence Scale and the subscale “self-reported copresence” was low, we excluded these measures from further analyses.

Perception of the Game We asked participants about their perceptions of Billie and the Prisoners’ Dilemma with fifteen ad hoc generated items. These items asked participants, for example, about the difficulty of the game (e.g., the game was easy to master), the perceived cooperation of the game (e.g., Billie and I were partners during the game), the trust towards the agent (e.g., Billie gave me the feeling that he knows better and I am better off not to trust him), or felt competitiveness (e.g., during the game I was mostly concerned with how I could maximize my own gains). All items were rated on a 5-point Likert scale ranging from “I do not agree at all” to “I do fully agree”.

Future Usage of Agent System Participants also indicated how likely they would use the virtual agent in the future (e.g., I can imagine interacting with Billie more often, Billie could help me also with other tasks in my everyday life; Cronbach’s α=.955).

With or against Each Other? The Influence of a Virtual Agent’s (Non) ....

11

Moderating Variables As moderating variables, we used the short version of the NEO-FFI Scale [15], the cooperation subscale of the Temperament and Personality Questionnaire [16], and the nonverbal immediacy scale [17].

RESULTS Decision-Making Behavior A mixed-design repeated measures ANOVA was calculated with the agent’s nonverbal behavior and age groups as between factors and users decisionmaking behavior during the game (user’s decision to cooperate or not within the four rounds) as repeated measures. Greenhouse-Geisser correction (ε =.78) was used, since the assumption of sphericity has been violated. No main effect of the presented nonverbal behavior emerged, while there was a significant main effect for the age group (F(1,76) = 9.438, p =.003, η2 =.110) and a linear effect of the cooperation behavior itself (F(2.35,178.63) = 9.852, p