130 69
English Pages 247 [244] Year 2021
Shuo Gao Shuo Yan Hang Zhao Arokia Nathan
Touch-Based Human-Machine Interaction Principles and Applications
Touch-Based Human-Machine Interaction
Shuo Gao • Shuo Yan • Hang Zhao • Arokia Nathan
Touch-Based HumanMachine Interaction Principles and Applications
Shuo Gao School of Instrumentation and Optoelectronic Engineering Beihang University Beijing, China Hang Zhao Institute for Interdisciplinary Information Sciences Tsinghua University Beijing, China
Shuo Yan School of New Media Art and Design, State Key Laboratory of Virtual Reality Technology and Systems Beihang University Beijing, China Arokia Nathan Darwin College, University of Cambridge Cambridge, UK
ISBN 978-3-030-68947-6 ISBN 978-3-030-68948-3 https://doi.org/10.1007/978-3-030-68948-3
(eBook)
© Springer Nature Switzerland AG 2021 This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed. The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. The publisher, the authors, and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors give a warranty, expressed or implied, with respect to the material contained herein or for any errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in published maps and institutional affiliations. This Springer imprint is published by the registered company Springer Nature Switzerland AG The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
Contents
1
Ambient Touch Interactivities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
1 4
2
Properties of Touch Events . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.1 Finger’s Physical Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.1.1 Physical Characteristics of Human Fingers and Touch Events . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.1.2 Characterization of Touch Events . . . . . . . . . . . . . . . . . . 2.2 User’s Touch Behavior . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.2.1 Dynamic Force Touch Behavior . . . . . . . . . . . . . . . . . . . 2.2.2 Static Force Touch Behavior . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. .
5 5
. . . . . .
5 6 8 9 13 16
Touch Detection Technologies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.1 Contact-Based Touch Detection Techniques . . . . . . . . . . . . . . . . 3.1.1 Resistive-Based Techniques . . . . . . . . . . . . . . . . . . . . . . 3.1.2 Capacitive-Based Techniques . . . . . . . . . . . . . . . . . . . . . 3.1.3 Acoustic-Based Techniques . . . . . . . . . . . . . . . . . . . . . . 3.1.4 Optical-Based Techniques . . . . . . . . . . . . . . . . . . . . . . . 3.1.5 Piezoelectric-Based Techniques . . . . . . . . . . . . . . . . . . . 3.1.6 Piezoresistive-Based Techniques . . . . . . . . . . . . . . . . . . . 3.1.7 Pyroelectric-Based Techniques . . . . . . . . . . . . . . . . . . . . 3.1.8 Triboelectric-Based Techniques . . . . . . . . . . . . . . . . . . . 3.2 Non-contact Techniques . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.2.1 Camera-Based Techniques . . . . . . . . . . . . . . . . . . . . . . . 3.2.2 Inertial Motion Unit-Based Techniques . . . . . . . . . . . . . . 3.2.3 Electromyogram-Based Detection Techniques . . . . . . . . . 3.2.4 Electrical Capacitance Tomography-Based Techniques . . . 3.2.5 Electrical Impedance Tomography-Based Techniques . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . .
19 19 19 25 33 41 50 59 61 64 67 69 70 71 72 83 83
3
v
vi
4
5
6
Contents
Haptic Feedback . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.1 A Brief Introduction of Feedback Technology . . . . . . . . . . . . . . 4.1.1 Feedback Technology in HMI Systems . . . . . . . . . . . . . . 4.1.2 Limitations of Conventional Visual and Auditory Feedback . . . . . . . . . . . . . . . . . . . . . . . . . 4.2 Haptic Feedback Technology . . . . . . . . . . . . . . . . . . . . . . . . . . 4.2.1 Vibration Technology . . . . . . . . . . . . . . . . . . . . . . . . . . 4.2.2 Bioelectrical Stimulation Technique . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Performance Optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.1 Capacitive TSP for 2D Detection . . . . . . . . . . . . . . . . . . . . . . . . 5.1.1 Noise Reduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.1.2 Noise Source . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.1.3 Solution for Noise Reduction . . . . . . . . . . . . . . . . . . . . . 5.1.4 Techniques for Fast Readout Speed and Low Power Consumption . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.1.5 Time and Frequency Domain . . . . . . . . . . . . . . . . . . . . . 5.1.6 Spatial Domain . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.1.7 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.2 Piezoelectric TSP for 3D Force Touch Detection . . . . . . . . . . . . 5.2.1 Touch Event-Related Instable Responsivity Issues . . . . . . 5.2.2 Touch Panel’s Mechanical Property-Induced Low Force Detection Accuracy . . . . . . . . . . . . . . . . . . . . . . . 5.2.3 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . User Experience Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.1 The Definition of User Experience . . . . . . . . . . . . . . . . . . . . . . . 6.1.1 The Importance of User Experience Evaluation . . . . . . . . 6.2 The Development of User Experience Evaluation . . . . . . . . . . . . 6.2.1 From Behavior to Perception . . . . . . . . . . . . . . . . . . . . . 6.2.2 Immersive Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . 6.2.3 Studies for Special Design . . . . . . . . . . . . . . . . . . . . . . . 6.3 Evaluation Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.3.1 Usability Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.3.2 Behavioral Observation . . . . . . . . . . . . . . . . . . . . . . . . . 6.3.3 Behavioral Measures . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.3.4 Physiological Signals . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.3.5 Affective Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.3.6 Survey and Questionnaires . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . .
91 91 91
. 92 . 92 . 93 . 103 . 107 . . . . .
109 109 109 109 112
. . . . . .
125 125 128 133 133 135
. 139 . 152 . 152 . . . . . . . . . . . . . . .
155 155 155 158 158 159 159 159 159 163 165 167 169 171 174
Contents
7
8
Emerging Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.1 Interactivity for Flexible Displays . . . . . . . . . . . . . . . . . . . . . . . . 7.1.1 Materials for Flexible Touch Panels . . . . . . . . . . . . . . . . . 7.1.2 Flexible Touch-Sensing Techniques . . . . . . . . . . . . . . . . . 7.1.3 Commercial Products Used in Interactive Displays . . . . . . . 7.2 Usage in Extreme Conditions . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.2.1 Water . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.2.2 Vibration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.2.3 Extreme Temperatures . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.2.4 Sunlight . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.3 Interactivity with Virtual Reality . . . . . . . . . . . . . . . . . . . . . . . . . 7.3.1 The System Composition of VR . . . . . . . . . . . . . . . . . . . . 7.3.2 Applications of VR Devices . . . . . . . . . . . . . . . . . . . . . . . 7.4 Touch and Speech Combined Interactivity . . . . . . . . . . . . . . . . . . 7.4.1 Characteristics of Touch and Speech Interactions . . . . . . . . 7.4.2 Enabling Accurate and Efficient Input . . . . . . . . . . . . . . . . 7.4.3 Accommodating a Wide Range of Users and More Conditions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.4.4 New Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.5 Emotion Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.6 Big-Data-Enabled Cyber Security . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
vii
179 179 179 187 194 195 196 198 202 202 203 203 204 207 207 208 214 216 217 218 224
Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231
Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 233
Chapter 1
Ambient Touch Interactivities
Touch interactivity represents the interactivities between humans and machines through touches (as conceptually shown in Fig. 1.1). Although touch interactivities have been intensively utilized, there is no strict definition for the term “touch interactivity.” In this book, to make the concept clear and concise for readers, we define touch interactivities as interactions between human and electronic systems conducted with the user’s hand, including contact and non-contact touch events. For example, a finger contacting a smartphone’s panel surface is deemed to be a contact touch interactivity. In contrast, gesture-based interactivities for virtual reality (VR) are treated as non-contact touch interactivities. Note that keystrokes with keyboards and clicks with mouses are not considered as touch interactivities here. The history of touch interactivity can be traced to the last century, when E. Johnson at the Royal Radar Establishment (UK) constructed the first touch panel for air traffic control in the year 1963 [1, 2]. In 1965, a detailed article [3] describing a capacitive-based technique designed for this touch panel was published, revealing that copper wires were placed in front of the cathode-ray tube (CRT) display. An AC-bridged circuit was used to interpret touch-induced capacitance change. A copper wire-based touch panel can only support a single-touch event at a time, and the presented capacitive touch-sensing technique was used for arranging air traffic until 1995. In 1973, F. Beck and B. Stumpe, working at the European Organization for Nuclear Research, presented the first true capacitive touch panel [4], which is almost the same as the type we use today. In the design, an array of transparent capacitive sensors was placed underneath the screen (conceptually depicted in Fig. 1.2a). When a finger touches a specific location, the dielectric of the corresponding capacitor is altered and sensed by the system [5]. In the same year, Dr. S. Hurst from Kentucky University developed the first resistive-based touch panel [6], which was selected as one of the US’s 100 Most Significant Technical Products of the year. The resistive architecture employed two transparent electrically resistive layers separated by spacer dots and connected to conductive bars in the horizontal (x-axis) and vertical (y-axis) sides. A voltage © Springer Nature Switzerland AG 2021 S. Gao et al., Touch-Based Human-Machine Interaction, https://doi.org/10.1007/978-3-030-68948-3_1
1
2
1 Ambient Touch Interactivities
Fig. 1.1 Conceptual description of (a) inputting and (b) retrieving information through touchenabled human–machine interactivities
Fig. 1.2 Structures of (a) capacitive-, (b) resistive-, and (c) IR-based touch panels
applied to one layer can be sensed by the other layer, and vice versa. When the user touches the screen, the two layers are connected at the touch point and work as voltage dividers, and the touch location is then calculated (conceptually shown in Fig. 1.2b). The developed resistive touch-sensing technique is still in use today for low-cost end terminals. During the same period, the first optical touch panel, consisting of arrays of lightemitting diodes (LEDs) and phototransistors arranged at the edges of the panel as depicted in Fig. 1.2c, was developed by Ebeling and his colleagues from the University of Illinois [7]. When the user’s finger touches the screen to select an object, the original light path is blocked, thereby determining the touch location [7]. A similar technique was commercially utilized in the Hewlett-Packard (HP) company’s product HP 150 in 1983 [8]. The first multi-touch-supported touch panel system was presented by the Input Research Group from the University of Toronto in 1982 and used touch images taken by a camera integrated behind the touch panel surface. The same group developed the first capacitive-based multi-touch-supported tablet system in collaboration with B. Buxton [9]. After more than 20 years of development since the first screen developed by Eric Johnson, fruitful research results have been demonstrated, and several commercialized products have emerged on the market. However, for several years, customers
1 Ambient Touch Interactivities
3
were not satisfied with the performance of touch-based location detection accuracy; at that time, objects were designed to be larger than the average finger size to avoid location detection errors, limiting the successful use of touch panels in consumer electronic products. This situation remained constant until 1988, when a “lift-off strategy” algorithm was proposed by the Human–Computer Interaction Lab at the University of Maryland through using a feedback scheme to tell users which object is selected after touching, thereby allowing users to adjust their finger’s position to the desired object before lifting their finger from the screen [10]. After enhancing the touch location detection accuracy, touch panels were integrated into smartphones. In 1994, the first touch panel-based smartphone was released by International Business Machines Corporation (IBM) and was named the IBM Simon Personal Communicator. Later, leading technology companies such as HP, Apple, and Microsoft released flagship touch-supported smart Personal Digital Assistants (PDAs). Notably, the iPhone was released in 2007 by Apple and adopted a capacitive touch-sensing technique; this device was deemed to be a revolutionary product that changed our manner of human–machine interactivities. To date, touch interactivities have been widely accepted by customers. Touch panels have been installed in consumer, industrial, medical, vehicle, and aero electronics. Diverse touch-sensing techniques allow customers to choose appropriate products for specific applications. The market of touch-enabled interactive displays reached 43.2 billion US dollars in the year 2019 [11], and it is expected that many more units will be shipped due to the fast development of the Internet of things (IoT) and smart city-associated scenarios. Meanwhile, touch detection is not limited to two-dimensional position identification; three-dimensional force sensing has also been realized through capacitive-, piezoresistive-, and piezoelectric-based architectures [12–14]. Furthermore, noncontact touch interactivities, e.g., gesture sensing, which are broadly depicted in scientific novels, are now entering into people’s daily lives, mainly with the help of optical technology, thus advancing the popularization of VR and motion sensing applications [15–17]. After 60 years of development, touch has provided irreplaceable functionality for HMIs, and touch interactivity is now attracting global attention from both academia and industry. The primary means to learn touch interactivity-related techniques is to read journals, conference papers, and patents. However, touch interactivity is a multidisciplinary area, covering different domains such as materials science, electrical engineering, and computer science. Reading research publications may be inconvenient for researchers and engineers (especially for beginners) who are interested in this area. Hence, it is highly desirable to provide a systematic study for this fast-developing area. The present book is composed under this background. The chapters of this book are arranged as follows: First, the biological characteristics of human fingers and user touch behaviors are overviewed in Chap. 2. In Chap. 3, the working principles of mainstream contact and non-contact touch detection techniques together with their corresponding applications and recent research advances are detailed. In addition, touch-related haptic techniques are briefly covered in Chap. 4 to provide readers an overall picture of how
4
1 Ambient Touch Interactivities
touch functions work with haptic feedback to provide a good user experience. Evaluation methodologies in both touch interactivity-supported systems and user experiences are provided in Chap. 5. Finally, Chap. 6 introduces touch interactivities associated with emerging applications, which are expected to provide novel and advanced user experiences along with greater human–machine interactive efficiency.
References 1. A. Nathan, S. Gao, Interactive displays: the next omnipresent technology [point of view]. Proc. IEEE 104(8), 1503–1507 (2016) 2. E.A. Johnson, Touch display—a novel input/output device for computers. Electron. Lett. 1(8), 219–220 (1965) 3. E.A. Johnson, Touch displays: a programmed man-machine interface. Ergonomics 10(2), 271–277 (1967) 4. J.F. Lowe, Computer creates custom control panel. Design News 29(22), 54–55 (1974) 5. CERN Document Server, Another of CERN’s many inventions! CERN Document Server. https://cds.cern.ch/record/1248908 6. G.S. Hurst, J.W.C. Colwell, Discriminating contact sensor, U.S. Patent No. 3,911,215, 7 Oct 1975 7. F. Ebeling, R. Johnson, and R. Goldhor, Infrared light beam xy position encoder for display devices, U.S. Patent No. 3,775,560. 27 Nov 1973 8. L. P. Development Company, in HP-150 Touchscreen Personal Computer with HP 9121 Dual Drives, 1983. http://www.hp.com/hpinfo/abouthp/histnfacts/museum/personalsystems/0031/ 9. AMT Lab @ CMU, in Multi-Touch Technology and the Museum: An Introduction. https://amtlab.org/blog/2015/10/multi-touch-technology-and-the-museum-an-introduction 10. R.L. Potter, L.J. Weldon, B. Shneiderman, Improving the accuracy of touch screens: an experimental evaluation of three strategies, in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, (ACM, New York, NY, 1988), pp. 27–32 11. Orion Market Reports, in Global Touch Screen Display Market Analysis, Trends, and Forecasts 2019–2025 (2019) 12. M.P. Coulson, C.J. Brown, D. Slamkul, Capacitive touch panel with force sensing, U.S. Patent No. 9,182,859, 10 Nov 2015 13. S. Gao, X. Wu, H. Ma, J. Robertson, A. Nathan, Ultrathin multifunctional graphene-PVDF layers for multidimensional touch interactivity for flexible displays. ACS Appl. Mater. Interfaces 9(22), 18410–18416 (2017) 14. S. Yue, W.A. Moussa, A piezoresistive tactile sensor array for touchscreen panels. IEEE Sensors J. 18(4), 1685–1693 (2017) 15. P. Kyriakou, S. Hermon, Can I touch this? Using natural interaction in a museum augmented reality system. Dig. Appl. Archaeol. Cult. Heritage 12, e00088 (2019) 16. I. Ahmed, V. Harjunen, G. Jacucci, E. Hoggan, N. Ravaja, M.M. Spapé, Reach out and touch me: effects of four distinct haptic technologies on affective touch in virtual reality, in Proceedings of the 18th ACM International Conference on Multimodal Interaction, (ACM, New York, NY, 2016), pp. 341–348 17. A.U. Batmaz, A.K. Mutasim, M. Malekmakan, E. Sadr, W. Stuerzlinger, Touch the wall: comparison of virtual and augmented reality with conventional 2D screen eye-hand coordination training systems, in 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), (IEEE, Piscataway, NJ, 2020), pp. 184–193
Chapter 2
Properties of Touch Events
2.1 2.1.1
Finger’s Physical Properties Physical Characteristics of Human Fingers and Touch Events
The user’s finger-related parameters include finger width and length, shapes of the fingertips [1, 2] and pulp, and impedance. These parameters are important for touch interactive systems [3–6] both to distinguish a touch event from a variety of potentially detected signals—e.g., cheek contact during a phone call—and to achieve high touch location registration accuracy [7–10]. Finger width and length are the first studied parameters, as they determine the general shape of a finger. In [11], 14 children and 14 adults’ finger geometries were measured, and the results are given in Table 2.1. It can be seen that the average finger geometry of children was around 70% that of adults, indicating that the threshold of determination for a touch event needs to be carefully designed for diverse age ranges of people. Besides finger length and width, the fingertip is considered to be a crucial factor, as in most finger touch events, the fingertip contacts [12–16] the panel surface directly. In [17], researchers studied the characteristics of the fingertips of ten people. The experimental results in terms of the width and length of the fingertips are shown in Table 2.2, indicating that the designed algorithms for calculating the touch centers of distinct fingers are different. In addition to the widths and lengths of fingertips, the tip curvature is also important, as it is widely used in the development of sub-pixel algorithms [18]. For example, in capacitive touch panels, curvature-related capacitance distribution is used to estimate the tip center, which is deemed as the touch location. In [18], researchers concluded that the best detection accuracy is achieved when the curvatures of the fingertips are modeled as Gaussian or elliptical curves. The impedance of a human finger is not a necessary consideration for most touchbased techniques, such as resistive, optical, and acoustic techniques, but it is of © Springer Nature Switzerland AG 2021 S. Gao et al., Touch-Based Human-Machine Interaction, https://doi.org/10.1007/978-3-030-68948-3_2
5
6
2 Properties of Touch Events
Table 2.1 Children and adults’ finger geometries (data are from [11])
Measurement Shoulder breath (cm) Arm length (cm) Hand length (cm) Hand width (cm) Index finger length (cm) Index finger width (mm) Index finger mass (g)
Study population Children Adults Mean SD Mean 31.0 2.2 42.5 50.8 4.7 71.3 13.6 1.0 18.8 7.5 0.6 10.0 4.8 0.1 7.2
SD 2.3 4.5 1.1 0.9 0.5
14.6
1.1
20.3
2.4
10.1
1.9
28.0
7.8
P-value