213 61 7MB
English Pages 137 [134]
Advances in Experimental Medicine and Biology 1170
João Silva Sequeira Editor
Robotics in Healthcare Field Examples and Challenges
Advances in Experimental Medicine and Biology Volume 1170
Editorial Board: Wim E. Crusio, CNRS and University of Bordeaux UMR 5287, Institut de Neurosciences Cognitives et Intégratives d’Aquitaine, Pessac Cedex, France John D. Lambris, University of Pennsylvania, Philadelphia, PA, USA Heinfried H. Radeke, Clinic of the Goethe University Frankfurt Main, Institute of Pharmacology & Toxicology, Frankfurt am Main, Germany Nima Rezaei, Research Center for Immunodeficiencies, Children’s Medical Center, Tehran University of Medical Sciences, Tehran, Iran
Advances in Experimental Medicine and Biology provides a platform for scientific contributions in the main disciplines of the biomedicine and the life sciences. This book series publishes thematic volumes on contemporary research in the areas of microbiology, immunology, neurosciences, biochemistry, biomedical engineering, genetics, physiology, and cancer research. Covering emerging topics and techniques in basic and clinical science, it brings together clinicians and researchers from various fields. Advances in Experimental Medicine and Biology has been publishing exceptional works in the field for over 40 years, and is indexed in SCOPUS, Medline (PubMed), Journal Citation Reports/Science Edition, Science Citation Index Expanded (SciSearch, Web of Science), EMBASE, BIOSIS, Reaxys, EMBiology, the Chemical Abstracts Service (CAS), and Pathway Studio. 2018 Impact Factor: 2.126.
More information about this series at http://www.springer.com/series/5584
João Silva Sequeira Editor
Robotics in Healthcare Field Examples and Challenges
Editor João Silva Sequeira Institute for Systems and Robotics Instituto Superior Técnico Lisbon, Portugal
ISSN 0065-2598 ISSN 2214-8019 (electronic) Advances in Experimental Medicine and Biology ISBN 978-3-030-24229-9 ISBN 978-3-030-24230-5 (eBook) https://doi.org/10.1007/978-3-030-24230-5 © Springer Nature Switzerland AG 2019 This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed. The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. The publisher, the authors, and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors give a warranty, expressed or implied, with respect to the material contained herein or for any errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in published maps and institutional affiliations. This Springer imprint is published by the registered company Springer Nature Switzerland AG. The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
Preface
This book is an effort to present a collection of perspectives by researchers involved in the exciting area that mingles robotics and healthcare. The growing number of technologies available to the world of robotics is bringing robots closer to people. The number of active R&D projects in the field is a clear indicator of the interest from the scientific communities. Most notably, healthcare is becoming a natural domain were to apply the new ideas originating from robotics. As the world demographics pushes natural resources, science, in general, and robotics, in particular, are assuming the responsibility of contributing for the well-being of societies, and healthcare emerges as the application domain. The works selected are, by no means, exhaustive. Instead, the goal was to identify a mixture of scientific, technical, and social challenges in the overlap of both healthcare and robotics. The selection ranges from systems modeling and control, motion strategies, and behavioral modeling to the expectations in field trials of proven robots. An emphasis in field trials illustrates the underlying desire of the scientific community in bringing robotics closer to people. The text is organized as to start with an overview of rehabilitation robotics and a taxonomy for the field, followed by the presentations of robots and related systems in the several healthcare areas, together with challenges and reflections on their use. In Chap. 1, Esyin Chew and David Turner provide an overview on rehabilitation robotics, namely, in the case of stroke, and discuss how the explosion of work in the field of robotics for healthcare is preventing an organized growth. Such noisy landscape creates difficulties and may be a source of confusion not only for healthcare professionals, when trying to build a global picture of existing technologies, but also to scientists and engineers working in the field, when defining research directions. Chew and Turner argue that additional attention, by robotics engineers and computer scientists, is needed to the multidisciplinary nature of healthcare robotics if the goal is to enhance human well-being. In Chap. 2, Rui Moreira, Joana Alves, Ana Matias, and Cristina Santos describe a smart walker device for locomotion rehabilitation. Pathologies such as degenerative diseases, trauma injuries, and musculoskeletal deformities often result in imbalance and lack of limb coordination and hence reduced mobility. Focusing on ataxia, a condition resulting from a neurological disor-
v
Preface
vi
der, Moreira et al. detail construction and usability aspects and clinical trials at a hospital, of a walker device capable of gait assistance adapted to the user characteristics. Chapter 3 visits the field of exoskeletons with a strong rehabilitation flavor. Maximo Totaro, Christian Di Natali, Irene Bernardeschi, and Lucia Beccai make a review of the main devices currently available for rehabilitation applications and focus on sensing principles for soft exoskeletons. A lighter and less bulkier alternative to rigid exoskeletons and soft exoskeletons present challenges in both sensing and actuation. In the sensing domain, enduring the diversity of rehabilitation scenarios, while providing reliable data that can be used to close the loop between user and exoskeleton, while maintaining wearability and mechanical robustness at acceptable levels is a major challenge currently being addressed by a large research community. In Chap. 4, Yasmeen Abu-Kheil, Omar Al Trad, Lakmal Seneviratne, and Jorge Dias explore the simulation of mobility principles for an active endoscopic (robot) capsule using immersive vision techniques. Chapter 5 is devoted to the use of robots as complementary therapies, namely, in aging-related diseases. Ana Nunes Barata makes a reflection on social robots in homecare therapies, namely, using the Paro robot, from a perspective of a healthcare professional. The literature on the Paro robot and its applications is extensive and reports positive results. However, the homecare context tends to be significantly different from the institutional context, e.g., with additional biases such as family support ranging from extreme to null. The challenge of adapting the complementary therapies to social- economical conditions is likely to require extensive field trials, where the natural feedback process can also contribute to novel robotic devices. Finally, in Chap. 6, João Silva Sequeira describes ongoing work, building on the FP7 European project Monarch, on developing a social robot for edutainment activities in the pediatric ward on an oncological hospital. The blossoming of social robotics in recent times is bringing expectations and fears about the role robots will play in a near future in human societies. This case study focus on the available technologies and usage constraints, as imposed by privacy regulations, discussing field situations experienced and lessons learned, along the period since the Monarch robot was deployed at the hospital. An underlying conclusion of the study is that raising awareness of people about social robots and their capabilities continues to be a much needed effort. Lisbon, Portugal April 2019
João Silva Sequeira
Acknowledgments
The editor is deeply grateful to all the authors for their contributions. At Springer Nature, the support of Inês Alves and Ilse Kooijman was instrumental. Inês Alves started this challenge of putting together contributions from people working in the overlapping of robotics and healthcare. Ilse Kooijman provided all the necessary assistance at the final stages.
vii
Contents
1 Can a Robot Bring Your Life Back? A Systematic Review for Robotics in Rehabilitation........................................... 1 Esyin Chew and David A. Turner 2 Smart and Assistive Walker – ASBGo: Rehabilitation Robotics: A Smart–Walker to Assist Ataxic Patients.................... 37 Rui Moreira, Joana Alves, Ana Matias, and Cristina Santos 3 Mechanical Sensing for Lower Limb Soft Exoskeletons: Recent Progress and Challenges..................................................... 69 Massimo Totaro, Christian Di Natali, Irene Bernardeschi, Jesus Ortiz, and Lucia Beccai 4 A Proposed Clinical Evaluation of a Simulation Environment for Magnetically-Driven Active Endoscopic Capsules........................................................................ 87 Yasmeen Abu-Kheil, Omar Al Trad, Lakmal Seneviratne, and Jorge Dias 5 Social Robots as a Complementary Therapy in Chronic, Progressive Diseases......................................................................... 95 Ana Nunes Barata 6 Developing a Social Robot – A Case Study.................................... 103 João S. Sequeira Index.......................................................................................................... 127
ix
1
Can a Robot Bring Your Life Back? A Systematic Review for Robotics in Rehabilitation Esyin Chew and David A. Turner
Abstract
Stroke is a leading cause of disability in the world and the use of robots in rehabilitation has become increasingly common. The Fourth Industrial Revolutions has created a novel and wide range of options for the involvement of computer-guided and artificially intelligent machines to be used in rehabilitation. In this chapter we critically review some of the literature on the use of robots in rehabilitation, and emphasize the diversity of approaches in this burgeoning field. We argue that there is a need to consolidate interdisciplinary evidence on robotics and rehabilitation in a systematic way, as the alternative is to have a literature that continues to grow, following the interests of various specialists, but without offering a synoptic assessment of what is available to medical specialists and patients. A literature review using Scopus and Web of Science, coupled with the Briggs Institute’s Critical Appraisal Tool: Checklist for Case Reports E. Chew (*) EUREKA Robotics Lab, Cardiff School of Technologies, Cardiff Metropolitan University, Cardiff, UK e-mail: [email protected]; https://www.cardiffmet.ac.uk/eureka D. A. Turner Institute for International and Comparative Education, Beijing Normal University, Beijing, China South Wales University, Wales, UK
was conducted. The two databases were systematically searched using inter-disciplinary keywords in Feb 2019. An initial search of the databases produced 9894 articles. After rigorous reviews, 35 articles were screened and selected for further interpretation. We examined the current studies on the efficiency and effectiveness of the robot interventions and produced a taxonomy of the review. An original finding of the current robotics in rehabilitation landscaping are critical presented with recommendations and concluding remarks concerning interdisciplinary impact. Keywords
Robot in rehabilitation · Robotics in healthcare · Interdisciplinary robotics research
1.1
Introduction and Research Methodology
1.1.1 Background and Aims Cerebrovascular accidents (CVA), or strokes, are a leading cause of death and serious disability in the UK, US and developing countries [4, 5, 17]. While CVA are preventable, and some are treatable and well supported in developed countries, they remain a huge burden on healthcare worldwide, especially in the Asia Pacific region [4, 12,
© Springer Nature Switzerland AG 2019 J. S. Sequeira (ed.), Robotics in Healthcare, Advances in Experimental Medicine and Biology 1170, https://doi.org/10.1007/978-3-030-24230-5_1
1
2
20, 29, 44]. The cost of the healthcare for CVA patients includes direct and indirect support from their families, healthcare professionals, friends, counsellors, and social and physical carers who may use various supporting equipment and medicine over extended periods of time. In the healthcare research environment, the cost of the healthcare to bring a CVA patient back to normal life includes knowledge partnership of interdisciplinary professionals such as healthcare and medical scientists, engineers, computer scientists, data and social scientists. There is a growing interest in using advanced technologies such as robotics and artificial intelligence (AI) for interdisciplinary research and services for CVA patients. WEF [55] states that the First Industrial Revolution used steam and hydropower to automate manufacturing, the Second Industrial Revolution used electric power to create mass production, and the Third industrial Revolution used electronics and information technology to digitise production and life. The Fourth Industrial Revolution, in which we are now involved, arises from the fusion of robotics and AI technologies that is blurring the lines between the physical, digital, ethical, social, economic, medical and biological spheres [54, 55]. We argue that the fusion of diverse elements in the Fourth Industrial Revolution has two immediate consequences. First, there is a huge growth in studies of the products of the Fourth Industrial Revolution where researchers apply their own disciplinary approaches to the subject matter. And second, there is an urgent need for interdisciplinary studies, so that a comprehensive understanding of the field can be produced that overcomes the present state of fragmentation. This may be particularly important in such cases as the support of CVA patients, where practical answers are needed that respond to important human needs. Such an interdisciplinary innovation will require interaction and conversation for new research directions and insights: “Disciplines have a lot to learn from each other and interdisciplinary research, with all its challenges, is invaluable” [32].
E. Chew and D. A. Turner
There have been many attempts to disrupting disciplinary silos to solve real-life problems, but responses have not been uniform; there are researchers who defend disciplinary-silos and others who champion interdisciplinary research [21, 43, 46]. Current robotics for rehabilitation research is dominated by engineering and computer science. This is understandable in terms of the considerable technical difficulties that have had to be overcome to achieve the Foruth Industrial Revolution. However, as robots come of age, and address the physical and psychosocial needs of human patients, other aspects will need to receive equal emphasis. As interdisciplinary academics in Computer Science, Education and Engineering, we offer the following focus for the research reported in this chapter, (1) Is there a disciplinary-silos-issue for the current robotics research in rehabilitation? (2) What is the interdisciplinary research landscaping for robotics in rehabilitation in the global literatures? (3) How can a robot blur the interdisciplinary spheres as a fusion of intelligent technology in healthcare, in particular in reducing the healthcare burden for CVA patients?
1.1.2 Research Methodology We commenced a systematic literature review, aiming to answer the above research questions by investigating peer-reviewed publications from two major databases: Scopus and Web of Science. These databases are used in the most widely cited world university rankings [38, 50] and the UK Research Excellence Framework [40]. Scopus was the main database used in the disciplinary analysis as is more inclusive of articles such as quality conference papers and book chapters. Web of Science was used for triangulation with Scopus as it draws on high impact social science (SSCI) journal articles. From 15th February to 15th March 2019, we used the combination of a small set of keywords to find the overall research landscape of robotics in rehabilitation, as well as interdisciplinary research (Engineering,
1 Can a Robot Bring Your Life Back? A Systematic Review for Robotics in Rehabilitation
Computer Science, Medicine, Mathematics, Neuroscience, Health, and Social Science). 1. “robot AND rehabilitation”; 2. “(robot AND rehabilitation) AND (LIMIT-TO (SUBJAREA, “ENGI“) OR LIMIT-TO (SUBJAREA, “COMP”) OR LIMIT-TO (SUBJAREA, “MEDI”) OR LIMIT-TO (SUBJAREA, “MATH“) OR LIMIT-TO (SUBJAREA, “NEUR“) OR LIMIT-TO (SUBJAREA, “HEAL”))”; and 3. “(robot AND rehabilitation) AND (LIMIT-TO (SUBJAREA, “SOCI”)”) 4. (robot AND rehabilitation) AND (LIMIT-TO (SUBJAREA, “MEDI”)) 5. (robot AND rehabilitation) AND (LIMIT-TO (SUBJAREA, “MATH”)) 6. (robot AND rehabilitation) AND (LIMIT-TO (SUBJAREA, “ENGI”)) 7. (robot AND rehabilitation) AND (LIMIT-TO (SUBJAREA, “COMP”)) These keyword searches were conducted in the fields, Article Title, Abstract or Keywords. We removed duplicate references across the two databases using Mendeley and two rounds of manual reviews. Articles in less prevalent disciplinary approaches to robotics, such as accounting, agriculture and environmental science were removed from the study. Secondary references of the multidisciplinary articles, government policy reports or resultant articles are included to recognise important relevant literatures in the research. The systematic reviews incorporated Joanna Briggs Institute’s [51] Critical Appraisal Tool: Checklist for Case Reports for inclusion and exclusion criteria of the interdisciplinary articles found. Grounded on the Joanna Briggs Institute [51] methodological quality measurement, the purpose of this systematic literature review is to determine the extent to which a study of robot in rehabilitation has addressed the possibility of bias in its design, conduct and analysis: 1. Were patient’s demographic characteristics clearly described? 2. Was the current clinical condition of the patient on presentation clearly described?
3
3. Was the intervention(s) or treatment procedure(s) clearly described? 4. Was the post-intervention clinical condition clearly described? If not, was it an innovative intervention for innovative methods, stakeholders’ experiences and acceptance? 5. Were adverse events (harms) or unanticipated events identified and described? Does the case report provide takeaway lessons? The first author (EC) screened all titles and abstracts of all articles found from the Scopus databases and the second author (DT) screened all titles and abstracts of all articles found from the Web of Science. Then both authors independently exchanged and reviewed the selection and agreed on the final set of articles to be included in the analysis. The results of this critical appraisal were examined by two external critical appraisers (JL and JB) to confirm the use of selected articles to inform synthesis and interpretation of the results of the study (Fig. 1.1).
1.2
Bibliometric Investigation for “Robot” and “Rehabilitation”
1.2.1 The Historical and International Profiles of Publications The earliest work in robotics and rehabilitations started in the 1970s with the “prosthetic man” and conceptual model of artificial limbs, the hand prosthesis using myoelectrical control, which was a revolutionary event in the rehabilitation field at that time [1, 52]. Robots have been used since the 1980s in orthopaedic and physical therapy treatments, and those robots can be categorised as surgical robots or physical therapy/assistant robots [24]. Kwakkel et al. [25] conducted an extensive literature review and claim that “no overall significant effect in favour of robot-assisted therapy was found in the present meta-analysis, however, subsequent sensitivity analysis showed a significant improvement in upper limb motor function after stroke for upper
E. Chew and D. A. Turner
4
Commence wi th the Sea rch Stra tegy (Keywords ) a l i gned wi th the Res ea rch Ques ti ons
Tota l “Robot a nd Reha b” Arti cl es i n a l l di s ci pl i nes : n=9894 (Scopus : 6799 a nd Web of Sci ence: 3095)
Tota l “Robot a nd Reha b” Arti cl es i n Sci ence rel a ted di s ci pl i nes (Engi neeri ng, Computer Sci ence, Medi ci ne, Ma thema ti cs , Neuros ci ence a nd Hea l th Profes s i on): n= 6590
Total “Robot and Rehab” Articles in Social Science or Mul ti -di s ci pl i na ry: n=138 (Scopus : 104 a nd Web of Sci ence: 34)
Arti cl es i ncl uded: n=35
Articles in less prevalent disciplines excl uded: n = 3304
Dupli ca tes removed: n=2 Arti cl es excluded based on Cri ti ca l Appra i s a l Tool : n = 101
Fig. 1.1 Flow Diagram of the Research Methodology
Fig. 1.2 Historical results analysis – scopus (left) and web of science (right)
arm robotics” (p. 111). Kwakkel et al. [25] further suggest that in future research robot-assisted therapy should be focused on kinematic analysis and distinguish between upper and lower robotics arm training. The relevant research began to blossom from the early 2000s as shown in Fig. 1.2.
From the historical footprint of Scopus data, the US, China, Japan, Italy, UK, Germany and South Korea lead the volume of publications for robotics in rehabilitation. China, US, Italy, UK, Japan, South Korea and Spain are the countries that produce most publications in higher impact journals indexed by Web of Science. Six coun-
1 Can a Robot Bring Your Life Back? A Systematic Review for Robotics in Rehabilitation
tries out of the top 15 Asian. Most leaders in the field of study are from developed countries, except for two developing countries, China and Malaysia (Fig. 1.3). Up to the present time, US and China are the leaders in the field of study in terms of research publications. The US remains the leader in engineering, computer science, medicine and social science for robotics research in rehabilitation. China is the leader in mathematics related publications for robotics in rehabilitation. Italy and
5
Switzerland are also prominent in Medicine related research in the field (see the further details in Appendix A). One expected results (as depicted in Fig. 1.4) is that the related research is dominated by engineering and computer science (62.6%), followed by medicine (13.8%), mathematics (6.2%), neuroscience (3.2%) and health professions (2.9%). Most of the publications venues for robotics in rehabilitation research are engineering, computer science and medicine as shown in the Fig. 1.5 and the document types are
CHINA USA ITALY UK/ENGLAND JAPAN SOUTH KOREA SPAIN GERMANY SWITZERLAND SINGAPORE CANADA NEW ZEALAND TURKEY TAIWAN MALAYSIA
0
Fig. 1.3 Historical results by country – scopus (left) and web of science (right)
Fig. 1.4 Scopus historical results by discipline
200
400
600
800
6
E. Chew and D. A. Turner
Fig. 1.5 Scopus historical results by publication venue
conference papers, except for the Journal of NeuroEngineering and Rehabilitation and IEEE Transactions on Neural Systems and Rehabilitation Engineering. This is because the field is young and innovative, and the commonly preferred publication venues are conference proceedings. In addition, the two most influential researchers in the field are Hogan (286 publications with 20,413 citations) and Krebs from MIT (142 publications with 9529 citations). As the most productive authors in the field, Riener from ETH Zurich, Switzerland has published 346 publications with 8023 citations (see the further details in Appendix A). All of these researhcers are from computer science and engineering disciplines. The international landscape of publications about robots and rehabilitation is dominated by science-based disciplines (engineering, medical, computer science, mathematics) which produced 96% of the publications, in contrast with the social science research and interdisciplinary research of science-based and social science- based research (Fig. 1.4). This finding is no surprise because of the robotics background in disciplinary-silos and the research is likely to be interdisciplinary only across the science-base, such as computer science and medicine, engineering and health professions. As interdisciplinary researchers with backgrounds in the social
sciences and sciences, we wish to pursue the literature review through the lens of social science and interdisciplinary studies, rather than the dominant science-based field of robotics and rehabilitation.
1.2.2 Robot and Rehabilitation in Social Science Research We commenced an alternative systematic review from the angle of social science. As stated in the Sect. 1.1.2, we analysed 138 “robot and rehabilitation” articles (categorised under the category of “Social Science”) indexed by Scopus and Web of Science and found that interdisciplinary research started from 2003. Figure 1.6 shows that social science and interdisciplinary research is of little concern in China for high impact journal publications, compared with Fig. 1.3 where China is ranked in first place in Web of Science publications. Intriguingly, the study shows that the US, UK, Switzerland and France indicate a high level of interest and are productive in the interdisciplinary research about robotics in rehabilitation. The leading publication venue for this interdisciplinary field is IEEE Transactions on Human Machines Systems (see the further details in Appendix A). Thirty-five articles out of the 138 articles were included to inform the synthesis and
1 Can a Robot Bring Your Life Back? A Systematic Review for Robotics in Rehabilitation
7
Fig. 1.6 Historical results in social science or interdisciplinary publications – scopus (left) and web of science (right)
Fig. 1.7 Historical results in social science or interdisciplinary publications by country – scopus (left) and web of science (right)
interpretation of the systematic review in the next sections. The summary of the selected articles can be found in Appendix C (Fig. 1.7).
1.2.3 Taxonomy for Robotics in Healthcare and Robotics Technologies in Rehabilitation Rehabilitation may not reverse the effects of a stroke but it is one of the most important phases of recovery for CVA patients, and assistant therapy robots are aimed to help CVA patients to
relearn or redefine how they live [24, 41]. The following are the major categories for healthcare robots [24] and the branches in the right frame in red can be categorised as robotics in rehabilitation (Fig. 1.8). Grounded on Kim et al. [24] and Regen [41], we classified the 35 articles based on the following taxonomy with three broad themes: (1) Communication, Social and Cognitive Skills and (2) Self-care and mobility skills; (3) Overall literature review articles that cut across the Upper Limbs and Lower Limbs, or (1) and (2) (Fig. 1.9):
E. Chew and D. A. Turner
8
Fig. 1.8 Taxonomy for robotics in healthcare. [24]
Fig. 1.9 Taxonomy for robotics in rehabilitation through the lens of social science
Communication, Social & Cognitive Skills Cognitive Robot-Patients Interactions or Speech Therapies for elderly care or children:
Communication, Social & Cognitive Skills Brain-Computer-Interface (BCI) or Brain– machine interface (BMI): 4 articles
7 articles Overall Literature Review: 6 articles Self-care & mobility skills
Self-care & mobility skills
Robotics for Upper Limbs Rehabilitation (shoulders, arms and hands) : 8 articles
Robotics for Lower Limbs Rehabilitation (legs and foots): 10 articles
In the overall literature review, home-based robotics can automate rehabilitations and produce measurable data and optimal dose [13, 18]. Most of the past interdisciplinary studies with positive, improving and promising results are robotics for lower limb rehabilitation [6, 8, 14, 22, 27, 35, 57]. In the area of upper-limb self-care and mobility studies, Hughes et al. [19] developed a novel portable and low cost H-Man robot with limited supervision. Palermo et al. [36] claimed that “statistical analysis showed a remarkable difference in most parameters between pre- and post-treatment” (pp. 1) of the Armeo Power robot, an assistive exoskeleton to perform 3D upper limb movements. Zhang et al. [57] were the first researchers to assert that the effectiveness of a robot-assisted rehabilitation intervention cannot be concluded due to the lack of universal assessment standards
and measures for various robotics devices and control strategies. Other recent studies critically propound that robots can enhance the intensity of rehabilitation exercises with cost and labour efficiency (i.e. increasing in the quantity of walking practice early post-stroke or enabling a therapist to train up to 1.5–6 times more patients for the same cost) but the clinical healthcare improvement underlying these effects remains poorly understood. There are no statistically significant differences between conventional rehabilitation and intervention with robots [5, 10, 37, 42]. Perhaps trivially, a study can claim that robot assisted rehabilitation that improves CVA patients’ lower limb recovery will promote their return to their family and society [28]. However, we would argue that past research struggles in transforming mechanical, digital and assistive robots into real social impact. Shamsuddin et al.
1 Can a Robot Bring Your Life Back? A Systematic Review for Robotics in Rehabilitation
[45] suggests that “a robotic animal is the solution to provide constant mental support and induce warm and empathetic feelings from the patients” (pp. 52) without providing empirical evidence or rigorous social implications. There are gaps in the past and current studies which fail to examine the truly social impact of rehabilitation robots using interdisciplinary approaches. Such approaches would appear to be necessary, in the light of the power of the Fourth Industrial Revolution to fuse the physical, ethical, social, economic, medical and biological spheres. These assistive technologies may help in training and restoring the physical functioning of a CVA patient, but the rehabilitation robots remain a cold and boring technology until the robots achieve substantial improvement in the social life of the CVA patients, and thereby become true social robots [26]. For instance, there is an exciting development of rehabilitation robotics to shift research from promoting mobility to socialisation among infants and toddlers [2]. However, there is a question mark over the extent to which this transformation is more than superficial, given the silo–disciplinary nature of past studies.
1.3
an Robots Bring Your Life C Back? The Controversial and Recommendations
Some empirical case studies across the globe demonstrated some very successful robots in rehabilitation. The list includes Zora Robot for elderly care services [53], SAR the Socially Assisted Robot for children’s therapy sessions [11], SLT the Speech-Language Robotic Therapist for improving cerebral palsy and communication disorders in children (Ochoa-Guaraca et al. [34] and MAKRO Robot for therapeutic exercises for children with cerebral palsy and similar movement disorders [16]. These cognitive Robot-Patient interactions for cognitive and speech therapies seem to be limited to elderly care or pediatric applications. Few studies have addressed a broader demographic, or have included cross-country comparison. In the present state of research it is unclear whether these
9
two groups are the focus of attention because that is where there is most medical need, or whether it results merely from the accidental interests of researchers. More social science and interdisciplinary studies would provide a context for such analysis, by relating robotic applications to the level of perceived need in different demographic groups, or producing an impetus to extend research to other groups. Such research might include other demographic characteristics such as the age group for public acceptance, quantitative clinical results on the effectiveness and in- depth qualitative or grounded theoretical studies long-term impacts of robot intervention in social and economic terms. Belda-Lois et al. [7] recommend that future research should analyse and optimise the impact of rehabilitation based on signals from the cerebral cortex using functional near infrared spectroscopy (fNIRS). Brain-machine Interfaces (BMI) are becoming a recent trend in the field of rehabilitation robotics [23, 48]. Miao et al. [30] argue that when the cortex is responsive to peripheral input, physiological signal control is an effective way of avoiding slacking and providing robotic support (Fig. 1.10). From the in-depth systematic review using the lens of social science, we conclude that robotics in rehabilitation has, so far, only been studied in a disciplinary silo. There is limited fusion of the study of rehabilitation robots in interdisciplinary studies, even though we have reviewed the social science and interdisciplinary articles from both databases. The question posed at this point is how we can best approach research into robots that by their very nature blur the interdisciplinary boundaries between intelligent technology and healthcare, especially in eliminating the healthcare burden for CVA patients in a long run. Past research has explored the healthcare burden and cost-effectiveness only to a very limited extent. We support the notion of Bellingham et al. [9] that these technologies will affect future labour productivity, human longevity and disability, and have a transformative influence on broad social- economic and political areas to drive human capacities beyond innate physiological levels in both anthropomorphic and non-anthropomorphic
E. Chew and D. A. Turner
10
Fig. 1.10 Summary of the literature review for upper- AAN assist-as-needed, EMG electromyographic, EEG electroencephalography, MI motor imagery, SSVEP limb rehabilitation techniques from 45 studies. [30] steady-state visual evoked potential
extended bodies. Furthermore, BMI studies may lead to a breakthrough. Comparative studies on various interdisciplinary ergonomic design and engineering or computer science methods with convincing empirical results are highly valued [49]. The future will depend on how robotics engineers and computer scientists bring together truly interdisciplinary experts in developing universal assessment standards and measures for various robotics devices and control strategies.
1.4
Concluding Remarks
Two immediate conclusions can be taken from this review of the literature. First, the field of robotics in rehabilitation is diverse. And second, the research focus has, so far, been very firmly on the technical aspects of robotics, as though the Fourth Industrial Revolution was in some way set apart and distinct from everything that had gone on before. However, while the current technological advances are in some ways revolutionary and change everything, there are also continuities. Mankind has always used tools and mechanical aids to support and extend the capabilities of our bodies. The classification of robotic purposes that we set out in Fig. 1.9 addresses the aspect of diversity. There are those robotic interventions that address communication, social and cognitive skills and there are interventions that address
self-care and mobility skills. And within those two categories there is a further division between robotics that are integrated and become, as it were, part of the body of the patient, and there are other robotic interventions that remain separate and which stand, so to speak, in opposition to the patient. To clarify those distinctions, and to emphasize the continuities in the area of patient rehabilitation, we might think in the area of self-care and mobility, of the difference between an exercise treadmill, which addresses physical fitness and muscle tone, without ever becoming part of the patient’s everyday life, and a wheelchair, which becomes a part of the patient’s daily routine. The exoskeleton described by Hansen et al. [18], which picks up signals from the hand and amplifies and reinforces the movements it detects, is extraordinary and a wonderful advance, but not very different in principle from a heart pacemaker. At the extreme end of this spectrum are interventions, such as those described by Khan and Hong [24] where signals taken from the cerebral cortex are used to control machinery that is external to the patient. The extraordinary technical achievement involved perhaps explains why the research should focus on the technical and engineering aspects, but the very important social and psychological aspects of such innovations deserve equal attention. Similarly, in the field of cognitive and communication skills, there is a distinction between
1 Can a Robot Bring Your Life Back? A Systematic Review for Robotics in Rehabilitation
those robotic interventions that a patient can use to drill and exercise certain abilities, but which, at the end of the day, are left in the office of the speech therapist or occupational therapist, and those aids to life which effectively become part of the body of the patient. One might, perhaps, think of the distinction between a teaching machine and a mobile phone, although with advances in technology that distinction is becoming less relevant. The point of emphasizing the continuities is to suggest that there is a great deal of research into social, ethical, economic and communicative behaviour that might be relevant to the understanding of human interactions with robots. At the most elementary level, people wear clothes, which they use to signal aspects of their personality, profession, attitudes and intentions. These can become very firmly integrated into their personality, and how they see themselves in relation to the world. For example, Weick [56] describes cases in which firefighters refuse to abandon equipment and flee a fire, even though retaining equipment endangers their lives, because they see the equipment as part of their professional personalities. This suggests that having a prosthetic limb may affect a person’s psychology and social interactions in ways that are at least as radical, and that the effects may be increased by the increased capabilities of such interventions. Care for a patient’s well-being implies that research should look further afield than just the issue of an engineering application in the realm of prosthetics. There is a risk, especially when considering brain-machine interfaces, that we will think of the person only as a brain. Tallis [47] has written about the importance of having a body (and by implication the form that body takes and the tools that extend its capabilities) in how we experience the world. He argues that we have come increasingly to view machines as human, and humans as machines. We see the tendency to view machines as humans in the naming of cars and the attribution of ill-will to washing machines and vacuum cleaners. But Tallis’ point is that this is a mistake and leads us into moral confusions. However, we
11
see robotic interventions exploiting this weakness of people, making robots humanoid in order to reduce their strangeness. However, even here, the issues involved are not clear-cut. We can understand a person’s attachment to machines that he or she lives with and used or repairs regularly. We only find it odd or dysfunctional if a person does not recognise the difference between a machine and a person, and that what they are dealing with is a machine; awareness of the distinctions is part of a mature understanding. But it is by no means certain that we would see the same “mistake” on the part of a child as being pathological; a child’s relationship with her doll may be very different from a woman’s relationship to her car. There is much here that needs exploring. We do not pretend to know all the answers, any more than anybody else does. But by drawing attention to the multitude of different ways in which different people engage with different machines, we wish to draw attention to two main points. First, the social, psychological, economic and ethical aspects of using robotics in rehabilitation are as important as mere physical competence. And second, the current state of research in the field does not in any way reflect the importance of those cross-disciplinary, multi- disciplinary and trans-disciplinary approaches that will be needed to understand and enhance human well-being. The research capabilities in robots for rehabilitation range from self-care and mobility skills (upper limbs and lower limbs) to communication, social and cognitive skills may have functional enhancement. There are still a number of open- ended challenges to resolve in the field, especially the fusion of the robots, ethical, social-economic, medical and biological spheres. In this sense, the application of a robot in rehabilitation includes multi-faceted complexities, and not only informs researchers about how we got where we are today, but it also sheds light on how the rehabilitation processes fused with robots can be better understood by taking a composite perspective and going beyond silo-disciplines.
12
Appendices ppendix A: Disciplines that Lead A in Robotics in Rehabilitation (Figs. 1.11, 1.12, 1.13, 1.14, 1.15, 1.16, 1.17 and 1.18)
Fig. 1.11 Robotics in Rehab – Engineering Publications
Fig. 1.12 Robotics in Rehab – Computer Science Publications
E. Chew and D. A. Turner
1 Can a Robot Bring Your Life Back? A Systematic Review for Robotics in Rehabilitation
Fig. 1.13 Robotics in Rehab – Medicine Publications
Fig. 1.14 Robotics in Rehab – Maths Publications
13
14
Fig. 1.15 Robotics in Rehab – Social Science and Interdisciplinary Publications
Fig. 1.16 Robotics in Rehab – The Leading Researchers in the Field
E. Chew and D. A. Turner
1 Can a Robot Bring Your Life Back? A Systematic Review for Robotics in Rehabilitation
Fig. 1.17 Robotics in Rehab – The Leading Researchers in the Field (Details)
15
16
E. Chew and D. A. Turner
Fig. 1.18 Historical Results by Publication Venues – Social Science and Interdisciplinary Research
1 Can a Robot Bring Your Life Back? A Systematic Review for Robotics in Rehabilitation
ppendix B: Sample of Inclusion A Screening Using JBI
17
As part of the clinical care of patients at the Royal Children’s Hospital in Melbourne, Australia, SAR is equipped to lead rehabilitation sessions of up to 30 min under the guidance of a therapist, and without technician support or Wizard-of-Oz operation. Both quantitative and qualitative data collected from 8 therapists participating across 19 rehab sessions A tool for caregivers to monitor patient affected from Mild Cognitive Impairment (MD) and Alzheimer’s Disease (AD). A robot/decision system can assist a patient through several levels of interaction. A robot can be employed to train and evaluate a patient by playing Syndrom Kurztest neuropsychological battery (SKT) Double loop of interaction: Caregiver-robot interaction and patient-robot interaction
NAO Robot
Robotic Arms
3. Andriella et al. [3]
Report on physiotherapists’ acceptance of a Socially Assistive Robot (SAR) as a therapeutic aid for paediatric rehabilitation
2. Carrillo et al. [11]
References Research methods Robot-patients interactions 1. Tuisku et al. [53] Empirical case study (interpretative content analysis) for implement Zora robot in elderly-care services. The data consist of interviews with Zora NAO Robot personnel (n = 39) who operated and work with Zora and comments from the general public: 107 (comments were collected from online and print media)
Appendix C: Summary of Included Articles
There is clearly a need for more information, for a better informed discussion on how robots can be used in elderly care and how to involve the general public in this discussion in a constructive way
This engagement has also delivered a prototype that is acceptable to therapists as part of clinical practice Analysis of survey results reveals overall positive perceptions of the SAR as a therapeutic aid, with particularly strong results for the SAR’s perceived usefulness and usability
A tool for caregivers to monitor patient affected from Mild Cognitive Impairment (MD) and Alzheimer’s Disease (AD) Human emotions are difficult to represent and analyze but, when available, they can be used to provide better Human-Robot Interaction (HRI) experiences
The results show that public opinion is mainly negative The personnel had more positive views; they saw it as a recreational tool, not as a replacement for their own roles
Results show SAR achieves a high degree of acceptance, i.e. its perceived usefulness and ease-of-use. Multiple sessions operating the SAR appears to strengthen positive perceptions of the system. The emphasis on robust performance over ambitious AI has contributed to the system’s successful integration. The current study is not a clinical trial and thus cannot provide conclusive statements regarding actual therapeutic benefits attributable to the SAR The results indicated that the robot can take profit of the initial interaction with the caregiver to provide a quicker personalisation, and it can adapt to different user responses and provide support and assistance at different levels of interaction New and better instruments will be crucial to assess the disease severity and progression, as well as to improve its treatment, stimulation, and rehabilitation
Recommendations & future work
Key findings & limitations
18 E. Chew and D. A. Turner
6. Naganuma et al. [31] Sony AIBO
5. Shamsuddin et al. [45] PARO Robot
References 4. Gnjatovic et al. [16] MAKRO Robot
Trial studies of the application of robotic pets in therapeutic rehabilitation at several hospitals and geriatric nursing homes are ongoing. In these settings, a variety of robotic devices have been introduced and implemented, mainly for physical rehabilitation In contrast with rehabilitation using living animals, robotic animals have advantages in that they avoid the problem of infection, can be controlled, and can sense and record healthy human states in conjunction with information communication technology
Research methods This paper reports on a pilot corpus of child-robot interaction in therapeutic settings. The corpus comprises recordings of the interactions between twenty-one children and the conversational humanoid robot MARKO, in the kinesitherapeutic room at the Clinic of Paediatric Rehabilitation in Novi Sad, Serbia. The subject group included both healthy children and children with cerebral palsy and similar movement disorders. Approximately 156 min of session time was recorded. All dialogues were transcribed, and nonverbal acts were annotated Participants will be selected from SOCSO rehabilitation Centre (SRC) Melaka, Malaysia. Criteria of potential participants includes but not limited to having symptoms of depression, e.g. low mood, withdrawal from social activities, sleep disturbance, loss of appetite and short attention span. The interaction includes but not limited to group therapy consisting of 6–10 participants undergoing group intervention with PARO as assistive device. The duration of group session is twice a week for 2 months. Each session will last for 30 min. Each participant will be able to have the opportunity to interact with PARO within the stipulated time
The robotic pet used in this study was the Sony AIBO, and all other components used were commercially available, allowing the easy implementation of these activities by any interested medical or welfare institutions The key issue in the promotion and successful completion of rehabilitation by older people is how to cultivate feelings of self-efficacy. With this in mind, this chapter describes the results of a study with two aims: first, to introduce a feeling of play and games and, second, to reverse the participant’s role from passive to active by giving them control over the robot rather than it being controlled by a therapist or operator
(continued)
Recommendations & future work In the next experimental phase, the interaction scenario will be adapted to particular therapeutic exercises. The goal of that phase will be to develop a computational model of interaction context of therapeutic exercises for children with cerebral palsy and similar movement disorders, and introduce a set of adaptive behavioral strategies for the robot MARK An animal robot, though small, is a therapeutic robotic companion that can make people happy with its comforting presence. Moreover, the study will deploy the concepts and structures of robot therapy in the field of rehabilitation robotics. This study is the first one in Malaysia to propose an animal robot to provide mental support to patients with depression. The main objective of this paper is to introduce PARO the seal robot as a remedy to reduce the need for psychotropic drugs during depression therapy at a rehabilitation center
Key findings & limitations The initial evaluation of the corpus indicates that children positively respond to MARKO, engage in interaction with MARKO, perform verbal instructions given by MARKO, and experience increased motivation for therapy This technical ability is essential for establishing a long-term attachment of children to the robotic system, which in turn has an important role in facilitating human-machine coexistence and cognitive infocommunication in the observed therapeutic setting A robotic animal is a proposed solution to provide constant mental support and induce warm and empathetic feelings from the patients. Comparing assessment scores of pre and post robotic therapy shall shed light on the suitability of PARO to help patients with depression Sessions in terms of: 1. Visual- where it can be seen from the face of people interacting with PARO. There are happier and are always smiling 2. Verbal- People who interacts often with PARO has courage to start conversations 3. Psychological- After interacting with PARO, people are less stressed and find that their mood improve
1 Can a Robot Bring Your Life Back? A Systematic Review for Robotics in Rehabilitation 19
3. Galloway et al. [15] Robotics Glove
We present the development of a soft robotic glove designed to support basic hand function. The glove uses soft fluidic actuators programmed to apply assistive forces to support the range of motion of a human hand
Research methods This paper presents a low-cost robotic assistant able to support children rehabilitation process through Speech-Language Therapy (SLT). It has been tested with 29 children with cerebral palsy and communication disorders Robotics for upper limbs rehabilitation 1. Hansen et al. [18] We propose instead to validate the design of a hand exoskeleton in a fully digital environment, without Virtual Hand Exoskelet the need for a physical prototype. The purpose of this study is thus to examine whether finger kinematics are altered when using a given hand exoskeleton. Therefore, user specific musculoskeletal models were created and driven by a motion capture system to evaluate the fingers’ joint kinematics when performing two industrial related tasks. The kinematic chain of the exoskeleton was added to the musculoskeletal models and its compliance with the hand movements was evaluated 2. Hughes et al. [19] In this paper we introduce a low cost, limited supervision, portable robot (H-Man) designed for Portable robot, H-Man upper extremity rehabilitation
References 7. Ochoa-Guaraca et al. [32] SLT Robot
These results are promising and this approach combining musculoskeletal and robotic modelling driven by motion capture data could be a key factor in the ergonomics validation of the design of orthotic devices and exoskeletons prior to manufacturing
This paper also introduces an implementation strategy to assess the effectiveness, benefits and barriers of using the H-Man robot for community-based neurorehabilitation in underserved populations, such as those that live in low income neighbourhoods or in rural areas This analysis concludes with results from preliminary human subjects testing where glove performance was evaluated on a healthy and an impaired subject
Our results show that the proposed exoskeleton design does not influence fingers’ joints angles, the coefficient of determination between the model with and without exoskeleton being consistently high (R2 = 0.93) and the nRMSE consistently low (nRMSE = 5.42 °)
A usability and feasibility study indicates that out-patient robotic treatment with the H-Man leads to positive improvements in arm movement, and that the system is usable and well accepted by key stakeholders
More specifically, we present a method of fabrication and characterization of these soft actuators as well as consider an approach for controlling the glove
Recommendations & future work This robotic assistant is able to interact with a mobile application that is aimed to conduct reinforcement activities at home with the parents of the children
Key findings & limitations The results are encouraging, given that it was possible to success fully integrate the robot to the therapy sessions
20 E. Chew and D. A. Turner
Research methods Despite the fact that so many studies claim the validity of robot-mediated therapy in post-stroke patient rehabilitation, it is still difficult to assess to what extent its adoption improves the efficacy of traditional therapy in daily life, and also because most of the studies involved planar robots
Robotic rehabilitation devices are more frequently used for the physical therapy of people with upper limb weakness, which is the most common type of stroke-induced disability. This is a challenging task to achieve for one of the most biomechanically complex joints of human body, i.e., the shoulder. Therefore, specific considerations have been made in the development of various existing robotic shoulder rehabilitation orthoses. Different types of actuation, degrees of freedom (DOFs), and control strategies have been utilized for the development of these shoulder rehabilitation orthoses This paper presents a comprehensive review of these shoulder rehabilitation orthoses. Recent advancements in the mechanism design, their advantages and disadvantages, overview of hardware, actuation system, and power transmission are discussed in detail with the emphasis on the assisted DOFs for shoulder motion
References 4. Palermo et al. [36] Armeo Power robot
5. Niyetkaliyev et al. [33] Various robotics for upper limbs are reviewed
Key findings & limitations We report the effects of a 20-sessionrehabilitation project involving the Armeo Power robot, an assistive exoskeleton to perform 3D upper limb movements, in addition to conventional rehabilitation therapy, on 10 subacute stroke survivors. Patients were evaluated through clinical scales and a kinematic assessment of the upper limbs, both pre- and post-treatment. A set of indices based on the patients’ 3D kinematic data, gathered from an optoelectronic system, was calculated. Statistical analysis showed a remarkable difference in most parameters between pre- and post-treatment Control strategies for the robotic upper limb rehabilitation orthoses are developed to repetitively guide the patients’ limbs on anatomically and ergonomically feasible trajectories so that the patients can regain muscular strength The main challenges are that these exoskeletons should be accurately aligned with the human joints, safely adjusted to match different individuals’ size, and provide naturalistic complex shoulder movements. The robotic shoulder rehabilitation orthoses that take into consideration only three rotational shoulder DOFs provide less workspace for patients and cause discomfort during the training sessions. Hence, to avoid the misalignments between the exoskeleton and human joints and provide larger ranges of motion, shoulder girdle mechanisms should be designed and implemented (continued)
The design of the robotic exoskeletons could be enhanced by using biomechanical principles of human motion. Thus, it is important for robotic specialists to thoroughly study shoulder biomechanics and cooperate with physiologists when designing future robotic orthoses. Understanding the shoulder anatomy and movement characteristics, structure of the bones, and articulations, muscle functions and their points of attachments will give a greater perspective toward the development of future robotic rehabilitation orthoses that can stimulate the natural movements of the shoulder complex
Recommendations & future work Significant correlations between the kinematic parameters and clinical scales were found. Our findings suggest that 3D robot-mediated rehabilitation, in addition to conventional therapy, could represent an effective means for the recovery of upper limb disability. Kinematic assessment may represent a valid tool for objectively evaluating the efficacy of the rehabilitation treatment
1 Can a Robot Bring Your Life Back? A Systematic Review for Robotics in Rehabilitation 21
7. Univadis
References 6. Miao et al. [30]
Research methods A comprehensive review of high-level control techniques for upper-limb robotic training. It aims to compare and discuss the potentials of these different control algorithms, and specify future research direction. Included studies mainly come from selected papers in four review articles. o make selected studies complete and comprehensive, especially some recently-developed upper-limb robotic devices, a search was further conducted in IEEE Xplore, Google Scholar, Scopus and Web of Science using keywords (‘upper limb∗’ or ‘upper body∗’) and (‘rehabilitation∗’ or ‘treatment∗’) and (‘robot∗’ or ‘device∗’ or ‘exoskeleton∗’). The search is limited to English-language articles published between January 2013 and December 2017 https://www.univadis.co.uk/viewarticle/ robot-assisted-stroke-rehab-benefits-patients-withupper-limb-motor-functional-impairments-esc522765?s1=news Title: Robot-assisted stroke rehab benefits patients with upper limb motor functional impairments | ESC 2017 Conference reports – RSi Communications
Key findings & limitations Comparative analysis shows that high-level interaction control strategies can be implemented in a range of methods, mainly including impedance/admittance based strategies, adaptive control techniques, and physiological signal control. Even though the potentials of existing interactive control strategies have been demonstrated, it is hard to identify the one leading to maximum encouragement from human users. However, it is reasonable to suggest that future studies should combine different control strategies to be application specific, and deliver appropriate robotic assistance based on physical disability levels of human users
Patients who receive robot-assisted rehabilitation of the upper limb following acute stroke in addition to conventional therapy experience greater reduction in motor impairment and greater improvement in function relative to patients who receive conventional therapy alone
Recommendations & future work To summarize in the field of control strategies for interactive rehabilitation training, (1) the impedance and admittance method is simply implemented with intuitive properties; (2) adaptive control is needed when incorporating time-varying capabilities of human users; and (3) physiological signal control is an effective way of avoiding slacking and providing robotic support only when the brain is particularly responsive to peripheral input
22 E. Chew and D. A. Turner
References Research methods Robotics for lower limbs rehabilitation 1. Rachakorakit and To restore musculature, it is necessary to prepare Charoensuk [39] for the next step of therapeutics The passive exercise is one of the method to maintain the musculature and prevent complication as deep vein thrombosis (DVT) as the patient Lehab robot cannot move leg until patient has been recovered. After that active exercise can be applied afterwards to maintain leg muscle and recover strength. In the past, the physical therapists are the person to use for treatment. The passive exercise and active exercise required a long time that makes physical therapists get fatigue Lehab robot was designed to assist therapist and improve quality of treatment. This robot was designed with four degree of freedom 2. Ballesteros et al. [6] Shared control is a strategy used in assistive platforms to combine human and robot orders to achieve a goal. Collaborative control is a specific shared control approach, in which user’s and robot’s BDWA with a robotic commands are merged into an emergent one in a rollator continuous way. Robot commands tend to improve efficiency and safety. However, sometimes, assistance can be rejected by users when their commands are too altered. This provokes frustration and stress and, usually, decreases emergent efficiency. To improve acceptance, robotnavigation algorithms can be adapted to mimic human behavior when possible We have compared the BDWA with other reactive algorithms in terms of similarity to paths completed by people with disabilities using a robotic rollator in a rehabilitation hospital unit. The BDWA outperforms all tested algorithms in terms of likeness to human paths and success rate
We propose a novel variation of the well-known dynamic window approach (DWA) that we call biomimetical DWA (BDWA) The BDWA relies on a reward function extracted from real traces from volunteers presenting different motor disabilities navigating in a hospital environment using a rollator for support
(continued)
The robot can be operated by the specify computer program which develop from the human interface. The subject can also operate by itself from three human interfaces included manual, active and passive The future prototype will be considered to add the other device such as the electromyography (EMG), to assess the ability of the muscle. The system can also be developed to have the other mode to increase the muscle strength
Hip flexion-extension adduct-abduct, knee flexion-extension, ankle dorsiflexionplantarflexion by applied PID controller for control each other joint of robot The design of the machine when transferring force with the gear of the robot can lift the distal weight up to 41 kg. The wheelchair-like design can be movable to the other place which is easier than moving the patient
The Lehab robot can function smoothly with four degree of freedom
Recommendations & future work
Key findings & limitations
1 Can a Robot Bring Your Life Back? A Systematic Review for Robotics in Rehabilitation 23
4. Galli et al. [14]
References 3. Ozaki et al. [35] Balance exercise assist robot (BEAR)
Research methods To examine the efficacy of postural strategy training using a balance exercise assist robot (BEAR) as compared with conventional balance training for frail older adults. The present study was designed as a cross-over trial without a washout term. A total of 27 community-dwelling frail or prefrail elderly residents (7 men, 20 women; age range 65–85 years) were selected from a volunteer sample.Two exercises were prepared for interventions: robotic exercise moving the center of gravity by the balance exercise assistrobot system; and conventional balance training combining muscle-strengthening exercise, postural strategy training and applied motion exercise. Each exercise was carried out twice a week for 6 weeks. Participants were allocated randomly to either the robotic exercise first group or the conventional balance exercise first group The aim of this research was to quantify the effects of an end-effector robotic rehabilitation locomotion training in a group of Parkinson’s disease (PD) patients using 3D gait analysis (GA). In particular, spatiotemporal parameters and kinematics variables by means of synthetic indexes (Gait Profile Score, GPS, and its Gait Variable Scores GVSs) were computed from GA at baseline, before the treatment (T0), and at the end of the rehabilitative program (T1) Patients underwent a cycle of out-patients rehabilitation treatment, consisting of at least a daily 3-h cycle, divided into 45 min of treatment for lower limb with robotic device and a treatment of occupational therapy for the upper limb
Recommendations & future work In frail or prefrail older adults, robotic exercise was more effective for improving dynamic balance andlower extremity muscle strength than conventional exercise. These findings suggest that postural strategy training with the balance exercise assist robot is effective to improve the gait instability and muscle weakness often seen in frail older adults
From these results, the use of Gait analysis has allowed to provide quantitative data about the end-effector robotic rehabilitation evidencing those joints more sensible to the treatment. The robotic locomotion training seems to improve gait pattern in patients with PD and in particular, the effect is on spatio-temporal parameters
Key findings & limitations Main outcome measures: preferred and maximal gait speeds, tandem gait speeds, timed up-and-go test, functional reach test, functional base of support, center of pressure, and muscle strength of the lower extremities were assessed before and after completion of each exercise program Robotic exercise achieved significant improvements for tandem gait speed (P = 0.012), functional reach test(P = 0.002), timed up-and-go test (P = 0.023) and muscle strength of the lower extremities (P = 0.001– 0.030) compared with conventional exercise
At T1 statistically significant improvements were found particularly in terms of spatiotemporal parameters (velocity, step length and cadence). No changes were observed as for GPS, while a trend towards improvement was found in terms of GVSs of pelvis and hip on the frontal plane This approach can contribute to increase a short time lower limb motor recovery in PD Patients
24 E. Chew and D. A. Turner
6. Zhang et al. [57]
References 5. Kang and Agrawal [22]
Research methods Children with cerebral palsy (CP) often suffer from movement disorders. They show poor balance and motor coordination. These children typically use passive walkers in their early years. However, there are no prior studies that document the effects of robot-enhanced walkers on functional improvements of these children. This paper reports the results of two pilot studies where children were trained to walk with a robot to perform a series of tasks with increasing levels of difficulty over a number of training sessions The aim of this study was to provide a systematic review of studies that investigated the effectiveness of robot-assisted therapy on ankle motor and function recovery from musculoskeletal or neurologic ankle injuries. METHODS: Thirteen electronic databases of articles published from January, 1980 to June, 2012 were searched using keywords ‘ankle∗’, ‘robot∗’, ‘rehabilitat∗’ or ‘treat∗’ and a free search in Google Scholar based on effects of ankle rehabilitation robots was also conducted. References listed in relevant publications were further screened Eventually, twenty-nine articles were selected for review and they focused on effects of robot-assisted ankle rehabilitation All the selected studies showed improvements in terms of ankle performance or gait function after a period of robot-assisted ankle rehabilitation training. The most effective robot-assisted intervention cannot be determined due to the lack of universal evaluation criteria for various devices and control strategies. Future research into the effects of robot-assisted ankle rehabilitation should be carried out based on universal evaluation criteria, which could determine the most effective method of intervention. It is also essential to conduct trials to analyse the differences among different devices or control strategies
RESULTS: Twenty-nine studies met the inclusion criteria and a total of 164 patients and 24 healthy subjects participated in these trials. Ankle performance and gait function were the main outcome measures used to assess the therapeutic effects of robot-assisted ankle rehabilitation. The protocols and therapy treatments were varied, which made comparison among different studies difficult or impossible. Few comparative trials were conducted among different devices or control strategies. Moreover, the majority of study designs met levels of evidence that were no higher than American Academy for Cerebral Palsy (CP) and Developmental Medicine (AACPDM) level IV. Only one study used a Randomized Control Trial (RCT) approach with the evidence level being discussed
(continued)
Recommendations & future work This pilot study documents the training outcomes for children with CP and compares results (i) between small and large number of training sessions and (ii) between toddlers and older children
Key findings & limitations The outcome measures are based on both data collected by the robot such as travel distance, average speed, and success ratio of given task and clinical variables to characterize their levels of disability and motor function
1 Can a Robot Bring Your Life Back? A Systematic Review for Robotics in Rehabilitation 25
Study design 20 participants with psychomotor disadaptation admitted to an academic rehabilitation ward were randomised to receive physiotherapist care supported by the SafeWalker® robotic walking aid or standard care only, for 10 days. SafeWalker® supports the body weight whilst securing postural stability without relying on upper body strength or high cognitive demand. Main outcome measures the primary outcome was the feasibility and acceptability of rehabilitation sessions at 5 and 10 days based on (i) questionnaires completed by patient and physiotherapist, (ii) the number of steps performed during sessions, (iii) replacement of a robotic session by a conventional one
8. Piau et al. [37]
SafeWalker Robotics Walking Aid
Research methods This review included 10 trials involving 502 participants to meta-analysis. The acute RAGT groups showed significantly greater improvements in gait distance, leg strength, and functional level of mobility and independence than the over-ground training (OGT) groups T
References 7. Lee et al. [27]
Key findings & limitations 120 participants) were observed than in the group with no intervention. Thus, RAGT improves mobility-related outcomes to a greater degree than conventional OGT for patients with incomplete SCI, particularly during the acute stage. RAGT treatment is a promising technique to restore functional walking and improve locomotor ability, which might enable SCI patients to maintain a healthy lifestyle and increase their level of physical activity Results the mean age of the participants was 85.2 years The robotic procedure was significantly simpler according to participants Regarding acceptability, there were no differences between the two groups
During follow-up, no robotic session had to be replaced by a conventional session All participants who benefited from the program completed the protocol
Recommendations & future work
26 E. Chew and D. A. Turner
Research methods CPWalker consists of a smart walker with body weight and autonomous locomotion support and an exoskeleton for joint motion support. Likewise, CPWalker enables strategies to improve postural control during walking. The integrated robotic platform provides means for testing novel gait rehabilitation therapies in subjects with CP and similar motor disorders. Patient-tailored therapies were programmed in the device for its evaluation in three children with spastic diplegia for 5 weeks
The aim of this study was to explore the application value of the lower limbs robot-assisted training system for post-total knee replacement (TKR) gait rehabilitation. A total of 60 patients with osteoarthritis of the knee were equally randomized into the traditional and robot-assisted rehabilitation training groups within 1 week after TKR. All patients received 2-week training
References 9. Belda-Lois et al. [8]
10. Li et al. [28]
Lower limbs robot-assisted rehabilitation training improves post-TKR patients’ knee proprioception and stability more effectively compared with the traditional method It improves patients’ gait and symptoms, increases their walking speed, and prolongs their walking distances, which benefit their return to family and society
Scores of hospital for special surgery (HSS), knee kinesthesia grades, knee proprioception grades, functional ambulation (FAC) scores, Berg balance scores, 10-m sitting--standing time, and 6-min walking distances were compared between the groups. The HSS score, Berg score, 10-m sitting--standing time, and 6-min walking distance of the robot-assisted training group were significantly higher than the control group (P 0.05)
(continued)
Recommendations & future work The results show the potential of the novel robotic platform to serve as a rehabilitation tool. The autonomous locomotion and impedance control enhanced the children’s participation during therapies. Moreover, participants’ postural control was substantially improved, which indicates the usefulness of the approach based on promoting the patient’s trunk control while the locomotion therapy is executed
Key findings & limitations After ten sessions of personalized training with CPWalker, the children improved the mean velocity (51.94 ± 41.97%), cadence (29.19 ± 33.36%) and step length (26.49 ± 19.58%) in each leg. Post-3D gait assessments provided kinematic outcomes closer to normal values than Pre-3D assessments
1 Can a Robot Bring Your Life Back? A Systematic Review for Robotics in Rehabilitation 27
References Research methods Brain-Computer-Interface (BCI) or Brain–machine interface (BMI) 1. Tang et al. [48] Brain-computer interface (BCI) directly translate human thought into machine command. It provides BCI-Robotic Arm a new and promising method for rehabilitation of persons with disabilities. BCI actuated robotic arm is an effective rehabilitation way for patients with upper limb disability. This paper proposed a method of combining electromyography (EMG) and Electroencephalogram (EEG) to control the manipulator. Specifically, we collect EMG signals from the human leg and use tcalahe leg movements to quickly and reliably select the joints which are currently activated 2. Khan and Hong [23] In this paper, a hybrid electroencephalographyfunctional near infrared spectroscopy (EEG-fNIRS) scheme to decode eight active brain commands from the frontal brain region for brain-computer interface is presented. A total of eight commands BMI- quadcopter are decoded by fNIRS, as positioned on the prefrontal cortex, and by EEG, around the frontal, parietal, and visual cortices. Mental arithmetic, mental counting, mental rotation, and word formation tasks are decoded with fNIRS, in which the selected features for classification and command generation are the peak, minimum, and mean ∆HbO values within a 2-s moving window
Three subjects participated in the experiment, the accuracy rates of classifiers in the offline experiment were exceeded 95% and they all completed the online control
The robot arm joints are precisely controlled by movement imagination (MI) brain-computer interfaces. The use of two non-homologous signals, scattered the burden of the brain and therefore reduce the work load. In addition, the program allows two kinds of operations at the same time, so the program is flexible and efficient. Offline experiment was designed to construct the classifier and optimal parameters. In the online experiment, subjects were instructed to control the robot arm to move an object from one location to another In the case of EEG, two eye-blinks, three eye-blinks, and eye movement in the up/down and left/right directions are used for fourcommand generation The features in this case are the number of peaks and the mean of the EEG signal during 1 s window. We tested the generated commands on a quadcopter in an open space. An average accuracy of 75.6% was achieved with fNIRS for four-command decoding and 86% with EEG for another four-command decoding
The testing results show the possibility of controlling a quadcopter online and in real-time using eight commands from the prefrontal and frontal cortices via the proposed hybrid EEG-fNIRS interface
Recommendations & future work
Key findings & limitations
28 E. Chew and D. A. Turner
Research methods This document provides a review of the techniques and therapies used in gait rehabilitation after stroke. It also examines the possible benefits of including assistive robotic devices and brain-computer interfaces in this field, according to a top-down approach, in which rehabilitation is driven by neural plasticity. The methods reviewed comprise classical gait rehabilitation techniques (neurophysiological and motor learning approaches), functional electrical stimulation (FES), robotic devices, and brain-computer interfaces (BCI)
Advances in robotics have also extended human sensory experience, cognition, and physical abilities. Direct brain control has offered disabled individuals a possibility to restore basic motor function. Soekadar et al. (4) give an example on how a non-invasive, hybrid electroencephalography and electrooculography—based brain and neural hand exoskeleton can restore intuitive control of grasping motions for quadriplegia patients, allowing them to perform basic daily living activities
References 3. Belda-Lois et al. [7]
4. Bellingham et al. [9]
Key findings & limitations Regarding classical rehabilitation techniques, there is insufficient evidence to state that a particular approach is more effective in promoting gait recovery than other. Combination of different rehabilitation strategies seems to be more effective than over-ground gait training alone. Robotic devices need further research to show their suitability for walking training and their effects on over-ground gait. The use of FES combined with different walking retraining strategies has shown to result in improvements in hemiplegic gait As noted by H. Herr, an advisory board member of Science Robotics, “future technologies will not only compensate for human disability but will drive human capacities beyond innate physiological levels, enabling humans to perform a diverse set of tasks with both anthropomorphic and non-anthropomorphic extended bodies.”
(continued)
Such augmentative technologies “will have a transformative influence on broad social, political, and economic spheres, affecting the future of sport, labor productivity, human longevity, and disability.”
Recommendations & future work Reports on non-invasive BCIs for stroke recovery are limited to the rehabilitation of upper limbs; however, some works suggest that there might be a common mechanism which influences upper and lower limb recovery simultaneously, independently of the limb chosen for the rehabilitation therapy
1 Can a Robot Bring Your Life Back? A Systematic Review for Robotics in Rehabilitation 29
2. Schröder et al. [42]
Aim: To evaluate the feasibility of repetitive gait training within the first 3 months post-stroke and the effects on gait-specific outcomes Methods: PubMed, Web of Science, Cochrane Library, Rehab Data and PEDro databases were searched systematically. Randomized controlled trials were included to descriptively analyse the feasibility and quantitatively investigate the effectiveness of repetitive gait training compared with conventional therapy. Fifteen randomized controlled trials were included
References Research methods Journal articles on overall literature review 1. Bustamante Valles et al. A typical group of stroke patients was randomly allocated to an intervention (n = 10) or a control [10] group (n = 10). The intervention group received Robotic Rehabilitation rehabilitation using the devices in the robot gym, (Robot Gym) whereas the control group (n = 10) received time-matched standard care. All of the study subjects were subjected to 24 two-hour therapy sessions over a period of 6–8 weeks. Several clinical assessments tests for upper and lower extremities were used to evaluate motor function pre- and post-intervention. A cost analysis was done to compare the cost effectiveness for both therapies
Recommendations & future work The robot gym therapy was more cost-effective than the traditional one-to-one therapy used during this study in that it enabled therapist to train up to 1.5–6 times more patients for the approximately same cost in the long term
Conclusion: Robots enable a substantial, yet feasible, increase in the quantity of walking practice early post-stroke, which might enhance functional recovery
Key findings & limitations Robot Gym consisted of low- and high-tech systems for upper and lower limb rehabilitation. Our hypothesis is that the Robot Gym can provide a cost- and labor-efficient alternative for post-stroke rehabilitation, while being more or as effective as traditional physical and occupational therapy approaches No statistically significant differences between the groups
Repetitive training can safely be provided through body weight support and locomotor assistance from therapists or a robotic device. No difference in drop-out rates was reported despite the demanding nature of the intervention. The meta-analysis yielded significant, but small, effects on walking independence and endurance. Training with end-effector robots appears most effective. However, the mechanisms underlying these effects remain poorly understood
30 E. Chew and D. A. Turner
Rehabilitation Robotics for infants and toddlers
5. Agrawal [2]
4. Law [26] Home-based Rehab technologies
Saebo Mobile Arm Support, Haptic Master, Hand Mentor Pro (HMP), Hand Mentor, and Myomo mPower 1000
References 3. Chen et al. [13]
The chapter argues that if the assistive technologies can only restore the biological and physical functioning of the disabled, they remain a robot only. When these technologies also manage to improve the social life of the disabled, they can turn into social robots Rehabilitation robotics, is dedicated to the state-of-the-art of an emerging interdisciplinary field where robotics, sensors, and feedback are used in novel ways to relearn, improve, or restore functional movements in humans Four distinct areas of Medical robotics, namely: Minimally Invasive Surgical Robotics, Micro and Nano Robotics in Medicine, Image-guided Surgical Procedures and Interventions, and Rehabilitation Robotics
Research methods The systematic review aims to synthesize the current knowledge of technologies and human factors in home-based technologies for stroke rehabilitation. We conducted a systematic literature search in three electronic databases (IEEE, ACM, PubMed), including secondary citations from the literature search. We included articles that used technological means to help stroke patients conduct rehabilitation at home, reported empirical studies that evaluated the technologies with patients in the home environment, and were published in English The types of technology of reviewed articles included games, telerehabilitation, robotic devices, virtual reality devices, sensors, and tablets. We present the merits and limitations of each type of technology
The next section addresses an important emphasis in the field of medicine today that strives to bring rehabilitation out from the clinic into the home environment, so that these medical aids are more readily available to users
The significant advances in and novel designs of soft actuators and wearable systems that have emerged in the area of prosthetic lower limbs and ankles in recent years, which offer potential for both rehabilitation and human augmentation
Robotic devices automate therapy procedure and generate a wide variety of forces and motions for training. Another benefit of robotic devices is to deliver measurable and optimal dose and intensity for intensive therapy Following this line of thought, the chapter argues that rehabilitation programmes facilitating the use of assistive technologies in Hong Kong have difficulties in transforming assistive technologies into social robots
Key findings & limitations The search yielded 832 potentially relevant articles, leading to 31 articles that were included for indepth analysis. We then derive two main human factors in designing home-based technologies for stroke rehabilitation: designing for engagement (including external and internal motivation) and designing for the home environment (including understanding the social context, practical challenges, and technical proficiency)
(continued)
The rehab devices for the pediatric population. Their impairments are life-long and rehabilitation robotics can have an even bigger impact during their lifespan. In recent years, a number of new developments have been made to promote mobility, socialization, and rehabilitation among the very young: the infants and toddlers
The chapter is concluded by further elaborating these difficulties at both the micro- and macro-level
Recommendations & future work This systematic review presents an overview of key technologies and human factors for designing home-based technologies for stroke rehabilitation The robotic devices mainly aid the movement of the arm, wrist, and hands to improve the active flexion and extension range of motion
1 Can a Robot Bring Your Life Back? A Systematic Review for Robotics in Rehabilitation 31
References 6. Kim et al. [24]
Research methods Since the first industrial robot was introduced in the 1960s, robotic technologies have contributed to enhance the physical limits of human workers in terms of repeatability, safety, durability, and accuracy in many industrial factories. In the twenty-first century, robots are expected to be further applied in healthcare, which requires procedures that are objective, repetitive, robust and safe for users
Key findings & limitations We focus on research and clinical activities that have followed successful demonstrations of early pioneering robots such as daVinci telesurgical robots and LOKOMAT training robots
Recommendations & future work First, we categorize major areas of healthcare robotics. Second, we discuss robotics for surgical operating rooms. Third, we review rehabilitation and assistive technologies
32 E. Chew and D. A. Turner
1 Can a Robot Bring Your Life Back? A Systematic Review for Robotics in Rehabilitation
References
33
13. Chen Y, Abel KT, Janecek JT, Chen Y, Zheng K, Cramer SC (2018) Home-based technologies for stroke rehabilitation: a systematic review. Int 1. Abram HS (1970) The prosthetic man. Compr J Med Inform:11–22. https://doi.org/10.1016/j. Psychiatry 11(5):475–481 ijmedinf.2018.12.001 2. Agrawal S (2018) The encyclopedia of medical robot 14. Galli M et al (2016) Use of the gait profile score ics. World Scientific, Singapore, p 280. https://doi. for the quantification of the effects of robot-assisted org/10.1142/10770-vol4 gait training in patients with Parkinson’s disease. In: 3. Andriella A, GuillemMés AR, Joan FH, Carme T Proceedings of 2016 IEEE 2nd international forum on (2018) Deciding the different robot roles for patient Research and Technologies for Society and Industry cognitive training. Int J Hum Comput Stud Elsevier leveraging a better tomorrow, RTSI 2016, Bologna, Ltd 117(March):20–29. https://doi.org/10.1016/j. Italy, September 7–9, 2016, pp 1–4. https://doi. ijhcs.2018.03.004 org/10.1109/RTSI.2016.7740603 4. Aziz ZA, Lee YY, Ngah BA, Sidek NN, Looi I, Hanip 15. Galloway KC et al (2018) Soft robotic glove for MR, Basri HB (2015) Acute stroke registry Malaysia, combined assistance and rehabilitation during activi2010–2014: results from the national neurology ties of daily living. In: The encyclopedia of medical registry. J Stroke Cerebrovasc Dis 24. https://doi. robotics. World Scientific, pp 135–157. https://doi. org/10.1016/j.jstrokecerebrovasdis.2015.07.025 org/10.1142/9789813232327_0006 5. Babaiasl M, Hamed S, Poorya J, Mojtaba Y (2016) A 16. Gnjatovic M et al (2018) Pilot corpus of childrobot review of technological and clinical aspects of robot- interaction in therapeutic settings. In: Proceedings aided rehabilitation of upper-extremity after stroke. of 8th IEEE international conference on Cognitive Disabil Rehabil Assist Technol Taylor & Francis Infocommunications, CogInfoCom 2018, Debrecen, 11(4):263–280. https://doi.org/10.3109/17483107.20 Hungary, September 11–14, 2018, pp 253–258. 14.1002539 https://doi.org/10.1109/CogInfoCom.2017.8268252 6. Ballesteros J et al (2017) A biomimetical dynamic 17. Gov.uk (2018) New figures show larger proportion of window approach to navigation for collaborative constrokes in the middle aged. UK Government Health trol. IEEE Trans Hum Mach Syst 47(6):1123–1133. and Social Care. https://www.gov.uk/government/ https://doi.org/10.1109/THMS.2017.2700633 news/new-figures-show-larger-proportion-of-strokes 7. Belda-Lois JM et al (2011) Rehabilitation of gait in-the-middle-aged. Accessed 10 Feb 2019 after stroke: a review towards a top-down approach. 18. Hansen C et al (2017) Design-validation of a hand J Neuroeng Rehabil BioMed Central Ltd 8(1):66. exoskeleton using musculoskeletal modeling. Appl https://doi.org/10.1186/1743-0003-8-66 Ergon Elsevier 68:283–288. https://doi.org/10.1016/j. 8. Belda-Lois JM et al (2016) Locomotor training apergo.2017.11.015 through a novel robotic platform for gait rehabilita19. Hughes CML et al (2016) Community-based tion in pediatric population: short report. J Neuroeng neurorehabilitation in underserved populations. Rehabil 13(1):1–6. https://doi.org/10.1186/ In: Proceedings of IEEE Global Humanitarian s12984-016-0206-x Technology Conference: technology for the ben 9. Bellingham J et al (2016) Science for robotics and efit of humanity, GHTC 2016, Seattle, WA, October robotics for science. Sci Robot 1(1):eaal2099. https:// 13–16, 2016, pp 576–588. https://doi.org/10.1109/ doi.org/10.1126/scirobotics.aal2099 GHTC.2016.7857338 10. Bustamante Valles K et al (2016) Technology- 20. Islam MN et al (2013) Burden of stroke in assisted stroke rehabilitation in Mexico: a pilot Bangladesh. Int J Stroke 8(3):211–213. https://doi. randomized trial comparing traditional therapy to org/10.1111/j.1747-4949.2012.00885.x circuit training in a robot/technology-assisted therapy 21. Jacobs JA (2014) In defense of disciplines: gym. J Neuroeng Rehabil 13(1):1–15. https://doi. Interdisciplinarity and specialization in the research. org/10.1186/s12984-016-0190-1 University Print, Oxford. https://doi.org/10.7208/ 11. Carrillo FM et al (2018) Physiotherapists’ accep chicago/9780226069463.001.0001 tance of a socially assistive robot in ongoing clini 22. Kang J, Agrawal SK (2018) Robot-enhanced walkcal deployment. In: Proceedings of 27th IEEE ers for training of children with cerebral palsy: pilot international symposium on Robot and Human studies. In: The encyclopedia of medical robotics. Interactive Communication, RO-MAN’18, Nanjing World Scientific, Singapore, pp 217–240. https://doi. China, August 27–31, 2018, pp 850–855. https://doi. org/10.1142/9789813232327_0009 org/10.1109/ROMAN.2018.8525508 23. Khan MJ, Hong K-S (2017) Hybrid EEG–fNIRS- 12. Chen, C.-Y., Huang, Y.-B. and Tzu-Chi Lee, C. (2013) based eight-command decoding for BCI: appli‘Epidemiology and disease burden of ischemic stroke cation to quadcopter control. Front Neurorobot in Taiwan.Int J Neurosci. England, 123(10), pp. 724– 11:6. Available at: https://www.frontiersin.org/ 731. doi: https://doi.org/10.3109/00207454.2013.796 article/10.3389/fnbot.2017.00006 552
34
E. Chew and D. A. Turner
experimental evaluation of upper limb motor recov 24. Kim J, Gu GM, Heo P (2016) Robotics for healthcare. ery. PeerJ 6:e5544. https://doi.org/10.7717/peerj.5544 In: Jo H et al (eds) Biomedical engineering: frontier research and converging technologies. Springer 37. Piau A et al (2019) Use of a robotic walking aid in rehabilitation to reduce fear of falling is feaInternational Publishing, Cham, pp 489–509. https:// sible and acceptable from the end user’s perspecdoi.org/10.1007/978-3-319-21813-7_21 tive: a randomised comparative study. Maturitas 25. Kwakkel G, Kollen BJ, Krebs HI (2008) Effects of Elsevier 120:40–46. https://doi.org/10.1016/j. robot-assisted therapy on upper limb recovery after maturitas.2018.11.008 stroke: a systematic review. Neurorehabil Neural 38. QS (2019) QS intelligent unit: citations per faculty. Repair 22(2):111–121 http://www.iu.qs.com/university-rankings/indicator26. Law P (2015) Social robotics in health-care ser citations-per-faculty/. Accessed 10 Feb 2019 vice: the case of rehabilitation programmes in Hong Kong. In: Vincent J et al (eds) Social robots 39. Rachakorakit M, Charoensuk W (2017) Development of LeHab robot for human lower limb movefrom a human perspective. Springer International ment rehabilitation. BMEiCON 2017 – 10th bioPublishing, Cham, pp 55–65. https://doi. medical engineering international conference, org/10.1007/978-3-319-15672-9_5 2017 January, pp 1–5. https://doi.org/10.1109/ 27. Lee HJ et al (2017) Robot-assisted gait training BMEiCON.2017.8229148 (Lokomat) improves walking function and activ40. REF (2014) Research excellence framework, the ity in people with spinal cord injury: a systematic UK higher education institutions. https://www.ref. review. J Neuroeng Rehabil 14(1):1–13. https://doi. ac.uk/2014/about/guidance/citationdata. Accessed 10 org/10.1186/s12984-017-0232-3 Feb 2019 28. Li J et al (2014) A pilot study of post-total knee 41. Regen (2019) Rehab programs. http://www. replacement gait rehabilitation using lower limbs regen.rehab/index.php/what-is-rehabilitation/ourrobot-assisted training system. Eur J Orthop Surg care#rehabprogram. Accessed 25 Feb 2019 Traumatol 24(2):203–208. https://doi.org/10.1007/ 42. Schröder J, Truijen S, Van Criekinge TA, Saeys W s00590-012-1159-9 (2019) Feasibility and effectiveness of repetitive gait 29. Loo KW, Gan SH (2012) Burden of stroke in training early after stroke: a systematic review and Malaysia. Int J Stroke 7(2):165–167. https://doi. meta-analysis. J Rehabil Med 51(2):78–88 org/10.1111/j.1747-4949.2011.00767.x 30. Miao Q et al (2018) Reviewing high-level control 43. Scoones (2018) Breaking disciplinary silos to solve real-life puzzles. Institute of Development Studies, techniques on robot-assisted upper-limb rehabilitaEcosystem Services for Poverty Alleviation. https:// tion. Adv Robot 32(24):1253–1268. https://doi.org/1 www.espa.ac.uk/news-blogs/blog/breaking-disciplin0.1080/01691864.2018.1546617 ary-silos-solve-real-life-puzzles. Accessed 10 Feb 31. Naganuma, M., Ohkubo, E. and Kato, N. (2017) 2019 ‘Promotion of rehabilitation practice for elderly 44. Shaik MM, Loo KW, Gan SH (2012) Burden of people using robotic pets’, in van Hoof, J., Demiris, stroke in Nepal. Int J Stroke 7(6):517–520. https://doi. G., and Wouters, E. J. M. (eds) Handbook of smart org/10.1111/j.1747-4949.2012.00799.x homes, health care and Well-being. Cham: Springer International Publishing, pp. 543–554. doi: https:// 45. Shamsuddin S et al (2017) Preliminary study on the use of therapeutic seal robot for patients with depresdoi.org/10.1007/978-3-319-01583-5_65 sion. In: Proceedings of IEEE 4th International 32. Nature (2017) Enabling interdisciplinary research. Symposium on Robotics and Intelligent Sensors: Nat Hum Behav 1(12):845. https://doi.org/10.1038/ empowering robots with smart sensors, IRIS 2016, s41562-017-0272-5 Tokyo, Japan, December 17–20, 2016, pp 52–56. 33. Niyetkaliyev AS et al (2017) Review on design https://doi.org/10.1109/IRIS.2016.8066065 and control aspects of robotic shoulder reha46. Stirling A (2015) Disciplinary dilemma: working bilitation orthoses. IEEE Trans Hum Mach across research silos is harder than it looks. Guardian Syst 47(6):1134–1145. https://doi.org/10.1109/ Political Science Blog, 11th June 2014. https://doi. THMS.2017.2700634 org/10.13140/RG.2.1.1919.3680 34. Ochoa-Guaraca M, Pulla-Sánchez D, Robles- 47. Tallis R (2011) Aping mankind: Neuromania, Bykbaev V, López-Nores M, Carpio-Moreta M, And Darwinitis and the misrepresentation of humanity. García-Duque J (2017) A hybrid system based on Acumen, Durham robotic assistants and mobile applications to support in speech therapy for children with disabilities 48. Tang J, Zhou Z, Yu Y (2017) A hybrid computer interface for robot arm control. In: Proceedings of 8th and communication disorders. Campus Virtuales International Conference on Information Technology 6(1):77–87 in Medicine and Education, ITME 2016, Fuzhou, 35. Ozaki K et al (2017) Training with a balance exerChina, December 23–25, 2016, pp 365–369. https:// cise assist robot is more effective than conventional doi.org/10.1109/ITME.2016.0088 training for frail older adults. Geriatr Gerontol Int 17(11):1982–1990. https://doi.org/10.1111/ggi.13009 49. Tang Z, Yang H, Zhang L, Liu P (2018) Effect of shoulder angle variation on sEMG-based elbow joint 36. Palermo E et al (2018) Translational effects of robot- angle estimation. Int J Ind Ergon 68:280–289 mediated therapy in subacute stroke patients: an
1 Can a Robot Bring Your Life Back? A Systematic Review for Robotics in Rehabilitation 50. THE (2019) Times Higher Education World University Ranking 2019. https://www.timeshighereducation. com/sites/default/files/the_2019_world_university_ rankings_methodology_pwc.pdf. Accessed 10 Feb 2019 51. The Joanna Briggs Institute (2017) The Joanna Briggs Institute critical appraisal tools for use in JBI systematic review: checklists for case reports. http://joannabriggs.org/assets/docs/critical-appraisaltools/JBI_Critical_Appraisal-Checklist_for_Case_ Reports2017.pdf. Accessed 10 Feb 2019 52. Tomovic R (1971) Systems approach to skeletal control: concept of the system. Adv Electron Electron Phys 30(C):273–282 53. Tuisku O et al (2018) Robots do not replace a nurse with a beating heart. Inf Technol People 32(1):47–67. https://doi.org/10.1108/itp-06-2018-0277
35
5 4. UK Parliament (2019) Fourth Industrial Revolution, Education Committee, UK Parliament. https://www.parliament.uk/business/committees/committees-a-z/commons-select/education-committee/inquiries/parliament-2017/ fourth-industrial-revolution-inquiry-17-19/ 55. WEF (2016) The fourth industrial revolution: what it means, how to respond. World Economic Forum. https://www.weforum.org/agenda/2016/01/thefourth-industrial-revolution-what-it-means-and-howto-respond 56. Weick KE (1999) That’s moving: theories that matter. J Manag Inq 8(2):134–142 57. Zhang M, Davies TC, Xie S (2013) Effectiveness of robot-assisted therapy on ankle rehabilitation – a systematic review. J Neuroeng Rehabil 10(1). https://doi. org/10.1186/1743-0003-10-30
2
Smart and Assistive Walker – ASBGo: Rehabilitation Robotics: A Smart–Walker to Assist Ataxic Patients Rui Moreira, Joana Alves, Ana Matias, and Cristina Santos
Thereby, this paper presents the system overview, focusing on design considerations, Locomotion is an important human faculty mechanical structure (frame and main compothat affects an individual’s life, bringing not nents), electronic and mechatronic compoonly physical and psychosocial implications nents, followed by its functionalities. Lastly, it but also heavy social-economic consequences. presents results regarding the main functionalThus, it becomes paramount to find means ities, addressing clinical evidence. (augmentative/assistive devices) to empower the user’s residual capacities and promote functional recovery. In this context, a smart walker (SW) is explored for further clinical evaluation of ataxic patients during walker-assisted and to Keywords Smart Walker · Ataxia · Mobility assistance · serve as a functional compensation and assist- Motor coordination as- needed personalized/customized rehabilitation tool, autonomously adapting assistance to the users’ needs, through innovative combination of real-time multimodal sensory information from SW built-in sensors. To meet the 2.1 Introduction users’ needs, its design was weighed, considering to whom it is intended. Locomotion/mobility plays an important role in daily living activities of individuals [1]. However, different types of pathologies, such as poliomyelitis, degenerative joint diseases, spinal cord injuries, multiple sclerosis, traumas, musculoR. Moreira · J. Alves · C. Santos (*) Universidade do Minho, Guimarães, Portugal skeletal deformities and ataxia, greatly impair human mobility at different levels provoking parCenter for Microelectromechanical Systems (CMEMS) / Academic Clinical Center (2CA Braga, tial or complete loss of this faculty, consequently Braga Hospital), Guimarães, Portugal compromising performance of daily tasks with e-mail: [email protected] ease [1, 2]. A. Matias Mobility restrictions induce not only profound Hospital de Braga, Braga, Portugal physical and psychosocial implications but also Departamento de Medicina Física e de Reabilitação, heavy social economic consequences, which furAbstract
Braga, Portugal
© Springer Nature Switzerland AG 2019 J. S. Sequeira (ed.), Robotics in Healthcare, Advances in Experimental Medicine and Biology 1170, https://doi.org/10.1007/978-3-030-24230-5_2
37
38
ther jeopardize quality of life [2–5]: early retirement, increasingly third-party and informal caregiver’s dependence costs and in more serious cases, palliative care or institutionalization. Ataxia is one of the most startling mobility disorders, induced by focal neurological deficits in the cerebellum and afferent proprioceptive pathways, responsible for ensuring a proper voluntary movement coordination and control [6]. World Health Organization reports an annual global prevalence of 1,3–20,2 cases per 100.000 population, and in Spain is estimated an annual health cost per patient with spinocerebellar ataxia of around 18.776€ [7]. Additionally, lesions in the cerebellum can result in postural sway and backward balance reactions, being those subjects extremely prone to falls and fall-related injuries, which greatly jeopardizes their independence and exacerbates social-economic consequences [8]. In fact, 73.6% and 74% of spinocerebellar ataxic patients involved in a 1-year study reported falls occurrences and fallrelated injuries, respectively [9]. There is neither a pharmacological or surgical solution which can fully reverse motor disturbances, being Physical Medicine and Rehabilitation (PMR) interventions key to promote functional recovery in ataxic patients [6, 10, 11]. Intensive coordinative physiotherapeutic training is supreme by harnessing functional residual capacities though stimulating of motor learning capabilities of the cerebellum [12]. Rehabilitation programmes found in current literature including balance, coordination, postural reactions training, occupational therapy and hydrotherapy have shown good results in terms of muscle strengthening, physical resilience increase, postural stability and motor coordination [11]. Recently, techniques such as trans- cranial magnetic stimulation, virtual reality, biofeedback (e.g. PhysioSensing platform), treadmill exercises with supported bodyweight and torso weighting have also shown to have potential [11]. Thus, researchers have been addressing the need of therapeutic alternatives including the selection and prescription of assistive/augmentative devices to provide adequate functional compensation and recovery [5]. Augmentative devices
R. Moreira et al.
aim to empower the user’s natural means of locomotion, taking advantage of the remaining motor capabilities. In this sense, emphasis has been placed on conventional walkers, as a promising solution for ambulatory rehabilitation, once it supports bipedalism and dynamic gait stability. Conventional walkers promote an increase of the base of support and a decrease in the weight- bearing on the lower limbs [13, 14], relegating to secondary alternative devices, including wheelchairs or canes/crutches, strongly discouraged for their limited lower limbs weight support and lateral stability (base of support) [13]. Negative experiences (e.g. falls) arising from canes/ crutches use, often cause patients to adopt a more sedentary life, not walking as much as recommended, which progressively deteriorate their motor condition and promote the onset of other diseases (e,g. diabetes mellitus). Conventional walkers mechanical structure promotes simultaneously, muscle strengthening and physical resilience increase, considering it reliefs weight bearing on the lower limbs and compensates the decrepitude of postural stability of ataxic patients as a result of a wider base of support [13]. Conventional walkers are classified accordingly to the type of support with the ground: (1) standard, with rubber tips, which must be lifted off the ground and place it forward while walking, (2) front-wheeled walker for patients who have difficulty lifting a standard walker, and (3) four-wheeled walker that promote the most natural gait patterns and can be used if the patient does not rely on the walker to bear weight [5, 15]. Nevertheless, prescription of a walker to patients implies taking into account not only their locomotion deficits, but also cognitive or sensory impairments [16]. This way, deficits related to dexterity, cognitive ability, motor coordination and maneuverability of conventional walkers still provoke destabilization of biomechanical forces leading to a lack of balance and potential falls [13]. As a result, many patients prefer not to use any walking aid, progressively deteriorating their motor condition and social-economic consequences [16]. In this sense, to solve this paradigm, incorporation of robotic technologies has started to emerge, in
2 Smart and Assistive Walker – ASBGo: Rehabilitation Robotics: A Smart–Walker to Assist Ataxic Patients
order to achieve a multifunctional powertrain smart walker, which promotes a better service, by combining safety, ease of use and low cognitive requirements and allowing patients to fully concentrate on their gait rehabilitation. Recently, several multifunctional powertrain smart walkers have been developed. Even so, the majority of those do not assemble a set of comprehensive functionalities in a single smart walker, promoting a generic functional compensation and recovery tool [17]. Further, they often promote early onset of fatigue, a higher cognitive workload and often result in destabilization of biomechanical forces, considering the majority of them deprive the user of an assisted-as-needed experience, tailored to each user’s needs and motion intentions towards assisted living environments [18]. This chapter describes a new PMR intervention with a smart walker – ASBGo∗ (Smart Walker for Mobility Assistance and monitoring System Aid), towards efficient rehabilitation by promoting ambulatory daily exercises enhancing the quality of life of people in general, and particularly, cerebellar ataxic patients. The prototype includes distinct functionalities to be adjustable to different users’ needs. Among those, physical support, autonomous guidance by interpreting the surrounding environment, manual guidance through an advanced human- machine interface designed to extract users’ intentions to modulate the SW behavior accordingly [19] and lastly, user’s state monitoring to track the patients motor condition through a set of rehabilitation sessions. Thus, they deprive the user of an assisted-as- needed experience, demand high cognitive effort, promote early onset of fatigue and often provoke destabilization of biomechanical forces leading to falls [5]. In the following sections, a brief literature review of SWs will be presented. In Sect. 2.3 an overview of the ASBGo∗ prototype is presented, extending from its mechanical and electrical features, to its design and ergonomics considerations. Next, Sect. 2.4 presents the results obtained from the ASBGo∗‘s functionalities validations, culminating with the presentation of hospital trials results and its discussion for each
39
functionality proposed. Lastly, Sect. 2.5 presents the major conclusions withdrawn.
2.2
Related Work
Bateni et al. reports that most conventional walkers are neither tailored or recommended for individuals with mobility dysfunctions plus visual and cognitive impairments, as well for patients suffering from balance issues, including ataxic patients [20]. The emergence of SW seeks to fill the gaps of conventional walkers, but its design presents unique challenges to researchers in this area, related to specific demands of people with impaired mobility and balance. Up to the authors knowledge, current smart walker-based systems have not yet addressed the required adaptation of the device to the patient’s disorder [21, 22]. To successfully outline/design a walker, it is paramount to look beyond motor disabilities and take into consideration users’ cognitive or sensory impairments as well. Additionally, usability issues including safety, comfort, simplicity of use and low cognitive demand must not be discarded, in order to attain a patient-oriented system design [16]. Thus, SW can be classified accordingly to their main functionalities: (1) physical support, (2) manual guidance, (3) autonomous and shared-control guidance (4) security and status monitoring of the user(s).
2.2.1 Physical Support Physical support functionality on a SW addresses mechanical enhancements to the standard four- wheeled walker, in order to ensure dynamic and static stability. Its mechanical architecture must be adaptable to different needs and requirements. For instance, Simbiosis, i-Walker, Yoshihiro et. al and JARoW (JAIST Active Robotic Walker) devices provide physical support by offering support assistance while ambulating, through the implementation of forearm support platform, contrarily to conventional handlebars [23, 24]. Forearm supports are also recommended for
40
patients with hand and/or wrist pathology [25]. As a result, muscle fatigue and weight bearing on the lower extremity joints (e.g. knee, ankle, or/ and hip joints) are severely diminished [15, 4, 26]. In addition, forearm supports provide an increased degree of support and stability, although they do not allow correct centralization and verticalization of the user’s trunk once there is no height adjustable mechanism. Adjustment of the devices’ height and width must be taken into consideration. As an example, Ye et al. proposed a SW with a width-adjustable base mechanism, based on an electric cylinder to stretch out or contract the rods in the presence of spacious environments narrower spaces, respectively [27]. On the other hand, i-go walker presents an handle’s height adjustable mechanism [28]. Sit-to- stand support is equally essential in a walker. As is the case with Chugo Takese et al. mobile walker and Walkmate [27–29]. ASBGo∗ seeks to fill a gap in the literature by including a wooden table with a curvature in the removable trunk contact area, which allows a postural control, increased stability and decreased tremors and dysmetria in patients with ataxia. The ASBGo walker also includes a similar system of [27–29] through an electric lifting columns and the support provided by the wooden table, allowing assistance to patients in the transition from sitting to standing.
2.2.2 Manual Guidance Other research concern relates to manual guidance, being vital a user-friendly and low cognitive effort human-machine interface, adaptable to different users’ levels of physical and mental abilities [32, 33], in order to promote an adequate maneuverability. These interfaces must be able to anticipate the users command intentions to drive the device accordingly and effectively contribute to the user rehabilitation training. Once, SW guidance is controlled by the user without retrieving any feedback from the controller to aid in decision-making, this guidance mode seeks to provide merely power assistance for patients that are cognitively able to make
R. Moreira et al.
command decisions but lack proper control over their walking velocity and gait pattern. This mode is only recommended for patients without cognitive or visual limitations and, simultaneously, motor and strength coordination to manipulate the handlebar. To this end, some sensory modalities are frequently implemented to establish human- machine interfaces by relying on physical interaction [34]. Several models of robotic walkers, including Morris et al. [21], GUIDO [24], Walkmate [31], RMP (Robotic Mobility Platform) [35], explored force sensors accoupled with handlebars of the device, being users’ intentions transmitted through physical interaction with ease. Force signals are converted into guidance commands through filtering and classification strategies [10]. Simbiosis [36] also explored 3D upper-body force interaction concept through force sensors installed under the forearm supporting supports [36]. Forearm supports promote a better posture and stability during gait [37]. Additionally, i-go Walker and Ye et al. also exploits upper-body force interaction for manual control [28]. More recently, Huang et al. exploited Lasso model and PCA algorithms to stablish a nexus between the measured forces by piezoresistive force sensors and user intentions and consequently explored a fuzzy-neural network controller to predict the SW motion in accordance. An alternative low-cost approach is based on a joystick mechanically coupled with a spring in the SW handlebars, whose movement is in agreement with users’ manipulation. Nonetheless, interfaces including joysticks integrated into handlebars present an unreliable behavior (delay and hysteresis) due to walking vibration. Switches, buttons and touch screens have also been explored, but all of these sensory interface modalities translate into discrete and unnatural movements [22, 33, 34, 37] and, particularly, demand a high mental workload and may cause confusion. Recently, interfaces using voice communication have emerged for visually impaired users, as a bilateral communication tool for transferring effective high-level commands [33]. Yet, verbal instructions are not optimal, considering voice recognition variability and restriction in the
2 Smart and Assistive Walker – ASBGo: Rehabilitation Robotics: A Smart–Walker to Assist Ataxic Patients
41
tain location specified in the map [41]. Following a predefined map is ideal for indoor environments, particularly homes or clinical settings, whereas the obstacle negotiation is recommended for open-space and dynamic environments [10]. Several robotic walkers integrate cognitive and sensorial assistance, including [39]. JARoW, Ye et al., PAMM-AID (Personal Adaptive Mobility AID), i-Walker developed by Cortés et al., i-Go and Huang et al. walker, provide cognitive assistance [17]. Huang et al. proposed a CO-Operative Locomotion Aide (COOL Aide) SW directed to safety, by predicting falls and adjusting the SW motion accordingly to accommodate [17]. As for Cortés et al., and PAMM SW, their studies address safety, by incorporating inclinometers on its system, which can identify positive and nega2.2.3 Autonomous & Shared- Control Guidance tive slopes and adjust the velocity to ensure the SW do not uncontrollably accelerates downwards Deficits in spatial orientation and wayfinding and provokes falls [17]. Ye et al. [27] presents a within unfamiliar and familiar environments are walking aid robot embedded with an obstacle known to deteriorate with age [38]. In this sense, avoidance system with width-changeable to incorporating a smart spatial orientation func- adapt to different environments. The robotic tionality in the SW may be highly beneficial to walker has ultrasonic sensors placed around it to promote mobility in individuals suffering from detect the surrounding range, leading the rods of cognitive disturbance, sensory degradation and the rear casters to contract when passing through loss of memory that are associated with neuro- narrow spaces and to be in stretch mode in spadegenerative diseases such as Alzheimer or cious environments [27]. Parkinson [10, 38]. In addition, from a rehabiliOther robotic walker examples which explores tation point of view, an autonomous guidance sonar, infrared sensors, wheel encoders and laser functionality is paramount for SW to provide range finder sensor data for obstacle avoidance safe navigational decisions and collision avoid- are MARC [42] and RT walker [43]. Also, they ance [39], consequently, allowing the patient to entail a control system to guide the user along a fully concentrate on their gait pattern and mobil- predefined path delineated accordingly to a map. ity. This feature is also known as sensorial and Other relevant examples of SWs with path plancognitive assistance, once this type of interface ning control system are CAIROW (Context- does not require the user to produce any specific aware Assisted Interactive Robotic Walker) [44] command to produce walker motion, being par- and Omni RT walker-II (ORTW-II) [45]. In additicularly, suitable for individuals with visual or tion, interaction often occurs through visual, cognitive dysfunctions [10]. Particularly, auditory or haptic feedback, to provide instrucPAMM-AID walker takes advantage of senso- tions or alerts to the user regarding the environrial information to assist blind people guidance ment [10]. [17]. Autonomous navigation is accomplished In addition, some research works conducted either by exploring ultrasonic, vision or infrared with SW adopted the shared-control concept. sensors for real-time detection and obstacles SWs exploring simultaneously, a navigation sysavoidance [40] or following predetermined tem (autonomous) and a user-interaction system paths inside indoors settings, to achieve a cer- demand a shared-control system to determine presence of noisy environments. Considering the concerns still posing regarding this topic, interfaces independent of physical interaction, including ultrasonic sensors and laser sensor sought to actively control the walker motion based on gait pattern recognition, which imposes a low mental workload [37]. Yet, evidences involving generation of an adequate motion based on pathological gait pattern is still scarce. Similarly, ASBGo∗ integrates manual guidance functionality, but it seeks to accomplish a manual guidance approach adequate for ataxic individuals, through an intuitive, user-friendly and easy to handle handlebar, capable of transmitting the intentions of the user.
42
whether the user or SW yields control, once their intentions often misalign. COOL Aide passive SW [32] sought to accomplish a shared navigational control strategy, based on real-time obstacle negotiation and user interaction for propulsion. Cortés et al. and PAMM Smart Walker also includes a shared adaptive controller to determine whether the machine or user detains control of the motion. Yet, its use is limited to specialized indoors environments. Recently, research works conducted with smart walkers have adopted preliminary strategies to attempt an autonomous guidance based on implicit user intentions inferred from human gait patterns [33], which implies a change of the paradigm of force-sensing interfaces for a vision- based one [39]. Nonetheless, neither one of these studies considers the deficits of conscious proprioceptive sensory system present in ataxia, which causes ataxic gait patterns to be peculiarity and unpredictable, which compromises the effective walker navigation strategy based on lower limbs patterns explored by those. Similarly to those presented, ASBGo∗ includes sonars, cameras and Laser Range Finders sensors for navigating and interpreting the user’s intentions during assisted-ambulation. Novelty arises from closed-loop assist-as-needed motion control strategies able to adapt autonomously, through innovative combination of real- time multimodal sensory information from SW built-in sensors to predict users’ motion intentions, without imposing a cognitive load.
2.2.4 U ser’s State Monitoring and Security Through interactive and embedded SW sensors, quantitative and continuous relevant features regarding user activity can be extracted. These features can assist clinicians tracking patients’ status throughout a set of rehabilitation sessions and actively contribute to aid in treatment decision- making and consequently customize rehabilitation programmes [10]. The final aim would be to alleviate the burden on physiotherapists by reducing the direct assistance, once it is
R. Moreira et al.
unfeasible to continuously attend/monitor patients rehabilitation interventions and self- assessments of patients are often unreliable. In addition, this also promotes the possibility of ambulatory user activity monitoring [10]. An accurate recognition of the human locomotion implies a well-established real-time lower and upper limb tracking algorithm, potentially based on statistical methods/heuristic rules or machine learning methods. Initially, some authors hypothesized that through the analysis of the distribution of the weight load applied over the handlebars, inferences could be done regarding the gait cycle, once those cyclic changes reflect the gait cycle. Once determined those load changes, the corresponding gait features could be easily identified. Results have reported peaks in vertical direction to be related to heel initial contact and in the forward direction to the toe-off event cycle [31, 46, 47]. As an example, Alwan et al. [22] extracted gait features, including heel strikes and toe-off events, as well as double support and right/left single support phases using only two 6-DOF load cells. Another study exploring force interaction (direct interaction) between user’s upper-body and walker is Abellanas et al., which can extract cadence based on Weighted Frequency Fourier Linear Combiner and precisely identify heel- strike (HS) and toe-off (TO) events. HS and TO estimation methods present mean errors of 1.35 and 0.55% respect to gait cycle. For continuous cadence estimation, the Mean Square Error is below 3.3 steps per minute. Given these gait features, not only are they able to infer the user’s state, but also adopt a motion control strategy, promoting safety. Extraction of gait features based on force sensors imposes a limitation, once weight load merely provides temporal patterns. Next, a paradigm transition occurred, going from direct interaction (force sensors) to indirect interaction approaches. Exploring on board SWs sensors to acquire clinical insight by tracking feet and legs trajectory revealed promising, since a robotic walker beyond being primarily an assistive locomotion device, it is also a device capable of stimulating an active participation of patients in functional recovery (rehabilitation) [10].
2 Smart and Assistive Walker – ASBGo: Rehabilitation Robotics: A Smart–Walker to Assist Ataxic Patients
By means of lower limbs’ segments tracking with sonar sensors [4], accelerometers, laser range finder sensors (LRF) [48], infra-red sensors [33] or cameras [49], it is possible to extract surrogate markers of incipient disease manifestation or identify abnormal gait patterns associated with disease progression [50], in order to proper customize/ adjust treatment or rehabilitation program. In addition, tracking lower limbs not only offers the ability to compute gait parameters but also addresses safety measures. In autonomous and shared-control guidance section, it was described safety measures adopted to ensure a safe interaction and navigation of the user with the surrounding environment, through obstacle avoidance (e.g. ultrasounds). Recently, by means of analyzing human and SW interaction, instead of the interaction of the SW with the surrounding environment, some works have adopted distinct preliminary safety measures. For instance, CAIROW robotic walker is equipped with a laser range finder sensor to recognize human locomotion, and simultaneously adopt a safety strategy by adjusting the walker velocity accordingly to the distance measured between the walker and user legs [44]. Another robotic walker that explores lower limbs tracking to simultaneously extract gait features and promote safety is Frizera et al. [4]. Based on direct transmission technique between two sonar transmitters placed on each leg and one sonar receiver accoupled to the walker frame, it is feasible to extract gait features. Moreover, the information obtained can be used to modulate the velocity of the motors of the device accordingly to prevent potential falls [4]. Wu et al. [51] is another example of a robotic walker study, exploring ultrasonic receiver and transmitters tied to each leg, making it feasible to infer the position and orientation of the users’ feet based on straightforward ultrasonic wave propagation math. Given the 3D feet pose, the authors propose a motion control strategy in agreement to promote safety. Next, studies exploring infrared sensors and laser range finder have been widely addressed. For instance, the RT-Walker [52] is also equipped with an LRF and performs an estimation of the
43
kinematics of a 7-link human model. The model is only used to estimate the position of the user centre of gravity (CoG) in 3D. The LRF acquires the position of the knee with regards to the walker. Despite the model have been tested in the real-world environment, no real walker users tested this system. In addition, JAIST active robotic walker (JARoW) tackles the challenge of accomplishing a natural user interface between a user and JARoW for ambulatory application. It proposes a particle filtered interface function to estimate and predict lower body segments location based on a pair of rotating infrared and laser range finder inputs [33]. The feedback motion control strategy acts accordingly to lower body segments location to adapt the pretended motion (e.g. velocity). Yet, its potential application is still scarce, considering issues were reported due to human gait variability [48]. Up to now, infrared sensors and LRF systems have been widely explored, but relying on these sensors often lead to false detections, resulting in an impracticable algorithm. Pallejà et al. [53] reported miscalculations of spatiotemporal gait parameters. Recently, the paradigm for 3D feet pose tracking have evolved to technologies using depth/RGB images [33]. As an example, Hu et al. [49] employs a probabilistic approach based on particle filtering to obtain an accurate 3D pose estimation of users’ lower limbs segments by exploring a depth sensor data along the coronal plane (Kinect), accoupled on the lower part of a four-wheeled walker. Nonetheless, position errors reported by the authors (less than 60 mm) are considerably biggerthan the ones obtained with markers (27 mm) with VICON [54]. Similarly, Chung Lim et al. [55] proposes a markerless gait tracking analysis system based on exploring depth image sensor onboard the robotic walker and explores the previously mentioned probabilistic approach as well. Lastly, Joly et al. [56] proposes a standard 4 wheeled-walker with an accoupled Kinect in the axial plane for biomechanical gait analysis through feet position estimation during assisted- walking based on the same probabilistic approach of [49]. Authors reported orientation errors to be less than 15°.
R. Moreira et al.
44
Concerns still pose around vision-based capturing systems for biomechanical analysis, being the major concern human gait variability, particularly high for ataxic patients. Neither one of these studies address pathological gait patterns. ASBGo tackles innovation for combination of real-time multimodal sensory information from SW built-in sensors (e.g. camera, infrared, force sensors, etc) for a proper pathological human locomotion recognition.
2.3
ASBGo∗ Smart Walker
A Smart Walker is intended to be a device that can act as a versatile rehabilitation and functional compensation tool. It should be adaptive considering the necessities of its user and its use should be safe. Patients present different necessities according to their intrinsic characteristics, their disorder and therapies. In order to help them, a Smart Walker should provide different functionalities. For the creation and development of a medical device such as a device, it should be taken into account for whom it is intended (end-user). This brings crucial characteristics and limitations to the development of the final prototype. Therefore, it is important that first of all a list of goals is specified before any other point of prototype creation is set. The first goal is to guarantee the safety of the device to its user. The walker should be robust and reliable in order to reduce to the maximum any risk of injury to its user. Second goal is the attractiveness of the device, which means that it has to be economic and comfortable. Other goal is to provide multifunctionality to the walker, being adjustable to the user and able to incorporate and solve various problems such as being motorized and help its user in various tasks (e.g. sit and stand from a chair). Also, the SW’s design must be suitable to the aim of use, i.e. as a functional compensation and rehabilitation tool. Thus, the device must have an ergonomic design that can provide the necessary support for the patient’s treatment. Finally, its use must be practical, easy to transport, store and adjust.
This section aims to present the project in general, focusing on design considerations, mechanical structure (frame and main components), walker’s system, the electronic and mechatronic components, including the sensorial system, and finally its functionalities as a gait assessment tool.
2.3.1 System Overview The development of an ASBGo (Assistance and monitoring system aid) 4-wheeled motorized walker aims to provide safety, a natural maneuverability and a certain degree of intelligence in assisting with the use of multiple sensors. In Fig. 2.1, it is presented the main functions that are proposed to integrate in the SW. These main functions are structure, motor connection, sensor location, adjustments, extra-help components. Each main function has several sub-functions that were considered throughout the project. In this way, several options were considered to be designed and developed, so the designer could get a better sense of the most reliable option for the final prototype. In design process, the first design proposal is subject to evaluation against the goals, analysis, refinement and development. Sometimes the analysis and evaluation show up fundamental flaws and improvements to be made, and thus, the initial conjecture has to be abounded, a new concept generated, and the cycle starts again. ASGo project is no exception and three different prototypes were developed until the final version, and its evolution is presented in Fig. 2.2. Then, the initial version of the ASBGo was projected as a proof of concept in order to verify some requirements and functions. The structure was too rudimental, needing improvements, and composed by iron materials, which are very heavy. Sensor locations were tested however the components were fixed, and a more adjustable position was required. The second version was designed with a circular tube base with a parallel structure that can pass through any environment (elevators, doors, etc) and to have a small area to have an easy stor-
2 Smart and Assistive Walker – ASBGo: Rehabilitation Robotics: A Smart–Walker to Assist Ataxic Patients
45
Fig. 2.1 Main functions proposed to the SmartW prototype. The dark boxes represent the final decisions
Fig. 2.2 ASBGo prototypes’ iteration. From left to right the frist, second and third models
age. However, this latter characteristic turned out to be a bad option for its users since most of them present a gait with a wide base of support, making them to trip over the walker structure. The motorization system of this SmartW was the same as the first prototype. This second prototype was tested with different patients and important modifications were set for the third prototype. In the third prototype, a more robust and stable structure was manufactured to give greater sense of confidence and safety to the user. In terms of the base structure, oval tubes were designed and instead of parallel tubes, they were
angled in 10° for each side. This prototype was intensely used in clinical trials with gait disabled patients. During that time flaws were identified such as the buckling of the length adjustment handlebar’s systems, low robustness of the height adjustment system, low forearms’ ergonomics and the support base was not wide enough for the user who presented disordered coordination between trunk and legs and dysmetria and were constantly hitting the low frame of the walker (i.e. box compartment, and tube structure). After extensive field research and several discussions with medical staff and physiotherapists
46
of the Hospital of Braga and respective patients, it was possible to conclude that the users of walkers, especially users with ataxia and cerebellum lesions, tend to have a wider gait base of support. Another aspect that was observed in some patients was the asymmetry of support in the walker. They have a tendency to choose one of the arms and therefore have decentralized gait forcing on one of the upper limbs, creating an incorrect and harmful posture. Therefore, it was needed a fourth model (Fig. 2.3), ASBGo∗ that includes improvements considering mechanical, electronic and software architecture. This device should integrate all the sensors embedded in the previous prototypes and be based on a modular SW architecture in order to an engineer easily integrate new functionalities, operating modes, sensors and adjust any mechanical and electronic necessary modifications. Afterall, this device was specially design to take into consideration a rehabilitation treatment for patients with ataxia. For example, an abdominal surface area with a curvature in the contact
R. Moreira et al.
area with the user was added in the fourth prototype to center the user and correct his posture, independently of his anatomy. Such surface was built of wood because it is a cheap material and attractive. Most of the SW weight (electronics and heavy components) was placed in the lower part to reduce the risk of instability and provide a better general equilibrium. An electric lifting system was installed with a load capacity of 800 N, less prone to buckling. Some extra-help components were also added. In order to give more autonomy and safety to patients it was added two bars with handles on the back of the walker to assist the transition of sit-to-stand. Finally, the box compartment that accommodate all the electronic part was designed with a good aesthetic, functionality and a structure to enable a wide support base for ataxic patients. In summary, the ASBGo∗ has a mechanical structure that allows the installation of motors, sensors and other electronic components. The functionalities and characteristics of the device are presented in Table 2.1. ASBGo∗ has four
Fig. 2.3 ASBGo∗ system overview: (a) mechanical frame and its subsystems and (b) fourth prototype
2 Smart and Assistive Walker – ASBGo: Rehabilitation Robotics: A Smart–Walker to Assist Ataxic Patients Table 2.1 ASBGo∗ characteristics and functionalities ASBGo∗ Smart Walker Type of device Motorized robotic walker End-user Gait disabled people (ataxic), locomotion pathologies, elderly Key functionalities Intention recognition, adaptation to the user, navigation, gait assessment tool Modes Manual, multitasking game, biofeedback, remote control Physical interaction Handlebar, GUI Steering Motorized rear wheels Indirect interaction Load cells (strain gauges), infra-red sensor, LRF, URF, Real Sense Camera, IMU Safety Detect falls, posture assessment tool, harness, navigation, emergency system (button and visual warnings) STM platform, RTOS Communication and programming
wheels and a supporting structure that holds the user. Its front casters can freely rotate. Two motors drive its right and left rear wheels independently. Each rear wheel is installed with an encoder. For rehabilitation purposes, the SW must provide adequate physical stability and safety, that is required in early stage treatments and be able to aid in the progression of the patient, as the users become more independent to control the walkers handling. The configuration of the handles can provide adequate stability levels and may also be used in man-machine interactions, such as detection of user’s movement intentions. Thus, the ASBGo∗ walker design provides a handlebar as a direct interface and is composed by potentiometers, and laser range finder and ultrasonic range finder sensors to be used for navigation and autonomous operation. For safety measures, force sensors were installed in the forearm supports and infrared sensor below the wooden support. Also, in order to monitor and work as a gait assessment tool the SW has an IMU and two Real Sense Camara to collect data regarding gait parameters, stability, equilibrium and posture of the patient during the therapy session. All these features will be presented in detail in the following sections.
47
As mentioned, the fourth design iteration proposed improvements and upgrades on a mechanical and design perspective on a third prototype of the ASBGo smart walker. Nonetheless, electronic and software improvements were also considered. In this step, it was fundamental to ensure the successful implementation of the associated electronics from previous prototypes; to have easy access to the various electronic components; to have easy and intuitive use and implement a unified modular system architecture, ensuring robust and user-friendly solutions, in order to engineer a trustworthy device that will establish a new rehabilitation concept. The integration of embedded sensors, the development of a modular software architecture (Fig. 2.4) capable of merge all the different algorithms, which will assure the device’s autonomy and enhance the user’s monitoring and assessment, and the implementation of user-friendly and robust solutions capable of working without failure were achieved steps. Given its generic nature tools, inference on modular development, documentation, message- passing infrastructure and positive evolution, revealed to be an asset for the new system, the software architecture was built using ROS, a robotic software platform. The system is centralized in a Inter NUC computer which runs the Main Controller of the system over ROS layer. The Low-Level Controller takes care of the low- level part of the ASBGo∗ and is built upon a Real Time Operating System-RTOS (FreeRTOS) which consists in a program that schedules execution in a timely manner, manages system resources, and provides a solid base for developing application code in a multitasking environment. Gathering the main functional requirements for a robotic smart device’s modular software and hardware, the new architecture was outlined and can be achieved with the embedded sensors and their interaction through well designed algorithms: 1. Acquire data from a set of sensors in real-time; 2. Monitor the battery’s voltage;
48
R. Moreira et al.
Fig. 2.4 ASBGo∗ Smart Walker system architecture
3. Stop in case of emergency; 4. Control walker motion based on user’s physical manipulation through a handlebar, or a remote control; 5. Monitor in real time the patient’s lower limb motion (gait parameters); 6. Monitor in real time the patient’s posture and balance; 7. Ensure a safe movement and provide alert errors (e.g. processing unit failure, motor errors, sensor acquisition communication failure, etc); 8. Provide biofeedback through a local graphical user interface; 9. Achieve intuitive human-walker interaction, without demanding cognitive efforts; 10. Include a database containing the clinical information of patients and their sessions, and thus enable to monitor user’s progress. Sensors Data Acquisition and Motors’ Control are included in Block 1, Block 2 corresponds to the Computer Vision for both lower limb and posture, Block 3 through a USB communication will be used for the assistive autonomous navigation functionality, and finally Block 4 with different type of communication has the local interface GUI (monitor) and the control remote. The blue shaded squares represent the modules directly connected to the Main Controller. Next, all the sensors and actuators presented in each block will be presented along with their
role on the system. Firstly, it is important to describe the mechanical frame and main mechanical components that shelter all the electronics.
2.3.2 M echanical Frame and Main Components This document is based on the idea that a SW should provide support whenever required, and it should be an easy-to-use device, monitoring and rehabilitation tool, presenting the following features: (i) provide dynamic support – whenever the user is walking, standing or sitting, a walker should provide a relatively stable support for the user to recover from losing balance; (ii) demand little or no effort to use – i.e. to move and change direction; (iii) be userfriendly – the movement speed and direction is controlled by the user subconsciously, not requiring special training. To achieve all the listed features, it is important to have a structural frame to withstand not only all the robotic components (sensors, computers, hardware) but also the patient and his/her body weight. Hereupon, in this section it will described the mechanical frame and main components composing the total ASBGo∗ SW. The structure is divided into three main parts: the lower section, the middle section and the top section. The lower section is identified on Fig. 2.5 as points one and two (orange and green shaded
2 Smart and Assistive Walker – ASBGo: Rehabilitation Robotics: A Smart–Walker to Assist Ataxic Patients
49
Fig. 2.5 Identification of ASBGo∗ mechanical frame main parts
parts), the middle is solely the point three (blue shaded part), and the top section is identified by points four, five and six (pink, yellow and grey shaded parts, respectively).
2.3.2.1 Low Section SW base systems are mainly concerned with balance and stability of the device’s frame. The system must be secure and stable, not putting the user’s health and physical state in danger. By this, we addressed this challenge by relying on significant weight to lend stability to the system and by putting the electronics and hardware on the lower section of the structure. The structure (1) is composed of oval tubes whose shape provides the necessary space, especially width, for the mid stance phase of gait for patients with ataxia and cerebellum lesions [57]. Two motors drive its right and left rear wheels independently and have 24 V DC with nominal speed of 40 rpm and nominal torque of 5 Nm. The front wheels are smaller than the rear wheels to enable a better control at maneuverability since the perimeter of the wheel, thus the space covered, is smaller than a greater wheel. Also, the
front wheels are caster and thus freely rotate enabling a better device maneuverability. The box (2) is used to store the electronics and hardware. The lower base of the box houses two 12 V rechargeable batteries with a capacity of 36 Ah. The right side comprises the hardware (Low- Level Controller) and all the acquisition system of the sensors from Block 1 of system architecture. On the left side are attached the motor drivers and the power plug connector.
2.3.2.2 Middle Section The middle section (3) comprises mainly the two support frames for the active depth sensor camera and the electric lifting columns. The lower support frame is directed to the user’s lower limb to acquire gait data parameters, and the upper support frame has the sensor for the posture and equilibrium assessment. As for the electric lifting columns, this system is capable of supporting 800 N of vertical load and providing a stroke length of 0.650 m. This system is used either to adapt the SW height to the user’s height and thus provide a comfortable maneuverability, and to assist the user in sit-to-stand moments.
50
2.3.2.3 Top Section The three points that comprise the top section are top frame (4), the handlebar (5) and the handlebar and monitor frame (5). The top frame is mostly the structure that supports all the top components but also the connection to the middle and thus lower section. At the mechanical level, the physical support is provided by a wood table in which patients can sustain the major part of their weight. This technical aspect gives a real and secure sensation to the patients, as they have a large surface where they can grab on to, increasing their support and reducing the risk of falling. Moreover, the wood table as also an abdominal surface is with a curvature in the contact area with the user to center him/her and correct his/her posture, independently of his/her anatomy. Besides the that, the wooden table has two comfortable and ergonomic forearm-supports, with foam filling, with length and width adjustment by Velcro system. These supports have also the possibility to be integrated with sensors, as it will be detailed in the next section. A simple normally closed emergency button was also installed on the top section. When pressed, it opens the circuit, thus stopping the ASBGo∗ SW in the middle of its use, in case of emergency events: falls, system failure and/or uncontrolled guidance by the patient. It is thus essential to be able to anticipate the user’s intentions such that the walker might proceed accordingly and verify if the device is effectively helping the user in his rehabilitation training. In order to do this, it has to be developed an interface that establishes a bridge of interaction between user and walker. This interface should be able to adapt to users with different levels of physical and cognitive capacities. This adaption should be done in a user-friendly, natural and transparent manner to the users not being demanding at their cognitive level. Thus, these interfaces have to be able to read and interpret the user’s command intentions to drive the device accordingly. There are many types of interfaces that have been used in smart walkers as we have seen in the state-of-the-art section. Despite all these advances in the current state-of-the-art user-walker interaction field, there are still many
R. Moreira et al.
unsolved questions and key areas in determining user-friendly and efficient interfaces. Especially, it was not found in the literature an interface with a user-oriented design, that is, an intuitive-use device with low cognitive effort for patients, such as ataxic individuals, capable of responding to the low motor coordination. To acquire user’s commands, the proposed handlebar needs two-axis sensors to detect the forward and turning forces [19]. These forces are detected with two potentiometers. Thus, two commercial potentiometers were embedded into the handlebar: a linear potentiometer (0-10kΩ linear) to detect directional changes in speed and a rotary potentiometer (0-470kΩ linear) to detect forward changes in speed. With this system, the user can intuitively manipulate the SW at his own pace. The SW interprets these two basic motions and controls the motors speed and direction, accordingly. It is not allowed to walk backwards. Since abduction movement of the wrists should not be allowed a movement greater than 20° and the rotary potentiometer has a 300° range, two mechanical battens were integrated to limit this rotation movement. On the other hand, the translational movement is limited by springs, placed on the center of the axle tube, but with the trade- off of not forcing the flexion/extension of the users’ wrists. In addition, the handle-bar is characterized by its balance, i.e. when not actuated by the user it remains in its zero position, which corresponds to when the device is stopped. This balance is extremely important because user safety must be assured, since users are mostly people with physical weakness. The part number six is the sheet metal that supports the handlebar’s frame and holds the monitor to the top section structure. The monitor is placed with a tilt angle of 25° to provide an ergonomic visibility of the GUI to the user.
2.3.3 Sensors and Actuators The final version of ASBGo walker (Fig. 2.3) was integrated with multiple sensors and other electronic components given it different functionalities and characteristics. It will be now presented
2 Smart and Assistive Walker – ASBGo: Rehabilitation Robotics: A Smart–Walker to Assist Ataxic Patients
the structure of the sensorial system as well as the different modules and their relations. The smart walker collects and processes information from a series of embedded sensors, allowing it to understand the surrounding environment, infer the user’s intentions and act accordingly. Starting from Block 1 – Sensors Data Acquisition and Motors’ Control, as referred previously, is mainly focused on the box compartment. A STM32F4 Discovery development board is used to implement the low-level control, i.e., sensors data acquisition, data’s management and motor’s actuation. Besides that, it handles the emergency events and monitors the battery’s voltage checking if it is in critical state (discharged). The application runs in FreeRTOS. The linear and angular potentiometers are used to acquire user’s commands from the handlebar and thus control the angular and linear velocities of the SW, respectively. This information is then processed and send to the motors that act the device accordingly. The velocity and traveled distance of the motors is calculated with two encoders, coupled in the device, one in each rear wheel. Strain gauges (load cells) and the infrared sensor are mainly used to monitoring the risk of fall by detecting possible falls, instabilities or imbalance moments from the user. The uni-axial force sensors (one in each forearm) were installed in the forearm and they detect if patient is correctly supported and if load on the walker is to heavy or null may mean that the subject is in a dangerous situation. As for the infrared sensor, localized beneath the abdominal surface table, it measures distance between the device and the patient and two conditions can be perceived: greater or lesser proximity of the user to the walker, which can be too close or too far meaning that the subject is in a dangerous situation. Depending on the detected state, the walker will perform a different action. Therefore, each state will provide enough information so that the walker can make a decision. This multi-sensor system comes to meet one of the ASBGo∗ project main goals to ensure user safety, monitor different states of a person and extract patterns and behaviors of the user along his/her gait during assisted gait with the SW.
51
Besides this monitoring system, the safety mode is also characterized by a warning system that alerts the presence of obstacles in front of the walker. This is done with 9 sonar sensors distributed in a three-layer configuration, on box compartment, to maximize the detection area. A low ring of 6 sonars mounted forward oriented detects the majority of ordinary obstacles, like people, walls or other low obstacles. High obstacles such as tables or shelves are detected by a high ring of 2 sonars pointing upwards is mounted to detect high obstacles. These 8 sonars are meant specifically for obstacle avoidance. An extra sonar pointing downwards is mounted on the walker to detect stairs. This sonar does not contribute to the obstacle avoidance task, but stops the walker when changes in the ground, such as stairs or holes are detected. Results demonstrated that the sonar configuration mounted on the SW had successfully detected several types of hospital obstacles, including dynamic obstacles [58]. When all the sonar sensors measure a distance greater than a predefined minimum distance no alert is given. When ASBGo∗ is at a distance of less than a pre-defined maximum distance, a sonorous/visual alert is activated to warn the patient that there is an obstacle near the SW. Other important requirement of a SW is the possibility of doing clinical evaluation during walker-assisted gait. Therefore, the final sensor integrated in Block 1, the IMU (Inertial Measurement Unit) is mostly used to indicate the stability of the user regarding his center of mass (COM) position, giving posture and balance information. The sensor is placed at the trunk of the patient and measures the COM’s displacement. Another good interpretation of this parameter is the capability to detect fall risk situations. If the body is unstable, the probability of falling increases substantially [59, 60]. Clinical gait analysis is the process by which quantitative information is collected to aid in understanding the etiology of gait abnormalities and in treatment decision-making. Advances in robotics made it possible to integrate a gait analysis tool on a walker to enrich the existing rehabilitation tests with new sets of objective gait
R. Moreira et al.
52
parameters. Further, these systems allow evaluating the evolution of some disorders and enhance diagnostics in ambulatory conditions. The team of this study developed a feet detection method to estimate feet position during assisted gait and implemented an application to extract patient’s upper body motion, assess the balance, posture and the risk of fall. More details will be presented in Sect. 2.3.4 along with the corresponding integrated biofeedback application. One active depth sensor (ADS) will track the feet, to provide position and orientation of the feet center and is localized in the middle section of the SW, the lower support frame, with 60 mm of height and 40 mm of distance to the patient. Another ADS is placed, also in the middle section, on the upper support frame, pointing upwards to the user’s trunk and shoulders. This last sensor is solely intended to be used when the user is driven by the walker (remote or autonomous maneuverability) while supporting his/her weight on the handle grips at the back of the ASBGo∗ SW, and thus the wooden table is provisionally removed to allow visibility to the user’s upper body. All this sensory system is integrated in Block 2 – Computer Vision. Path modulation and generation are classical issues in navigation architectures for SW since intentions of the system are taken into account to compute a final locomotion command, used for gait training sessions. Thus, Block 3 corresponds to the autonomous navigation and integrates a Hokuyo URG-04LX-UG01 Scanning Laser Rangefinder sensor. Consists in a small and accurate laser scanner with a detectable range of 200 millimeters to 5.6 meters. Also, it scans in a 240°area with a 0.36°angular resolution and a scanning time of 100 ms/scan. This autonomous navigation mode allows the user or physiotherapist to define the desired position coordinates while guiding the SW in the environment. The local Graphical User Interface (GUI) provide means to configure parameters related to the locomotion, perform rehabilitation sessions, check the state of the device and configure and run application-level features. The QML language was used fort the design of the interface that is included in Block 4 – User Interaction. This user-SW interaction will be further dis-
cussed in Sect. 2.3.4. Finally, as to the remote controller, if the patient is not able to drive the device using the handlebar, the medical specialist can drive it remotely. Also, the buttons may be convenient to start or stop the system. All the collected data from the sensorial system is constantly stored on an embedded database, with the patient ID identified. The information is all gathered and organized in folders according to the patient and the gait training sessions’ date and time. Thereafter, it will be discussed the functionalities, maneuverability modes and main functions.
2.3.4 Functionalities As mentioned previously, the ASBGo∗ SW has the possibility of doing clinical evaluation during walker-assisted gait, acquire biomechanical data collection, offering real interactive applications such as the biofeedback real-time interaction and a multitasking game. Also, besides acting as a gait assessment tool, ASBGo∗ SW enables a smooth and secure maneuverability to support the patient in a dynamic and intuive rehabilitation. With such functionalities, ASBGo is versatile, adaptive and safe as a rehabilitation and functional compensation device for patients with mobility problems prescribed for its use. Versatile since it can be used for a variety of patients that present difficulties in mobility associated with other personal limitations such as visual problems and/or cognitive). Adaptive since it allows adapting the parameters of control systems (such as minimum and maximum speeds) depending on the physical limitations of the patient. Safe because the structure of the presented SmartW was developed with a design that provides for a more stable movement and safety for the patient. Therefore, the ASBGo∗ SW functionalities and operation modes are structured as seen in Fig. 2.6 and through main modules: Maneuverability, Clinical Evaluation, which comprises the Gait and Posture Assessment Tool, Biofeedback and Multitasking. All these features
2 Smart and Assistive Walker – ASBGo: Rehabilitation Robotics: A Smart–Walker to Assist Ataxic Patients
53
Fig. 2.6 ASBGo∗ SW functionalities main modules
ment while avoiding the obstacles in front of the smart walker, thus constituting a well synchronized double-task training. After placing the hands on the two handgrips, the user will act on them accordingly to the command he wants to perform: start to walk, accelerate, slow down and turn left or right. Thus, if the user intends to: (1) increase the walking speed, he has to turn the handlebar in a counterclockwise direction; (2) decrease the walking speed, he has to turn the handlebar in a clockwise direction; (3) 2.3.4.1 Maneuverability The main goal of the developed SW (ASBGo∗) is turn to the right, he must move the handlebar to the rehabilitation and functional compensation of the right side; (4) turn to the left, he must move patients with mobility and balance problems. the handlebar to the left side. Safety is a feature Since patients can present different types of dif- always present in any of the modules. For examficulties and disorders associated with locomo- ple, during manual operation the user is suption, the SW has to be adaptable to these different ported by the abdominal wooden table and the limitations. Thus, through three modes of maneu- forearms’ support. The latter has embedded load verability is possible to adapt the operation of cells that acquire data related to the applied load. ASBGo∗ depending on the difficulties of the If load on the walker is to heavy or null may patient and provide a safer, comfortable and effi- mean that the subject is in a dangerous situation. cient rehabilitation. Also, it is possible to assess and correct the The manual operation consists in a double- patients’ posture accordingly to the values given task training and is recommended for patients by each sensor and provide a visual message, so with visual and cognitive capabilities and enough the patient compensates his/her load on the least motor coordination to conduct, independently, supported body side. In a similar way, the infrathe walker under the guidance of the commands red sensor, below the wooden support, measures defined on the handlebar. In this way, the patient distance between the device and the patient and is responsible for supervising the ASBGo∗ move- thus monitor the risk of fall. are integrated in a safety module system, with patient-oriented considerations. The red arrows correspond to a direct interaction and parallel use between the four main modules. The blue dashed arrows indicate the intrinsic topics of the module. Finally, the black dashed arrow demonstrates the dependency of the Biofeedback from the Clinical Evaluation data. Each one of the listed modules will be sequentially described.
54
The other mode of maneuverability is remote control, developed to allow the physiotherapist to monitor the user behavior and control the velocity and orientation of the SW accordingly. In this mode, the physiotherapist can analyze the behavior, compensations and reactions of the patient against sudden changes in speed and orientations and defines the commands to control the SW’s movement. In addition, it allows the patient to focus on his gait pattern and balance and not on the guidance of the SW. It is advice the use of this mode in parallel with the biofeedback to augment the correction and focus of the patient’s locomotion and feet position. With such visual information the patient demonstrated will have the opportunity to auto-correct his movements, having the sense of his problem and solving it automatically. In this mode the patient has the option of either support himself/herself on the forearms or use the handle grips at the back of the walker. To decrease physiotherapist work effort to monitor the patient’s behavior and focus solely on the patient’s locomotion rehabilitation an autonomous navigation mode is proposed. Autonomous mode allows the user or physiotherapist to define a target location for the walker [61]. Neither the patient nor the physiotherapist have control over this mode. Decisions are made by the walker itself. This operation mode is suitable for patients with visual and/or cognitive limitations, or/and cannot control the SW manually due to weakness or lack of upper and lower limbs coordination. This mode it is intended to complement the SmartWalker ASBGo∗ with a lower cognitive maneuverability than the previous ones, taking into account the needs of the patient using it. So, the functions of this mode are: 1. Have a map of the environment, where the user should be able to choose a destination; 2. With an endpoint selected by the user the walker should be capable of calculate a route from an initial position, that is the location where the walker was positioned when a destination was selected;
R. Moreira et al.
3. With a route planed the walker should be capable of guide the patient through that path while: avoiding obstacles, like walls, plants and similar; not disturbing the surrounding people; take the intention of the user into consideration. During ASBGo∗ SW maneuverability, for any chosen mode, while taking into account the safety of the user, the walker will alert about or avoid obstacles that appear along the way. The monitoring of the environment is characterized by a warning system that alerts the presence of obstacles in front of the SmartW and this is done through the nine sonar sensors placed on the front of the box compartment, as discussed in Sect. 2.1 (System Overview) and Sect. 2.3 (Sensors and Actuators).
2.3.4.2 Clinical Evaluation A SmartW is not only a device to give support and guide its user. It should also have the functionality of evaluating the recovery of its user. In order to well diagnose and follow rehabilitation with the use of a walker, a gait assessment system has to be accurate but also affordable to reduce unequal access to health care and to improve clinical follow-up (i.e. to allow to be used in physiotherapists’/physicians’ office). Similarly, the gait assessment system should be portable and adaptive to the majority of walkers. The system should be contactless to be used in daily routine, improve comfort and decrease the time of analysis. If the system enables real-time analysis, data could be used directly during the consultation by the physician and eventually at home for motivational purposes or monitoring the quality of walk (to predict any forthcoming deterioration of a user’s gait). Such availability of equipment is important to allow an objective assessment of a person’s functional physical state. Thus, inclusion of embedded and portable systems on the walker seems to be more appropriate for building a gait assessment system to characterize and analyze walker-assisted gait. Advances in robotics made it possible to integrate sensors on walkers to act as portable gait assessment systems, thus
2 Smart and Assistive Walker – ASBGo: Rehabilitation Robotics: A Smart–Walker to Assist Ataxic Patients
ADS (active depth sensor) were used to acquire biomechanically data of patient’s body motion. The definition of main concepts and the software applications are presented below:
55
of fall of this work will be based on the study conducted by Doheny et al. [60].
2.3.4.3 Biofeedback Most ataxic patients are distinguished by weak• Lower limb monitoring. Assess the patient’s ness in their proprioceptive system. Moreover, gait pattern and storing the patient’s data: spa- during the gait, it is common that the patient’s tial and temporal metrics related to the gait legs bump against one another. Due to the degracycle, using one Active Depth Sensor (ADS) dation of their proprioceptive system, the indito track the feet (position and orientation of viduals with ataxia are unable to identify precisely the feet center). This allows generating easy- the localization of his/her legs and sense their to-read plots relative to the symmetry of gait position relative to the ground, resulting in an in terms of size and time of passage, average unsteady gait [62]. Moreover, ataxic gait has width between feet, distances traveled, among been characterized by a widened base of support, other spatiotemporal gait parameters. With inappropriate timing of foot placement, reduced this data, the medical team can objectively step frequency, increased step width, and promeasure the patient’s evolution and prescribe longed time in double-limb support. Both an individualized treatment. Also, being a impaired postural stability and decomposition of clinical tool, it is also a motivating agent multi-joint leg movements appear to be factors in which leads patients to achieve better results cerebellar gait ataxia [63]. by improving their gait symmetry. Thereby, the development of a tool which can • Upper limb monitoring. Monitoring and give biofeedback related to lower limbs perforreal-time assessment of posture with an ADS mance can be extremely useful for assisted gait oriented towards the user’s upper body to training. Hence, the insertion of biofeedback to evaluate patient’s posture, equilibrium and self-correct gait is one of the main goals of stability, in real-time. The data is collected ASBGo∗ SW. This module is intrinsically conand stored in the database which can also be nected to the Clinical Evaluation module, since it used, later, by the medical team to measure the uses the data collected from the ADS sensor and patient’s evolution. compute it as alerts and warnings for foot drag, consequence of inappropriate timing of foot This algorithm is based in the detection and placement/foot drop, and for leg bumping. If the use of the patient’s feet’s centroids (Lower Limb distance is lower than the threshold defined, a Monitoring) and identifies user’s central points warning is activated. The warning consists in a (shoulder, hips and upper body), the velocity of continuous audible alarm and/or visual image the upper body’s center and the patient’s hands which indicates the undesired limb position. The (Upper Limb Monitoring). The data captured by other alert is activated always when the length the camera are disposed through a 3-dimensional distance between the feet are above a pre-defined system (coordinates in (x,y,z)). threshold consider as the desired standard for the Besides, other system is considered for the patient’s anatomy. clinical evaluation: use of one IMU placed at the In addition to the monitorization of the pattern trunk, level of the sternum to estimate the 3D ori- followed by feet, the variables to evaluate body entation of the trunk, the stability of the user balance of the user are also acquired. In a similar regarding his center of mass (COM) position, way, the upper body biofeedback module acquires posture and balance information, by applying an data from the Clinical Evaluation and a warning efficient algorithm. Since the common problem is issued if the patient is incorrectly holding on of walker users is usually lack of balance, such the handle grips, preventing the risk of fall. This assessment is fundamental through their recov- biofeedback helps the individual to correct his/ ery. The assessment of posture stability and risk her posture in order to prevent a possible fall.
R. Moreira et al.
56
Results regarding both topics of the Biofeedback module will be provided in the corresponding Section.
2.3.4.4 Multitasking Game The relevance of multitasking in gait ambulation is widely acknowledged by medical community [64, 65]. Multitasking training is a plus-value advantage in physical rehabilitation and can significantly impact recovery of functional waking in people with neurological disorders. For this, individuals, the sessions become more attractive while at the same time motivate the subjects to get better results for his/her progression. In this context and according to medical request, a multitasking tool that can measure the reaction time of patients to specific stimulus was designed and implemented. The strategy adopted followed the main listed aspects, considering the literature [66, 67]: 1. The application must be adaptable to different levels of cognitive ability; 2. It must also include audible and visual stimulus to be suitable to patients with hearing or vision impairment; 3. Visual stimulus should include basic geometric shapes as circles, triangles and squares; 4. In order to design different visual levels, the geometric figures should be represented in distinct colors; 5. Audible warnings should include sounds that are easy identifiable, such as train horn; 6. The audible mode must have different difficulty levels and other sounds should be included; 7. Patients without cognitive, audible and visual disabilities must perform a multitasking training with audible and visible stimulus interaction.
2.4
Results and Discussion
The potential of using walking aids in patients with ataxia is promising but lacks clinical evidence. The combination of imbalance and low coordination of the lower limbs suggests a strong
rationale to use of an intelligent walker in gait training. The ASBGo∗ is based on the assumption that the following characteristics favor this assistance: (1) the actuators (motors) enables systematic speed control; (2) improvement of walking pattern through repetition of gait cycles using accurate sensory information; (3) stimulate a normal gait pattern; (4) performing tasks in parallel by stimulating cognitive and motor skills; (5) gait training in real environments avoiding obstacles; (6) patient biofeedback of their body, thus encouraging their active participation in therapy, speeding up the locomotion recovery; (7) axial support reducing tremor and asymmetry; and (8) human-machine interaction is natural and without cognitive effort. In this section, are demonstrated the obtained results with the ASBGo∗ SW since the beginning of this project. Results consider the main functionalities and modules well integrated in a safe usage environment: low cognitive effort maneuverability adaptable for different conditions and needs (manual, remote or autonomous); Biofeedback of the body in real-time; and clinical evaluation with biomechanical metrics determination. Hereinafter, the results will be discussed.
2.4.1 Safety A very important aspect of SWs is to provide for security/safety such that the user feels safe while controlling the SW, mostly in manual maneuverability. On the ASBGo∗, the patient guides the SW and a warning system is activated if a dangerous situation is detected. Both the environment and the patient are monitored. The monitoring of the environment is characterized by a warning system that alerts the presence of obstacles in front of the SW. The warning system of this operation mode consists of three lights: green, yellow and red. The green light is lightened in the absence of obstacles in front of the SW (Fig. 2.7 situation A), i.e. when all the sonar sensors measure a distance greater than a predefined minimum distance (mindist = 1.1 m). In situation B, the yellow light signal is connected because there are obstacles in
2 Smart and Assistive Walker – ASBGo: Rehabilitation Robotics: A Smart–Walker to Assist Ataxic Patients
57
Fig. 2.7 Three situations detected by the URF sensors used to provide safety features for the ASBGo∗ SW
a distance between mindist and maxdist = 0.2 m. When SW is at a distance of less than a predefined maximum distance (maxdist), the red light is activated to warn the patient that there is an obstacle near the SW, such as in situation C. Additionally, an audible alarm system, with different sound frequencies associated to these different distances, may also be triggered. Note that the parameters (mindist and maxdist) that define the distance of detection of obstacles can be calibrated in order to adapt the warning system to the type of patient in the ASBGo∗. In terms of detecting the risk of fall of the user, two sensors are used: infrared sensor and load cells. In Fig. 2.8 it is depicted the infrared sensor (IR) signal of a walking user with the ASBGo∗. The IR signal decreases accordingly to
the user approximation to the ASBGo∗. An algorithm was developed to detect abrupt changes on the signal, to then detect if the user was falling forward. When such situation is detected, the ASBGo stops immediately. The situation of falling backwards is similar, but the IR signal decreases in Fig. 2.8b. Secondly, two strain gauges, one on each forearm support, are used to verify if the user is with his arms properly supported on the forearm supports. On one hand, if the user relies on both supports, the measured force signal increases, and the ASBGo∗ is enabled to move. On the other hand, if the user is not loading the sensor the output signal decreases until it reaches zero (Fig. 2.8c) and the ASBGo∗ immediately stops. Therefore, the safety mode implemented in the
58
R. Moreira et al.
Fig. 2.8 Fall risk detection sensors identified on the black dashed rectangle: (a) IR signal for a forward fall, (b) IR fall event for a backward fall, and (c) load cells detecting unload event (fall event)
ASBGo∗ allows detecting the obstacles present in the environment and advising the patient of their presence. In addition, it warns the physiotherapist if the user is falling.
2.4.2 Biofeedback In previous sections we discussed that assisted gait and posture monitoring could benefit from the measurement of feet position and orientation and also the body COM. Several works have been dedicated to the detection of lower limbs. The proposed methods are usually fast, but only detect the position of the legs. Also, they often use markers attached on the feet, which is unsuitable in daily routine. In this work, and to be imbedded in the assistive and monitoring ASBGo∗ SW we proposed a new method to extract feet position and orientation data from a camera depth sensor. The main advantages of the presented method are that it is marker less, faster than using 3D models, robust against clothing variations and that continuously detect orientations of the feet. The precision of the presented method is better than the other marker less methods and seems sufficient for gait analysis.
From the ADS sensor we are able to extract color and depth images that will be used for visual and audible feedback and/or gait and fall risk assessment analysis, respectively. Depth information is obtained through the projection of an infrared light and its deformation by ADS own software. The algorithm developed for gait analysis and posture assessment process the depth frames and calculates the position of the feet, or upper body, relative to the camera, the distances between them and also for the detection of steps, strides and user’s central points (shoulder, hips and upper body), the velocity of the upper body’s center and the patient’s hands. The extraction of the feet, or upper body, from the depth frame is done through background removal techniques. This, for instance, is obtained by calculating the minimum depth value for each pixel of a predefined number of background frames. This process is shown in Fig. 2.9. From feet distance comparison between consecutive frames it is possible to deduce the different phases of gait of biomechanical metrics. Regarding the upper body with collected data, the software will expose the coronal, axial and sagittal views of the patient, giving important information regarding posture.
2 Smart and Assistive Walker – ASBGo: Rehabilitation Robotics: A Smart–Walker to Assist Ataxic Patients
59
Fig. 2.9 Color, depth and depth with background removed
Fig. 2.10 Patient undergoing gait training session with the SW and the biofeedback module of the lower limb
As mentioned, the majority of ataxic patients exhibits small support distances which increase the instability and the risk of fall. Moreover, during the gait they may present moments when the legs bump against one another or there is foot dragging/drop. Thus and due to the degradation of their proprioceptive system, a biofeedback strategy was implemented in order to correct the patient’s gait training performance. During the execution of the lower limb monitoring a window with the patient’s gait state information is shown, along with warnings consisting in a continuous audible alarm, which can be disabled, and a
visual image which indicates the wrong position (see Fig. 2.10). And more importantly, the continuous transmission of the color frame to provide the real and online biofeedback of the patient’s performance. Besides patient motivation increases, we also hypothesize that the gait’s symmetry will increase with the use of the lower limb biofeedback. Regarding upper limb biofeedback, the main goal of this application is the development of a monitoring tool to be used in real time and help the SW’s users prevent risk events such as falls. For that purpose, it was necessary to firstly perform the detection of upper body main sections: neck, shoulders, waist and hips. The method uses border extraction (Canny Edge Detector) in the depth frames and the results of the detection are shown in Fig. 2.11. The picture present in Fig. 2.11a depicts the points used to determine the central point between the shoulders and the neck point. As for the hand support, to correctly and securely use the SW, the user has to place and support his/her body in the respective handle grip. Failure in one hand’s detection is considered has a risk situation (fall), the patient is alerted, and the SW should act accordingly, i.e. stop the movement. To assess user posture and balance, it is required the determination of the user’s COM. The position of COM can be roughly assumed to be by the midway point between the two central points that mediate the extremities of the upper body (midway point of the shoulders and central point of the waist/hips). To automatically find the waist’s position, the algorithm starts by determining the first possible waist points on
60
R. Moreira et al.
Fig. 2.11 Detection of points of interest of the upper body: (a) points of the neck and shoulders; (b) hip point and waist center determination with an accuracy of 80%; (c) definition of hand’s region
each side of the hips. Experimental trials have shown an accuracy higher than 80%.
2.4.3 Clinical Evaluation and Hospital Trials The manual maneuverability is characterized by controlling the movement of the ASBGo∗ SW under guidance of commands defined on the handlebar by the user. In this mode, the patient is responsible for taking the decisions regarding the ASBGo∗ movement. However, this mode is prescribed by the physician or physiotherapist only for patients without visual and/or cognitive diffi-
culties, with motor coordination and sufficient strength for the SW manipulation handlebar. Most of the validations to verify the potential of assistive technology like the ASBGo∗ SW and their long-term effects in rehabilitation therapies were done considering the manual maneuverability of the device since it constitutes a well synchronized double-task training. The validation study, here presented, introduced the smart walker in the rehabilitation of three ataxic patients. Their gait patterns and postural stability was acquired and clinically evaluated. Great improvements in gait parameters as well as in postural stability were observed in all three cases. Important outcomes were high-
2 Smart and Assistive Walker – ASBGo: Rehabilitation Robotics: A Smart–Walker to Assist Ataxic Patients
lighted in order to assess the improvement of the three case studies: stride-to-stride variability, symmetry index and COM displacement. Before beginning the gait training with the SW, all baseline data was collected. Patients were evaluated by the application of BBS and with static and dynamic tests the information was gathered by several sensors integrated in the device, which allowed characterizing the assisted gait and stability. Static and dynamic tests consisted on 4 conditions: (1) static stance, (2) static semi-tandem stance, (3) walk with the SW and (4) walk alone and/or with an alternative assistive device. In each condition several parameters were acquired, as we will see. Conditions (2.1) and (2) consisted on 3 trials with 1 min of duration each and in conditions (3) and (4) the patient had to walk 20 meters. It is noteworthy that condition (4) is done in order to verify which gait and postural modifications have been made with the SW training. The total average number of gait training sessions with the SW were 20, each one lasting 15 min, in a velocity comfortable to the patient. Before the beginning of the sessions generic evaluations of balance were assessed, however they will not be referred in this document as it is not the focus of this work. Clinical evaluation during walker-assisted gait is the first step to assess the evolution of a patient during rehabilitation and to identify his needs and difficulties. Advances in robotics made it possible to integrate a gait analysis tool on our SW to enrich the existing rehabilitation tests with new sets of objective gait parameters. As already mentioned, the team of this study developed a feet detection method to estimate the position of the lower limbs during assisted walking, using the active depth sensor (ADS). Gait events were identified to calculate the following spatiotemporal parameters, previously determined [68]: step and stride length (STP and STR) for each side, stride width (WIDTH), gait cycle (GC), cadence (CAD), velocity (VEL), stance and swing phase duration (STAD and SWD), double support duration (DS) and step time (STPT), for each side. With these spatiotemporal parameters, it is possible to calculate stride-to-stride variability.
61
This is a strong indicator of risk of fall. Other important indicator is the symmetry of parameters. This can tell us if the coordination between legs is improving or not. Symmetry indices (SI) [69] were calculated for each feature using the formula: SI =
UR − UL UL
(2.1)
Where 𝑈𝑅 and 𝑈𝐿 are any aforementioned features for the right (R) and left (L) leg, respectively. Perfect symmetry results if SI is zero, larger positive and negative deviations would indicate a greater symmetry towards the right or left leg. The results obtained for the SI regarding the spatiotemporal parameters are following disclosed. We will discuss the SI and stability of the three cases along the training sessions. Starting with the Case 1, Fig. 2.12 presents the gait parameter’ results in terms of symmetry index (SI) of the evaluations done with the ASBGo. As it can be seen all parameters had a good evolution for the improvement of the patient’s gait pattern. Since most parameters present negative asymmetry, the left leg is the one responsible for the asymmetric gait. Looking for the evolution of SI, one can see that SI of all parameters tend to zero week to week. In a similar way, looking at Fig. 2.13, it is obvious that the symmetry tends to zero across the evaluations with ASBGo. In terms of symmetry, in Fig. 2.14, it can be observed that in the first sessions, the patient presented great asymmetry on gait, improving over time. Over time, the patient improved coordination and symmetry. Postural stability parameters were calculated during static and dynamic positions using an IMU sensor as was pointed in Sect. 2.3.3. Two stance conditions were evaluated, a comfortable stance (CS) and a more unstable and challenging position semi-tandem stance (for each side, SSL and SSR) [60]. COM displacement was acquired for all conditions (CS, SSL, SSR and ASBGo). In order to have a better visualization of the evolution, in time, of the patient in terms of stability, the COM displacement was approximated to an
62
R. Moreira et al.
Fig. 2.12 Case 1 study SI evaluation while in gait training with ASBGo
Fig. 2.13 Case 2 study SI evaluation while in gait training with ASBGo
Fig. 2.14 Case 3 study SI evaluation while in gait training with ASBGo
ellipse. Taking the outside margins of the COM displacement, an ellipse was drawn in Figs. 2.15, 2.16, and 2.17. For simplicity of this work, the results obtained for the stability will only consider the ASBGo walking support.
From this analysis, we conclude that the patients had a large medial-lateral displacement, meaning that he/she presented a lateral displacement that could cause instability while walking, having the tendency to fall sideways.
2 Smart and Assistive Walker – ASBGo: Rehabilitation Robotics: A Smart–Walker to Assist Ataxic Patients
63
Smart Walker
8
AP displacement (mm)
6 1st 2nd 3rd 4th
4 2 0 –2 –4 –6 –8 –10
–8
–6
–4
–2 0 2 ML displacement (mm)
4
6
8
10
Fig. 2.15 Case 1 study postural stability results using the SW
Smart Walker
1.5
AP displacement (mm)
1
1st 2nd 3rd 4th 5th 6th 7th 8th 9th 10th
0.5 0 –0.5 –1 –1.5 –2
–1.5
–1
0.5 –0.5 0 ML displacement (mm)
1
1.5
2
Fig. 2.16 Case 2 study postural stability results using the SW
However, this instability was reduced over the weeks, allowing the patients to better control posture while walking with SW. It is noteworthy, that the ML displacement has reduced more than AP, showing greater improvements. In all cases the ellipses decreased their radius, meaning a significant enhancement in COM displacement. In this study, three different ataxic patients performed gait training with ASBGo. Different improvements, in different recovery times, with different functional gains were achieved by the
patients. However, similar measures, intervention and protocol were performed. Symmetry index, stride-to-stride variability and COM displacement were considered the best outcomes to evaluate the evolution of these type of patients, giving quantitative information about their improvements. Findings of this study show that gait training equipment can be improved and, consequently, better functional gains can be achieved if quantitative methods for evaluating walking performance are developed, such that it can be possible
R. Moreira et al.
64
Supported Walking vs Smart Walker
1.5
1st 3rd
AP displacement (mm)
1
5th 5th 7th
0.5
8th 9th 10th 11th
0
12th
–0.5 –1 –1.5 –2
–1.5
–1
–0.5 0 0.5 ML displacement (mm)
1
1.5
2
Fig. 2.17 Case 3 study postural stability results using the SW
to establish baseline training parameters for each patient and then progress each patient in an optimal recover. The team is now improving the algorithm for spatiotemporal parameters detection, using improved cameras, calculating additional gait parameters, and in parallel integrate the biofeedback in the new prototype and modular architecture. Besides, the data will be display automatically in a graphical database to be better analyzed by the medical team and the patient as a conscientious and motivating evaluation.
2.5
Conclusions
The smart and assistive walker ASBGo∗ – A contribution to ataxic patients and neurological diseases, is an intelligent, motorized and adaptative walker with interoperability between functions, which pretends to offer gait assistance, either in hospitals or clinics, to imbalance and lack of limb coordination individuals, especially the ataxic population. This device works as a rehabilitation procedure and is generic enough to help people with different locomotion and neurological disorders. ASBGo∗ SW is an ergonomic prototype, with an aluminium and steel structure especially designed for the ataxic gait, with four wheels,
sized for a wide range of users in terms of weight and height. Its mechanical structure includes a support base for the upper limbs implemented with forearm and trunk support, that reduces tremor and asymmetry, a very relevant feature for these patients. The information collected by several embedded sensors is used to characterize assisted gait and user-walker interaction. An instrumented handlebar enhances the user’s manual and intuitive manoeuvrability and interprets the intentions of the user allowing the walker to act accordingly without the cognitive and weight effort to push it. ASBGo∗ SW acts as a support tool for the rehabilitation of gait and for the diagnosis of gait spatiotemporal parameters and stability of the patient. This is done through the interactive functionalities and embedded tools, now integrated in a new ROS architecture, that allow a real-time analysis of locomotion parameters and posture, and thus assess the evolution of the patient and adjust his/her treatment. Finally, it integrates a system of biofeedback, hence achieving an effective participation of the patient is his/her own rehabilitation. In this specific document we described the evolution of the SW until nowadays. We explore the different maneuverability options that make the walker a rehabilitation device and functional compensation tool for patients with mobility
2 Smart and Assistive Walker – ASBGo: Rehabilitation Robotics: A Smart–Walker to Assist Ataxic Patients
problems. The autonomous module (navigation) allows the user or physiotherapist to define the desired position coordinates of the walker and autonomously moves to the position avoiding any obstacles in the environment. The manual mode is characterized by the walkers’ movement under the guidance of commands defined by the user. The safety of the user is always in consideration and during any maneuverability mode a warning system alerts the presence of obstacles in front of the walker. The remote maneuverability is called remote control mode and has been developed in order to allow the physiotherapist to control the movement of the walker. Results suggest that these different modes are sufficient for this kind of therapy. A great positive feedback was given by the patients and physiotherapists. This ASBGo∗ showed to be versatile, adaptive and a secure rehabilitation and functional compensation device for patients with mobility problems prescribed for the use of ASBGo∗. Versatile since it can be used for a variety of patients that present difficulties in mobility associated with other personal limitations such as visual problems and/or cognitive). Adaptive since it allows adapting the parameters of control systems (such as minimum and maximum speeds) depending on the physical limitations of the patient. Safe because the structure of the presented SW was developed with a design that provides for a more stable movement and safety for the patient. In summary, this multifunctional SW, whose project is user centered, guarantees safety, stability, low cognitive effort usability, natural maneuverability and provides an end-user oriented adjusted assistance, in a way to improve the comfort and recovery in rehabilitation sessions. The device intends: • A safety training for ataxic patients through a cyclic and regular training, enabling an improvement of stability, balance and gait pattern; • Delaying the early use of a wheelchair; • Promote the ataxic and neurological impaired patients’ autonomy; • Provide a means to monitor gait and posture parameters without the need of a gait labora-
65
tory, during gait training sessions, and thus allowing the summative assessment of the functional gains of each patient. The ASBGo∗ smart walker introduces a new relevant concept in terms of rehabilitation and clinical follow-up. A prototype of the ASBGo∗ is already in operation at the Hospital de Braga, in the Department of Physical Medicine and Rehabilitation, where some selected patients perform their physiotherapy treatments and are followed by the physician and physiotherapist involved. At the same time, we are improving and implementing new functionalities to integrate in the modular system of this device. Also, future studies will address more experimental studies with other types of patients, actuation modules in a Hospital environment. Acknowledgments This research is supported in part by the FEDER Funds through the COMPETE 2020— Programa Operacional Competitividade e Internacionalização (POCI) and P2020 with the Reference Project EML under Grant POCI-01-0247-FEDER-033067 and through the COMPETE 2020—Programa Operacional Competitividade e Internacionalização (POCI)—with the Reference Project under Grant POCI-01-0145-FEDER-006941.
References 1. Winter DA (2009) Biomechanics and motor control of human movement, fourth edition David A. Winter (cloth) 1. Human mechanics. 2. Motor ability. 3. Kinesiology. I. Title. QP303.W59 2. Buchman AS, Boyle PA, Leurgans SE, Barnes LL, Bennett DA (2011) Cognitive function is associated with the development of mobility impairments in community-dwelling elders. Am J Geriatr Psychiatry 19(6):571–580 3. Arnell P (2010) The biomechanics and motor control of human gait 74(2) 4. Frizera-Neto A, Ceres R, Rocon E, Pons JL (2011) Empowering and assisting natural human mobility: the Simbiosis Walker. Int J Adv Robot Syst 8(3):34–50 5. Van Hook FW, Demonbreun D, Weiss BD (2003) Ambulatory devices for chronic gait disorders in the elderly. Am Fam Physician 67(8):1717–1724 6. Morton SM, Bastian AMYJ (2014) Cerebellar control of balance and locomotion. Neuroscientist 10:247–259 7. López-Bastida J, Peña-Longobardo LM, Aranda- Reneo I, Tizzano E, Sefton M, Oliva-Moreno J (2017)
66 Social/economic costs and health-related quality of life in patients with spinal muscular atrophy (SMA) in Spain. Orphanet J Rare Dis 12(1):141 8. Cernak K, Stevens V, Price R, Shumway-Cook A (2008) Locomotor training using body-weight support on a treadmill in therapy in a child with severe cerebellar ataxia. Phys Ther 88(1):88–97 9. Schniepp R, Wuehr M, Schlick C, Huth S, Pradhan C, Dieterich M (2014) Increased gait variability is associated with the history of falls in patients with cerebellar ataxia. J Neurol 261(1):213–223. https://doi. org/10.1007/s00415-013-7189-3 10. Neto AF, Elias A, Cifuentes C, Rodriguez C, Bastos T, Carelli R (2015) Smart walkers: advanced robotic human walking-aid systems. In: Mohammed S, Moreno J, Kong K, Amirat Y (eds) Intelligent assistive robots. Springer tracts in advanced robotics, vol 106. Springer, Cham 11. Orsini M, Bastos VH, Leite AA, MRG DF (2014) Neurological rehabilitation in patients with spinocerebellar ataxia: it’s really effective and permanent? Phys Med Rehabil Int 1(1):1–2 12. Ilg W, Synofzik M, Bro D, Giese MA, Schols L (2009) Intensive coordinative training improves motor performance in degenerative cerebellar disease. Neurology 73(22):1823–1830 13. Bradley SM, Hernandez CR, Sinai M, York N, York N (2011) Geriatric assistive devices. Am Fam Physician 84(4):405–411 14. Page S, Saint-Bauzel L, Rumeau P, Pasqui V (2017) Smart walkers: an application-oriented review. Robotica 35(6):1243–1262. https://doi.org/10.1017/ S0263574716000023 15. Martins MM, Santos CP, Frizera-Neto A, Ceres R (2012) Assistive mobility devices focusing on smart walkers: classification and review. Rob Auton Syst 60(4):548–562 16. Martins M, Santos CP, Costa L, Frizera-Neto A (2013) Multivariate analysis of walker-assisted ambulation. In: Proceedings of 3rd Portuguese meeting on bioengineering ENBENG 2013 – B., Table I, Braga, Portugal, February 20–23, 2013, pp 23–26 17. Alves J, Seabra E, Caetano I, Gonçalves J, Serra J, Martins M, Santos CP (2016) Considerations and mechanical modifications on a Smart Walker, In: Proceedings of International Conference on Autonomous Robot Systems and Competitions, ICARSC’16, Bragança, Portugal, May 4–6, 2016, pp 1–6 18. Ilg W, Timmann D (2013) Gait ataxia – specific cerebellar influences and their rehabilitation typical signs of ataxic gait. Mov Disord 28(11):1566–1575 19. Martins M, Santos C, Seabra E, Frizera A, Ceres R (2014) Design, implementation and testing of a new user interface for a smart walker. In: Proceedings of IEEE International Conference on Autonomous Robot Systems and Competitions, ICARSC’14, Espinho, Portugal, May 14–15, 2014, pp 217–222 20. Bateni H, Heung E, Zettel J, Mcllroy WE, Maki BE (2004) Can use of walkers or canes impede lateral
R. Moreira et al. compensatory stepping movements? Gait Posture 20(1):74–83 21. Morris A, Donamukkala R, Kapuria A, Steinfeld A, Matthews JT, Dunbar-Jacob J, Thrun S (2003) A robotic walker that provides guidance, In: Proceedings of IEEE International conference on robotics and automation (Cat. No.03CH37422), Taipei, Taiwan, September 14–19, 2003, pp 1–6 22. Rentschler AJ, Simpson R, Cooper RA, Boninger ML (2008) Clinical evaluation of Guido robotic walker. J Rehab Res Dev 45(9):1281–1293 23. Grosset DG, Macphee GJA, Nairn M, Guideline Development Group (2010) Diagnosis and pharmacological management of Parkinson’s disease: summary of SIGN guidelines. BMJ 340:b5614 24. Kai Y, Arihara K, Kitaguchi S (2014) Development of a walking support robot with velocity and torquebased mechanical safety devices. In: Proceedings of IEEE/ASME international conference on Advanced Intelligent Mechatronics, AIM’14, Besançon, France, July 8–11, 2014, pp 1498–1503 25. Ko CY et al (2014) Assessment of forearm and plantar foot load in the elderly using a four-wheeled walker with armrest and the effect of armrest height. Clin Interv Aging 9:1759–1765 26. Kawakami S, Kikuchi T, Hosaka M, Niino K, Anzai K, Tanaka T (2013) Evaluation of line-tracing controller of intelligently controllable walker. Adv Robot 27(7):493–502 27. Ye J, Huang J, He J, Tao C, Wang X (2012) Development of a width-changeable intelligent walking-aid robot. In: Proceedings of international symposium on Micro-nanomechatronics and Human Science MHS’12, Nagoya, Japan, November 4–7, 2012, pp 358–363 28. Lu CK, Huang YC, Lee CJ (2015) Adaptive guidance system design for the assistive robotic walker. Neurocomputing 170:152–160 29. Chugo D, Asawa T, Kitamura T, Songmin J, Takase K (2009) A motion control of a robotic walker for continuous assistance during standing, walking and seating operation. In: Proceedings of IEEE/RSJ international conference on Intelligent Robots and Systems, IROS’09, St. Louis, USA, October 11–15, 2009, pp 4487–4492 30. Jun HG et al (2011) Walking and sit-to-stand support system for elderly and disabled. In: Proceedings of IEEE international conference on rehabilitation robotics, Zurich, Switzerland, June 27–July 1, 2011, pp 1–5 31. Shi F, Cao Q, Leng C, Tan H (2010) Based on force sensing-controlled human-machine interaction system for walking assistant robot. In: Proceedings of World Congress Intelligent Control and Automation, WCICA 2010, Jinan, China, July 7–9, 2010, pp 6528–6533 32. Huang C, Wasson G, Alwan M, Sheth P, Ledoux A (2005) Shared navigational control and user intent detection in an intelligent walker. AAAI Fall Symp Tech Rep FS-05-02:59–66
2 Smart and Assistive Walker – ASBGo: Rehabilitation Robotics: A Smart–Walker to Assist Ataxic Patients 33. Lee G, Ohnuma T, Chong NY (2010) Design and control of JAIST active robotic walker. Intell Serv Robot 3(3):125–135 34. Sierra SD, Molina JF, Gomez DA, Munera MC, Cifuentes CA (2018) Development of an interface for human-robot interaction on a robotic platform for gait assistance: AGoRA smart Walker. In: Proceedings of IEEE ANDESCON technology and innovation for Andean Industry, Cali, Colombia, August 22–24, 2018, pp 1–7 35. Grondin SL, Li Q (2013) Intelligent control of a smart walker and its performance evaluation. In: IEEE International Conference on Rehabilitation Robotics, ICORR’13, Seattle, WA, June 24–26, 2013, pp 1–6 36. Martins M, Frizera A, Santos CP (2011) Review and classification of human gait training and rehabilitation devices. Assist Technol Res Ser 29:774–781 37. Cifuentes CA, Rodriguez C, Frizera-Neto A, Bastos- Filho TF, Carelli R (2016) Multimodal human-robot interaction for Walker-assisted gait. IEEE Syst J 10(3):933–943 38. Werner C, Moustris GP, Tzafestas CS, Hauer K (2018) User-oriented evaluation of a robotic rollator that provides navigation assistance in frail older adults with and without cognitive impairment. Gerontology 64(3):278–290 39. Paulo J et al (2017) An innovative robotic walker for mobility assistance and lower limbs rehabilitation. In: Proceedings of IEEE 5th Portuguese meeting on bioengineering, ENBENG’17, Coimbra, Portugal, February 16–18, 2017, pp 1–4 40. MacNamara S, Lacey G (2002) A smart walker for the frail visually impaired. In: Proceedings of 2000 ICRA. Millennium conference. IEEE international conference on robotics and automation. Symposia Proceedings (Cat. No.00CH37065), San Francisco, CA, April 24–28, 2000, pp 1354–1359 41. Graf B (2008) An adaptive guidance system for robotic walking aids. J Comput Inf Technol 17(1):109 42. Wasson G, Gunderson J, Graves S (2001) Effective shared control in cooperative mobility aids. In: Proceedings of Fourteenth international Florida Artificial Intelligence Research Society conference, vol 1, Florida, May 21–23, 2001, pp 1–5 43. Hirata Y, Muraki A, Kosuge K (2006) Standing up and sitting down support using intelligent walker based on estimation of user states. In: Proceedings of IEEE International Conference on Mechatronics and Automation, ICMA’06, vol 2006, Luoyang, Henan, China, June 25–28, 2006, pp 13–18 44. Chang M-F, Mou W-H, Liao C-K, Fu L-C (2012) Design and implementation of an active robotic walker for Parkinson’s patients. In: Proceedings of SICE Annual Conference of the Society of Instrument and Control Engineers of Japan, SICE 2012. Akita, Japan, August 20–23, 2012 45. Nejatbakhsh N, Kosuge K (2006) Adaptive guidance for the elderly based on user intent and physical impairment. In: Proceedings of the IEEE international workshop Robot and Human Interactive Communication,
67
RO-MAN’06, Hatfield, UK, September 6–8, 2006, pp 510–514 46. Fast A, Wang FS, Adrezin RS, Cordaro MA, Ramis J, Sosner J (1995) The instrumented walker: usage patterns and forces. Arch Phys Med Rehabil 76(5):484–491 47. Ballesteros J, Urdiales C, Martinez AB, van Dieën JH (2016) On gait analysis estimation errors using force sensors on a smart Rollator. Sensors (Basel) 16(11):1–15 48. Ohnuma T, Lee G, Chong NY (2011) Particle filter based feedback control of JAIST active robotic walker. In: Proceedings of IEEE international workshop on Robot and Human Interactive Communication, RO-MAN’11, Atlanta, GA, July 31–August 3, 2011, pp 264–269 49. Hu RZL, Hartfiel A, Tung J, Fakih A, Hoey J, Poupart P (2011) 3D pose tracking of walker users’ lower limb with a structured-light camera on a moving platform. In: Proceedings of IEEE computer society conference on Computer Vision and Pattern Recognition, CVPR’11. Colorado Springs, CO, June 20–25, 2011 50. Del Din S, Godfrey A, Rochester L (2016) Validation of an accelerometer to quantify a comprehensive battery of gait characteristics in healthy older adults and Parkinson’s disease: toward clinical and at home use. IEEE J Biomed Heal Informatics 20(3):838–847 51. Wu H, Chien C, Jheng Y, Chen C, Chen H, Yu C (2011) Development of Intelligent Walker. J Life Support Eng 16(Supplement):71–72 52. Hirata Y, Muraki A, Kosuge K (2006) Motion control of intelligent passive-type walker for fall-prevention function based on estimation of user state. In: Proceedings of IEEE International Conference on Robotics and Automation, 2006. ICRA 2006, vol 2006, Orrlando, FL, May 15–19, 2006, pp 3498–3503 53. Pallejà T, Teixidó M, Tresanchez M, Palacín J (2009) Measuring gait using a ground laser range sensor. Sensors 9(11):9133–9146 54. Paolini G et al (2014) Validation of a method for real time foot position and orientation tracking with microsoft kinect technology for use in virtual reality and treadmill based gait training programs. IEEE Trans Neural Syst Rehabil Eng 22(5):997–1002 55. Lim CD, Cheng CY, Wang CM, Chao Y, Fu LC (2015) Depth image based gait tracking and analysis via robotic walker. In: Proceedings of IEEE International Conference on Robotics and Automation, ICRA’15, Seattle, WA, May 25–30, 2015, pp 5916–5921 56. Joly C, Dune C, Gorce P, Rives P (2013) Feet and legs tracking using a smart rollator equipped with a Kinect. In: Work. Assistance Serv. Robot. a Hum. Environ. conjonction with IEEE/RSJ Int. Conf. Int. Rob. Sys, Tokyo 57. Stolze H et al (2002) Typical features of cer ebellar ataxic gait. J Neurol Neurosurg Psychiatry 73(3):310–312 58. Faria V, Silva J, Martins M, Santos C (2014) Dynamical system approach for obstacle avoidance in a Smart Walker device. In: Proceedings of IEEE
68 International Conference on Autonomous Robot Systems and Competitions, ICARSC’14, Espinho Portugal, May 14–15, 2014, pp 261–266 59. Tereso A, Martins M, Santos CP, da Silva MV, Gonçalves L, Rocha L (2014) Detection of gait events and assessment of fall risk using accelerometers in assisted gait. In: Proceedings of 11th International Conference on Informatics in Control, Automation and Robotics, ICINCO’14, vol 1, Vienna, Austria, September 1–3, 2014, pp 788–793 60. Doheny EP et al (2012) Displacement of Centre of mass during quiet standing assessed using accelerometry in older fallers and non-fallers. In: Proceedings of 2012 annual international conference of the IEEE engineering in medicine and biology society, San Diego, CA, August 28–September 1, 2012, pp 3300–3303 61. Silva J, Santos C, Sequeira J (2013) Navigation architecture for mobile robots with temporal stabilization of movements. In: Proceedings of 9th workshop on Robot Motion and Control, RoMoCo’13, Wasowo, Poland, July 3–5, 2013, pp 209–214 62. Baker JM (2018) Gait disorders. Am J Med 131(6):602–607 63. Cernak K, Stevens V, Price R, Shumway-Cook A (2008) Locomotor training using body-weight support on a treadmill in conjunction with ongoing physical therapy in a child with severe cerebellar ataxia. Phys Ther 88(1):88–97 64. Fritz NE, Cheek FM, Nichols-Larsen DS (2015) Motor-cognitive dual-task training in persons with
R. Moreira et al. neurologic disorders: a systematic review. J Neurol Phys Ther 39(3):142–153 65. Plummer-D’Amato P, Kyvelidou A, Sternad D, Najafi B, Villalobos RM, Zurakowski D (2012) Training dual-task walking in community-dwelling adults within 1 year of stroke: a protocol for a single-blind randomized controlled trial. BMC Neurol 12(1):129 66. Ricklin S, Meyer-Heim A, van Hedel HJA (2018) Dual-task training of children with neuromotor disorders during robot-assisted gait therapy: prerequisites of patients and influence on leg muscle activity. J Neuroeng Rehabil 15(1):82 67. Labruyere R, Gerber CN, Birrer-Brutsch K, Meyer- Heim A, van Hedel HJA (2013) Requirements for and impact of a serious game for neuro- pediatric robot-assisted gait training. Res Dev Disabil 34(11):3906–3915 68. Martins M, Santos CP, Page S, Saint-Bauzel L, Pasqui V, Mézière A (2015) Real-time gait assessment with an active depth sensor placed in a walker. In: Proceedings of IEEE International Conference on Rehabilitation Robotics, ICORR’15, Singapore, August 11–14, 2015, pp 690–695 69. Martinez-Ramirez A, Weenk D, Lecumberri P, Verdonschot N, Pakvis D, Veltink PH (2013) Pre- operative ambulatory measurement of asymmetric lower limb loading during walking in total hip arthroplasty patients. J Neuroeng Rehabil 10:41
3
Mechanical Sensing for Lower Limb Soft Exoskeletons: Recent Progress and Challenges Massimo Totaro, Christian Di Natali, Irene Bernardeschi, Jesus Ortiz, and Lucia Beccai
Abstract
Keywords
Soft exoskeletons hold promise for facilitating monitoring and assistance in case of light impairment and for prolonging independent living. In contrast to rigid material-based exoskeletons, they strongly demand for new approaches of soft sensing and actuation solutions. This chapter overviews soft exoskeletons in contrast to rigid exoskeletons and focuses on the recent advancements on the movement monitoring in lower limb soft exoskeletons. Compliant materials and soft tactile sensing approaches can be utilized to build smart sensorized garments for joint angle measurements (needed for both control and monitoring). However, currently there are still several open challenges derived from the needed close interaction between the human body and the soft exoskeleton itself, especially related to how sensing function and robustness are strongly affected by wearability, which will need to be overcome in the near future.
Exoskeletons · Tactile sensing · Assistive robots · Sensorized garments
M. Totaro · I. Bernardeschi · L. Beccai (*) Center for Micro-BioRobotics, Istituto Italiano di Tecnologia, Pontedera, PI, Italy e-mail: [email protected] C. Di Natali · J. Ortiz Department of Advanced Robotics, Istituto Italiano di Tecnologia, Genova, Italy
3.1
Introduction
The research on robotic exoskeletons has shown a boost in the last decade [1–3]. Exoskeletons are robotic systems that can be worn by human operators to improve their performance. They can be divided in three main categories. The first one consists of exoskeletons that can provide super- human strength, being designed to improve physical capabilities of healthy subjects. For instance, this kind of exoskeletons, whose main application areas is the military field, could be used for carrying heavy loads and walking/running over large distances [4]. The second category of mobile exoskeletons includes assistive robotic devices for impaired individuals with different kinds of disabilities (i.e. patients with upper limb pathologies or with difficulty walking due to Spinal Cord Injury (SCI), stroke or muscle weakness). The needed assistance may vary from a total motility restoration [5, 6] to a partial support [7, 8] as required by the severity of the pathology. This kind of devices helps patients to execute movements and to give them back partial or total healthy-like capabilities. Industrial exoskeletons also fall in this category. These systems can also aid workers to prevent possible work related
© Springer Nature Switzerland AG 2019 J. S. Sequeira (ed.), Robotics in Healthcare, Advances in Experimental Medicine and Biology 1170, https://doi.org/10.1007/978-3-030-24230-5_3
69
M. Totaro et al.
70
injury during daily tasks by offering a partial support [9, 10]. The third category consists of therapeutic exoskeletons for rehabilitation, also known as stationary exoskeletons. The aim of these rehabilitative devices is to safely facilitate the recovery of the human movement by supplying recurrent tasks (i.e. repetitive gait training), helping physiotherapists to provide better treatments, and monitoring the patient progress [11, 12]. From the sensing point of view, such systems require monitoring the positon, velocity, and the torque of the human joints (i.e. elbow, knee, ankle, etc) in addition to ground reaction forces. Most of the research and commercial solutions for exoskeletons are mainly developed by means of rigid materials. In this case, the sensing of joint parameters can be easily performed by using well-known and reliable technologies (i.e. inertial unit measurements (IMUs), encoders, etc) [13]. Oppositely, in recent years the research on exoskeletons based on soft materials has been very active. These systems have several advantages with respect to their rigid counterpart. In particular: they can be easily worn by the user, providing more comfort in the case of long-term use; they are lighter; and, they are much more reconfigurable [14]. However, the use of soft materials has introduced many challenges, especially in sensing since both the human body (tissues) and the exoskeleton (garment) are highly deformable, introducing unreliable responses if rigid-based sensors are employed. Thus, soft mechanical sensing solutions [15] having high sensitivity yet high robustness, are mandatory. In this chapter, an overview of both commercial and research solutions for rigid and soft lower-limb exoskeletons is presented. They are highly needed for rehabilitation and assistive purposes and present the most difficult challenges both in terms of actuation (higher torque needed) and sensing (higher ranges of motion and multi degrees of freedom). Then, the main challenges for making reliable sensing systems in the case of soft exoskeletons are discussed. Finally, some possible solutions for soft mechanical sensing are indicated.
3.2
Rigid Exoskeletons
The majority of available exoskeletons are made of rigid materials with links that can provide additional torque in parallel to the human muscles, transmit loads to the ground and support compressive forces. Apart from their functions and the specific requirements that they fulfill, all exoskeletons share some engineering design aspect such as the presence of actuators, sensors, energy supplies, control strategies and materials. Many commercial exoskeletons were designed for industrial or military use in order to augment the capabilities of workers or soldiers. The Berkley Lower Extremity Exoskeleton (BLEEX) [16] is one of them, developed with the purpose to allow soldiers to carry heavy loads over long distances. DARPA EPHA [17] is a full body exoskeleton; it supports both arms and legs and it is designed to amplify user’s strength. The X1 [18] exoskeleton was developed in a joint collaboration between IHMC (Institute of Human Machine and Cognition in Pensacola, FL, USA) and NASA. It was designed as a potential device for training astronauts in space, in order to prevent user’s muscles and bones from degrading in microgravity. All the previous examples have some common major limitations: their large mass can affect the inertia of the user’s movement and limit the natural range of motion of different human joints. In addition, some of them led to an increase in metabolic energy expenditure. To overcome this issue, Hugh Herr’s lab at MIT [19] developed an ankle-foot smart device with actuated ankle joint. The research group demonstrated a reduction in metabolic cost compared to what is spent without wearing the device. This kind of solution belongs to the assistive robotic exoskeletons for people with disabilities. Of course, one of the most important goals for this class of exoskeletons is to reduce the overall metabolic cost. However, other purposes are to give patients enough safety, support and balance during gait, or grasping and moving objects in the case of upper limb exoskeletons.
3 Mechanical Sensing for Lower Limb Soft Exoskeletons: Recent Progress and Challenges
One of the earliest exoskeletons developed for assisting disabled users is ReWalk [20, 21], which has been recently approved by the Food and Drug Administration (FDA) for utilization by SCI patients. ReWalk is a bilateral system composed of two modules, one for the knee and one for the hip, and it enables patients to walk, and to perform stand-to-sit and sit-to-stand. In this case, some of the limitations are the size and the difficulty of learning how to control it. In the same class of assistive devices several can be cited, like Hybrid Assistive Limb (HAL) [22], Ekso [23], Indego [24] and Rex Bionics [25]. HAL is an assistive device developed by a Japanese company to help people with several deficits that impair walking functions. It includes arm and leg supports, which can be used either separately or in tandem. HAL is one of the few commercial exoskeletons that exploit surface electromyography signal information. Ekso Bionics developed a device for helping patients affected by stroke and SCI to get back on their feet, supporting re-learning of correct step patterns, weight shifting, and potentially mitigating compensatory behaviors. Similar to ReWalk and HAL, Ekso bilaterally actuates the knee and hip joints. Despite it is largely used in clinical trials, it has still some limitations regarding the gait speed and the poor flexibility to adapt to changes in tasks or in the gait environment. Parker-Hannifin is an American company that has helped to develop and market the Indego exoskeleton. It is a powered orthosis worn around the waist and legs. Thanks to this device, individuals with SCI can stand and walk. An important advantage of Indego is that it is modular, so it can be divided into smaller parts for transportation when it is not in use. In the case of Rex Bionics, the artificial legs of the device completely grab human legs and the exoskeleton walks for the user without needing interaction between human and machine. However, it is quite bulky and walking is very slow. Different exoskeleton designs have been developed for industrial applications especially for assistance of healthy subjects. FORTIS [26] exoskeleton was developed by Lockheed Marti
71
for industrial use mainly in shipyards. The FORTIS device allows the workers to easily manipulate or handle heavy objects. Exhauss [27] is a French company with multiple exoskeleton solutions for various industrial areas. Some models are intended for users carrying tools or loads for longer periods of time and operate by having the user’s arms attached to suspending arms extended down from the shoulders: this way the carrying load is transferred from the arms and shoulders to the exoskeletons arms and frame. Another model is intended for users in sitting positions who perform repetitive tasks and motions, and finally another one specifically targets users carrying cameras that need to record frames with high steadiness. Gravity compensation assistance for heavy loads generally reduces the muscle activity of upper-limbs by distributing the muscles stress from shoulders and lumbars to the waist. Active- arms from Robo-Mate European project [28] use electric actuators to provide the assistance to the wearer’s arms from the waist. A second solution to the same industrial activity addressed by an active back-support to reduce job related injuries. The Robo-Mate exoskeleton was designed for manual material handling tasks [29]. Other exoskeletons have been developed specifically to provide therapeutic benefits for the user. For instance, a wheelchair bound individual may suffer of loss of muscle tone or brittleness of bones. Thus, it is very important to provide the patient with daily rehabilitative therapy. Also, it is crucial for the wheelchair bound individual to feel physically active for improving his psychophysical health. These kinds of exoskeletons can be either stationary (i.e. LOCOMAT, LOPES or ALEX) [11, 12] or mobile (AlterG’s Bionic Leg) robots [30]. Stationary exoskeletons are mainly employed by rehabilitation laboratories and by hospitals, while the latter are commercial devices and are used especially by stroke patients. AlterG’s Bionic Leg is unilateral and it is worn on the knee of the affected lower limb. In this case, one of the main limitations is the fact that therapy is unilateral. In addition, it is voluminous and heavy.
M. Totaro et al.
72
3.3
Soft Exoskeletons
Despite the large use of rigid exoskeletons in different sectors such as industry, military, clinical or rehabilitative environments, they still present several issues to be solved. Indeed, they require bulky self-aligning mechanisms, since rigid links can resist the movement of biological joints if they are not perfectly aligned. Rigid systems also present the problem of large inertia. Also, because of the mass added to the limbs, the metabolic cost to of accelerating and decelerating them increases. Due to these effects, wearing such devices often disturbs human biomechanics, making it not comfortable and causing pain (i.e. due to pressure on the skin) when worn for a long time. In order to overcome these challenges, in recent years, soft wearable robots have been investigated. These are devices that interface with the body using innovative textiles and soft materials to provide a more compliant and unobtrusive means to interface to the human body [31, 32]. Soft exoskeletons have several advantages compared with rigid exoskeletons. They can have extremely low inertias that reduce the metabolic expenditure of wearing them and very low weight. Research is ongoing in this field and pursuing systems that have low profile so that they can be worn underneath regular clothing, and thus increase user acceptability. Since they are mainly composed of textiles and soft materials, they shall adapt easily to anatomical variations and in a near future they will be donned and doffed without the assistance of an external operator. Wearing a soft exoskeleton should feel like wearing a normal shirt or a pair of pants, and not impart any constraint on the wearer. An important feature of a soft exoskeleton is that, if the actuated segments are extended, the supporting cloth can elongate accordingly so that the entire suit is slack. Indeed bulky, rigid, heavy and high power exoskeletons are being replaced by light, soft and, hence, less powerful ones. One of the first attempts was introduced in [33] where soft actuators were implied to assist limbs in rehabilitation. Then,
more recently, a new lower limb soft aid device, the Exosuit, was presented [34] with the specific aims to avoiding excessive pressures to the skin, increase the system autonomy and enhancing everyday usability and acceptance. In order to optimally control and evaluate soft exoskeletons, sensor systems that are easy to integrate with textiles and soft components are required. Rigid exoskeletons employ sensors such as potentiometers or encoders in robotic joints that monitor angles. These technologies are not compatible with soft exoskeletons. Alternative sensors have been designed to measure human kinematics and suit-human interaction forces that are compliant, robust, cost effective, and they could be easily integrated into wearable suits. One of the most relevant examples of soft exoskeletons is the Exosuit developed at Harvard. This system targets lower limb assistance. It was initially developed for military applications [35] and it has been recently successfully applied in rehabilitation [36]. A key feature of this exoskeleton is that it can minimize the distal mass that is attached to the wearer through more proximally mounted actuation systems and flexible transmissions that transmit power to the joints. The actuators are developed with cable- driven electromechanical approaches. Regarding the soft sensing capabilities, the same group designed a separate soft sensing suit for monitoring hip, knee, and ankle sagittal plane joint angles, using hyperelastic resistive strain sensors made with microchannels of liquid metal embedded within silicone elastomers [37]. Noticeably, they optimized their design with the use of discretized stiffness gradients to improve mechanical durability. This way, sensors could stretch up to 396% of their original lengths with gauge factor sensitivities greater than 2.2, and exhibited less than 2% change in electromechanical specifications through 1500 cycles of loading–unloading. Evaluating the accuracy and variability of the soft sensing suit by comparing it with joint angle data obtained through optical motion capture, the sensing suit had root mean square (RMS) errors of less than 5° for a walking speed of 0.89 m/s and reached a maximum RMS error of 15° for a running speed of 2.7 m/s. Researchers found that the relative repeatability of
3 Mechanical Sensing for Lower Limb Soft Exoskeletons: Recent Progress and Challenges
73
based on electromagnetic clutches mounted on the belt, transmitting the forces through Bowden cables, and a first generation of soft capacitive strain sensors mounted on the garment for the estimation of the knee angle was successfully integrated [40]. The beta 2 version was an important improvement since the electromagnetic clutches were replaced by a novel type of vacuum-based soft linear clutches [41, 42, 79]. This version was completely configurable with the possibility to assist the hip flexion/extension, knee flexion/extension and ankle dorsi/plantar flexion, in unilateral or bilateral configurations, with a maximum of 8 simultaneous actuators. The beta 2 prototype could be employed in the clinical validation phase. XoSoft final version is gamma (shown in Fig. 3.1i) integrating further improvements of the different technologies and it being tested for home-simulated environments. Across the different versions, innovative non- traditional sensing and actuation systems based on smart materials and soft structures were designed and evaluated. In particular, in the next sections, after giving an overview of soft sensing 3.3.1 XoSoft solutions, as an example a detailed description of the soft sensing system used in XoSoft is proXoSoft [39] is a soft, modular, lower limb exo- vided. Regarding the control, the system uses a skeleton that elderly and disabled people can biomimetic approach, which requires the meawear to assist their leg strength and support, to surement and identification of the movement of increase their mobility. It is designed to assist the user. In order to do that, different sensor sets persons with mobility restrictions due to muscle were used: (i) insole pressure sensors, (ii) IMUs, weakness and/or a partial loss of sensory or and (iii) soft strain sensors. Thanks to the insole motor function. The system is not intended to pressure sensors, it is possible to segment the substitute complete function loss, but rather to stance phase of the gait (heel strike, flat foot and assist the user in a tailored manner, such a post- toe off). Two IMUs mounted on the thigh and stroke, SCI or elderly subjects. shank can provide information about the angle/ Several generations of prototypes were devel- speed of the knee joint, which is necessary for the oped within the project. The alpha version was segmentation of the swing phase. The information designed using state-of-the-art technologies and from the soft capacitive sensors mounted on the it represented a test bed for the technologies, and knee is equivalent to the information provided by a mechanism to ensure the design process the IMUs, and they allow in the same way to segremained user centered, while the beta 1 version ment the swing phase. The actuators are then trigwas the first integrated system, with unilateral gered following a time or event based approach. assistance for the knee and hip flexion. In the The actuation strategy is decided according to the latter a quasi-passive actuation system was used, selected configuration and the patient. the sensing suit’s joint angle measurements were statistically equivalent to that of optical motion capture at all speeds. A similar concept can be found in the Myosuit system [38]. This wearable device is designed to provide continuous assistance at the hip and knee joint when working with and against gravity during activities of daily living (ADL). This robotic device combines active and passive elements with a closed-loop force controller designed to behave like an external muscle and deliver gravity compensation to the user. The main elements featuring this exoskeleton are the textile interface, tendon actuators, and bi-articular actuation for providing assistance. Apart from the soft structure of the system, these two examples have the common feature that the assistive forces are transmitted through cables to the body attachments. Recently in the European project XoSoft [39] a modular lower limbs’ assistive device was developed, with a novel type of quasi-passive soft actuation directly mounted on the lower limbs, as described in the next section.
74 Fig. 3.1 Main examples of hard (a–f) and soft (h–i) lower limb exoskeletons available in the market or in the literature. (a) Reproduced with permission from [1]; (b) Reproduced with permission from [18]; (c) Reproduced with permission from [2]; (d) Reproduced with permission from [1]; (e) Reproduced with permission from [21]; (f) www.rexbionics.com; (g) Reproduced from [38] (CC BY 4.0); (h) Reproduced with permission from [32]; (i) www.xosoft.eu
M. Totaro et al.
3 Mechanical Sensing for Lower Limb Soft Exoskeletons: Recent Progress and Challenges
3.4
ensing Solutions in Soft S Exoskeletons
75
ent physical principles [48, 49], and the application in various promising scenarios including the field of soft robotics [15]. Figure 3.2 provides an overview of different soft sensing developments. In the next sections, several kinds of soft strain sensors, exploiting different transduction principles and functional materials, potentially useful for monitoring joint movements in soft exoskeletons are overviewed.
Different types of sensors are used to control robotic exoskeletons [37–39]. Typically, mechanical sensors are helpful to regulate position, force or torque. Some lower limb robotic exoskeletons use a position controller that specifies a defined joint angle trajectory that the exoskeletons cycle through each step. Many devices use force sensors such as foot sensors that measure ground contact force; sometimes they may be used like a 3.4.1 Piezoresistive Strain Sensors simple on/off switch to detect heel strike and toe off. Accelerometers and gyroscopes are occa- In piezoresistive strain/pressure sensors the sionally used too. Neural and/or muscular sens- resistance R = ρL/S varies due to the mechanical ing has been employed in some exoskeletons, but stimulation [48]. The relative variation can be they have not yet been used for commercial devices. However, in soft exoskeletons the sens- written as ∆R = ∆ρ + ∆L − ∆S , where the first ρ0 L 0 S0 R0 ing system cannot rely on rigid components. Regarding the detection of joint movements, a term takes in account the variation of resistivity viable solution is the development of sensing (piezo-resistive effect), while the other two terms structures that provide a response upon strain are due to the geometrical deformation of the caused by the hosting garment worn on the joint. device. In case of uniaxial strain εx = ΔL/L0 and Soft strain sensors respond to the deformation isotropic material, the resistance variation vs. induced by an externally applied stimulus (e.g. ∆R ∆ρ ( ε x ) = + (1 + 2ν ) ε x , with ν the by joint movement) with different mechanisms, strain is ρ0 R0 depending on the type of materials, micro/nanostructures, and fabrication process. Strain- Poisson ratio of the material. The first term is null resistance response of traditional strain gauges if piezoresistive effects are negligible, otherwise originates from geometrical effects and piezore- it can be dominant, giving a huge gauge factor sistivity of materials themselves. Unlike tradi- (GF), exceeding 105 [55]. Usually, metallic-based strain gauges rely their tional strain gauges, mechanisms such as disconnection between sensing elements, crack behavior on geometric variations, leading a GF propagation in thin films, and tunneling effect around 2. Otherwise, crystalline Si-based sensors have been utilized to develop stretchable strain can have a much higher GF (order of 100–300) [56], sensors. Many of these approaches come from due to their piezoresistive effect. However, both the area of skin-like artificial tactile sensing these technologies are not suitable for wearable where, in addition to normal force, various applications, since the materials are very rigid, not mechanical parameters are pursued [43–45] (e.g. allowing the needed flexibility and stretchability. For this reason, several examples of resistive tangential forces, texture, vibrations) to mimic human sense of touch and to obtain the informa- sensors based on elastomeric stretchable materition for generating the perception, of e.g. a con- als and doped with conductive micro/nanocomtacted object by dexterous robotic hands [46]. In posite can be found in the literature. Usually, an this boundless field, several approaches have insulating stretchable polymer embeds conducbeen followed [45] and strongly rely on detecting tive particles and the current flows due to percolathe deformations induced in skin-like materials tive effects. Among others, graphene [57, 58], by external stimulations. They are strictly cross- conductive polymers [59, 60], carbon nanotubes linked with the development of novel materials (CNTs) [51, 61] and nanowires [62, 63] are used [44] and device layouts [47], the study of differ- as filling materials.
Fig. 3.2 Examples of soft sensors in wearable systems and soft robotics. (a) Hyper-elastic strain sensors based on micro-channels of liquid metal embedded within elastomer and integrated in a lower limb soft exoskeleton for monitoring hip, knee and ankle joints in the sagittal plane. (Adapted with permission from [37]). (b) Highly stretchable capacitive sensor made of conductive knit fabric as electrode and silicone elastomer as dielectric to be used in human articulation detection, soft robotics, and exoskeletons. (Reproduced with permission from [50]).
(c) Ultra-stretchable and skin-mountable strain sensors based on CNTs–Ecoflex nanocomposites. (Adapted with permission from [51]). (d) Stretchable capacitive sensor array made of very low modulus polymers and embedding liquid metal micro-channels. (Adapted with permission from [52]). (e) Stretchable optical waveguides for strain sensing in a soft finger. (Reproduced with permission from [53]). (f) Flexible, inductance-based contraction sensors in the closed-loop motion control of soft actuators (“smart braid”). (Adapted with permission from [54])
3 Mechanical Sensing for Lower Limb Soft Exoskeletons: Recent Progress and Challenges
An alternative approach consists in using inherently conductive stretchable materials, such as liquid metals [64], ionic conductors [65], or hydrogels [66], or even structure the material in order to have guided deformations upon strain (i.e. wrinkled structures or interlocked deformable nets). One of the main limitations of resistive sensors is their low stability (especially in the case of hydrogel and nanoparticle-based devices) and their high hysteresis due to creep, plastic deformations and to the unstable polymer/particle interface. Also, resistive devices are very sensitive to temperature and humidity variations, and solutions to compensate these effects are not always compatible with wearable systems.
3.4.2 Capacitive Strain Sensors In the case of parallel plane electrodes and neglecting fringing field effects, the capacitance is given by C = kA/d, with k the dielectric constant, A the electrode area, and d the dielectric thickness. For pressure sensors, the mechanical stimulation affects mainly the distance d between electrodes, while for strain, both area and distance can vary considerably. In the case of uniaxial εx strain and isotropic material, the relative capacitance variation is ΔC/C0 = εx; then the GF cannot exceed the unit. To have higher sensitivities, either alternative structures (i.e. comb fingers) or anisotropic materials/structures should be used. Capacitive sensors have been widely used in semiconductor industry for decades, such as in micro electro-mechanical systems (MEMS) for their stability, good frequency response, low temperature drift and low power consumptions. Recently, for their good characteristics, capacitive sensors have also been largely developed in soft and stretchable systems. Indeed, a plethora of novel materials, such as conductive fabric [50, 67], nanocomposites [68], conductive polymer/ hydrogel [69], and conductive liquid [52] have been deployed as stretchable electrodes. On the other hand, the continuous miniaturization required an increasing effort in discriminating small variations, down to the fF range, leading
77
to very high sensitivity to proximity, electrostatic effects, parasitic elements and electromagnetic interference. For all these factors, especially for wearable sensors, electrical shielding became a fundamental aspect to be considered.
3.4.3 S train Sensors Based on Other Transduction Principles Besides resistive and capacitive based sensors, also other transduction principles have been exploited in the field. In particular, optical-based systems seem a very promising solution. In this case [53, 70], the light emitted by a LED is transmitted through a soft optical fiber or waveguide. If integrated in a soft system (i.e. a robot or an exoskeleton) movements and tactile interactions cause a deformation in the transmission medium and by consequence a variation on the detected light on the other end of the waveguide. This technology enables high resolution and large sensing areas by making an array of fibers or a skin with several emitter/ detectors combined together [71]. In addition, the sensing area can be completely free of rigid sensing elements and electronic boards that can be located on the periphery, preserving as much as possible the mechanical properties of the soft systems. On the other hand, they require higher power consumption than resistive and capacitive devices, and this can limit their use in autonomous or low power systems. Finally, inductive sensors are emerging in recent years, where different mechanism can correlate mechanical deformations and inductance variations (i.e. coil geometry [72], mutual inductance [54], eddy-current effect [73], or magnetic reluctance [74]). Some recent development include an inductance-based soft deformation and force sensor for McKibben muscles [54] by forming stretchable helical coils on the fiber-reinforced braid (“smart braid”), and a new type of inductive tactile sensor based on eddy-current effect [75], which is low in cost, of high performance, robust in harsh environments (e.g., underwater), and durable to repeated contact (demonstrated by the hammer strike test).
M. Totaro et al.
78
One of the main advantages is that the sensitive materials do not need any physical connection to the read-out system. Then, typical issues related to wiring in soft systems (complexity, reliability, mechanical compliance) can be avoided. Nowadays, the main limitations with this technology are the few different materials available, the complex signal conditioning electronics and the high sensitivity to electromagnetic interferences.
3.5
xample of Soft Sensing E for Lower Limb Soft Exoskeletons
Sensorized modules in XoSoft embed soft capacitive strain sensors [40]. They consist of a combination of conductive and dielectric layers. Building a simple parallel plate capacitor structure is not enough since parasitic capacitances and proximity effects must be avoided. Thus, a three-electrode configuration
Fig. 3.3 Schematic layout (a) and cross sectional view (b) (both not in scale) of the soft capacitive strain sensor with three- electrode configuration used in XoSoft for monitoring lower limb joint movements. (Reproduced from [40] CC BY 4.0)
was adopted, where the bottom and the top electrodes of the capacitor have been connected to ground, while the central electrode to the sensing pin of the conditioning electronic circuit, as shown in Fig. 3.3. In addition both ground layers have been designed larger than the electrode and dielectric layers in order to completely embed the device, and to shield it. It is also fundamental to avoid parasitic capacitances arising from the electrical connections going from the sensor to the electronics module. Therefore the shielded sensor and shielded connections are designed as a whole device integrated in the textile. From an electrical point of view, the nominal capacitance C0 _ tot of the sensor can be considered as the parallel between C0 and C0 , which are the 1 2 capacitance between the electrode and the bottom ground layer, and between the electrode and the top ground layer, respectively, as depicted in 0 In particular C0 and C0 can be considered as 1 2 the capacitances of two parallel electrodes plates, as follows:
3 Mechanical Sensing for Lower Limb Soft Exoskeletons: Recent Progress and Challenges
C01 =
C02 =
ε 0ε r1 A1 d01
ε 0ε r2 A2 d0 2
where ε0 is the dielectric constant of the free space (8.85 × 10−14 F/cm),ε r and ε r are the rela1 2 tive permittivity for the first and the second dielectric material, A1 and A2 are the sensing area between the electrode and the top and bottom ground layers, d0 and d0 are the dielectric thick1 2 nesses. Therefore, C0 _ tot, which is the parallel between C0 and C0 , is given by: 1
is C M =
Considering the employment of the same material for both dielectric layers (thus, ε r1 = ε r2 = ε r ), and with the same thicknesses (i.e., d= d= d0 ), and the same sensing area 01 02 between the electrode and the top and bottom ground layers (i.e., A1 = A2 = A0), the nominal capacitance C0 _ tot becomes:
C0 _ tot = 2ε 0ε r
A0 d0
d 0 ∆A1 ∆d1 − A 0 = 2C0 d 0 − ∆d1
d 0 ∆A2 ∆d 2 − A 0 d 0 − ∆d 2
Cs ( C p + C x ) Cs + C p + C x
− Coff , with Cx the sensor
capacitance, as sketched in Fig. 3.4. When the sensor is strained, a capacitance variation ΔC occurs with respect its initial value C0. Consequently, the measured capacitance variation is ∆CM =
Cs ( C p + C0 + ∆C ) Cs + C p + C0 + ∆C
Cs ( C p + C0 ) Cs − ∆C. Cs + C p + C0 Cs + C p + C0 2
When a strain is applied to the sensor, both the dielectric thicknesses and the sensing area vary, resulting in a capacitance variation ∆C with respect to the nominal value C0 _ tot, as explained in the following:
∆C = C − C0tot
the sensor subtle signal variations. The input dynamic of the chip is ±8.192 pF, and the CDC can introduce a virtual offset capacitance (COFF) in the range ± 20 pF to shift the capacitance value of sensor in the input range. However, in the case of quite large capacitance, like those fabricated for sensorized kneepad and anklet, this offset is not enough, and a capacitor Cs is put in series with the sensor to allow a correct measurement. In addition, the coaxial cable, used to minimized electromagnetic noise and proximity effects in the connections, introduces a capacitance Cp in parallel with the sensor. Then, the capacitance measured by the CDC
2
ε r A1 ε r A2 C0 _ tot = C01 + C02 = ε 0 1 + 2 d0 d0 2 1
79
From the above equation, we can observe that the measured variation depends on the square of the partition factor between Cs and Cp + C0. In addition, since Cp is proportional to coaxial cable length, longer cables introduce larger attenuation factors. This limits the cable length used for connecting the sensors to the electronics.
3.5.1 Knee and Ankle Modules
This capacitance variation is in the range of a picofaraday fraction. To measure properly such small quantity a read-out circuit is used (Fig. 3.3) based on a capacitance to digital converter (CDC) (AD7147, Analog Devices) that is able to measure up to 13 channels in parallel with a resolution of 1 fF. This detail is very important to detect
In XoSoft project, sensorized modules for knee and ankle joints were developed, shown in Fig. 3.5. In particular, the knee module should monitor one degree of freedom, namely the flexion/extension angle; while the ankle module should provide feedback on up to three degrees of freedom (i.e. plantar/dorsiflexion, abduction/ adduction, inversion/eversion).
80
Fig. 3.4 Schematics of read-out circuitry for soft capacitive sensors, considering the effect of cable parasitic
Fig. 3.5 Smart kneepad and anklet sensorized with soft capacitive strain sensors, developed in the XoSoft project, for monitoring knee and ankle movements. (Reproduced from [40] CC BY 4.0)
The analysis of the knee bending movement reveals that the largest strain occurs in the sagittal axis direction. According to this, the strain sensors for the knee brace were designed in rectangular shape, with the longest side parallel to the sagittal axis. In this way, the deformation of the sensors, and therefore the output signals, is maximized.
M. Totaro et al.
capacitance (Cp), the virtual offset introduced by the CDC (COFF), and the series capacitance (Cs) introduced for tuning the proper input range
More specifically, the sensorized knee brace integrates three sensors. One sensor is positioned at the knee centre in correspondence of the kneecap (named C2), while the two other sensors are placed at kneecap’s sides (named C1 and C3). Each sensor has a sensing area of 162 mm2. The capacitive elements are sensitive also to pressure solicitations; therefore, the integration of three sensors allows the simultaneous discrimination between strain (due to the bending of the knee) and pressure (due to accidental contact with the surrounding). For the general purpose monitoring of ankle angles relative to dorsi/plantar flexion, abduction/adduction, and rotation, five strain sensors were integrated in the commercial ankle brace (Fig. 3.5). Three sensors were integrated on the front side of the ankle brace: one central (named C3 with a sensing area of 93 mm2) and two lateral sensors (named C2 and C4, each having an area of 180 mm2). The other two sensors were positioned on the back side of the brace (i.e., C1 and C5 with a sensing area of 120 mm2 for each). By combining all sensor outputs with a proper algorithm, the monitoring of the desired movements is possible. Both modules have been characterized by comparing the sensor response to the joint movements reconstructed by an Optitrack system. The main results are summarized below, while further details can be found in [40]. In Fig. 3.6a, b the time response and the output characteristics for the knee module are shown, respectively. In this case, a linear fitting is enough for obtaining a linear correlation in the
3 Mechanical Sensing for Lower Limb Soft Exoskeletons: Recent Progress and Challenges
81
Fig. 3.6 (a) Estimated angle (red solid line) obtained by the combination of the three sensors on the kneepad, compared with the angle measured by the OptiTrack
system (black dashed line) during bending/unbending cycles. (b) Combined output characteristics of the sensorized kneepad. (Adapted from [40] CC BY 4.0)
whole 0–90° range, with a root mean square error (RMSE) less than 4°. For the ankle, Fig. 3.7 shows the results obtained by reconstructing the data from the ankle module for three different movements (dorsi/ plantar flexion, prono/supination and rotation) are shown. In the upper panel, an example of raw signals in function of time is depicted. Oppositely, in the lower panels the reconstruction of Euler angles is shown, and compared to the data obtained by optical tracking and using a polynomial combination (4th and 5th degree). Three different phases are clearly visible from the angle graphs. In particular, in phase 1, which corresponds to dorsi/plantar flexion, Eul-X variation is much larger with respect to the variation of the other two components. Moreover, in phase 2, Eul-X has the lowest variations. This behavior corresponds to an abduction/adduction movement. Then, in phase 3 all angles vary in the same range, indicating that a foot rotation is occurring.
lenges of the field. In particular, soft sensors need to be integrated directly in the exoskeleton garment, keeping their functionality but ensuring, at the same time: (i) mechanical robustness, especially because during donning and doffing the garment can experience much higher deformations with respect to the normal working regime; (ii) washability of the sensorized garment with relative (iii) easy and reliable attaching/detaching of conditioning and transmission circuitry; (iv) wearability, keeping low complexity in the whole sensing system (i.e. connections, wiring, etc.). In particular, regarding the last point, a high number of connections and wiring could affect or limit the movements, being at the same time difficult to manage both electronically and algorithmically. From these considerations, the most promising path appears to develop soft wearable sensing structures that completely merge the sensing system and the textile substrate both in terms of materials and structures, like some groups have started investigating [76–78]. In conclusion, in the design and development of wearable sensorized systems, the main future efforts should be concentrated on achieving compliance with the hosting garment, but still keeping a reliable transduction functionality and being able to discriminate different mechanical stimuli without knowing a priori the body movement and regardless of the softness of the body itself.
3.6
Conclusions
In this chapter, the main functional solutions adopted for sensorizing soft exoskeletons are overviewed focusing on the needs from a functional point of view. In addition to this there are some key requirements that are still open chal-
82
Fig. 3.7 Mixed movement monitoring. Upper panel: Raw capacitance variations. Lower panels: Reconstructed Euler angles compared with the optical tracking system
M. Totaro et al.
measurements (solid green line). In this case, different movements can be distinguished. (Reproduced from [40] CC BY 4.0)
3 Mechanical Sensing for Lower Limb Soft Exoskeletons: Recent Progress and Challenges
References 1. Chen B, Ma H, Qin LY, Gao F, Chan KM, Law SW, Qin L, Liao W (2016) Recent developments and challenges of lower extremity exoskeletons. J Orthop Trans 5:26 2. Young AJ, Ferris DP (2017) State of the art and future directions for lower limb robotic exoskeletons. IEEE Trans Neural Syst Rehabil Eng 25(2):171–182 3. Ferris DP, Schlink BR (2017) Robotic devices to enhance human movement performance. Kinesiol Rev 6(1):70–77 4. Bogue R (2009) Exoskeletons and robotic prosthetics: a review of recent developments. Ind Robot Int J 36(5):421–427 5. Farris RJ, Quintero HA, Goldfarb M (2011) Preliminary evaluation of a powered lower limb orthosis to aid walking in paraplegic individuals. IEEE Trans Neural Syst Rehabil Eng 19(6):652–659 6. Murray SA, Ha KH, Goldfarb M (2014) An assistive controller for a lower-limb exoskeleton for rehabilitation after stroke, and preliminary assessment thereof. In: Proceedings of Engineering in Medicine and Biology Society (EMBC), 2014 36th Annual International Conference of the IEEE. IEEE, pp 4083–4086 7. Ikehara T, Nagamura K, Ushida T, Tanaka E, Saegusa S, Kojima S, Yuge L (2011) Development of closed fitting-type walking assistance device for legs and evaluation of muscle activity. In: Proceedings of 2011 IEEE International Conference on Rehabilitation Robotics, ICORR’11, Zurich, Switzerland, June 27– July 1, 2011, pp 1–7 8. Kong K, Jeon D (2006) Design and control of an exoskeleton for the elderly and patients. IEEE/ASME Trans Mechatron 11(4):428–432 9. Sugar T, Veneman J, Hochberg C, Shourijeh M, Acosta A, Vazquez-Torres R, Marinov B, Nabeshima C (2018) Hip exoskeleton market-review of lift assist wearables. Wearable Robotics Association Conference, Scottsdale (AZ), USA 10. De Looze MP, Bosch T, Krause F, Stadler KS, OSullivan LW (2016) Exoskeletons for industrial application and their potential effects on physical work load. Ergonomics 59(5):671–681 11. Jezernik S, Colombo G, Keller T, Frueh H, Morari M (2003) Robotic orthosis lokomat: a rehabilitation and research tool. Neuromod Technol Neural Interface 6(2):108–115 12. Veneman JF, Kruidhof R, Hekman EE, Ekkelenkamp R, Van Asseldonk EH, Van Der Kooij H (2007) Design and evaluation of the lopes exoskeleton robot for interactive gait rehabilitation. IEEE Trans Neural Syst Rehabil Eng 15(3):379–386 13. Fong D, Chan Y-Y (2010) The use of wearable inertial motion sensors in human lower limb biomechanics studies: a systematic review. Sensors 10(12):11556–11565 14. Veneman JF, Burdet E, van der Kooij H, Lefeber D (2017) Emerging directions in lower limb externally
83
wearable robots for gait rehabilitation and augmentation–a review. In: Advances in cooperative robotics, pp 840–850 15. Wang H, Totaro M, Beccai L (2018) Toward perceptive soft robots: progress and challenges. Adv Sci 5(9):1800541 16. Zoss AB, Kazerooni H, Chu A (2006) Biomechanical design of the Berkeley lower extremity exoskeleton (BLEEX). IEEE/ASME Trans Mechatron 11(2):128–138 17. Exoskeletons for Human Performance Augmentation, DARPA is Soliciting Innovative Research Proposals on Exoskeletons. http://www.oocities.org/marksrealm/project450.html 18. Rea R, Beck C, Rovekamp R, Neuhaus P, Diftler M (2013) X1: a robotic exoskeleton for in-space countermeasures and dynamometry. In: Proceedings of AIAA Space 2013 conference and exposition, p 5510. https://doi.org/10.2514/6.2013–5510. September 2013 19. Blaya JA, Herr H (2004) Adaptive control of a variable-impedance ankle-foot orthosis to assist drop-foot gait. IEEE Trans Neural Syst Rehabil Eng 12(1):24–31 20. Esquenazi A, Talaty M, Packel A, Saulino M (2012) The ReWalk powered exoskeleton to restore ambulatory function to individuals with thoracic-level motorcomplete spinal cord injury. Am J Phys Med Rehabil 91(11):911–921 21. Talaty M, Esquenazi A, Briceño JE (2013) Differentiating ability in users of the ReWalk powered exoskeleton: an analysis of walking kinematics. In: Proceedings of IEEE 13th International Conference on Rehabilitation Robotics, ICORR’13, Seattle, WA, June 24–26, 2013, pp 1–5 22. Sankai Y Hybrid assistive limb based on cybernics. Robotics Research, pp:25–34 23. Ekso Bionics, Richmond, CA, USA. https://eksobionics.com/ 24. The Parker Indego® Powered Lower Limb Exoskeleton. http://www.indego.com/indego/en/home 25. Rex Bionics – Reimagining Rehabilitation. https:// www.rexbionics.com/ 26. Martin L (2016) Fortis exoskeleton. https://www. lockheedmartin.com/en-us/products/exoskeletontechnologies/industrial.html 27. http://www.exhauss.com/ 28. Robo-mate. www.robo-mate.eu 29. Toxiri S, Koopman AS, Lazzaroni M, Ortiz J, Power V, de Looze MP, O’Sullivan L, Caldwell DG (2018) Rationale, implementation and evaluation of assistive strategies for an active back-support exoskeleton. Front Robot AI 5 art. num. 53 30. Bionic leg™, Alter G. https://exoskeletonreport.com/ product/bionic-leg/ 31. Dollar AM, Herr H (2008) Lower extremity exoskeletons and active orthoses: challenges and state-of-theart. IEEE Trans Robot 24(1):144–158 32. Asbeck AT, De Rossi SMM, Galiana I, Ding Y, Walsh CJ (2014) Stronger, smarter, softer: next- generation wearable robots. IEEE Robot Autom Mag 21(4):22–33
84 33. Caldwell DG, Tsagarakis NG, Kousidou S, Costa N, Sarakoglou I (2007) “Soft” exoskeletons for upper and lower body rehabilitation design, control and testing. Int J Humanoid Robot 4(03):549–573 34. Kesner SB, Jentoft L, Hammond FL, Howe RD, Popovic M (2011) Design considerations for an active soft orthotic system for shoulder rehabilitation. In: Engineering in Medicine and Biology Society, EMBC, Annual International Conference of the IEEE. IEEE, 2011, pp 8130–8134 35. Asbeck AT, Dyer RJ, Larusson AF, Walsh CJ (2013) Biologically-inspired soft exosuit. In: 2013 IEEE International Conference on Rehabilitation Robotics (ICORR), IEEE, pp 1–8 36. Awad LN, Bae J, O’donnell K, De Rossi SMM, Hendron K, Sloot LH, Kudzia P et al (2017) A soft robotic exosuit improves walking in patients after stroke. Sci Transl Med 9(400):eaai9084 37. Mengüç Y, Park YL, Pei H, Vogt D, Aubin PM, Winchell E et al (2014) Wearable soft sensing suit for human gait measurement. Int J Robot Res 33(14):1748–1764 38. Schmidt K, Duarte JE, Grimmer M, Sancho-Puchades A, Wei H, Easthope CS, Riener R (2017) The myosuit: bi-articular anti-gravity exosuit that reduces hip extensor activity in sitting transfers. Front Neurorobot 11:57 39. Di Natali C, Poliero T, Sposito M, Ortiz J, Graf E, Bauer C, Pauli C, Bottenberg E, De Eyto A, O’Sullivan L, Hidalgo AF, Scherly D, Stadler K, Caldwell DG. Design and evaluation of a soft assistive lower limb exoskeleton. Robotica: 1–21, doi:10.1017/ S0263574719000067 40. Totaro M, Poliero T, Mondini A, Lucarotti C, Cairoli G, Ortiz J, Beccai L (2017) Soft smart garments for lower limb joint position analysis. Sensors 17(10):2314 41. Sadeghi A, Mondini A, Mazzolai B (2018, October) Preliminary experimental study on variable stiffness structures based on textile jamming for wearable robotics. In: Proceedings of international symposium on wearable robotics, Springer, Cham, Pisa, Italy, October 16–20, 2018, pp 49–52 42. Sadeghi A, Mondini A, Mazzolai B. Compact and powerful: a vacuum powered soft textile-based clutch, Preprints 2019. https://doi.org/10.20944/preprints201904.0001.v1 43. Chortos A, Liu J, Bao Z (2016) Pursuing prosthetic electronic skin. Nat Mater 15(9):937 44. Yogeswaran N et al (2015) New materials and advances in making electronic skin for interactive robots. Adv Robot 29(21):1359–1373 45. Yang T, Xie D, Li Z, Zhu H (2017) Recent advances in wearable tactile sensors: materials, sensing mechanisms, and device performance. Mater Sci Eng R Rep 115:1–37 46. Kappassov Z, Corrales JA, Perdereau V (2015) Tactile sensing in dexterous robot hands. Robot Auton Syst 74:195–220 47. Tiwana MI, Redmond SJ, Lovell NH (2012) A review of tactile sensing technologies with applications in
M. Totaro et al. biomedical engineering. Sensors Actuators A Phys 179:17–31 48. Stassi S, Cauda V, Canavese G, Pirri C (2014) Flexible tactile sensing based on piezoresistive composites: a review. Sensors 14(3):5296–5332 49. Salim A, Lim S (2017) Review of recent inkjet-printed capacitive tactile sensors. Sensors 17(11):2593 50. Atalay A, Sanchez V, Atalay O, Vogt DM, Haufe F, Wood RJ, Walsh CJ (2017) Batch fabrication of customizable silicone-textile composite capacitive strain sensors for human motion tracking. Adv Mater Technol 2(9):1700136 51. Morteza A, Yong Jin Y, Inkyu P (2015) Ultra- stretchable and skin-mountable strain sensors using carbon nanotubes–Ecoflex nanocomposites. Nanotechnology 26(37):375501 52. Li B, Fontecchio AK, Visell Y (2016) Mutual capacitance of liquid conductors in deformable tactile sensing arrays. Appl Phys Lett 108(1):013502 53. Zhao H, O’Brien K, Li S, Shepherd RF (2016) Optoelectronically innervated soft prosthetic hand via stretchable optical waveguides. Sci Robot 1(1):eaai7529 54. Felt W, Chin KY, Remy CD (2017) Smart braid feedback for the closed-loop control of soft robotic systems. Soft Robot 4(3):261–273 55. Ivančo J, Halahovets Y, Végsö K, Klačková I, Kotlár M, Vojtko A, Micuśík M, Jergel M, Majková E (2016) Cyclopean gauge factor of the strain-resistance transduction of indium oxide films. In: Proceedings of IOP conference series: materials science and engineering 108(1):012043. Mykonos, Greece, September 27–30, 2015 56. Yang S, Lu N (2013) Gauge factor and stretchability of silicon-on-polymer strain gauges. Sensors 13(7):8577–8594 57. Tian H, Shu Y, Cui Y-L, Mi W-T, Yang Y, Xie D, Ren T-L (2014) Scalable fabrication of 692 high- performance and flexible graphene strain sensors. Nanoscale 6(2):699–705 58. Bae S-H, Lee Y, Sharma BK, Lee H-J, Kim J-H, Ahn J-H (2013) Graphene-based transparent strain sensor. Carbon 51:236–242 59. Pani D, Dessì A, Saenz-Cogollo JF, Barabino G, Fraboni B, Bonfiglio A (2016) Fully textile, PEDOT: PSS based electrodes for wearable ECG monitoring systems. IEEE Trans Biomed Eng 63(3):540–549 60. Gualandi I, Marzocchi M, Achilli A, Cavedale D, Bonfiglio A, Fraboni B (2016) Textile organic electrochemical transistors as a platform for wearable biosensors. Sci Rep 6:33637 61. Yamada T, Hayamizu Y, Yamamoto Y, Yomogida Y, Izadi-Najafabadi A, Futaba DN, Hata K (2011) A stretchable carbon nanotube strain sensor for human- motion detection. Nat Nanotechnol 6(5):296–301 62. Amjadi M, Pichitpajongkit A, Lee S, Ryu S, Park I (2014) Highly stretchable and sensitive strain sensor based on silver nanowire–elastomer nanocomposite. ACS Nano 8(5):5154 63. Wang J, Jiu J, Nogi M, Sugahara T, Nagao S, Koga H, He P, Suganuma K (2015) A highly sensitive and flex-
3 Mechanical Sensing for Lower Limb Soft Exoskeletons: Recent Progress and Challenges ible pressure sensor with electrodes and elastomeric interlayer containing silver nanowires. Nanoscale 7(7):2926 64. Park YL, Chen BR, Wood RJ (2012) Design and fabrication of soft artificial skin using embedded microchannels and liquid conductors. IEEE Sensors J 12(8):2711–2718 65. Helps T, Rossiter J (2018) Proprioceptive flexible fluidic actuators using conductive working fluids. Soft Robot 5(2):175–189 66. Truby RL, Wehner M, Grosskopf AK, Vogt DM, Uzel SG, Wood RJ, Lewis JA (2018) Soft somatosensitive actuators via embedded 3D printing. Adv Mater 30(15):1706383 67. Viry L, Levi A, Totaro M, Mondini A, Mattoli V, Mazzolai B, Beccai L (2014) Flexible three-axial force sensor for soft and highly sensitive artificial touch. Adv Mater 26(17):2659–2664 68. White EL, Yuen MC, Case JC, Kramer RK (2017) Low-cost, facile, and scalable manufacturing of capacitive sensors for soft systems. Adv Mater Technol 2(9):1700072 69. Larson C, Peele B, Li S, Robinson S, Totaro M, Beccai L et al (2016) Highly stretchable electroluminescent skin for optical signaling and tactile sensing. Science 351(6277):1071–1074 70. Yun S, Park S, Park B, Kim Y, Park SK, Nam S, Kyung KU (2014) Polymer-waveguide-based flexible tactile sensor array for dynamic response. Adv Mater 26(26):4474–4480 71. Levi A, Piovanelli M, Furlan S, Mazzolai B, Beccai L (2013) Soft, transparent, electronic skin for distributed and multiple pressure sensing. Sensors 13(5):6578–6604
85
72. Lazarus N, Bedair SS (2018) Bubble inductors: pneumatic tuning of a stretchable inductor. AIP Adv 8(5):056601 73. Wang H, Kow J, Raske N, de Boer G, Ghajari M, Hewson R et al (2018) Robust and high-performance soft inductive tactile sensors based on the eddy-current effect. Sensors Actuators A Phys 271: 44–52 74. Kawasetsu T, Horii T, Ishihara H, Asada M (2017, October) Size dependency in sensor response of a flexible tactile sensor based on inductance measurement. In: 2017 IEEE SENSORS, IEEE, pp 1–3 75. Wang H, Jones D, de Boer G, Kow J, Beccai L, Alazmani A, Culmer P (2018) Design and characterization of tri-Axis soft inductive tactile sensors. IEEE Sensors J 18(19):7793–7801 76. Totaro M, Bottenberg E, Groeneveld R, Erkens L, Mondini A, Brinks GJ, Beccai L (2018, October) Towards embroidered sensing technologies for a lower limb soft exoskeleton. In: Proceedings of international symposium on wearable robotics, Springer, Cham, Pisa, Italy, October 16–20, 2018, pp 53–57 77. Parrilla M, Cánovas R, Jeerapan I, Andrade FJ, Wang J (2016) A textile-based stretchable multi-ion potentiometric sensor. Adv Healthc Mater 5(9):996–1001 78. Lee J, Kwon H, Seo J, Shin S, Koo JH, Pang C et al (2015) Conductive fiber-based ultrasensitive textile pressure sensor for wearable electronics. Adv Mater 27(15):2433–2439 79. Sadeghi A, Mondini A, Totaro M, Mazzolai B, Beccai L, (2019) A wearable sensory textile based clutch with high blocking force. Adv Eng Mat (in press), doi:10.1002/adem.201900886.
4
A Proposed Clinical Evaluation of a Simulation Environment for Magnetically-Driven Active Endoscopic Capsules Yasmeen Abu-Kheil, Omar Al Trad, Lakmal Seneviratne, and Jorge Dias
Abstract
Background A simulation environment for magnetically-driven, active endoscopic capsules (Abu-Kheil Y, Seneviratne L, Dias J, A simulation environment for active endoscopic capsules. 2017 IEEE 30th international symposium on Computer Based Medical Systems (CBMS), Thessaloniki, pp 714–719, 2017), can perform four main operations: capsule tele- operation, tracking of a specific region of interest, haptic feedback for capsule navigation and virtual reality navigation. Methods The main operations of the simulation environment can be clinically evaluated. In this paper, we proposed a clinical evaluation for the main functions of the simulation environment. There main testing procedures for the navigation strategies are proposed; i) vision-based tele-operation, ii) vision/haptic-based navigation without head control, and iii) vision/haptic-based navigation with head control. The navigation ways can be compared with each other in terms of introduction time, visualization and procedure comfort. Human-subject studies are to be conY. Abu-Kheil · O. Al Trad Higher Colleges of Technology, Abu Dhabi Women’s College, Abu Dhabi, United Arab Emirates L. Seneviratne · J. Dias (*) Khalifa University of Science and Technology, Abu Dhabi, United Arab Emirates e-mail: [email protected]
ducted in which 20 students and 12 expert gastroenterologists participated.
Keywords
Haptic guidance · Image-guided surgery · Medical robotics · Active endoscopic capsule · Virtual reality
4.1
Introduction
Colorectal cancer (CRC) is ranking third among all cancers in humans [2]. With an annual death toll of 774,000 (2015), CRC imposes a considerable economic burden on affected individuals, healthcare systems and the society overall. The annual expenditure for CRC is around $5.3 billion [3]. However, CRC mortality can be reduced if new cases are detected at an early stage. Colonoscopy remains in the core of most population-based screening; however, due to the rigidity of the instrument, colonoscopy is often poorly tolerated by patients. On the other hand, capsule endoscopy (CE) allows non- invasive and painless examination of the gastrointestinal (GI) tract [4]. However, one of the important limitations of CE is the long procedural time due to passive capsule movement based on the p eristalsis
© Springer Nature Switzerland AG 2019 J. S. Sequeira (ed.), Robotics in Healthcare, Advances in Experimental Medicine and Biology 1170, https://doi.org/10.1007/978-3-030-24230-5_4
87
88
motion of the GI tract. Another limitation of the CE is that often only a portion of the GI tract is visualized due to poor image quality and uncontrolled capsule motion [5, 6]. Therefore, many research groups were motivated to develop techniques for active capsule locomotion and localization that would enable clinicians to guide the capsule movement so a full coverage of the surface of the GI tract is achieved [4]. Experimentation for ex vivo validation of active capsule locomotion and localization usually requires the development and manufacturing of capsule hardware, the locomotion system as well as a testing environment (e.g. explanted porcine intestine or life-like plastic simulators). Such experimentation though requires both time and considerable cost. Simulators have played a vital role in robotics research as tools for quick and efficient testing of new concepts, strategies, and algorithms [8]. Using robotic simulation for active endoscopic capsules not only reduces the cost involved in capsule production, but also allows fast testing and demonstration of the developed algorithm to determine its applicability. Therefore, there is an unprecedented need for developing simulation platforms for active endoscopic capsules. These platforms will be used to train medical doctors on how to select a proper navigation technique that results in a high-quality inspection.
Y. Abu-Kheil et al.
system is based on a man-in-the-loop control approach involving master/slave architecture. The basic teleoperation system consists of three main elements: a master, a communication link and a slave. In this system, the master (user) can control and adjust the slave (robot) using the master device based on the visual feedback information such as images and videos [7]. Teleoperation can be applied into active endoscopic capsule navigation as shown in Fig. 4.1. In this case, the operator i.e. an endoscopist uses a haptic device to control the position of an external permanent magnet (EPM), placed at the end of a robotic arm. During the teleoperation process, the user will receive visual feedback from the endoscopic capsule camera on a pair of 3D glasses. The user can also specify a region of interest to track where this region is bounded by a box during the navigation process inside the colon. Examples of region of interests are a mucosal ulcer or a polyp.
4.2.2 Vision/Haptic-Based Navigation Without Head Control
The aim of the vision/haptic-based navigation without head control, shown in Fig. 4.2, is to assist endoscopists during the navigation inside the colon by providing a force feedback to the user through the haptic device. The force feedback allows the user to sense the forces applied by the capsule on its environment. In this case, 4.2 Materials & Methods the operator uses a haptic device to control the In this paper, a simulation environment for position of an EPM, placed at the end of a robotic magnetically-driven active endoscopic capsule is arm. During the teleoperation process, the user evaluated. Such a simulation environment pro- will receive a rendered view of colon images on vides three main ways to navigate the capsule the 3D glasses and will feel the interaction forces inside the colon environment: vision-based tele- with the intestinal wall through the haptic device. The shared control between vision and force operation, vision/haptic-based navigation with/ will not only provide better maneuverability, but without head control. also will lead to subtle dexterity especially during minimally invasive surgeries [8]. For example, using vision information only increases the risk 4.2.1 Vision-Based Tele-Operation of tissue damage during tele-operation surgeries. Tele-operation is the process of remotely control- For example, haptic guidance allows the user to ling and operating machines. The tele-operation sense a virtual feedback force if the user is getting
4 A Proposed Clinical Evaluation of a Simulation Environment for Magnetically-Driven Active…
Fig. 4.1 Vision based tele-operation system
Fig. 4.2 Vision/haptic based tele-operation system without head control
89
Y. Abu-Kheil et al.
90
closer to the boundaries indicated by the colon map. The haptic guidance is also used to keep the user moving over a specific trajectory; in this case, the information stored in the trajectory is used to build a 3D virtual force field that is applied on the haptic device. If the user deviates away from the generated trajectory, an attractive virtual force is generated.
4.2.3 Vision/Haptic-Based Navigation with Head Control Similar to the second navigation way “Vision/ Haptic-based Navigation without Head Control”, the user is provided with two types of feedback information: 3D visual information feedback through 3D rendered bowel maps and 3D pseudo force feedback through a haptic device. The aim of the vision/haptic-based navigation with head control, shown in Fig. 4.3, is to use the head orientation, measured from the 3D glasses, to con-
trol the capsule orientation while using the haptic device to control the translational motion of the capsule. In this case, the pitch movement of the head (looking up and down) corresponds for orienting the camera in the vertical direction, the yaw movement of the head (looking left and right) corresponds for orienting the camera in the horizontal direction and the horizontal movement of the haptic tip along the X-direction of the haptic device corresponds to the translational movement of the capsule. Position dependent algorithm [9], was applied to move the capsule based on the user orientation. In this algorithm, the angle of the head is related to the position of the EPM; if the user is away from the initial point, the gain is increased up to a certain limit. In this navigation way, the use of a virtual reality glass ensures the image is always in front of the doctor’s eye and the doctor is completely immersed into the procedure. Furthermore, integrating the haptic feedback into capsule endoscopic application will allow the doctors to
Fig. 4.3 Vision/haptic based tele-operation system with head control
4 A Proposed Clinical Evaluation of a Simulation Environment for Magnetically-Driven Active…
control the robotic capsule movements while feeling the interaction forces between the capsule and the colon environment. The haptic guidance system will assist the user in avoiding colon tissue and following a specific region of interest. In other words, the operator can use: the glasses to be immersed and steer the capsule and the haptic interface to move and feel the interaction forces. In that case, the human-robotic (HR) control interface is complete and guarantees immersion and haptic interaction.
4.3
Evaluation Procedure
Human-subject studies cab be performed in which 32 subjects, 12 experienced gastroenterologists and 22 students, perform a simulated colonoscopy. Every subject used three different control methods: a haptic device with vision guidance only, the same haptic device with vision and haptic guidance without head control, and a haptic device with head control. Their performance was evaluated on introduction time, percentage of the colon that was visualized and procedure comfort.
Fig. 4.4 Experimental test setup
91
4.3.1 Test Setup A test setup was built to enable evaluation of the three navigation methods, provided in the simulation environment. An overview of this setup is shown in Fig. 4.4 and a picture of the setup in use is shown in Fig. 4.5.
4.3.2 Procedure In order to be able to make a ‘repeated measures’ comparison, all subjects should perform the three experimental navigation ways. The subjects were instructed to try to reach a pre-defined region of interest quickly and to carefully inspect the colon model for any abnormality. For all navigation ways, two colon models were provided: one without abnormality and another one with abnormality. The users were asked to use the haptic device to move an external magnet inside the simulator interface. Due to the magnetic interaction between the external magnet and the internal magnet, embedded inside the simulated capsule, the capsule will start to move, and colon images will be shown on the display screen and rendered
Y. Abu-Kheil et al.
92 Fig. 4.5 The test setup in use by one of the operators
inside the 3D glasses to the user. Table 4.1 summarizes all lists of activities and related outcomes of the performed tests.
4.4
Discussion
Caps-Sim is based on the Gazebo open source simulator and ROS middleware; both became a standard in robotics research, facilitating integration of contributions by other researchers, through an open source platform. Both ROS and Gazebo were chosen to allow the easy addition of further modules and to facilitate individual components testing. Gazebo is a robust physics engine that provides realistic rendering of environments including high-quality graphics such as lighting, shadows, and textures. Gazebo is also capable to simulate populations of robots in an accurate and efficient manner. Common sensors such as laser range finders, RGB-D and stereo cameras are already available in Gazebo and can be connected to any robot inserted in the simulation scenario. Other sensors, such as mono-cameras and inertial measurement units (IMU), have plugins that can be easily used and customized [10]. Finally,
Gazebo is free and can be easily shared by the community. Integrating robotic simulators with ROS allows the user to access various sets of already available algorithms. ROS can be considered as a robotic framework that can facilitate creating robotic applications by building, writing, and running low-level codes. The ultimate purpose of ROS is to support the ability of reusing and sharing codes in the research and development community of robotics. ROS can be also seen as a middleware software that links the operating system with applications, allowing communication of data in distributed systems. Finally, the proposed CAPS-Sim architecture is modular and can be applied to any capsule type or any other robotic manipulators regardless of its underlying technology. This can be achieved by changing the models in the environment. The current version of CAPS-Sim doesn’t take into account the contraction (movement) of the colon as well as the tissue property and its models. Therefore, the next version of CAPS- Sim is to increase the reliability of the simulator by adding these two properties: tissue modeling and colon movements.
4 A Proposed Clinical Evaluation of a Simulation Environment for Magnetically-Driven Active… Table 4.1 List of activities and outcomes evaluated with the simulation environment No. Activity 1. General introduction of the simulation environment that includes the following: Introduction Motivation Aims and objectives Platform description Hardware interface Graphical user interface Available functions Expected outcomes 2. Provide instructions how to use the software including hardware and the graphical interface 3. Test # 1: Evaluate the graphical interface module in terms of: Realism Clarity Visualization Easy to use
4.
5.
Test # 2: Tele-operate the capsule navigation inside the colon using vision only
Test # 3: Tele-operate the capsule inside the colon using vision and haptic feedback without considering head movements Test # 4: Tele-operate the capsule inside the colon using vision and haptic feedback with head movements
Outcomes Getting familiar with the simulation Interface
Table 4.1 (continued) No. Activity 6. Test # 5: Tele-operate the capsule inside the colon using haptic feedback in different colon environments: Parabolic force shape Conic force shape Neiloid force shape
Hands-on tests of the simulator
Indicate the degree to which the simulation environment approximates the real environment (i.e.: Colon environment, capsule model and image feeds) Indicate how useful is the use of pseudo forces in guiding the capsule movement compared to vision guidance only Indicate how head control movements affect the navigation process compared with moving using the haptic device only.
(continued)
93
7.
Test # 6: Analyze the trajectory performed by each user based on the collected data from the previous tests
Outcomes Evaluate and compare different force shapes to guide the operator during the capsule navigation Identify which force shape is more comfortable and more user friendly in guiding the capsule motion toward a specific target A profile showing the trajectory for each user in each trial A profile of the potential force feedback generated for each user in each trial
Acknowledgment This research is supported by ADEK Award for Research Excellence (AARE-2018).
References 1. Abu-Kheil Y, Seneviratne L, Dias J (2017) A simulation environment for active endoscopic capsules. In: Proceedings of IEEE 30th international symposium on Computer Based Medical Systems, CBMS’17, Thessaloniki, Greece, June 22–24, 2017, pp 714–719 2. World health organization [Online]. Available: http:// www.who.int/mediacentre/factsheets/fs297/en/ 3. Redaelli A, Cranor CW, Okano GJ, Reese PR (2003) Screening, prevention and socioeconomic costs associated with the treatment of colorectal cancer. PharmacoEconomics 21(17):1213–1238 4. Sliker LJ, Ciuti G (2014) Flexible and capsule endoscopy for screening, diagnosis and treatment. Expert Rev Med Devices 11(6):649–666 5. Abu-Kheil Y, Ciuti G, Mura M, Dias J, Dario P, Seneviratne L (2015) Vision and inertial-based image mapping for capsule endoscopy. In: Proceedings of 2015 international conference on Information and Communication Technology Research, ICTRC’15, Abu Dhabi, UAE, May 17–19, 2015, pp 84–87
94 6. Fan Y, Meng M-H, Li B (2010) 3D reconstruction of wireless capsule endoscopy images. In: Proceedings of annual international conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Buenos Aires, Argentina, August 30– September 4, 2010, pp 5149–5152 7. Sansanayuth T, Nilkhamhang I, Tungpimolrat K (2012) Tele-operation with inverse dynamics control for phantom omni haptic device. In: Proceedings of IEEE in SICE annual conference, SICE’12, Akita, Japan, August 20–23, 2012, pp 2121–2126 8. Xiong L, Chng CB, Chui CK, Yu P, Li Y (2017) Shared control of a medical robot with haptic guidance. Int J Comput Assist Radiol Surg 12(1):137–147
Y. Abu-Kheil et al. 9. Reilink R, de Bruin G, Franken M, et al (2010) Endoscopic camera control by head movements for thoracic surgery. In: Proceedings of 2010 3rd IEEE RAS & EMBS international conference on biomedical robotics and biomechatronics, BioRob’10, Tokyo, Japan, September 26–29, 2010, pp 510–515 10. Linner T, Shrikathiresan A, Vetrenko M, Ellmann B (2011), Modeling and operating robotic environment using gazebo/ros. In: Proceedings of the 28th International Symposium on Automation and Robotics in Construction, ISARC2011, Seoul, Korea, June 29–July 2, 2011, pp 957–962
5
Social Robots as a Complementary Therapy in Chronic, Progressive Diseases Ana Nunes Barata
Abstract
Keywords
Globally, the world population is ageing, which increases the prevalence of non- communicable diseases that affect patients both physically and psychologically, such as is the case of dementia. Consequently, there is a greater demand for the healthcare system as it needs to develop solutions to answer to these needs. The literature review shows that complementary therapies may be applied in dementia in order to aid the symptom management as well as to slow down the progression of the disease. The Socially Assistive Robots (SAR) are tools that may be used as a complementary therapy in dementia and have shown to promote a potentially beneficial relationship. The zoomorphic models of SAR have shown to have similar results to complementary therapies with animals, as they generate positive emotions and promote multisensorial interaction through the sight, hearing and touch. The use of SAR is a new tool that has shown to have benefits in terms of slowing down the progression of the disease, aiding to improving the quality of life in the elderly.
Socially assistive robots · Complementary therapies · Neurodegenerative diseases
A. N. Barata (*) ACES Amadora, Amadora, Portugal e-mail: [email protected]
5.1
Introduction
Ageing, as a process of degeneration of the body that results from the natural flow of time [1], is a phenomenon that is increasing its prevalence globally. The World Health Organization estimates that between 2015 and 2050, the percentage of people aged over 60 will double from 12% to 22%, being that nearly 80% will live in high income countries [2]. In addition, the number of adults aged 60 and over is expected to exceed the total number of children under five by 2020 [2]. This ageing process leads to an increased prevalence of chronic diseases, which affect both the physical and psychological nature of the patients. These diseases may cause limitations at a personal level – limitations due to dementia (which impairs the capacity of space and time orientation), as well as other limitations such as the ones caused by falls, pain, osteoarthritis, chronic obstructive pulmonary disease, depressive pathology and diabetes [2]. The increased prevalence of chronic diseases consequently implies a greater need for the healthcare system to organize itself in terms of demand and response. This implies that there is a greater need for resources. Thus, it is
© Springer Nature Switzerland AG 2019 J. S. Sequeira (ed.), Robotics in Healthcare, Advances in Experimental Medicine and Biology 1170, https://doi.org/10.1007/978-3-030-24230-5_5
95
A. N. Barata
96
imperative to adapt and adapt the healthcare system, focusing on strategies that provide holistic care to the geriatric population and that focus on the promotion of the quality of life of this age group [2]. The neurodegenerative disease is characterized by a progressive and irreversible destruction of neurons [3] and it is estimated that in 2010, 35.6 million patients suffered from dementia [4]. This disease plays an important role when we analyze the consumption of healthcare resources, since, in most cases, patients live several years after the onset of symptoms, which usually worsen over time due to the progression of the disease. In addition, it is also estimated that the value of the annual cost associated with this disease increases even more rapidly than its prevalence [4]. As neurodegenerative diseases are chronic and progressive, it is possible to consider the application of complementary therapies as adjuvants for symptom control and that also help delaying the progression of the disease [5]. Currently, the most commonly used complementary therapies are mainly focused on occupational therapy (therapies applying music, sand and animals) [5]. In summary, taking into account the context of a greater demand for healthcare resources, the prevalence and characteristics of neurodegenerative diseases, as well as the perspective of the decreasing number of healthcare professionals [6], it is needed to research and to establish multidisciplinary collaborations, namely when it comes to developing knowledge and research in the technological area. This research includes the development of social robots, a technology developed to interact with humans [7], which, in turn, opens new possibilities in the strategy of treating patients with neurodegenerative diseases. Using innovative solutions such as the application of robotics, it may be possible to strive to a person’s well-being, ensuring the quality of life.
5.2
Theoretical Framework
Technological developments have allowed significant advances in robotics. Efforts have been made to create robots that are able to establish
active social interactions with humans. The robots that have been developed until now are based off forms of organisms that already exist in our environment, so these robots have either anthropomorphic or zoomorphic traits. Given the galloping technological evolution, new horizons have been made possible, as well as new opportunities and fields for their application. Nowadays, it is possible to research on the effects of robotic intervention, as a facilitator of socialization, assessing the results in the cognitive and behavioral domain of human beings [8–11]. The term Socially Interactive Robots (SIR) was first used by Fong et al. [12], where the application of robots that were mainly developed to interact with humans are described. This term was introduced in order to be able to differentiate these robots from the ones usually handled by man (such as robotic guided surgeries). Feil- Seifer et al. [13] defined Socially Assistive Robotics (SAR) as an integration of the concept of “assistance robots” and the SIRs. Currently, there is no definition for “assistance robots”, but a suitable definition may be that of a robot that helps and supports the human being in his actions [14]. Research on “assistance robots” includes robots used in rehabilitation, wheelchairs and other mobility solutions, articulated limbs for amputees, company and educational robots [14]. In the case of SIR, it is the goal to develop close and effective interactions, being the end-result the interaction in itself. On the other hand, SARs are not designed to increase the effectiveness of activities of daily living, but rather they aim to provide support, through social interaction, so that an improvement in rehabilitation and learning is seen [14]. Thus, SARs can be considered as a subcategory of SIRs. Dementia is a neurodegenerative disease where there is a compromise of the cognitive function, so it is linked to a change in communication and a tendency for a more depressive mood. Consequently, patients may be at risk – there may be a greater tendency for isolation and development feelings of loneliness, which, in turn, will deteriorate the mental health of patients [15]. In this way, there is a window of opportunity for the application of complementary
5 Social Robots as a Complementary Therapy in Chronic, Progressive Diseases
therapies, directed to the symptomatic relief and the delay of the progression of the disease [5]. These therapies promote the stimulation, interaction and change of routines through the application of animals, music, drawing, painting and other activities that also include the application of robotics. Given that the neurodegenerative disease is a condition that is often incurable, the inclusion of complementary therapies acquires a greater relevance in the therapeutic strategy of these patients. Regarding the complementary treatments of dementia, we should highlight the therapy with domestic animals [16], due to the millenarian relationship between humans and other animals (the first fossilized proof refers back to the era of Homo erectus). This relationship of proximity has been maintained and domestic animals are often chosen today due to their affective return and as facilitators of social interaction between humans, as they manage to reduce the feeling of loneliness and contribute to a sense of well-being of their owners. There are two explanatory theories of this interaction: the Biophilic Hypothesis and the Social Support Hypothesis [17]. The first justifies that the human being has an innate tendency to pay attention and to be attracted to other animals or living beings. In an evolutionary context, attention to animals would increase the individual’s chances of survival, since the animal played a role of sentinel, alerting towards possible dangers and allowing a sense of security [18]. Nowadays, animals continue to be a focus of external attention, with the capacity to transmit tranquility to the individuals they live with [19]. The Social Support Hypothesis, on the other hand, affirms that companion animals are, by themselves, a social support and also act as facilitators of human interaction [17]. This is because pet animals reduce feelings of loneliness and give their owners a sense of well-being [20]. The reasons for their success as social support elements range from their constant availability to the support and unconditional love they offer to their owners – a strong bond is formed between the owner and the pet and many owners consider their pets as a family member or a friend [17].
97
However, it is sometimes difficult to study therapeutic intervention with domestic animals because of the difficulties that may arise such as hygienic, allergic or institutional reasons. Thus, it opens the opportunity for the development of the application of zoomorphic SARs that intend to overcome this limitation.
5.3
Goal
This chapter intends to perform a narrative review of the published literature on the application of SAR in elderly patients with neurodegenerative diseases as well as to narrate the experience of the application of SAR in a homecare context. The goal is to understand and characterize both the possibilities and limitations of the application of SARs for the aim of promoting a better quality of life for patients.
5.4
Methods
A bibliographic search was performed in the PubMed database under MeSH terms “aged”, “aged, 80 and over”, “robotics” and “dementia“. Twenty-seven articles were identified, and review articles published between 2006 and 2013 were defined as inclusion criteria. Thus, a total of 6 articles were selected. Simultaneously, there is a narrative experience of the application of a SAR in a homecare context.
5.5
Results
Review articles included a systematic review and five narrative reviews. All studies targeted SAR- type robots. In the area of robotics, there is now a global trend where it is increasingly common to see humans interacting with robots. However, in order to ensure that this interaction is established, it is necessary to provide some enabling conditions for communication, notably non-verbal communication [21], given that most of the
98
essage flows through non-verbal language. So, m it is necessary to give robots some anthropomorphic characteristics (facial expressions and gestures), so that it is easier for people to interact with them [21, 22]. The face of the robot is the most important part for transmitting expressions and, in order to be able to interact through this medium, there are currently different models that specialize in the face, such as eMuu ©, Feelix ©, iCat © and Kismet © [21]. Other models also sought to include gestures by moving the upper body of the robot (trunk, arms and hands) with the most well-known being the Leonardo ©, Infanoid ©, Kaspar ©, Robovie-IV ©, WE-4RII © and Nexi © [21]. On the other hand, there are also more anthropomorphic models such as the ASIMO ©, QRIO ©, Kobian © and iCub © models that seek to achieve with their humanoid aspect an easier interaction between people and the environment [21]. In addition, there are models that specialize to be applied in therapeutic actions, placing the person at the center of the robot-person interaction. These models are PARO ©, Robota ©, Keepon © and Huggable © [21]. Studies indicate that the relationship established with a SAR robot is potentially beneficial and could be used as a therapeutic intervention in patients with neurodegenerative disease, including dementia [10]. Like other innovative interventions, research generally takes place more slowly, with unexpected results and new research questions frequently encountered in investigations already under way. Overall, it can be affirmed that, at the moment, there is evidence that demonstrates the use of social robots as a way to promote the interaction between people, creating an effect of calmness, companionship, motivation and, above all, happiness [23, 24]. However, the available studies are still not very robust, and the vast majority of available studies still have very small sampling [23]. Among the complementary therapies commonly used in the dementia pathology, animal therapy is highlighted because of its beneficial effects on patients’ quality of life [25]. However, there are sometimes limitations to the use of animals in specific settings (hygienic, institutional,
A. N. Barata
allergic reasons ...), which prevents patients from accessing this type of therapy. To answer to this need, solutions in the area of robotics have been developed. Most of the studies available with zoomorphic robots were performed with a baby seal (PARO ©) robot, similar to a dog (AIBO ©), and with a robotic cat (NeCoRo ©), and all studies were designed to develop positive emotions and promote multisensory interaction: through vision, hearing and touch [23, 26, 27]. Of these, PARO © is the model that is currently being used and studied the most. It is a baby seal-shaped robot that was developed in Japan in 2004 and which in 2009 was approved by the Food and Drug Administration as a medical device [28]. It currently has a worldwide distribution, and there are several countries to carry out research with this model. Studies using fNIRS brain monitoring showed that there was a difference in brain activity when the robot was on and when it was not in the “STOP ON” mode, there was a decrease in brain activity in the left frontal area, while in the “OFF” mode a decrease in motor cortex activity [29] was observed comparatively. In the elderly, monitoring with EEG demonstrated an improvement in the function of the cortical neurons after they had been exposed to the robot [30]. Scales were also used to assess the effect of the robot, and after the exposure, an improvement in the values of the Quality of Life in Alzheimer’s Disease (QOL_AD) scale was identified and an increase in the relative pleasure score according to the Observed Emotion Rating Scale (OERS) [31]. As for the possibility of the robot’s suitability depending to the patient’s degree of dementia, a qualitative study was carried out in Italy that assessed the robot’s effects in patients with different scores in the Mini Mental State Examination (MMSE). An increased capacity to communicate and emotional involvement was seen, at the same time as patients expressed different reactions according to their cognitive ability: there was an emotional involvement (“you are so dear”), interpretation of the will of the PARO © (“you are so clever! “) and his feelings (“ Do not do this because you frighten him! “), as well as an involvement in a personal sphere
5 Social Robots as a Complementary Therapy in Chronic, Progressive Diseases
of the patient, where he constructed narratives and interpreted PARO © reactions according to the story the he was telling. In this study patients with very mild cognitive impairment accepted PARO © very well and referred to it as an inanimate object “designed by a very intelligent person” [32]. Similarly, it has been found in other studies that the presence of PARO® has been well accepted by professionals working in homes, seeing it as a means to calm and promote well-being among the elderly [33]. Finally, in a randomized study of 40 institutionalized patients in New Zealand, three groups of patients were compared (performing usual activities, performing usual activities with the presence of a dog, and carrying out usual activities with the presence of PARO®) and the variables that were assessed were loneliness (UCLA Loneliness Scale), depression (Geriatric Depression Scale) and quality of life (QoL-AD) [34]. There was greater statistically significant improvement in the scores when usual activities were performance with PARO © (p = 0.02), compared with the presence of a dog (p = 0.08). The greatest results were seen in the levels of loneliness, with the patients reporting lower solitude levels after being exposed to PARO® [34].
99
doctor, being that PARO® is always being used by the medical professional. Patients with chronic, progressive diseases have been selected, having the research focused on patients with dementia or with cognitive impairment due to, e.g. an oncological disease. Currently, around ten patients are being followed up, having received between two and eight visits with PARO®. There has been no time limit to the exposure of PARO® and each session has ranged between 20 min and 1 h. The homecare setting differs greatly from an institution as in the home the homecare team enters the patient’s private environment. Additionally, cultural aspects gain a greater weight in this context as well, so preformed ideas define reactions and thoughts in a greater way. So, the first step of this trial was to understand if a new, unknown object such as PARO® would be accepted by the patient in his/her home. Until now, the results have been favorable. When asking how patients felt about PARO®, the words that were most commonly expressed were “calm”, “happiness” and “at peace”. Upon seeing PARO®, it was also possible to see an easier recognition of the homecare team and communication has been more fluid as well. It was also seen a greater initiative by the patient when it comes to interact with the homecare team and PARO®, promoting also spontaneous movement ® as patients attempt to stroke PARO®. 5.6 PARO in the Homecare Though, globally, there seems to be a window Setting (Trial) of opportunity to use PARO® in the homecare setWhen applying a SAR in the community, namely ting, the study still needs to provide more results. in the homecare setting, there are other variables This kind of study must rely on long run trials so that need to be taken into account such as socio- that rich enough data is generated. cultural and economic aspects. A SAR is a resource that aids in the care that is provided as it facilitates the communication 5.7 Discussion between the homecare team and the patient and, when considering a neurodegenerative disease Global aging and the decrease of global populasuch as dementia, it contributes as an item that tion are pressing concerns that require research makes patients’ recognition of the homecare into innovative disciplines. team easier. Dementia is a progressive disease that affects A trial experience with PARO® in the homec- the areas of cognition, affection, communication are setting is currently being studied so to under- and behavior. Because of its characteristics, it stand the impact of the SAR in this type of setting. gives patients some degree of incapacity results The homecare team is made up of nurses and a in a greater need for care as time progresses [35].
A. N. Barata
100
As it is a chronic and progressive disease, there is the possibility to apply complementary therapies and to study their benefits. The most commonly used complementary therapies are mainly focused on occupational therapy using resources such as music, sand and animals [5]. Activity models focus on three aspects: familiarization, interaction and communication [5]. In a process of familiarization, the main objective is to explore the surrounding space and also to elaborate simple activities. As for interaction, it is aimed to promote behaviors that are amusing. When working on communication, the therapist will seek to explore and develop the emotional involvement. Complementary therapy with animals is a therapeutic solution that focuses highly in the familiarization process, since the physical contact with the animal promotes a generalized stimulation of the senses and facilitates the feeling of well-being. Thus, it is a solution that can be applied even in advanced neurodegenerative pathology [5]. Studies of complementary therapies with SAR, in particular the zoomorphic robots, have shown similar results to the complementary therapy with domestic animals [36], where an increased ability to socialize, decreased agitation, aggressiveness and depressive symptomatology has been seen [32, 37, 38] (proven results with values in analytical, imaging and electrical traces) [29, 39, 40]. In addition, there is also an increase in the capacity for nutritional support and reduction of medication and medical consultations. As for changes in the cognitive domain, the evidence is still unclear [10]. The available studies demonstrate that there is a possibility of developing robust research in the area of application of social robots in neurodegenerative diseases, however the available randomized studies are still scarce and sometimes the articles that are available become redundant by analyzing individually different variables belonging to the same study [10]. Therefore, there is a need to improve the studies that are already available, replicating their findings in more robust designs.
In order to move forward in this area, attention must also be focused on different issues, including ethical issues. It is not the intention of complementary therapy with social robots to replace the action of the human being, but rather to facilitate this interaction. Thus, it is necessary to develop guidelines for the application of social robots in order to achieve the best possible results [41].
5.8
Conclusion
At present individuals are ageing globally, which creates new challenges in the management of healthcare resources, especially due to the burden related to the demand due to chronic, limitative diseases. Within chronic and progressive diseases, dementia will be one of the most prevalent diseases, affecting individuals for several years. In order to provide better symptomatic control and delay the progression of this neurodegenerative disease, the opportunity opens up for the application of complementary therapies which will be adjuvants for the well-being of the individuals. The most used therapies are music, sand and animal therapy, the latter having shown to be the most integrative and with best results. Also integrated in the complementary therapies are the SARs with the main objective to interact with the human being. Studies to date have demonstrated that SARs may be facilitators, acting not only in the prevention of mental diseases, but also in improve the overall condition of elderly patients with neurodegenerative disease. Within SARs, zoomorphic models have had more beneficial results, possibly due to their similarity to animal therapy. The use of SAR should not be underestimated as it opens up new possibilities for the application and development of technology that aims to improve the quality of life of the population both living in institutions and at home.
5 Social Robots as a Complementary Therapy in Chronic, Progressive Diseases
References 1. Ageing definition and meaning (In Portuguese). Definição ou significado de envelhecimento no Dicionário Infopédia da Língua Portuguesa sem Acordo Ortográfico [Internet]. Infopédia – Dicionários Porto Editora. [Cited 2019 Mar 24]. Available from: https://www.infopedia.pt/dicionarios/ lingua-portuguesa-aao/envelhecimento 2. WHO | Ageing and health [Internet]. WHO. [Cited 2019 Mar 24]. Available from: http://www.who.int/ mediacentre/factsheets/fs404/en/ 3. What? | JPND [Internet]. [Cited 2019 Mar 24]. Available from: http://www.neurodegenerationresearch.eu/about/what/ 4. Dementia: a public health priority | Alzheimer’s Disease International [Internet]. [Cited 2019 Mar 24]. Available from: https://www.alz.co.uk/ WHO-dementia-report 5. Marti P, Bacigalupo M, Giusti L, Mennecozzi C, Shibata T (2006) Socially assistive robotics in the treatment of behavioural and psychological symptoms of dementia. In: The first IEEE/RAS-EMBS international conference on biomedical robotics and biomechatronics, 2006 BioRob 2006, pp 483–488 6. WHO | Global strategy on human resources for health: Workforce 2030 [Internet]. WHO. [Cited 2019 Mar 24]. Available from: http://www.who.int/hrh/ resources/pub_globstrathrh-2030/en/ 7. Dautenhahn K (2007) Socially intelligent robots: dimensions of human–robot interaction. Philos Trans R Soc B Biol Sci 362(1480):679–704 8. Robinson H, MacDonald BA, Kerse N, Broadbent E (2013) Suitability of healthcare robots for a dementia unit and suggested improvements. J Am Med Dir Assoc 14(1):34–40 9. Wu Y-H, Fassert C, Rigaud A-S (2012) Designing robots for the elderly: appearance issue and beyond. Arch Gerontol Geriatr 54(1):121–126 10. Mordoch E, Osterreicher A, Guse L, Roger K, Thompson G (2013) Use of social commitment robots in the care of elderly people with dementia: a literature review. Maturitas 74(1):14–20 11. Shibata T, Wada K (2011) Robot therapy: a new approach for mental healthcare of the elderly a mini- review. Gerontology 57(4):378–386 12. Fong T. A survey of socially interactive robots: concepts, design, and applications [Internet]. The Robotics Institute Carnegie Mellon University. [Cited 2019 Mar 24]. Available from: https://www.ri.cmu. edu/publications/a-survey-of-socially-interactiverobots-concepts-design-and-applications/ 13. Feil-Seifer D, Mataric MJ (2005) Defining socially assistive robotics. In: Proceedings of 9th International Conference on Rehabilitation Robotics, ICORR’05, Singapore, August 11–14, 2005, pp 465–468 14. Bemelmans R, Gelderblom GJ, Jonker P, de Witte L (2012) Socially assistive robots in elderly care: a
101
systematic review into effects and effectiveness. J Am Med Dir Assoc 13(2):114–120.e1 15. Wada K, Shibata T, Musha T, Kimura S (2008) Robot therapy for elders affected by dementia. IEEE Eng Med Biol Mag 27(4):53–60 16. Burton A (2013) Dolphins, dogs, and robot seals for the treatment of neurological disease. Lancet Neurol 12(9):851–852 17. O’Haire M (2010) Companion animals and human health: benefits, challenges, and the road ahead. J Vet Behav Clin Appl Res 5(5):226–234 18. Wilson EO (1984) Biophilia. Harvard University Press, Cambridge, MA 19. Gullone E (2000) The Biophilia hypothesis and life in the 21st century: increasing mental health or increasing pathology? J Happiness Stud 1(3):293–322 20. Sable P (1995) Pets, attachment, and well-being across the life cycle. Soc Work 40(3):334–341 21. Goris K, Saldien J, Vanderborght B, Lefeber D (2011) How to achieve the huggable behavior of the social robot Probo? A reflection on the actuators. Mechatronics 21(3):490–500 22. Argyle M, Alkema F, Gilmour R (1971) The communication of friendly and hostile attitudes by verbal and non-verbal signals. Eur J Soc Psychol 1(3):385–402 23. Broekens J, Heerink M, Rosendal H (2009) Assistive social robots in elderly care: a review. Geron 8(2):94–103 24. Libin AV, Libin EV (2004) Person-robot interactions from the robopsychologists’ point of view: the robotic psychology and robotherapy approach. Proc IEEE 92(11):1789–1803 25. Nordgren L, Engström G (2014) Animal-assisted intervention in dementia: effects on quality of life. Clin Nurs Res 23(1):7–19 26. Libin A, Cohen-Mansfield J (2004) Therapeutic robocat for nursing home residents with dementia: preliminary inquiry. Am J Alzheimers Dis Dementiasr 19(2):111–116 27. Wada K, Shibata T (2008) Social and physiological influences of robot therapy in a care house. Interact Stud 9(2):258–276 28. Establishment Registration & Device Listing [Internet]. [Cited 2019 Mar 24]. Available from: https://www.accessdata.fda.gov/scripts/cdrh/cfdocs/ cfrl/rl.cfm?rid=142911 29. Kawaguchi Y, Wada K, Okamoto M, Tsujii T, Shibata T, Sakatani K (2012) Investigation of brain activity after interaction with seal robot measured by fNIRS. In: Proceedings of 21st IEEE international symposium on Robot and Human Interactive Communication, RO-MAN’12, Paris, France, September 9–13, 2012, pp 571–576 30. Wada K, Shibata T, Musha T, Kimura S (2005) Effects of robot therapy for demented patients evaluated by EEG. In: Proceedings of IEEE/RSJ international conference on Intelligent Robots and Systems, IROS’05, Hamburg, Germany, September 28–October 02, 2005, pp 1552–1557
102 31. Moyle W, Cooke M, Beattie E, Jones C, Klein B, Cook G et al (2013) Exploring the effect of companion robots on emotional expression in older adults with dementia: a pilot randomized controlled trial. J Gerontol Nurs 39(5):46–53 32. Giusti L, Marti P (2006) Interpretative dynamics in human robot interaction. In: Proceedings of 15th IEEE international symposium on Robot and Human Interactive Communication, RO-MAN 2006, Hatfield UK, September 6–8, 2006, pp 111–116 33. Moyle W, Bramble M, Jones C, Murfield J (2016) Care staff perceptions of a social robot called Paro and a look-alike plush toy: a descriptive qualitative approach. Aging Ment Health 0(0):1–6 34. Robinson H, MacDonald B, Kerse N, Broadbent E (2013) The psychosocial effects of a companion robot: a randomized controlled trial. J Am Med Dir Assoc 14(9):661–667 35. About dementia | Alzheimer’s Disease International [Internet]. [Cited 2019 Mar 24]. Available from: https://www.alz.co.uk/about-dementia 36. Bernabei V, De Ronchi D, La Ferla T, Moretti F, Tonelli L, Ferrari B et al (2013) Animal-assisted interventions for elderly patients affected by dementia or psychiatric disorders: a review. J Psychiatr Res 47(6):762–773 37. Šabanović S, Bennett CC, Chang W-L, Huber L (2013) PARO robot affects diverse interaction modalities in group sensory therapy for older adults with
A. N. Barata dementia. In: Proceedings of IEEE International Conference on Rehabilitation Robotics, ICORR’13, Seattle, WA, June 24–26, 2013 38. Wada K, Takasawa Y, Shibata T (2013) Robot therapy at facilities for the elderly in Kanagawa prefecture – a report on the experimental result of the first week. In: Proceedings of 22nd IEEE international symposium on Robot and Human Interactive Communication, RO-MAN’13, Gyeongju, South Korea, August 26–29, 2013, pp 757–761 39. Chang W-L, Sabanovic S, Huber L (2013) Use of seal-like robot PARO in sensory group therapy for older adults with dementia. In: Proceedings of 8th ACM/IEEE international conference on HumanRobot Interaction, HRI’13, Tokyo, Japan, March 3–6, 2013, pp 101–102 40. Moyle W, Jones C, Cooke M, O’Dwyer S, Sung B, Drummond S (2013) Social robots helping people with dementia: assessing efficacy of social robots in the nursing home environment. In: Proceedings of 6th international conference on Human System Interaction, HSI’13, Gdansk, Poland, June 6–8, 2013, pp 608–613 41. Wada K, Kouzuki Y, Inoue K (2012) Field test of manual for robot therapy using seal robot. In: Proceedings of 4th IEEE RAS EMBS international conference on Biomedical Robotics and Biomechatronics, BioRob’12, Roma, Italy, June 24–27, 2012, pp 234–239
6
Developing a Social Robot – A Case Study João S. Sequeira
Abstract
Social robotics is currently challenging researchers to look at virtually every topic with relevance for human societies. The multidisciplinary nature of this new area is emerging at a fast pace and, at the same time, multiple challenges are also emerging. This chapter addresses the development of a social robot, including baseline work developed during the European FP7 Project Monarch on a social robot for edutainment activities for inpatient children in the Pediatrics ward of an oncological hospital, and the work developed in the post-project period. The long period (Monarch started in early 2013) allowed a diversity of experiments that resulted in valuable lessons and in the blossoming of new ideas and challenges. The overview of the Monarch project is the leitmotiv for a critical view of social robotics and the identification of key challenges, on the light of current technologies and social trends and expectations.
Keywords
Social robots · Robot personality traits · Ethics in sensing · Human-robot interaction J. S. Sequeira (*) Instituto Superior Técnico/Institute for Systems and Robotics, Lisbon, Portugal e-mail: [email protected]
6.1
Introduction
An “invasion” of human societies by social robots is under way. Two fronts are currently very active, namely, through (i) interfacing devices with embedded intelligence, eventually located on the cloud, (e.g., Alexa, Cortana, and Siri – essentially non-mobile systems), and (ii) mobile systems with onboard devices with embedded intelligence of variable degrees (e.g., NAO and Pepper, from Softbank Robotics, and Sanbot, from Sanbot Innovation). The response to this invasion (which some may even qualify as a threat) is already evolving. A genuine interest by people on these “new beings”, empowered by the attention of the media on the subject, is leading to a general reflection on societal changes and expectations, and ultimately, on human condition. In fact, social robots represent a new breed of interaction agents that combine motion and sensing into social behavioral skills. Programming these new agents is becoming a challenge due to the intrinsic complexity of the dynamics of the networks forming these systems and, hence, some intrinsic unpredictability is becoming a trademark of social robots. This, however, fuels the discussion on humans vs. robots because, though humans clearly like some unpredictability (see the discussion ahead on synthetic personality), they also fear the consequences of that unpredictability.
© Springer Nature Switzerland AG 2019 J. S. Sequeira (ed.), Robotics in Healthcare, Advances in Experimental Medicine and Biology 1170, https://doi.org/10.1007/978-3-030-24230-5_6
103
J. S. Sequeira
104
These fears seem to emerge from a lack of information. In recent years Artificial Intelligence paradigms went through a major change, from logic related paradigms to neural base paradigms. This is raising the expectations on social (intelligent) robots significantly, as “neural” has a resonance in human biology, and the topic quickly mapped into ethical challenges, among others, maybe echoing fears due the known complexity of human relations. This chapter discusses a number of challenges in Social Robotics in the light of the lessons from the European FP7 Monarch Project.1 Some of the project landmarks are discussed, together with the implications in ongoing and future work.
6.1.1 W hat Is a Social Robot After All? A definition of social robot may be quite broad and involve static agents and moving agents. Common dictionaries state that “social” means “to be able to create and maintain relationships with other people … needing companionship” (from Google Dictionary, or Oxford Dictionary). More formally, to be social is to be able to model a mathematical relation between the outcomes of an individual and those of their (team)mates (see [41]). An important aspect, intrinsic to both the formal and informal definitions above, is that the types of interactions a social robot aims at must be well defined a priori as they are likely to constrain, significantly, virtually every aspect of the robot. Though strong social relationships are not really necessary for survival (several animal species seem to demonstrate this), it is clear that symbiotic and/or synergetic types of relationships improve both social and individual performances. Studies report a 50% boost in longevity for people engaged in solid social networking [29], and personality theories in Psychology have long highlighted the importance of, for instance, Project Monarch – Multi-Robot Cognitive Systems Operating in Hospitals, grant FP7-ICT-2011-9-601033. 1
self-esteem in individuals, which is related to the satisfaction degree of the relationships with other people [33, 63]. Nowadays, both positive and negative effects of the boost in social networking caused by social media tools are well known. Therefore, a social robot has the intrinsic goal of establishing some kind of relation with the surrounding people. Moreover, given the aforementioned known benefits perceived by people when connecting to other people, it is natural that strong expectations about connecting to a (social) robot emerge. The type of relationships that can evolve among robots and people is a subject open for debate. Though it seems clear that a human can develop some sort of a connection towards a robot (Fig. 6.1 shows two examples – note that the behavior shown is unlikely to occur between people and inanimate objects), it remains questionable if a robot will ever develop feelings for a human (even if this is just reduced to basic forms of interaction, e.g., through speech2). The social greeting gesture, from staff towards the robot, of light physical contact is often observed and multiple hypothesis of its meaning (for humans) can be developed, e.g., as humanizing the environment and/or stress relieving. For the purpose of the chapter, a social robot is defined through a weak set of conditions, namely (i) to have motion skills adapted to the social conditions of the environment, (ii) being able to communicate (conveying information towards the environment), and (iii) being able to observe a subset of the state of the environment. This means, for example, (i) moving carefully, and possibly slowly, if people moves frantically, or moving faster if people tend to use short movements (for which the robot can have confidence that they will not go and place themselves directly ahead of the robot), (ii) issuing verbal and non- verbal utterances, and (iii) perceiving some model of the environment dynamics such that behaviors can be adjusted. Interacting through speech is a sophisticated endeavor. Nonetheless, seemingly coherent, unsophisticated, speech can be achieved with basic techniques – recall the ELIZA program from the early days of AI.
2
6 Developing a Social Robot – A Case Study
105
Fig. 6.1 People interacting with a social robot (unidirectional interactions) as the robot was moving (all players were duly aware of the presence of a robot in the prem-
ises). (a) “Love tap”. View from a camera onboard the robot. (b) Verbal interaction. View from an overhead camera
Embedded in this set of weak conditions is a survival behavior. A truly social robot must be equipped with the necessary behaviors to allow it (i) socially surviving, that is, being able to establish and maintaining relations with other social beings, and (ii) inducing the adequate (goal) perceptions in other inhabitants, i.e., behaving adequately. Avoiding the “social surviving” conditions amounts to consider application driven social robots, e.g., museum tour guides, as in [42].
[12]. By definition, laboratories are controlled environments and hence experimenting in such conditions may bias any conclusions. This seems especially relevant when assessing acceptance of social robots and may raise ethical issues. Among the practical problems arising when experimenting in non-laboratory environments a highly relevant one is the fact that scheduled experiments (as it would happen in a laboratory) are in general difficult to set up. In fact, the stochastic nature of real life environments, where relevant events follow are not easily known in advance, constrains the scheduling of experiments. This constitutes a major challenge which has effects at a functional level, namely requiring that, up to some point, the robots (i) be prepared to deal with unexpected events, and (ii) be capable of operating during long periods of time (waiting for the proper events to happen). In addition, the challenges imposed by the environment may have an effect on the selection of hardware and software, namely, for example, in the recharging batteries, activity periods, or software upgrades (that should cause no disturbances, i.e., introducing no biases in the experiment).
6.2
Environments
Developing a social robot involves multiple challenges. When moving experiments out of the laboratories, one of the biggest is certainly dealing with the uncertainty that results from the behavioral variety/richness of the inhabitants. Quoting [37], p.9, “… laboratories are not real environments, especially for service robots …”. Controlling an environment to change its behavior has been used in multiple areas of human behavior, e.g., establishing a routine may help making a habit consistent among inhabitants,
J. S. Sequeira
106
6.3
Related Work
The literature on Social Robots and related issues is immense and covers a wide temporal period. Despite a bias towards laboratory experiments, the number of case studies involving real(istic) environments is growing. The research on the human-robot interaction (HRI) created momentum with the experiments with autistic children interacting with robots, [13]. These are scenarios that tend to have some structure, as that imposed by external sensing, and where the behaviors of the robots are required to be well defined, e.g., aiming at having children imitating the behaviors of the robot (see for instance [5]). Technical limitations can easily hinder behavior design and limit the potential of robots and have been pointed as a limitation in the use of robots for autism-related therapies, [61]. The uncanny valley paradigm, [48], implicitly links interaction skills with the complexity mechanical design. In fact, the number of degrees-of-freedom (dof) can be identified with the capability to make microexpressions, which are known to be relevant in emotion generation (though this refers commonly to facial microexpressions3 and the seven basic emotions [19, 28], the idea can be extended to the full body; for instance feet can also be revealing on sentiments and intentions [50]). Table 6.1 shows the number of mechanical degrees–of-freedom (not only related to facial expressions) for a list of known robots aiming at social purposes. The wide range of mechanical dof listed and the acceptances of some these robots (of which known commercial acceptances can be used) suggest that the relation between both is not simply proportional. In fact, wear and maintenance issues tend to increase with the number of mechanical dof and hence a tradeoff between construction and operation complexity and acceptance must be considered. Such was the case in the Monarch project where the robot developed is of low complexity. Microexpressions have a very short duration, in the range 1/15–1/25 of a second, and may involve the involuntary motion of muscles. 3
Table 6.1 Compilation of social interaction robots, mechanical complexity and purpose Robot Tamagochi
Mechanical dof 0
Furby
1
Paro AIBO 2 QRIO ASIMO Robovie Monarch
5 22 38 57 17–22 5
Erica
19
Geminoid-F
12
Nao Pepper/ Romeo Kismet
25 20 21
Cozmo Sophia
9 27
Purpose (1996) Handheld digital pet (1998) Small size robotic pet (2004) Therapeutic (2017) Toy pet (2006) Humanoid R&D (2000) Humanoid R&D (2001) HRI R&D (2016) HR relations R&D (2015) Humanoid expressivity R&D (2010) Humanoid expressivity R&D (2006) Assistant (2014) Assistant (1990) Facial expression R&D (2016) Toy pet (2016) Entertainment
Common goals for these robots include (i) the teaching of simple values to children, [30], (ii) entertainment using lifelikeness properties, [15], and (iii) “supporting people through interactions with body movements and speech.”, [37], Chap. 2. From Table 6.1, the variety of mechanical complexity suggests the relevance of motor behaviors (though people may have stereotypes for social robots and some authors are adamant claiming that motion and appearance matters – see [54]). Social robots have been often developed aiming at exhibiting specific behaviors in human social environments. Friendly (hugging) behaviors have been demonstrated in [34]. Though such highly focused experiments demonstrate the capabilities of robots and the use of computational concepts to design behaviors (software design patterns), they tend to embed biasing factors. A robot perceived as a device with only a few social skills, e.g., those associated with friendliness, is likely to bias an experiment as it may lower any expectation thresholds and hence lead to higher acceptance.
6 Developing a Social Robot – A Case Study
The importance of using multimodal interfaces, much as humans do, has been emphasized, for example, in [55]. The coordination of such interfaces, e.g., displaying a facial expression consistent with a verbal utterance, is known to improve the quality of interactions. Guiding behaviors, namely in museums, have been described in [32, 42, 59, 64], among others. The fact that such robots operate in a real environment has been recognized to bring additional challenges. For example, speech recognition becomes harder, [37], and, naturally, unbalances the direction of the interactions. Also, long run experiments in real social environments tend to require a large number of behaviors. [37] report a number of behaviors in the order of hundreds for the experiments in the Osaka Science Museum and Tokyo Metro station. Tutoring applications for children have been reported in [8, 38], on the use of a robot to improve the teaching/learning of a second language, and [49], using a cognitive system supported on IBM Watson®. Of special interest is the conclusion in [49] that, even though the potential is very high, at the current state of technology, only a limited number of scenarios is feasible. Healthcare is of special interest in what concerns social robots. Paro, the robot seal cub, [67], is likely to be the most notable example, as it is a FDA4 certified medical device. The Paro is being used in multiple countries as auxiliary to cognitive therapies, namely among elderly. Examples in rehabilitation robotics with the NAO/Pepper™ robots have been extensively reported (see [68] for an assistive robot aiming an increasing the engagement of people in physical exercise with therapeutic objectives, [53] for a collection of games for stroke rehabilitation involving a robot, or [39] for a discussion on the relevance of trust bonds between robots and patients in rehabilitation). In this context, robots compete with video gaming based technology. Other robots include the 7-dof parrot-like KiliRo, to improve social interaction skills of autistic children, [9]. A Wizard-of-Oz (WoZ) technique was used to operate the robot remotely. US Food and Drug Administration.
4
107
The motion of the robot and its communication skills were reported as being immediately acknowledged by the children. As a final remark on related works, the WoZ technique has been used in numerous experiments, namely for decoupling the controls of utterances and motion, i.e., two of the most important features/tools used for socialization, which in general must be active in a synchronous way. Some authors raised concerns on the use of this technique, namely because it may not yield repeatable experiments, and also it represents a misguidance of the social environment, which is, in practical terms, deceived by the wizard (see the report in [69]). Even though the ethical concerns can be debatable, WoZ can show where to drive technology. Moreover, repeatability can be easily addressed if assumed in a stochastic sense, e.g., after a big enough sequence of experiments some probability distribution can be estimated and that can be used to predict (in a stochastic sense) the outcome of the experiment.
6.4
Design of a Social Robot
From a classical robotics perspective, relevant factors in the design of a social robot include perception, reasoning, and action. Perception and action are directly related to human senses and action skills such as locomotion and speech. Each of the interfaces involved accepts/generates information according to some quality model. Usability is a concept used in multiples areas of engineering and product design to qualify an interface (and has been defined in ISO 9241-11). Robots are interfaces to human and hence usability principles provide sound support to design proper human-robot interaction. [71] refer to five main usability principles to be accounted, at design time, for when implementing interfaces. Learnability, i.e., how easy is it for users to accomplish basic tasks the first time they encounter the design; Efficiency, i.e., once users have learned the design, how quickly can they perform tasks; Memorability, i.e., when users return to the design after a period of not using it, how easily can they reestablish proficiency; Errors, i.e., how
108
many errors do users make, how severe are these errors, and how easily can they recover from the errors; Satisfaction, i.e., how pleasant is it to use the design. Quality interaction has been studied in areas ranging from engineering to product design and hence human factors, e.g., memory and emotions (see for instance [14]) are well understood. Good principles for good human-robot interaction have also been described in [14, 26]. These include (i) switching interfaces, (ii) give clues to guide the interaction, (iii) use semantics, (iv) manipulate the environment (instead of the robot), (v) manipulate the relationship between robot and environment, even if the result is not realistic, (vi) allow people to manipulate the information presented by the robot, and (vii) make all the information quickly perceivable by interacting people. Moreover, users should be involved in the process of designing the interfaces and human dynamics duly accounted for, e.g., the sensory and memory dynamics. In the domain of Information Systems, Technology Acceptance Models (TAM) and similar frameworks e.g., the Unified Theory of Acceptance and Use of Technology Models and the Theory of Planned Behavior, [7, 43], have also identified perceptions of usefulness and easiness of use as key factors for acceptance. Such conclusions can be extended to Social Robotics.
6.4.1 Basis Functionalities At the core of a social robot there are systems related to motion, perception and interaction. The aesthetics of the robot, provides people with the first perception of the social robot. The Monarch robot (or mbot) aesthetics is dominated by its outer fiber glass shell, shapped to be consistent with the children stereotypes for a robot obtained from a survey study (see [22]), and overall it allows a smooth physical interaction with the children. The overall volume and mass distribution (around 40 Kg) allow it to be pushed by children during physical interactions (see Fig. 6.2). Figure 6.3 shows the general aspect of the mbot. The robot has two 1-dof arms, a 1-dof neck, and is built over a holonomic platform with
J. S. Sequeira
Fig. 6.2 Children pushing the robot while in operation
four Mecanum wheels. Maximum velocity is 2.5 m/s, which is compatible to a human walking at fast pace, though fast movements are usually avoided (as they tend to raise safety concerns in bystanders). Laser range finders at the bottom of the robot are used for navigation purposes and a touchscreen at the chest provides output and input interfacing. The head encloses LED based eyes, mouth and cheeks, enabling basic facial expressions. In addition, a RFID reader, two RGB-D cameras, a Stargazer device, to track special marks on the ceiling, and a simple microphone- speakers system, to capture ambient sound and issue interaction utterances. Figure 6.4 shows the relative placement of these devices inside the head of the robot. The facial expression interfaces in the head are capable of displaying multiple elements present in several basic emotions (Ekman’s emotions), namely surprise, sadness, happiness (fear, disgust, anger being less relevant in the Pediatrics context where the robot must aim at cheer up the children – and everyone in general). The navigation system, including a localization system using the laser range finders mounted at the front and back of the robot, has been shown
6 Developing a Social Robot – A Case Study
Fig. 6.3 The mbot robot developed within the EU FP7 Monarch project. (a) Physical dimensions compatible with young children (from [3]). (b) Front view; the head
109
interfaces, chest touchscreen and front laser range finder are visible (photo Exame Informática Magazine, 2016)
Fig. 6.4 Location of the main head interfaces for interaction (from [3])
to (i) maintaining a credible localization during extended periods of time, and (ii) to generate trajectories that are fully compatible with the intrinsic proxemics in a hospital ward (see Fig. 6.5, the walls of a corridor are clearly visible with the trajectory of the robot flowing mainly along the middle line; most of the dots in the middle of the corridor correspond to people passing in front of the robot). The obstacle avoidance skills tend to make the robot pass nearby people and objects at socially
acceptable distances. Occasionally, the robot will slightly touch an object while avoiding it, namely with the side. However, people tend to accept easily light physical contact with the robot (and even actively search for it – see ahead). Although the original intent was to operate the robot in a networked system, with multiple fixed cameras acquiring images for people activity recognition, current privacy regulations prevent the usage of any RGB cameras. RGB-D cameras are allowed. However, its use easily conveys a per-
J. S. Sequeira
110 Fig. 6.5 Sample of trajectories produced by the navigation system (laser range measurements shown as red dots, position of the robot shown as blue + marks)
Fig. 6.6 Depth images from the Kinect onboard the robot (black dots correspond to invalid points)
ception that the RGB device is also being used and, as a consequence, trigger privacy concerns (eventually unfounded).5 This represents a significant challenge in sensing. Figure 6.6 shows two depth images commonly obtained at the hospital. The left image shows a child bed which is an example of an object difficult to detect using only information from the laser range finders and/or the ultrasound sensors. RGB cameras were used only for the duration of the Monarch project and are currently not being used. 5
In fact, the distinctive features of these beds are the vertical bars which are always placed above the vertical limit for detection of the lasers at the base of the robot. This type of object can be detected by convolving a region-of-interest in the depth image with a suitable mask, e.g., defining vertical stripes, and looking for multiple maxima above some predefined threshold (see Fig. 6.7). To reduce the computational overhead of the convolution the size of the region of interest needs to be carefully selected.
6 Developing a Social Robot – A Case Study
111
Fig. 6.7 Children bed detection using a convolution filter (on the left image, corresponding to an image including a bed, the number of peaks above a threshold is clearly bigger than on the right image which contains no bed)
Moreover, as the right image in Fig. 6.6 shows, the number of invalid points can represent a significant portion of the image. These are examples of sensory situations that may result in low precision and recall performance and require specific behaviors strategies, e.g., a sensor that does not giving information consistent with the expectations on the environment may lead to the robot moving away from the region where the information is being collected.
6.4.2 People Detection from RFID The detection of people (or any other object of interest) can be reliably made using the RFID technique. Disposable RFID tags are made available to people, with each tag encoding basic information, e.g., a name by which the person wants to be known (not necessarily related to the person’s real name). The method developed uses the anisotropy of the RFID antenna and is illustrated in (Fig. 6.8). It returns a rough estimate of the angular position of the RFID tag, i.e., of the person carrying the tag, relative to the robot. As normal social relations involving spatial proximity are not very demanding in terms of the angular position accuracy, this becomes an interesting method. Environment conditions can modify significantly the tag detection region around the reader and
affect the performance. This however tends not to be an issue in social human-robot interaction as it is commonly socially accepted to interact without being straight face-to-face.
6.4.3 P eople Detection from Laser Range Finder RFID based people detection has the drawback of requiring a priori interaction to hand over and agree on the information encoded in the tag. Moreover, estimating the detection probability from the readings may require time that is not compatible with some social interactions. In addition, the constraints on the use of RGB imaging techniques due to privacy regulations naturally lead to the development of alternative strategies. The RFID in the previous section individualizes people. Anonymous people, not carrying a RFID tag, can still be detected using the range scans provided by the laser range finders primarily used for navigation. Figure 6.9 shows a front scan sample (on the left) and the corresponding image (on the right). The pair of legs is marked red in the scan image. Including, whenever possible, sensing strategies, complementing each other and/or providing redundancy, as is the case with the multiple people detection strategies, is also a key aspect when designing a social robot.
J. S. Sequeira
112
Fig. 6.8 People detection and localization from RFID (from [57]). (a) Probability of detection as a function of the angular position relative to the RFID antenna –
obtained in a free open space. (b) Example of a region of points where tag detection probability is close to the probability estimated from the readings. The position of the person is computed within this region
Fig. 6.9 Leg recognition from range data
6.4.4 Verbal and Non-verbal Utterances Sound is undoubtedly an important interface. Both verbal and non-verbal utterances can be used to deliver messages that, besides the explicit sound information, can encode hidden meanings, e.g., using amplitude and/or pitch to modulate the expressivity.
The mbot robot is currently using a combination of an off-the-shelf text-to-speech (TTS) engine,6 and pre-recorded sounds and messages. The first is used to speak generic information, e.g., news grabbed on the web, for which voice expressivity may not be relevant whereas the secThe ReadSpeaker™ commercial TTS software is used. Current configuration is that of a female voice with slightly increased pitch to resemble a childish voice.
6
6 Developing a Social Robot – A Case Study
ond is used for childish sounds and verbal expressions with increased emotional expressivity, e.g., to incentivize the children to do some action or behave in a certain way. The robot was initially programmed to use only pre-recorded utterances, this being the only system for a period of over 3 years. The combination with the synthetic TTS has not lead to any noticeable changes in the acceptance of the robot. Among the verbal utterances issued by the robot are the Portuguese version of the “Hello person” utterance when the laser at front detects a person, and “Hello ” when a RFID tag is detected. Web grabbed daily news are issued once per minute on average (if internet access latency allows it).
6.4.5 I n Search of a Synthetic Personality The global, macroscopic, behavior of a social robot is influenced by the performance of its sensors and actuators. In general, such devices have dynamic characteristics that make their performances vary according to the condition of the environment. Those variations may affect the perceptions that people interacting with the robot develop. The collection of those perceptions amounts to (or is perceived as) the personality of the robot. Humans are very good at hiding their own perceptual limitations (up to a point) using their personalities. By carefully selecting which behaviors to use in a given situation a person can avoid, for example, requiring vision information, and relying more on sound information. Though this sort of compensation strategy is often naturally developed from physical limitations (as in the case of visual blindness), it can also be artificially generated to take advantage of available sensors and computational resources. In humans, personality seems to be the result of the interactions of multiple components (forming the robot and the environment). This idea arises, for instance, if a systems engineering perspective is used, in behaviorist theories, but also, quite naturally, from the various examples of
113
control architectures. In social robots, a similar systems engineering perspective could, in principle, be used, though the interactions between components often cannot be easily associated to robot behaviors (the complexity of the interactions scales up very easily). Therefore, alternative models for human personality may lead to novel paradigms. Different psychologists define personality differently, giving rise to different models. For example, [10] definition states “That which permits a prediction of what a person will do in a given situation.” and led to the 16PF model [11] which distinguishes 16 personality factors: warmth, intellect, emotional stability, aggressiveness, liveliness, dutifulness, social assertiveness, sensitivity, paranoia, abstractness, introversion, anxiety, open-mindedness, independence, perfectionism, and tension. Each of these factors spans between low and high scores. [20] defines three “higher-order dimensions”, or traits, namely “extraversion/introversion”, “neuroticism”, and “psychoticism”. Each of these traits can be refined according to multiple variations (see for instance [6]). [2] defines personality as “The dynamic organization within the individual of those psychophysical systems that determine his characteristic behavior and thought”. In [47], personality is defined as “The distinctive patterns of behavior (including thoughts and well as ‘affects,’ that is, feelings, and emotions and actions) that characterize each individual enduringly”. In [24], “Personality refers to individuals’ characteristic patterns of thought, emotion, and behavior, together with the psychological mechanisms -hidden or not -- behind those patterns”. As for [21], “Although no single definition is acceptable to all personality theorists, we can say that personality is a pattern of relatively permanent traits and unique characteristics that give both consistency and individuality to a person’s behavior”. A common trait between the above authors is the concept of “behavior” and, in some of the cases implicitly, the idea of a model/pattern that allows for some sort of behavioral prediction. Even though the differences in semantics between the different authors, there seems to be an idea of
J. S. Sequeira
114
permanency of characteristics used to define personality which can be extrapolated to social robots, i.e., to synthetic personalities. The “big five” model (see for instance [25, 46, 66, 70]) is considered one the most suitable models for computational implementation, [51], even though it is not the most complete. The five traits were initially found as a simplification of a 60 adjective list used to describe personality, [65]. Later on, [52, 66] also concluded that five traits can accommodate the fundamental features of personality. Trait models are especially interesting in what concerns computational implementation as they list features (often described by adjectives) that can be computationally detected/recognized/generated. The “big five” model uses “extraversion”, “agreeableness”, “conscientiousness”, “emotional stability”, and “openness to experience”. Human personality models have also been studied accounting for the social environment a person belongs to. For instance, [4] estimates “big five” traits from observed social interaction behaviors, from which a collection of features representing each trait is estimated. [72] use a graph model of the social network structure, including a learning stage, to estimate the “big five” traits. In both cases it was found that social interaction influences (human) personality. The translation of these concepts to a computational framework must thus be linked to (i) robot behaviors and (ii) the displaying of behavioral patterns (a behavioral pattern can be defined as a model, deterministic or stochastic, that defines the timing consistency7 in behavior activation). [27] refers to variability and stability as two characteristics of personality oriented behaviors must have, though occurring in shorter and longer time scales, respectively. Besides the behavior activation timings, a behavioral model can also change configuration parameters. The trigger mechanism can use perception information, obtained through sensors, e.g., the RFID system. Without loosing generality, behaviors are restricted to actions that affect the environment,
i.e., physical movement of the robot body, neck, and arms, making verbal and non-verbal sounds, and changing the facial expression. This excludes behaviors involved in intellectual reasoning, which for the purpose of the mbot social robot is totally acceptable. Though the variability in the above models, among the definitions of personality trait, behavior, and behavioral pattern, a synthetic personality framework can be supported on finite state automata (FSA), or similar, to supervise the exchange of information between the different components in a social robot. Encoding the sense-think-act pipeline in FSA can be done empirically, by using learning strategies to keep only relevant states (see for instance [40]), or in a hybrid form. From a practical perspective, any initial trial must already encode a priori knowledge on any relevant social norms. Moreover, FSA concepts such as concurrency and hierarchy have easy counterparts in social robotics implementations, namely in the ROS8 middleware and the SMACH package. The personality of the mbot is constructed around a backbone behavior that simply makes the robot wander around for a limited period of time, on a daily basis. This backbone represents a personality trait which can be identified with the “liveliness” trait in Cattell’s 16PF model: the robot moves (apparently) aimlessly, and without interaction concerns. In the “big five” model this can be identified with some degree of introspection. The motion of the mobile base plays an important role in the definition of the personality of the robot. A low extraversion trait (akin to a low liveliness score), with movement only behavior, may yield a perception of zombie personality. Additional verbal and non-verbal sounds are used to convey an increased perception of liveliness and also increase the extraversion trait. Furthermore, this is consistent with the variability property that, some authors argue, behaviors are required to have (see [27]).
In this context, consistency has a meaning of equivalence or similarity.
8
7
Acronym for Robot Operating System, a common, public domain, middleware used in the mbot.
6 Developing a Social Robot – A Case Study
Figure 6.10 shows, in state diagram form, a comprehensive part of the liveliness personality trait implementation for the mbot. The diagram shows (i) a main wandering task running in parallel with (ii) interaction tasks designed to convey a perception of liveliness, (iii) on-demand catch- and- touch game, (iv) survival related tasks (watchdogs), e.g., battery monitoring, and (v) logging tasks, that keep record of relevant variables. The liveliness is mapped into basic actions using the motion of the neck and the body of the robot, and some verbal and nonverbal utterances. A basic survival behavior, such as going to charge the batteries and staying there for an adequate amount of time, is made robust by including a request for assistance by a human in case the robot cannot reach the charging point in a pre- specified amount of time. Additional survival behaviors also requiring human assistance are not represented, e.g., detecting a mismatch in the localization and asking a human to bring it to the docking station for a localization reset. The presence of people wearing a duly registered RFID tag in the close proximity the robot triggers a verbal greeting for the name registered in the tag. Small adjustments are made in real time, namely if no one is detected within a pre- specified time interval a verbal utterance complaining that “no one is in sight” is issued. Playing games can be framed within the extroversion and openness traits. Being expansive (or in high spirits) while playing fits into extraversion. Being willing to play, challenging people to play together fits in the openness trait of the “Big Five” model. Figure 6.11 shows a state diagram for a simple catch-and-touch game, to start on request by a staff member. Engaging in a game, even simple as this one, requires that the main corridor has enough free space and a minimum of bystanders willing to play the game are present. Such verifications are better done by a social assistant staff member that can start the game by waiving a specific RFID tag near the head of the robot (see Fig. 6.10). The behavior of a robot resulting from diagrams such as those in Figs. 6.10 and 6.11, with a
115
reasonable number of interactions between states, may be perceived as not being easily predictable. Furthermore, such diagrams embed, intrinsically, choices from their designers, which in a sense are a product of their personalities. Therefore, the personality of a robot tends to inherit traits from its programmers, making the personality of the robot dependent of its intrinsic social circle.
6.5
etting Up a Simple Social S Experiment
A challenging feature when setting up a social experiment in a non-lab environment is the natural unpredictability. Often, experiments can only evolve when that is socially acceptable, i.e., it is not possible to schedule in advance an experiment. This means experiments resulting from the observations instead of being a priori “designed”, i.e., a collection of observations is organized to form an experiment. This makes the acquisition of scientifically rigorous data a lengthy process. Moreover, a social space may induce (or shape) specific dynamics on the people there, e.g., behaviors may be constrained by some moral codes advising public behavioral discretion. This, however, tends to apply only to people old enough to be aware of such moral codes. Figure 6.12 shows a child interaction with an mbot in the lobby of a public building. After an initial adaptation period (with duration around 20 min) the child was completely at ease with the physical presence of the robot. The experiment followed a WoZ approach, with a remote operator in charge of controlling the utterances and, partially, the navigation. The preparation of the experiment included the deployment of the robot and associated infrastructure, keeping the interference in the environment to a minimum. This often translates in (i) hiding sensors, e.g., recording devices, (ii) not manipulating physical objects to comply with the skills of the robot, (iii) minimizing the presence of the development staff, and (iv) avoiding the presence of any bystanders during the setting up and closing down the experiment, i.e., forcing a “natural” starting and ending of the experiment.
J. S. Sequeira
116
Fig. 6.10 Basic liveliness behavior (or personality trait)
Failures must be managed by the robot itself, in a graceful/smooth manner, as a human would likely do, in order to avoid lower the expectations of any bystanders. Though not so relevant for young children, as they are in general not well aware of technologies, for older people, with developed reasoning skills, a poor failure management easily conveys wrong perceptions/ expectations. In what concerns Monarch, the deployment period included the installation of computational and communications resources and fixed sensors and, hence, led to an abnormal activity on the
ward and to a transient rise in the people expectations about social robots.
6.5.1 Behaviors for Entertainment and Social Integration Social robots have been developed mainly for edutainment9 and simple hosting activities (see Table 6.1), that is, experiments in which the robot
Education and entertainment.
9
6 Developing a Social Robot – A Case Study
117
Fig. 6.11 Catch-and-touch gaming behavior – The game is started when a specific RFID tag is detected by the robot
Fig. 6.12 At ease with an mbot after an initial transient habituation period (child aged four)
does not effectively lives in a social environment. In such cases, the setting up and closing down is not a concern as the robot tends to be seen as a tool that can be switched on/off. For example, in the case of a hosting activity the robot does not go into or out from the hosting place by itself (see Fig. 6.13). Figure 6.14 shows snapshots of a variant of the Flow Free board game developed during the Monarch project. This variation of the game (well known in cellphones) used an overhead video projector, mounted on the ceiling, to project the board game on the floor and an overhead video camera from
which the positions of the robot and of the playing child were computed. The game dynamics was made very slow to allow children with reduced mobility to play the game, if necessary with the help of an adult.
6.5.2 Assessment, Errors and Features Assessment experiments carried out during the period of the Monarch project have been reported in the project Deliverables (available to general public). The metrics considered were based in
118
J. S. Sequeira
Fig. 6.13 The Pepper robot as host in stands at the EuroCIS 2017 trade fair for retail technologies in Dusseldorf, Germany (the base of the robot remained static during the hosting activities)
Fig. 6.14 Game playing during the Monarch project (from [58])
6 Developing a Social Robot – A Case Study
119
Fig. 6.15 Time between consecutive detections of people (blue marks) and touch (red marks) across a period of 2 days (sampling time 1 s). The magenta stars mark the 8 am and 8 pm instants
activation rates of micro-behaviors and required that long video sequences be analyzed a posteriori of the assessment sessions. Likert like questionnaires were considered not adequate as they would likely bias the opinion of some people (the ward population has a wide variety of cultures and belief systems and selecting a statistically good population would not be possible/practical within a limited time frame). Several linear combinations of the activation rates were then used to assess specific aspects of the behavior of the mbot. Statistical evidence of hypothesis formulated using data resulting from video segmentation was not obtained during the project, namely because (i) structured experiments would be required and the normal operation of the ward was not to be disturbed, and (ii) segmenting micro-behaviors represented a significant effort that would consume valuable project resources, easily going above the development effort. Nevertheless, the empirical evidence collected strongly suggested high acceptance, even after the aforementioned transient phase after the initial deployment. Statistical models for the time between similar micro-behaviors, obtained during some of the trials, can be found in [56]. Ongoing work is centered on the identification of statistical models for relevant variables
observed through the privacy-compatible sensors, namely, people detection using the laser range finder, and the touch sensors in the arms. Figure 6.15 shows a sequence of the time between consecutive detections of people and touch. Measurements started around 6 pm and lasted for approximately for 5 non-consecutive days (2 days in the lefthand plot and 3 days in the righthand one). The periods without measurements correspond to night time. In the righthand plot the large period in the middle region could be identified with an obstacle placed in front of the robot that blocked it for more than 1 day. People and touch detections in close proximity can be interpreted as people physically interacting with the robot. Simple people detections may represent both people standing near the robot, or passing by. In both experiments the constant trends are removed from the data. A constant trend is unlikely to be due to human interaction with the robot. Instead it may be due to errors in the sensor and/or the associated computations. Statistical models can be obtained from this data, namely by fitting to probability distributions such as Lognormal, Weibull, or Gamma (common in modeling time between events). As the personality of the robot is perceived differ-
J. S. Sequeira
120
Fig. 6.16 Evolution of the parameters of Weibull, Gamma, and Lognormal distributions over a period of 3 months
ently along time (as a result from the evolution in its skills) changes in the parameters of such distributions are to be expected. Figure 6.16 shows the evolution of such parameters (shape and scale) over a 3 month period during which no major development in the mbot occurred and, hence, the variations can be reasonably expected to be due to changes in the environment itself. Each sample corresponds to a time lapse between 1 and 9 days. All three distributions agree for most of the events occurring during the experiment period, namely by exhibiting synchronized variations in the corresponding scale/shape parameters. The parameters for the Weibull and Gamma distributions show a higher sensitivity to some environment conditions (represented by the strong peaks). The Lognormal distribution is able to detect the same conditions with all the peaks being similarly shaped. From the perspective of detecting environmental changes any of the distributions can be used. In general, it is difficult to determine the nature of the events detected through the distributions. Typical situations include (i) visits of large groups of people to the ward, (ii) placement of objects covering the laser range finders during a long time, and even (iii) software failures. In fact, inferring about the occurrence of socially relevant events from modelling data represents a challenge, possibly requiring the acquisition of long data sequences. Besides showing events intrinsic to the environment, variations in the parameters of the modeling distributions can also be identified with behavioral (personality) changes from the robot and the corresponding reactions of people. For
Fig. 6.17 Sample of trajectories of a robot caused by programming errors (robot position shown as red circle marks, initial and final positions shown as blue + marks)
example, some sort of discomfort by the people at the ward may result from abnormal behaviors. In extreme situations, misbehaviors by the robot may lead to people trying to switch it off. Figure 6.17 shows a sample of the trajectories of the robot as a result of programming errors. The physical area corresponds to the lower part of the ward environment shown in Fig. 6.5. During normal operation hours, 8:00–19:0010 the robot was expected to behave according to the liveliness trait in Fig. 6.10. The software problem caused the robot to go out of its docking station at 20:00 and circulated in the neighborhood for 3–4 h (limited by the life This period includes a small (variable) period in which the robot is allowed to move within the ward and a period in which touch, RFID, RGB-D, and utterances are active but the mbot is stationary at the docking station.
10
6 Developing a Social Robot – A Case Study
of the batteries). As shown in the figure, the robot followed a well defined pattern of a narrow corridor with middle axis defined by points (1.2,6) and (1.4,0.5). The line pattern with axis between points (1.2,6) and (0.5,0) corresponds to a trajectory of return to the docking station imposed by the system once the batteries go below a pre- specified level. An intrinsic characteristic of programming errors is that they are a priori unwanted features and not always directly observable. They appear in the eyes of a bystander not as errors but as features, with potential to induce a wrong perception of personality, or, at least, a perception of behavioral autonomy. Often, they will be reported not as errors but as undesirable/abnormal/different/ interesting behaviors (which do not necessarily mean that some error is involved – such was the case in this example) and hence requiring complex debugging.
6.6
thics Concerns in Social E Robotics
The rapid progression of social robotics has fostered both (i) the emergence of ethics related challenges in robotics, of which the aforementioned variety in people detection strategies is just an example, and (ii) the recent developments in the Ethics related legislation. Among the robotics literature, [35, 44] highlighted the role of ethics in healthcare. In the legislation domain, key EU documents include those related to the processing of Electronic Health Records, [18], on the definition of “consent”, and [17], on advertising behavior. In fact, the use of Informed Consent (IC) forms is a standard practice to ensure that anyone interacting with the robot keeps the control of any relevant personal information [16] (also recommended in [31]), though it can also easily bias any assessment (some belief systems easily introduce biases). Moreover, paradoxical situations easily arise, namely when buying a modern smartphone where no explicit IC is considered though associated privacy issues are well known. For a broad coverage of Ethics issues in Robotics see [23].
121
Ethics compliant social robotics means to match the technology capabilities with the social norms enforced in each environment, i.e., making robots behaving in a socially acceptable manner, this implying some form of similarity, when interacting with humans.11 Implicitly, this means that robots must be accepted by humans on the grounds that they explicitly comply with the adequate Ethics. In what concerns liability, the topic handling consequences of poor ethics, logging every relevant interaction, such that causal links involved in liability issues can be verified, quickly grows beyond reasonable storage limits. Frameworks to allow a transparent access to the data acquired/ stored/processed by robots may allow forensic like queries while conveying a perception of transparency, thus bringing expectations to a level compatible with the available technologies and fighting the fear of Artificial Intelligence. During the Monarch period, data gathering experiments were completely transparent and fully supported by the hospital Ethics board. A posteriori screening was used to select data to be used for public display, namely images and Informed Consents used whenever necessary.
6.7
Conclusions and Future Trends
This chapter presented an overview of the challenges faced while deploying and maintaining a social robot in a non-lab environment. The robot, developed during the European FP7 Monarch project, underwent multiple experiments that established its acceptance among people. After the project end privacy regulations significantly reduced the flexibility of the system, namely in what concerns the design of interesting social behaviors. Though people seems to appreciate Strictly, similarity with humans implies that a social robot does not have to necessarily act nice at all times, and it may even be impolite, e.g., as saying to someone not to go to a specific area, or showing contempt/sadness if is being hit repeatedly. Exerting direct authority, from robots towards humans, is frequently not well accepted by humans. Advising/counselling is likely to produce the best results. Indirect suggestions, e.g., nudges (see for instance [62]) also play a valuable role.
11
122
that a robot is aware of them (the gesture of waiving in front of the face of the robot is common and people often express satisfaction if the robot reacts accordingly – a behavior claimed to be hardwired in human biology, [1]), they fear possible misuses of the imaging information allegedly captured by the systems embedded in the robot face. Therefore, making people increasingly aware of social robots is a key step to overcome the limitations and barriers imposed by the legislation. Still, overcoming nonsensical fears represents a major challenge. The capabilities of current technologies enable robots to be equipped with sensing systems providing information with a quality approaching that of physical senses of humans. For example, current computer vision is already capable of interesting performances in facial and body recognition (as, for example, airport technologies already in place at passport control points). Integrating this in social robots presents no significant technological problems and would expand the flexibility in behavior design significantly. As designing interesting behaviors for social robots to expand social skills is becoming easier as the technology progresses, namely sophisticated microphone technology and fast internet connections, and, paradoxically, challenging, dependence of privacy sensitive data must be reduced. Moreover, accounting for Ethics principles may require humanizing the robots (see the discussion in [45]) and advances in Psychology that some authors argue might be impossible to apply to HRI, [36]. However, as Psychology related concepts, namely emotions and personality, are translated to their synthetic/computational counterparts it can be expected that improved social behaviors are developed. In addition, nudging techniques, often used to stimulate the fears, can also be used in the opposite way, i.e., to make people aware of potential benefits of having social robots in their social environments. This includes dealing with robots that will behave poorly, from a social perspective, and may make mistakes. The current trend in (deep) learning techniques makes them very appealing to design
J. S. Sequeira
synthetic personality (SP) architectures. Using a neural network as a map between with raw data from perception and the space of actions the robot can performed is unequivocally appealing, as it implements, directly, the robotics pipeline sense-think-act (see for instance [60]). Assigning each personality trait to a different network, complex SPs may be constructed by carefully managing the outputs of the different traits, i.e., interleaving, sequencing, blocking, prioritizing, or any other form of combination of data from multiple sources. However, in highly constrained applications, as in the case of the Pediatrics ward, the training/ learning time must be very small or null. Furthermore, any disturbances arising from the training/learning process are to be avoided and hence alternative strategies providing acceptable baseline behaviors are required. As social robots become ubiquitous and fears from technology are damped out, some of the challenges in this chapter, namely those related to ethics, are likely to vanish. Traditional professions will be challenged, by having human professionals compared with robots. In the healthcare domain, professions not requiring a rich integration of motion and reasoning may be natural candidates for quick developments, e.g., pharmacy assistants, host attendants (both professions relying extensively in face-to-face interactions, structured reasoning skills, and not requiring extensive behavioral body motions). However, if humans are willing to improve their social behavior to integrate social robots, relevant results are likely to emerge, namely an improved humanization of societies. Acknowledgements This work would not have been possible without the collaboration of the Pediatrics ward of IPOLFG hospital in Lisbon, Portugal, and in particular of the Director of the ward, Dra. Filomena Pereira. As Monarch full partners, the staff of the Pediatrics ward was always supportive, without facilitating the life of the robot, hence naturally contributing to reduce bias in the assessment. The robot stayed at the hospital after the official end of the project (June 2016) and it has been a continuous scientific challenge always backed by the people in the Pediatrics ward. This work has been partially funded by project FCT [UID/EEA/50009/2019].
6 Developing a Social Robot – A Case Study
References 1. Adolphs R (2013) The biology of fear. Curr Biol 23(2):R79–R93. January 2013, Elsevier Ltd. https:// doi.org/10.1016/j.cub.2012.11.055 2. Allport GW (1961) Pattern and growth in personality: by Gordon W. Allport. Holt, Reinhart & Winston, Oxford 3. Alvito P, Marques C, Carriço P, Barbosa M, Estilita J, Antunes D, Gonçalves D (2014) Deliverable D2.2.1 – Monarch robots hardware v.2. December 2014 4. Bai S, Zhu T, Cheng L (2012) Big-five personality prediction based on user behaviors at social network sites. arXiv:1204.4809[cs.CY] 5. Baraka K, Melo FS, Veloso M (2017) ‘Autistic robots’ for embodied emulation of behaviors typically seen in children with different autism severities. In: Proceedings of 2017 international conference on social robotics. Springer, Cham, pp 105–114 6. Bech P (2016) Measurement-base care in mental disorders. Springer, New York 7. Beer JM, Prakash A, Mitzner TL, Rogers WA (2011) Understanding robot acceptance, Technical report HFA-TR-1103. School of Psychology – Human Factors and Aging Laboratory. Georgia Institute of Technology, Atlanta 8. Belpaeme T, Kennedy J, Baxter P, Vogt P, Krahmer EJ, Kopp S, Bergmann K, Leseman P, Küntay AC, Göksun T, Pandey AK, Gelin R, Koudelkova P, Debliec T (2011) L2TOR – second language tutoring using social robots. In: Proceedings of 1st international workshop on educational robots at ICSR 2015, Paris, France, October 2015 9. Bharatharaj J, Huang L, Mohan RE, Al-Jumaily A, Krägeloh C (2017) Robot-assisted therapy for learning and social interaction of children with autism spectrum disorder. Robotics 6(1):4. https://doi. org/10.3390/robotics6010004 10. Cattell RB (1950) Personality: a systematic theoretical and factual study. McGraw-Hill, New York 11. Cattell RB (1965) The scientific analysis of personality. Penguin Books, Baltimore 12. Ciotti G (2014) Want to change your habits? Change your environment. Psychology Today, August 2014 [online January 2019] 13. Dauntenhahn K, Werry I (2002) A quantitative technique for analysing robot-human interactions. In: Proceedings of 2002 IEEE/RSJ international conference on Intelligent Robots and Systems. Lausanne, Switzerland, September 30–October 4, 2002 14. Dix A, Finlay J, Abowd GD, Beale R (2004) Human- computer interaction, 3rd edn. Pearson/Prentice Hall, Harlow 15. Dorin A (2004) Artifact and artifice: building artificial life for play. In: Artificial Life, vol 10, no 1, MIT Press, pp 99–112 16. EC (2017) Ethics review in FP7: guidance for applicants: informed consent. European Commission – Research Directorate-General Directorate L – Science,
123 Economy and Society Unit L3 – Governance and Ethics. Available at http://ec.europa.eu/research/participants/data/%20ref/fp7/89807/%20informed-consent_en.pdf [online April 2017] 17. ECDPO-171 Article 29 of Directive 95/46/EC – Data Protection Working Party, WP 171 18. ECDPO-187 Article 29 of Directive 95/46/EC – Data Protection Working Party, WP 187 19. Ekman P (1985) Telling lies: clues to deceit in the marketplace, politics, and marriage. W.W. Norton & Co., New York 20. Eysenck HJ (1952) The scientific study of personality. Routledge & Keegan Paul, London 21. Feist J, Feist GJ (2009) Theories of personality, 7th edn. McGraw-Hill Primis, Boston, MA 22. Ferreira MI, Sequeira JS (2016) Designing a robotic interface for children: the Monarch robot example. In: Proceedings of 19th international conference on Climbing and Walking Robots and Support Technologies for Mobile Machines, CLAWAR’16. London, September 12–14, 2016 23. Ferreira MIA, Sequeira JS, Tohki MO, Kadar EE, Virk GS (eds) (2017) A world with robots. Selected papers from the international conference on robot ethics 2015. Springer series in intelligent systems, control and automation: science and engineering, vol 84 24. Funder DC (2001) Personality. Annu Rev Psychol 52:197–221 25. Goldberg LR (1993) The structure of phenotypic personality traits. Am Psychol 28:26–34 26. Goodrich MA, Olsen DR Jr. (2003) Seven principles of efficient human robot interaction. In: Proceedings of IEEE international conference on systems, man, and cybernetics, Washington, DC, October 5–8, 2003, pp. 3943–3948 27. Gundogdu D, Finnerty AN, Staiano J, Teso S, Passerini A, Pianesi F, Lepri B (2017) Investigating the association between social interactions and personality states dynamics. R Soc Open Sci 4:170194 28. Haggard EA, Isaacs KS (1966) Micromomentary facial expressions. In: Gottschalk LA, Auerback AH (eds) Methods of research in psychology. Appleton Century Crofts, New York 29. Harmon K (2010) Social ties boost survival by 50 percent. Scientific American, July 2010 [online January 2019] 30. Hildmann H, Uhlemann A, Livingstone D (2008) A mobile phone based virtual pet to teach social norms and behaviour to children. Second IEEE International Conference on Digital Games and Intelligent Toys Based Education, 2008 31. IEEE (2016) Ethically aligned design. Version 1 – public discussion. The IEEE global initiative, 2016 32. Jensen B, Froidevaux G, Greppin X, Lorotte A, Mayor L, Meisser M, Ramel G, Siegwart R (2002) Visitor flow management using human-robot interaction at Expo.02. Workshop on robots in exhibitions. In: Proceedings of IEEE/RSJ international conference on Intelligent Robots and Systems, IROS’02. Lausanne Switzerland, September 30–October 4, 2002
124 33. Jhangiani R, Tarry H (2014) Principles of social psychology – 1st international edition. Adapted from principles of social psychology from Charles Stangor. B.C. Open Textbook Project, 2014 [online January 2019] 34. Kahn P Jr., Freier N, Kanda T, Ishiguro H, Ruckert J, Severson R, Kane S (2008) Design patterns for sociality in human-robot interaction. In: Proceedings of 3rd ACM/IEEE international conference on HumanRobot Interaction, HRI’08. Amsterdam, Netherlands, March 12–15, 2008 35. Kahn PH Jr., Kanda T, Ishiguro H, Gill BT, Ruckert JH, Shen S, Gary HE, Reichert AL, Freier NG, Severson RL (2012) Do people hold a humanoid robot morally accountable for the harm it causes? In: Proceedings of 7th ACM/IEEE international conference on Human-Robot Interaction, HRI’12. Boston, MA, March 5–8, 2012 36. Kahn PH, Ishiguro H, Friedman B, Kanda T (2006) What is a human? – towards psychological benchmarks in the field of human-robot interaction. In: Proceedings of 15th IEEE international symposium on Robot and Human Interactive Communication, RO-MAN’06. Hatfield UK, September 6–8, 2006 37. Kanda T, Ishiguro H (2013) Human robot interaction in social robotics. CRC Press, Boca Raton 38. Kanda T, Hirano T, Eaton D, Ishiguro H (2004) Interactive robots as social partners and peer tutors for children: a field trial. Hum Comput Interact 19:61–84 39. Kellmeyer P, Mueller O, Feingold-Polak R, Levy- Tzedek S (2018) Social robots in rehabilitation: a question of trust. Sci Robot 3(21):eaat1587. https:// doi.org/10.1126/scirobotics.aat1587 40. Kerr W, Tran A, Cohen P (2011) Activity recognition with finite state machines. In: Proceedings of 22nd International Joint Conference on Artificial Intelligence, IJCAI’11. Barcelona, Spain, 16–22 July 2011 41. Kline B, Tamer E (2011) Some interpretation of the linear-in-means model of social interactions. Online 2018, https://scholar.harvard.edu/files/tamer/files/kt2june16lim\_0.pdf 42. Kuno Y, Sekiguchi H, Tsubota T, Moriyama S, Yamazaki K, Yamazaki A (2006) Museum guide robot with communicative head motion. In: Proceedings of 15th IEEE international symposium on Robot and Human Interactive Communication, RO-MAN’06. Hatfield UK, September 6–8, 2006 43. Lai PC (2017) The literature review of technology adoption models and theories for the novelty technology. J Inf Syst Technol Manag 14(1):21–38 44. Lee S, Lau IY (2011) Hitting a robot vs. hitting a human: is it the same? In: Proceedings of 6th ACM/ IEEE international conference on Human-Robot Interaction, HRI’11. Lausanne, Switzerland, March 6–9, 2011 45. MacDorman KF, Cowley SJ (2006) Long-term relationships as a benchmark for robot personhood.
J. S. Sequeira In: Proceedings of 15th international symposium on Robot and Human Interactive Communication, RO-MAN’06. Hatfield, UK, September 6–8, 2006 46. McCrae RR, Costa PT Jr (2013) Introduction to empirical and theoretical status of the five-factor model of personality traits. In: Widiger T, Costa P (eds) Personality disorders and the five-factor model of personality, 3rd edn. American Psychological Association, Washington, DC, pp 15–27 47. Mischel W (1999) Introduction to personality. Harcourt Brace College Publishers, Fort Worth 48. Mori M (1970) The uncanny valley. Translated by Karl F. MacDorman and Norri Kageki. IEEE Robotics & Automation Magazine, June 2012 49. Muller S, Bergande B, Brune P (2018) Robot tutoring: on the feasibility of using cognitive systems as tutors in introductory programming education – a teaching experiment. In: Proceedings ECSEE’18, June 14–15, 2018, Seeon/Bavaria, Germany 50. Navarro J (2008) What every body is saying. Harper Collins, New York 51. Neuman Y (2016) Computational personality analysis: introduction, practical applications and novel directions. Springer, Cham 52. Norman WT (1963) Toward an adequate taxonomy of personality attributes: replicated factor structure in peer nomination personality ratings. J Abnorm Soc Psychol 66:574–583 53. Polak RF, Bistritsky A, Gozlan Y, Levy-Tzedek S (2019) Humanoid robotic system for post-stroke upper-limb rehabilitation: the need for personalization. In: Proceedings of 14th ACM/IEEE international conference on Human-Robot Interaction, HRI’19. Daegu, Korea, March 11–14, 2019 54. Robins B, Dautenhahn K, Boekhorst R, Billard A (2004) Robots as assistive technology – does appearance matter? In: Proceedings of IEEE international workshop on Robot and Human Interactive Communication, RO-MAN’04. Kurashiki, Japan, September 20–22, 2004. 55. Salichs M, Gorostiza J, Khamis A, Barber R, Malfaz M, Rivas R, Corrales A (2006) Multimodal human-robot interaction framework for a personal robot. In: Proceedings of 15th IEEE international symposium on Robot and Human Interactive Communication, RO-MAN’06, Hatfield, UK, September 6–8, 2006 56. Sequeira J (2017) Evaluation of experiments in social robotics: insights from the Monarch Project. In: Proceedings of 26th IEEE international symposium on Robot and Human Interactive Communication, RO-MAN’17. Lisbon, Portugal, August 28– September 1, 2017 57. Sequeira J, Gameiro D (2017) A probabilistic approach to RFID-based localization for human-robot interaction in social robotics. Electronics 6(2):32. Special Issue on RFID Systems and Applications. https://doi.org/10.3390/electronics6020032
6 Developing a Social Robot – A Case Study 58. Sequeira J, Gameiro D, Ventura R, Pereira JN, Martinoli A, Wasik A, Talebpour Z, Alvito P, Gonzalez V, Castro Á, Castillo JC, Salichs MÁ, Pecora F, Saffiotti A, Tomic S, Lima P, Barbosa M, Viswanathan DG (2016) Deliverable D8.8.6 – the Monarch system at IPOL. Project Monarch deliverable, May 2016 59. Shiomi M, Kanda T, Ishiguro H, Hagita N (2006) Interactive humanoid robots for a science museum. In: Proceedings of 1st ACM SIGCHI/SIGART conference on Human-Robot Interaction, HRI’06, pp 305–312. Salt-Lake City, UT, March 2–3, 2006 60. Siegel M (2003) The sense-think-act paradigm revisited. In: Proceedings of 1st international workshop on Robotics Sensing, ROSE’03. Orebro, Sweden, June 5–6, 2003 61. Srinivasan SM, Park IK, Neelly LB, Bhat AN (2015) A comparison of the effects of rhythm and robotic interventions on repetitive behaviors and affective states of children with Autism Spectrum Disorder (ASD). Res Autism Spectr Disord 18:51–63 62. Sunstein CR (2014) Nudging: a very short guide, 37 J. consumer Pol’y 583. [Online http://nrs.harvard.edu/ urn-3:HUL.InstRepos:16205305, January 2018] 63. Tafarodi RW, Swann WB Jr (1995) Self-liking and self-competence as dimensions of global self- esteem: initial validation of a measure. J Pers Assess 65(2):322–342 64. Thrun S, Bennewitz M, Burgard W, Cremers AB, Dellaert F, Fox D, Hahnel D, Rosenberg C, Roy N, Schulte J, Schulz D (1999) MINERVA: a second generation mobile tour-guide robot. In: Proceedings
125 of IEEE International Conference on Robotics and Automation, ICRA’99. Detroit, MI, May 10–15, 1999 65. Thurstone LL (1934) The vectors of mind. Psychol Rev 41:1–32 66. Tupes EC, Christal RE (1961) Recurrent personality factors based on trait ratings, Technical report ASD-TR-61-97. Lackland Air Force Base, Personnel Laboratory, Air Force Systems Command 67. Wada K, Shibata T, Musha T, Kimura S (2008) Robot therapy for elders affected by dementia. IEEE Eng Med Biol Mag 27:53–60 68. Winkle K, Caleb-Solly P, Turton A, Bremner P (2018) Social robots for engagement in rehabilitative therapies: design implications from a study with therapists. In: Proceedings of 2018 ACM/IEEE international conference on Human-Robot Interaction, HRI’18. Chicago, IL, March 5–8, 2018 69. Riek LD (2012) Wizard of Oz studies in HRI: a systematic review and new reporting guidelines. J Human-Robot Interaction 1(1):119–136 70. Digman JM (1990) Personality structure: emer gence of the five-factor model. Annu Rev Psychol 41:417–440 71. Bevan N, Macleod M (1994) Usability measure ment in context. J Behaviour Information Technol 13:132–145 72. Ye Z, Yang D, Li Z (2017 October) Predicting personality traits of users in social networks. In: Proceedings of 18th Intelligent Data Engineering and Automated Learning, IDEAL’17, vol 10585. Springer LNCS. Guilin, China, October 30– November 1, 2017
Index
A Ageing, 95, 100 Assistant robots, 3 Assistive devices, 19, 61, 71, 73 Augmentation devices, 38 Autonomous guidance, 39, 41, 43 B Bibliometric investigation, 8 Big five model, 114, 115 Biofeedback, 38, 47, 48, 52, 54–56, 58, 59, 64 C Capacitive sensors, 73, 77 Complementary therapies, 95–100 D Degrees of freedom (DOFs), 21, 23, 42, 70, 79, 106–108 Dementia, 95–100 Developed countries, 5
H Haptic feedback, 41, 90, 93 Homecare, 97, 99 Human-machine interface, 39, 40, 56 I IMU, see Inertial measurement units (IMU) Inductive sensor, 77 Inertial measurement units (IMU), 47, 51, 55, 61, 70, 73, 92 Informed Consent (IC), 121 Insole pressure sensors, 73 Interdisciplinary research, 2, 6, 16 L Laser range finder (LRF), 41–43, 47, 52, 92, 108, 109, 111, 119, 120 Liveliness, 113–115, 120 Locomotion, 24, 27, 37, 38, 42–44, 47, 52–54, 56, 64, 88, 107
E Emotions, 18, 98, 100, 106, 108, 113, 122 Endoscopic capsule, 87 Ethics, 2, 9, 11, 100, 104, 105, 107, 121, 122 Exoskeletons, 8, 10, 20–22, 27, 29, 69–81
M Magnetically driven, 87–93 Maneuverability, 38, 40, 44, 49, 52–54, 56, 60, 64, 88 Microexpressions, 106 Mobility, 7–11, 26, 31, 37, 39–41, 52, 53, 64, 73, 96, 117 Motor disturbances, 38
F Force sensor, 40, 42, 44, 47, 51, 75, 77
N Neurodegenerative diseases, 96–100
G Gait, 24–27, 29, 38, 40–45, 47–49, 51, 52, 54–56, 58–61, 63–65, 70, 73 Gastrointestinal (GI) tract, 87 Gazebo, 92 GI, see Gastrointestinal (GI) tract Gravity compensation, 71, 73
P Palliative care, 38 People detection, 111, 119, 121 Personality models, 114 Physical therapy, 3, 21 Piezoresistive sensor, 40, 75–77 Privacy concerns, 110
© Springer Nature Switzerland AG 2019 J. S. Sequeira (ed.), Robotics in Healthcare, Advances in Experimental Medicine and Biology 1170, https://doi.org/10.1007/978-3-030-24230-5
127
Index
128 R Rehabilitation robotics, 1–15, 21, 24, 30, 31, 37–65, 107 RFID tag, 111, 113, 115 Robotics in rehabilitation taxonomy, 7–9 Robotic walker, 39–43, 47 S Security monitoring, 39, 42–44 Sensory impairment, 38, 39 Shared control, 23, 41, 88 Shared control guidance, 39, 41, 43
Socially Assistive Robotics (SAR), 9, 18, 96–100 Surgical robots, 3, 31 T Teleoperation, 88 Textile interface, 73 W Wearable systems, 76, 77 Wizard of Oz techniques, 107