Foundations of Artificial Intelligence in Healthcare and Bioscience: A User Friendly Guide for IT Professionals, Healthcare Providers, Researchers, and Clinicians 0128244771, 9780128244777

Foundational Handbook of Artificial Intelligence in Healthcare and Bioscience: A User Friendly Guide for IT Professional

371 103 14MB

English Pages 558 [551] Year 2020

Report DMCA / Copyright

DOWNLOAD PDF FILE

Table of contents :
Foundations of Artificial Intelligence in Healthcare and Bioscience
Copyright
Dedication
Contents
Section I Artificial Intelligence (AI): Understanding the technology1
Section II Artificial Intelligence (AI): Applications in Health and Wellness73
List of Illustrations
Foreword by Adam Dimitrov
Foreword by Ernst Nicolitz
Preface
Postscript
Acknowledgments
Artificial intelligence (AI): Understanding the technology
Introduction
References
1 The evolution of artificial intelligence (AI)
1.1 Human intelligence
1.2 Defining artificial intelligence (AI)
References
2 The basic computer
2.1 Layers of basic computers
2.1.1 Input layer
2.1.2 Inner (hidden) layer
2.1.3 Output layer
2.2 Basic computer language and programming
2.3 Basic computer hardware
2.4 Basic computer software
2.5 Servers, internet and world wide web (www)
2.5.1 Servers
2.5.2 Internet
2.5.3 World wide web (www)
References
3 The science and technologies of artificial intelligence (AI)
3.1 The theory and science of artificial intelligence (AI)
3.2 Artificial neural network (ANN) model of artificial intelligence (AI)
3.3 AI software (algorithms)
3.3.1 Machine learning
3.3.1.1 Supervised (labeled) data
3.3.2 Neural networking and deep learning
3.3.2.1 Unsupervised (unlabeled) data
3.3.2.2 Reinforcement learning
3.4 AI hardware
3.4.1 Ram (random access memory)
3.4.2 Computer servers (file, mail, print, web, game, apps)
3.4.3 Central processing unit (CPU)
3.4.4 Graphic processing unit (GPU)
3.4.5 Accelerators [61]
3.4.6 Quantum processors using “qubits” (vs digital binary code)
3.4.7 Neuromorphic chips (“self-learning” microchips)
3.4.8 Application specific integrated circuit (ASIC)
3.4.9 Field-programmable gate array (FPGA) integrated circuit with hardware description language (HDL)
3.5 Specialized AI systems
3.5.1 Natural language processing (NLP)
3.5.2 Natural language generation (NLG)
3.5.3 Expert systems
3.5.4 “Internet of things” (IoT)
3.5.5 Cyber-physical system (CPS)
3.5.6 Big data analytics
3.5.7 Blockchain
3.5.8 Robotics
3.6 Sample AI scenarios
3.6.1 “Why is the Mona Lisa smiling?” [103]
3.6.2 The “great steak” experience [106]
References
Artificial intelligence (AI): Applications in health and wellness
Introduction
References
4 AI applications in the business and administration of health care
4.1 AI applications in government agencies (GOVs), non-governmental organizations (NGOs) and third-party health insurers
4.1.1 Primary AI applications GOVs, NGOs, and third-party health insurers (1, 2, 3)
4.1.2 Additional AI applications to GOVs, NGOs, and third-party health insurers (4, 5, 6)
4.2 Big data analytics in health care [Text #1]
4.2.1 Primary AI literature reviews of big data analytics (1, 2, 3)
4.2.2 Additional AI literature reviews of big data analytics (4, 5, 6)
4.3 Blockchain in health care [Text #2]
4.3.1 Primary AI literature reviews of blockchain (1, 2, 3)
4.3.2 Additional AI literature reviews of blockchain (4, 5, 6)
4.4 Health information and records (electronic health record or EHR) [Text #3]
4.4.1 Primary AI literature reviews of health information and records (EHR) (1, 2, 3)
4.4.2 Additional AI literature reviews of health information and records (EHR) (4, 5, 6)
4.5 Population health [Text #4]
4.5.1 Primary AI literature reviews of population health (1, 2, 3)
4.5.2 Additional AI literature reviews of population health (4, 5, 6)
4.6 Healthcare analytics (descriptive, diagnostic, predictive, prescriptive, discovery) [78] [Text #5]
4.6.1 Descriptive analytics [Text #6] [84]
4.6.2 Diagnostic analytics [Text #7] [85]
4.6.3 Predictive analytics [Text #8, page 99] [78]
4.6.4 Prescriptive analytics [Text #9, page 100] [83]
4.6.5 Primary AI literature reviews of health analytics (1, 2, 3)
4.6.6 Additional AI literature reviews of health analytics (4, 5, 6)
4.7 Precision health (aka precision medicine or personalized medicine) [Text #10]
4.7.1 Primary AI literature reviews of precision medicine/health (1, 2, 3)
4.7.2 Additional AI literature reviews of precision medicine/health (4, 5, 6)
4.8 Preventive medicine/healthcare [Text #11]
4.8.1 Primary AI literature reviews of preventive medicine/healthcare (1, 2, 3)
4.8.2 Additional AI literature reviews of preventive medicine/healthcare (4, 5, 6)
4.9 Public health [Text #12]
4.9.1 Primary AI literature reviews of public health (1, 2, 3)
4.9.2 Additional AI literature reviews of public health (4, 5, 6)
References
5 AI applications in diagnostic technologies and services
5.1 Major diagnostic technologies [4] and their AI applications
5.1.1 Diagnostic imaging
5.1.1.1 Categories of diagnostic imaging
5.1.1.1.1 AI’s influence on conventional radiography [15]
5.1.1.1.2 Literature reviews re AI’s influence on conventional radiography
5.1.1.1.3 AI’s influence on mammography
5.1.1.1.4 Literature reviews re AI’s influence on mammography
5.1.1.1.5 AI’s influence on fluoroscopy [33]
5.1.1.1.6 Literature reviews re AI’s influence on fluoroscopy
5.1.1.1.7 AI’s influence on radiomics
5.1.1.1.8 Literature reviews re AIs influence on radiomics
5.1.1.1.9 AI’s influence on computed tomography (CT or CAT) scans [53]
5.1.1.1.10 Literature reviews re AI’s influence on computed tomography (CT or CAT) scans
5.1.1.1.11 AI’s influence on MRI scans
5.1.1.1.12 Literature reviews re AI’s influence on MRI scans
5.1.1.1.13 AI’s influence on nuclear medicine scans [74]
5.1.1.1.14 Literature reviews re AI’s influence on nuclear medicine scans
5.1.1.1.15 AI’s influence on ultrasound (sonography) [78]
5.1.1.1.16 Literature reviews re AI’s influence on ultrasound (sonography)
5.1.1.1.17 AI’s influence on endoscopy [90]
5.1.1.1.18 Literature reviews re: AI’s influence on endoscopy
5.1.1.1.19 AI’s influence on fundus imaging [97]
5.1.1.1.20 Literature reviews re AI’s influence on fundus imaging
5.1.1.1.21 AI’s influence on medical (clinical) photography
5.1.1.1.22 Literature reviews re AI’s influence on medical (clinical) photography
5.1.2 Laboratory (clinical diagnostic) testing
5.1.2.1 AI’s influence on laboratory testing
5.1.3 Genetic and genomic screening and diagnosis
5.1.3.1 The science
5.1.3.2 Cytogenetics
5.1.3.3 Genetic testing [128]
5.1.3.4 Big data analytics in genomics [130]
5.1.3.5 AI in genetic cancer screening
5.1.3.6 AI in immunogenetics (see also Immunology, Chapters 6 and 7)
5.1.3.7 Genetics, precision medicine and AI
5.1.3.8 Literature reviews re AI’s influence on genetics and genomics
5.2 Additional diagnostic technologies and their AI applications
5.2.1 Vital signs
5.2.2 Electrodiagnosis
5.2.3 Telemedicine (aka telehealth)
5.2.4 Chatbots
5.2.5 Expert systems
5.2.5.1 Literature reviews re AI’s influences on “additional diagnostic technologies”
References
6 Current AI applications in medical therapies and services
6.1 Medical care (primary, secondary, tertiary, quaternary care)
6.1.1 Big data analytics and AI in medical care
6.1.2 Health information and records (EHR) and AI in medical care
6.1.3 Research/clinical trials and AI in medical care
6.1.4 Blockchain and AI in medical care
6.1.5 Internet of Things (IoT) and AI in medical care [15]
6.1.6 Telehealth and AI in medical care [16]
6.1.7 Chatbots and AI in medical care [16]
6.1.8 Natural language processing (NLP) and AI in medical care
6.1.9 Expert systems and AI in medical care
6.1.10 Robotics and AI in medical care
6.1.11 Population health (demographics and epidemiology) and AI in medical care
6.1.12 Precision medicine/health (personalized health) and AI in medical care
6.1.13 Healthcare analytics and AI in medical care
6.1.14 Preventive health and AI in medical care
6.1.15 Public health and AI in medical care
6.1.16 Access and availability and AI in medical care
6.2 Pharmaceutical and biopharmaceutical care
6.2.1 Big data analytics and AI in pharmaceutical care
6.2.2 Health information and records (EHR) and AI in pharmaceutical care
6.2.3 Research/clinical trials and AI in pharmaceutical care
6.2.4 Blockchain and AI in pharmaceutical care
6.2.5 Internet of Things (IoT) and AI in pharmaceutical care
6.2.6 Telehealth and AI in pharmaceutical care
6.2.7 Chatbots and AI in pharmaceutical care
6.2.8 Natural language processing (NLP) and AI in pharmaceutical care
6.2.9 Expert systems and AI in pharmaceutical care
6.2.10 Robotics and AI in pharmaceutical care
6.2.11 Population health (demographics and epidemiology) and AI in pharmaceutical care
6.2.12 Precision medicine/health (personalized health) and AI in pharmaceutical care
6.2.13 Healthcare analytics and AI in pharmaceutical care
6.2.14 Preventive health and AI in pharmaceutical care
6.2.15 Public health and AI in pharmaceutical care
6.2.16 Access and availability and AI in pharmaceutical care
6.3 Hospital care
6.3.1 Big data analytics and AI in hospital care
6.3.2 Health information and records (EHR) and AI in hospital care
6.3.3 Research/clinical trials and AI in hospital care
6.3.4 Blockchain and AI in hospital care
6.3.5 Internet of Things (IoT) and AI in hospital care [15]
6.3.6 Telehealth and AI in hospital care [114]
6.3.7 Chatbots and AI in hospital care
6.3.8 Natural language processing (NLP) and AI in hospital care
6.3.9 Expert systems and AI in hospital care
6.3.10 Robotics and AI in hospital care
6.3.11 Population health (demographics and epidemiology) and AI in hospital care
6.3.12 Precision medicine/health (personalized health) and AI in hospital care
6.3.13 Healthcare analytics and AI in hospital care
6.3.14 Public health and AI in hospital care [136]
6.3.15 Access and availability and AI in hospital care
6.4 Nursing care
6.4.1 Big data analytics and AI in nursing care
6.4.2 Health information and records (EHR) and AI in nursing care
6.4.3 Research/clinical trials and AI in nursing care
6.4.4 Blockchain and AI in nursing care
6.4.5 Internet of Things (IoT) and AI in nursing care
6.4.6 Telehealth and AI in nursing care
6.4.7 Chatbots and AI in nursing care
6.4.8 Natural language processing (NLP), and AI in nursing care
6.4.9 Expert systems and AI in nursing care
6.4.10 Robotics and AI in nursing care
6.4.11 Population health (demographics and epidemiology) and AI in nursing care
6.4.12 Precision medicine/health (personalized health) and AI in nursing care
6.4.13 Healthcare analytics and AI in nursing care
6.4.14 Preventive health and AI in nursing care
6.4.15 Public health and AI in nursing care
6.4.16 Access and availability and AI in nursing care
6.5 Home health care, nursing homes and hospice care
6.5.1 Big data analytics and AI in home health, nursing homes, and hospice care
6.5.2 Health information and records (EHR) and AI in home health, nursing homes, and hospice care
6.5.3 Research/clinical trials and AI in home health, nursing homes, and hospice care
6.5.4 Blockchain and AI in home health, nursing homes, and hospice care
6.5.5 Internet of Things (IoT) and AI in home health, nursing homes, and hospice care
6.5.6 Telehealth and AI in home health, nursing homes, and hospice care
6.5.7 Chatbots and AI in home health, nursing homes, and hospice care
6.5.8 Natural language processing (NLP) and AI in home health, nursing homes, and hospice care
6.5.9 Robotics and AI in home health, nursing homes, and hospice care
6.5.10 Population health (demographics and epidemiology) and AI in home health, nursing homes, and hospice care
6.5.11 Precision medicine/health (personalized health) and AI in home health, nursing homes, and hospice care
6.5.12 Healthcare analytics and AI in home health, nursing homes, and hospice care
6.5.13 Preventive health and AI in home health, nursing homes, and hospice care
6.5.14 Public health and AI in home health, nursing homes, and hospice care
6.5.15 Access and availability and AI in home health, nursing homes, and hospice care
6.6 Concurrent medical conditions (“comorbidity,” aka “multimorbidity”)
6.6.1 Big data analytics and AI in concurrent medical conditions (“comorbidity”)
6.6.2 Health information and records (EHR) and AI in concurrent medical conditions (“comorbidity”)
6.6.3 Research/clinical trials and AI in concurrent medical conditions (“comorbidity”)
6.6.4 Blockchain and AI in concurrent medical conditions (“comorbidity”)
6.6.5 Telehealth and AI in concurrent medical conditions (“comorbidity”)
6.6.6 Chatbots and AI in concurrent medical conditions (“comorbidity”)
6.6.7 Natural language processing (NLP) and AI in concurrent medical conditions (“comorbidity”)
6.6.8 Expert systems and AI in concurrent medical conditions (“comorbidity”)
6.6.9 Robotics and AI in concurrent medical conditions (“comorbidity”)
6.6.10 Population health (demographics and epidemiology) and AI in concurrent medical conditions (“comorbidity”)
6.6.11 Precision medicine/health (personalized health) and AI in concurrent medical conditions (“comorbidity”)
6.6.12 Healthcare analytics and AI in concurrent medical conditions (“comorbidity”)
6.6.13 Preventive health and AI in concurrent medical conditions (“comorbidity”)
6.6.14 Public health and AI in concurrent medical conditions (“comorbidity”)
6.6.15 Access and availability and AI in concurrent medical conditions (“comorbidity”)
6.7 Medical/surgical robotics
6.7.1 Big data analytics and AI in medical/surgical robotics
6.7.2 Health information and records (EHR) and AI in medical/surgical robotics
6.7.3 Research/clinical trials and AI in medical/surgical robotics
6.7.4 Blockchain and AI in medical/surgical robotics
6.7.5 Internet of Things (IoT) and AI in medical/surgical robotics
6.7.6 Telehealth and AI in medical/surgical robotics
6.7.7 Chatbots and AI in medical/surgical robotics
6.7.8 Natural language processing (NLP) and AI in medical/surgical robotics
6.7.9 Expert systems and AI in medical/surgical robotics
6.7.10 Precision medicine/health (personalized health) and AI in medical/surgical robotics
6.7.11 Healthcare analytics and AI in medical/surgical robotics
6.7.12 Preventive health and AI in medical/surgical robotics
6.7.13 Public health and AI in medical/surgical robotics
6.7.14 Access and availability and AI in medical/surgical robotics
6.8 Stem cells and regenerative medicine
6.8.1 The basic bioscience of stem cells and regenerative medicine [276]
6.8.2 Big data analytics and AI in stem cells and regenerative medicine
6.8.3 Research/clinical trials and AI in stem cells and regenerative medicine
6.8.4 Blockchain and AI in stem cells and regenerative medicine
6.8.5 Internet of Things (IoT) and AI in stem cells and regenerative medicine
6.8.6 3-D bioprinting and AI in stem cells and regenerative medicine
6.8.7 Chatbots and AI in stem cells and regenerative medicine
6.8.8 Natural language processing (NLP) and AI in stem cells and regenerative medicine
6.8.9 Expert systems and AI in stem cells and regenerative medicine
6.8.10 Robotics and AI in stem cells and regenerative medicine
6.8.11 Precision medicine/health (personalized health) and AI in stem cells and regenerative medicine
6.8.12 Healthcare analytics and AI in stem cells and regenerative medicine
6.8.13 Preventive health and AI in stem cells and regenerative medicine
6.8.14 Public health and AI in stem cells and regenerative medicine
6.8.15 Access and availability and AI in stem cells and regenerative medicine
6.9 Genetics and genomics therapies
6.9.1 Big data analytics and AI in genetics and genomics
6.9.2 Health information and records (EHR) and AI in genetics and genomics therapies
6.9.3 Research/clinical trials and AI in genetics and genomics
6.9.4 Blockchain and AI in genetics and genomics
6.9.5 Internet of Things (IoT) and AI in genetics and genomics
6.9.6 Telehealth and AI in genetics and genomics
6.9.7 Chatbots and AI in genetics and genomics
6.9.8 Natural language processing (NLP) and AI in genetics and genomics
6.9.9 Expert systems and AI in genetics and genomics
6.9.10 Robotics and AI in genetics and genomics
6.9.11 Population health (demographics and epidemiology) and AI in genetics and genomics
6.9.12 Precision medicine/health (personalized health) and AI in genetics and genomics
6.9.13 Healthcare analytics (and bioinformatics) and AI in genetics and genomics
6.9.14 Preventive health and AI in genetics and genomics
6.9.15 Public health and AI in genetics and genomics
6.9.16 Access and availability and AI in genetics and genomics
References
7 AI applications in prevalent diseases and disorders
7.1 Immunology and autoimmune disease
7.1.1 Pathogenesis and etiologies of immunology and autoimmune disease
7.1.2 Clinical presentations in immunology and autoimmune disease
7.1.3 Current treatment approaches and AI applications in immunology and autoimmune disease
7.1.3.1 Stem cell transplantation
7.1.3.2 CRISPR-Cas9 (gene editing)
7.1.3.3 CAR-T cell (gene replacement)
7.1.4 Research and future AI considerations in immunology and autoimmune disease
7.2 Genetic and genomic disorders
7.2.1 Description and etiology of genetic and genomic disorders
7.2.2 Clinical presentations in genetic and genomic disorders
7.2.3 Current treatment approaches and AI applications in genetic and genomic disorders
7.2.4 Research and future AI considerations in genetic and genomic disorders
7.3 Cancers
7.3.1 Description and etiology of cancers
7.3.2 Clinical presentations in cancers
7.3.3 Current treatment approaches and AI applications in cancers
7.3.4 Research and future AI considerations in cancers
7.4 Vascular (cardiovascular and cerebrovascular) disorders
7.4.1 Description and etiology of cardio and cerebrovascular disorders
7.4.1.1 Structures of the cardiovascular systems
7.4.1.2 Structures of the cerebrovascular system
7.4.1.3 Diseases and disorders of the cardiovascular system
7.4.1.4 Diseases and disorders of the cerebrovascular system
7.4.2 Current treatment approaches and AI applications in vascular disorders
7.4.3 Research and future AI considerations in vascular care
7.4.3.1 Diagnostic and screening considerations in vascular care
7.4.3.2 Emerging AI applications in vascular treatment and prevention
7.5 Diabetes (type 1 and 2)
7.5.1 Description and etiology of diabetes (type 1 and 2)
7.5.1.1 Type 1 diabetes
7.5.1.2 Type 2 diabetes (mellitus)
7.5.2 Clinical presentations in diabetes (type 1 and 2)
7.5.2.1 Type 1 diabetes
7.5.2.2 Type 2 diabetes mellitus
7.5.3 Current treatment approaches to diabetes (type 1 and 2)
7.5.3.1 Type 1 diabetes [163]
7.5.3.2 Type 2 diabetes [164]
7.5.4 Research and future AI applications in diabetes (type 1 and 2)
7.5.4.1 Type 1 diabetes
7.5.4.2 Type 2 diabetes
7.6 Neurological and sensory disorders and diseases
7.6.1 Neuroanatomy, etiologies, clinical considerations associated with neurological and sensory disorders
7.6.1.1 The central nervous system (CNS) neuroanatomy [177]
7.6.1.2 Central nervous system (CNS) clinical considerations (by etiology) [178]
7.6.1.3 Peripheral nervous system (PNS) neuroanatomy [179]
7.6.1.4 Peripheral nervous system (PNS) clinical considerations (by etiology) [180]
7.6.1.5 Sensory systems [181]
7.6.2 Research and AI considerations in neurological and sensory disorders
7.7 Musculoskeletal disorders (MSDs)
7.7.1 Musculoskeletal disorders (MSD) and diseases and associated AI applications
7.8 Integumentary system and exocrine glands
7.8.1 Dermatology
7.8.2 Integumentary system disorders and diseases and associated AI applications
7.9 Endocrine glands
7.9.1 Endocrine disorders and diseases and associated AI applications
7.10 Digestive and excretory systems
7.10.1 Digestive and excretory disorders and diseases and associated AI applications
7.11 Renal system and urinary system
7.11.1 Renal and urinary disorders and diseases and associated AI applications
7.12 Respiratory (pulmonary) system
7.12.1 Respiratory system diseases and disorders and associated AI applications
7.13 Reproductive systems
7.13.1 Female reproductive system [366]
7.13.2 Female reproductive cycle
7.13.2.1 Disease conditions of the female reproductive system with recent, related AI programs
7.13.3 Male reproductive system [366]
7.13.3.1 Male reproductive process
7.13.3.2 Functional disorders of the male reproduction system with recent, related AI programs
7.13.4 Disease conditions of the male reproduction system with recent AI programs
7.14 Physical injuries, wounds and disabilities
7.14.1 Fatal injury data
7.14.2 Nonfatal injury data
7.14.3 Disabilities
7.15 Infectious disease
7.16 Human development, aging, degeneration and death
7.17 Chronic disease
7.18 Mental and behavioral disorders
7.19 Nutrition and exercise (preventive care)
7.19.1 Physical exercise
7.19.2 Nutrition
References
8 SARS-CoV-2 and the COVID-19 pandemic
8.1 Background
8.1.1 Definitions
8.1.2 History of pandemics
8.1.2.1 Historical overview
8.1.2.2 Recent history
8.1.3 Incidence and prevalence of COVID-19
8.2 Pathogenesis and bioscience considerations for SARS-CoV-2
8.2.1 Mechanisms
8.2.2 Theories
8.2.3 Life cycle of SARS-CoV-2
8.2.4 Review of AI regarding the pathogenesis of SARS-CoV-2
8.3 Clinical considerations regarding SARS-CoV-2 infection
8.3.1 Clinical manifestations (signs and symptoms)
8.3.2 Diagnostic testing
8.3.2.1 Antigen testing
8.3.2.2 Molecular genetic test (PCR test)
8.3.2.3 Antibody testing
8.4 Treatment and management strategies
8.4.1 General measures
8.4.1.1 Basic preventive steps
8.4.1.2 Mitigation
8.4.1.3 Contact tracing
8.4.1.4 Modeling
8.4.1.5 Herd immunity and R Naught (RO or RO)
8.4.2 Therapeutics
8.4.2.1 Monoclonal antibodies
8.4.2.2 Convalescent plasma (serum)
8.4.2.3 Hydroxychloroquine (Plaquenil®) combined with azithromycin (Zithromax®)
8.4.2.4 Remdesivir
8.4.2.5 Dexamethasone (and corticosteroids)
8.4.2.6 RNA screening
8.4.3 Vaccine (immunization)
8.4.4 CRISPR-Cas13 and RNA screening
8.4.5 Immunoinformatics
8.4.6 Review of AI for clinical considerations for coronavirus infections
8.5 Epidemiology and public health considerations in COVID-19
8.5.1 Current epidemiologic considerations
8.5.2 Review of AI for epidemiology and public health considerations
Conclusion
References
Epilogue
References
Glossary of terminology
Glossary of abbreviations
Index
Recommend Papers

Foundations of Artificial Intelligence in Healthcare and Bioscience: A User Friendly Guide for IT Professionals, Healthcare Providers, Researchers, and Clinicians
 0128244771, 9780128244777

  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
File loading please wait...
Citation preview

Foundations of Artificial Intelligence in Healthcare and Bioscience

Foundations of Artificial Intelligence in Healthcare and Bioscience A User Friendly Guide for IT Professionals, Healthcare Providers, Researchers, and Clinicians

Louis J. Catania Nicolitz Eye Consultants, Jacksonville, FL, United States

Academic Press is an imprint of Elsevier 125 London Wall, London EC2Y 5AS, United Kingdom 525 B Street, Suite 1650, San Diego, CA 92101, United States 50 Hampshire Street, 5th Floor, Cambridge, MA 02139, United States The Boulevard, Langford Lane, Kidlington, Oxford OX5 1GB, United Kingdom Copyright © 2021 Elsevier Inc. All rights reserved. No part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopying, recording, or any information storage and retrieval system, without permission in writing from the publisher. Details on how to seek permission, further information about the Publisher’s permissions policies and our arrangements with organizations such as the Copyright Clearance Center and the Copyright Licensing Agency, can be found at our website: www.elsevier.com/permissions. This book and the individual contributions contained in it are protected under copyright by the Publisher (other than as may be noted herein). Notices Knowledge and best practice in this field are constantly changing. As new research and experience broaden our understanding, changes in research methods, professional practices, or medical treatment may become necessary. Practitioners and researchers must always rely on their own experience and knowledge in evaluating and using any information, methods, compounds, or experiments described herein. In using such information or methods they should be mindful of their own safety and the safety of others, including parties for whom they have a professional responsibility. To the fullest extent of the law, neither the Publisher nor the authors, contributors, or editors, assume any liability for any injury and/or damage to persons or property as a matter of products liability, negligence or otherwise, or from any use or operation of any methods, products, instructions, or ideas contained in the material herein. British Library Cataloguing-in-Publication Data A catalogue record for this book is available from the British Library Library of Congress Cataloging-in-Publication Data A catalog record for this book is available from the Library of Congress ISBN: 978-0-12-824477-7 For Information on all Academic Press publications visit our website at https://www.elsevier.com/books-and-journals

Publisher: Mara Conner Acquisitions Editor: Chris Katsaropoulos Editorial Project Manager: Rafael G. Trombaco Production Project Manager: Niranjan Bhaskaran Cover Designer: Christian J. Bilbow Typeset by MPS Limited, Chennai, India

Dedication To my wife Stephanie, Thank you for your love and support To all the health care workers who have served us all during the COVID-19 pandemic and every day, Thank you To the families of all those lost to the COVID-19 pandemic, My deepest sympathies

Contents List of Illustrations

xxv

Foreword by Adam Dimitrov Foreword by Ernst Nicolitz Preface

xxxi

xxxiii

Acknowledgments

Section I

xxix

xxxv

Artificial Intelligence (AI): Understanding the technology

Introduction References 1.

2.

1 1 6

The evolution of artificial intelligence (AI)

7

1.1 Human intelligence

7

1.2 Defining artificial intelligence (AI)

8

References

11

The basic computer

13

2.1 Layers of basic computers

14

2.1.1 Input layer

14

2.1.2 Inner (hidden) layer

16

2.1.3 Output layer

18

2.2 Basic computer language and programming

19

2.3 Basic computer hardware

21

2.4 Basic computer software

22 vii

viii

Contents

2.5 Servers, internet and world wide web (www)

3.

23

2.5.1 Servers

23

2.5.2 Internet

25

2.5.3 World wide web (www)

26

References

26

The science and technologies of artificial intelligence (AI)

29

3.1 The theory and science of artificial intelligence (AI)

29

3.2 Artificial neural network (ANN) model of artificial intelligence (AI)

31

3.3 AI software (algorithms)

36

3.3.1 Machine learning

38

3.3.1.1 Supervised (labeled) data

38

3.3.2 Neural networking and deep learning

40

3.3.2.1 Unsupervised (unlabeled) data

40

3.3.2.2 Reinforcement learning

42

3.4 AI hardware

46

3.4.1 Ram (random access memory)

46

3.4.2 Computer servers (file, mail, print, web, game, apps)

47

3.4.3 Central processing unit (CPU)

47

3.4.4 Graphic processing unit (GPU)

48

3.4.5 Accelerators

48

3.4.6 Quantum processors using “qubits” (vs digital binary code)

49

3.4.7 Neuromorphic chips (“self-learning” microchips)

49

3.4.8 Application specific integrated circuit (ASIC)

50

3.4.9 Field-programmable gate array (FPGA) integrated circuit with hardware description language (HDL)

50

3.5 Specialized AI systems

51

3.5.1 Natural language processing (NLP)

51

3.5.2 Natural language generation (NLG)

52

Contents

3.5.3 Expert systems

53

3.5.4 “Internet of things” (IoT)

55

3.5.5 Cyber-physical system (CPS)

55

3.5.6 Big data analytics

56

3.5.7 Blockchain

59

3.5.8 Robotics

60

3.6 Sample AI scenarios

64

3.6.2 The “great steak” experience

67

Artificial Intelligence (AI): Applications in Health and Wellness

Introduction References 4.

64

3.6.1 “Why is the Mona Lisa smiling?” References

Section II

ix

68

73 73 77

AI applications in the business and administration of health care

79

4.1 AI applications in government agencies (GOVs), non-governmental organizations (NGOs) and third-party health insurers

79

4.1.1 Primary AI applications GOVs, NGOs, and third-party health insurers (1, 2, 3)

79

4.1.2 Additional AI applications to GOVs, NGOs, and third-party health insurers (4, 5, 6)

82

4.2 Big data analytics in health care [Text #1]

83

4.2.1 Primary AI literature reviews of big data analytics (1, 2, 3)

83

4.2.2 Additional AI literature reviews of big data analytics (4, 5, 6)

85

x

Contents

4.3 Blockchain in health care [Text #2]

85

4.3.1 Primary AI literature reviews of blockchain (1, 2, 3)

86

4.3.2 Additional AI literature reviews of blockchain (4, 5, 6)

88

4.4 Health information and records (electronic health record or EHR) [Text #3]

88

4.4.1 Primary AI literature reviews of health information and records (EHR) (1, 2, 3)

89

4.4.2 Additional AI literature reviews of health information and records (EHR) (4, 5, 6)

91

4.5 Population health [Text #4]

91

4.5.1 Primary AI literature reviews of population health (1, 2, 3)

95

4.5.2 Additional AI literature reviews of population health (4, 5, 6)

97

4.6 Healthcare analytics (descriptive, diagnostic, predictive, prescriptive, discovery) [Text #5]

97

4.6.1 Descriptive analytics [Text #6]

98

4.6.2 Diagnostic analytics [Text #7]

99

4.6.3 Predictive analytics [Text #8]

99

4.6.4 Prescriptive analytics [Text #9]

100

4.6.5 Primary AI literature reviews of health analytics (1, 2, 3)

100

4.6.6 Additional AI literature reviews of health analytics (4, 5, 6)

101

4.7 Precision health (aka precision medicine or personalized medicine) [Text #10]

101

4.7.1 Primary AI literature reviews of precision medicine/health (1, 2, 3)

102

4.7.2 Additional AI literature reviews of precision medicine/health (4, 5, 6)

106

4.8 Preventive medicine/healthcare [Text #11] 4.8.1 Primary AI literature reviews of preventive medicine/healthcare (1, 2, 3)

106 108

Contents

4.8.2 Additional AI literature reviews of preventive medicine/healthcare (4, 5, 6) 4.9 Public health [Text #12]

5.

xi

110 111

4.9.1 Primary AI literature reviews of public health (1, 2, 3)

112

4.9.2 Additional AI literature reviews of public health (4, 5, 6)

116

References

117

AI applications in diagnostic technologies and services

125

5.1 Major diagnostic technologies and their AI applications

127

5.1.1 Diagnostic imaging 5.1.1.1 Categories of diagnostic imaging 5.1.2 Laboratory (clinical diagnostic) testing 5.1.2.1 AI’s influence on laboratory testing 5.1.3 Genetic and genomic screening and diagnosis

129 131 158 159 168

5.1.3.1 The science

169

5.1.3.2 Cytogenetics

173

5.1.3.3 Genetic testing

173

5.1.3.4 Big data analytics in genomics

175

5.1.3.5 AI in genetic cancer screening

176

5.1.3.6 AI in immunogenetics

176

5.1.3.7 Genetics, precision medicine and AI

177

5.1.3.8 Literature reviews re AI’s influence on genetics and genomics

177

5.2 Additional diagnostic technologies and their AI applications

178

5.2.1 Vital signs

179

5.2.2 Electrodiagnosis

180

5.2.3 Telemedicine (aka telehealth)

182

xii

Contents

5.2.4 Chatbots

185

5.2.5 Expert systems

187

5.2.5.1 Literature reviews re AI’s influences on “additional diagnostic technologies”

6.

189

References

190

Current AI applications in medical therapies and services

199

6.1 Medical care (primary, secondary, tertiary, quaternary care)

200

6.1.1 Big data analytics and AI in medical care

201

6.1.2 Health information and records (EHR) and AI in medical care

202

6.1.3 Research/clinical trials and AI in medical care

202

6.1.4 Blockchain and AI in medical care

203

6.1.5 Internet of Things (IoT) and AI in medical care

203

6.1.6 Telehealth and AI in medical care

204

6.1.7 Chatbots and AI in medical care

204

6.1.8 Natural language processing (NLP) and AI in medical care

205

6.1.9 Expert systems and AI in medical care

205

6.1.10 Robotics and AI in medical care

206

6.1.11 Population health (demographics and epidemiology) and AI in medical care

207

6.1.12 Precision medicine/health (personalized health) and AI in medical care

207

6.1.13 Healthcare analytics and AI in medical care

208

6.1.14 Preventive health and AI in medical care

209

6.1.15 Public health and AI in medical care

209

6.1.16 Access and availability and AI in medical care

211

6.2 Pharmaceutical and biopharmaceutical care

212

6.2.1 Big data analytics and AI in pharmaceutical care

212

6.2.2 Health information and records (EHR) and AI in pharmaceutical care

212

Contents xiii

6.2.3 Research/clinical trials and AI in pharmaceutical care

213

6.2.4 Blockchain and AI in pharmaceutical care

214

6.2.5 Internet of Things (IoT) and AI in pharmaceutical care

215

6.2.6 Telehealth and AI in pharmaceutical care

215

6.2.7 Chatbots and AI in pharmaceutical care

216

6.2.8 Natural language processing (NLP) and AI in pharmaceutical care

216

6.2.9 Expert systems and AI in pharmaceutical care

217

6.2.10 Robotics and AI in pharmaceutical care

217

6.2.11 Population health (demographics and epidemiology) and AI in pharmaceutical care

217

6.2.12 Precision medicine/health (personalized health) and AI in pharmaceutical care

218

6.2.13 Healthcare analytics and AI in pharmaceutical care

218

6.2.14 Preventive health and AI in pharmaceutical care

219

6.2.15 Public health and AI in pharmaceutical care

219

6.2.16 Access and availability and AI in pharmaceutical care

220

6.3 Hospital care

220

6.3.1 Big data analytics and AI in hospital care

220

6.3.2 Health information and records (EHR) and AI in hospital care

220

6.3.3 Research/clinical trials and AI in hospital care

221

6.3.4 Blockchain and AI in hospital care

221

6.3.5 Internet of Things (IoT) and AI in hospital care

222

6.3.6 Telehealth and AI in hospital care

222

6.3.7 Chatbots and AI in hospital care

223

6.3.8 Natural language processing (NLP) and AI in hospital care

223

6.3.9 Expert systems and AI in hospital care

224

6.3.10 Robotics and AI in hospital care

224

xiv

Contents

6.3.11 Population health (demographics and epidemiology) and AI in hospital care

225

6.3.12 Precision medicine/health (personalized health) and AI in hospital care

225

6.3.13 Healthcare analytics and AI in hospital care

226

6.3.14 Public health and AI in hospital care

226

6.3.15 Access and availability and AI in hospital care

227

6.4 Nursing care

227

6.4.1 Big data analytics and AI in nursing care

227

6.4.2 Health information and records (EHR) and AI in nursing care

228

6.4.3 Research/clinical trials and AI in nursing care

228

6.4.4 Blockchain and AI in nursing care

228

6.4.5 Internet of Things (IoT) and AI in nursing care

229

6.4.6 Telehealth and AI in nursing care

229

6.4.7 Chatbots and AI in nursing care

230

6.4.8 Natural language processing (NLP), and AI in nursing care

230

6.4.9 Expert systems and AI in nursing care

231

6.4.10 Robotics and AI in nursing care

231

6.4.11 Population health (demographics and epidemiology) and AI in nursing care

232

6.4.12 Precision medicine/health (personalized health) and AI in nursing care

232

6.4.13 Healthcare analytics and AI in nursing care

233

6.4.14 Preventive health and AI in nursing care

233

6.4.15 Public health and AI in nursing care

234

6.4.16 Access and availability and AI in nursing care

234

6.5 Home health care, nursing homes and hospice care 6.5.1 Big data analytics and AI in home health, nursing homes, and hospice care

235 235

Contents

xv

6.5.2 Health information and records (EHR) and AI in home health, nursing homes, and hospice care

235

6.5.3 Research/clinical trials and AI in home health, nursing homes, and hospice care

236

6.5.4 Blockchain and AI in home health, nursing homes, and hospice care

236

6.5.5 Internet of Things (IoT) and AI in home health, nursing homes, and hospice care

237

6.5.6 Telehealth and AI in home health, nursing homes, and hospice care

237

6.5.7 Chatbots and AI in home health, nursing homes, and hospice care

238

6.5.8 Natural language processing (NLP) and AI in home health, nursing homes, and hospice care

238

6.5.9 Robotics and AI in home health, nursing homes, and hospice care

239

6.5.10 Population health (demographics and epidemiology) and AI in home health, nursing homes, and hospice care

239

6.5.11 Precision medicine/health (personalized health) and AI in home health, nursing homes, and hospice care

239

6.5.12 Healthcare analytics and AI in home health, nursing homes, and hospice care

240

6.5.13 Preventive health and AI in home health, nursing homes, and hospice care

240

6.5.14 Public health and AI in home health, nursing homes, and hospice care

241

6.5.15 Access and availability and AI in home health, nursing homes, and hospice care

241

6.6 Concurrent medical conditions (“comorbidity,” aka “multimorbidity”)

242

6.6.1 Big data analytics and AI in concurrent medical conditions (“comorbidity”)

243

6.6.2 Health information and records (EHR) and AI in concurrent medical conditions (“comorbidity”)

243

xvi

Contents

6.6.3 Research/clinical trials and AI in concurrent medical conditions (“comorbidity”)

244

6.6.4 Blockchain and AI in concurrent medical conditions (“comorbidity”)

244

6.6.5 Telehealth and AI in concurrent medical conditions (“comorbidity”)

245

6.6.6 Chatbots and AI in concurrent medical conditions (“comorbidity”)

246

6.6.7 Natural language processing (NLP) and AI in concurrent medical conditions (“comorbidity”)

246

6.6.8 Expert systems and AI in concurrent medical conditions (“comorbidity”)

247

6.6.9 Robotics and AI in concurrent medical conditions (“comorbidity”)

247

6.6.10 Population health (demographics and epidemiology) and AI in concurrent medical conditions (“comorbidity”)

248

6.6.11 Precision medicine/health (personalized health) and AI in concurrent medical conditions (“comorbidity”)

248

6.6.12 Healthcare analytics and AI in concurrent medical conditions (“comorbidity”)

249

6.6.13 Preventive health and AI in concurrent medical conditions (“comorbidity”)

249

6.6.14 Public health and AI in concurrent medical conditions (“comorbidity”)

250

6.6.15 Access and availability and AI in concurrent medical conditions (“comorbidity”)

250

6.7 Medical/surgical robotics

251

6.7.1 Big data analytics and AI in medical/surgical robotics

251

6.7.2 Health information and records (EHR) and AI in medical/surgical robotics

251

6.7.3 Research/clinical trials and AI in medical/surgical robotics

252

6.7.4 Blockchain and AI in medical/surgical robotics

252

Contents

xvii

6.7.5 Internet of Things (IoT) and AI in medical/surgical robotics

253

6.7.6 Telehealth and AI in medical/surgical robotics

253

6.7.7 Chatbots and AI in medical/surgical robotics

254

6.7.8 Natural language processing (NLP) and AI in medical/surgical robotics

255

6.7.9 Expert systems and AI in medical/surgical robotics

255

6.7.10 Precision medicine/health (personalized health) and AI in medical/surgical robotics

256

6.7.11 Healthcare analytics and AI in medical/surgical robotics

256

6.7.12 Preventive health and AI in medical/surgical robotics

256

6.7.13 Public health and AI in medical/surgical robotics

257

6.7.14 Access and availability and AI in medical/surgical robotics

257

6.8 Stem cells and regenerative medicine

257

6.8.1 The basic bioscience of stem cells and regenerative medicine

258

6.8.2 Big data analytics and AI in stem cells and regenerative medicine

259

6.8.3 Research/clinical trials and AI in stem cells and regenerative medicine

260

6.8.4 Blockchain and AI in stem cells and regenerative medicine

260

6.8.5 Internet of Things (IoT) and AI in stem cells and regenerative medicine

261

6.8.6 3-D bioprinting and AI in stem cells and regenerative medicine

261

6.8.7 Chatbots and AI in stem cells and regenerative medicine

261

6.8.8 Natural language processing (NLP) and AI in stem cells and regenerative medicine

262

xviii

Contents

6.8.9 Expert systems and AI in stem cells and regenerative medicine

262

6.8.10 Robotics and AI in stem cells and regenerative medicine

263

6.8.11 Precision medicine/health (personalized health) and AI in stem cells and regenerative medicine

263

6.8.12 Healthcare analytics and AI in stem cells and regenerative medicine

264

6.8.13 Preventive health and AI in stem cells and regenerative medicine

264

6.8.14 Public health and AI in stem cells and regenerative medicine

265

6.8.15 Access and availability and AI in stem cells and regenerative medicine

265

6.9 Genetics and genomics therapies

265

6.9.1 Big data analytics and AI in genetics and genomics

266

6.9.2 Health information and records (EHR) and AI in genetics and genomics therapies

268

6.9.3 Research/clinical trials and AI in genetics and genomics

268

6.9.4 Blockchain and AI in genetics and genomics

269

6.9.5 Internet of Things (IoT) and AI in genetics and genomics

269

6.9.6 Telehealth and AI in genetics and genomics

270

6.9.7 Chatbots and AI in genetics and genomics

270

6.9.8 Natural language processing (NLP) and AI in genetics and genomics

271

6.9.9 Expert systems and AI in genetics and genomics

271

6.9.10 Robotics and AI in genetics and genomics

272

6.9.11 Population health (demographics and epidemiology) and AI in genetics and genomics

272

6.9.12 Precision medicine/health (personalized health) and AI in genetics and genomics

273

Contents

7.

xix

6.9.13 Healthcare analytics (and bioinformatics) and AI in genetics and genomics

273

6.9.14 Preventive health and AI in genetics and genomics

274

6.9.15 Public health and AI in genetics and genomics

275

6.9.16 Access and availability and AI in genetics and genomics

275

References

276

AI applications in prevalent diseases and disorders

293

7.1 Immunology and autoimmune disease

294

7.1.1 Pathogenesis and etiologies of immunology and autoimmune disease

295

7.1.2 Clinical presentations in immunology and autoimmune disease

298

7.1.3 Current treatment approaches and AI applications in immunology and autoimmune disease

300

7.1.3.1 Stem cell transplantation

302

7.1.3.2 CRISPR-Cas9 (gene editing)

303

7.1.3.3 CAR-T cell (gene replacement)

305

7.1.4 Research and future AI considerations in immunology and autoimmune disease 7.2 Genetic and genomic disorders

306 308

7.2.1 Description and etiology of genetic and genomic disorders

309

7.2.2 Clinical presentations in genetic and genomic disorders

310

7.2.3 Current treatment approaches and AI applications in genetic and genomic disorders

312

7.2.4 Research and future AI considerations in genetic and genomic disorders 7.3 Cancers

313 314

7.3.1 Description and etiology of cancers

314

7.3.2 Clinical presentations in cancers

315

xx

Contents

7.3.3 Current treatment approaches and AI applications in cancers

318

7.3.4 Research and future AI considerations in cancers

319

7.4 Vascular (cardiovascular and cerebrovascular) disorders 7.4.1 Description and etiology of cardio and cerebrovascular disorders

320 321

7.4.1.1 Structures of the cardiovascular systems

322

7.4.1.2 Structures of the cerebrovascular system

322

7.4.1.3 Diseases and disorders of the cardiovascular system

322

7.4.1.4 Diseases and disorders of the cerebrovascular system

324

7.4.2 Current treatment approaches and AI applications in vascular disorders

324

7.4.3 Research and future AI considerations in vascular care

331

7.4.3.1 Diagnostic and screening considerations in vascular care

331

7.4.3.2 Emerging AI applications in vascular treatment and prevention

332

7.5 Diabetes (type 1 and 2) 7.5.1 Description and etiology of diabetes (type 1 and 2)

332 333

7.5.1.1 Type 1 diabetes

333

7.5.1.2 Type 2 diabetes (mellitus)

333

7.5.2 Clinical presentations in diabetes (type 1 and 2)

334

7.5.2.1 Type 1 diabetes

334

7.5.2.2 Type 2 diabetes mellitus

334

7.5.3 Current treatment approaches to diabetes (type 1 and 2)

334

7.5.3.1 Type 1 diabetes

334

7.5.3.2 Type 2 diabetes

335

Contents

7.5.4 Research and future AI applications in diabetes (type 1 and 2)

xxi

335

7.5.4.1 Type 1 diabetes

335

7.5.4.2 Type 2 diabetes

336

7.6 Neurological and sensory disorders and diseases 7.6.1 Neuroanatomy, etiologies, clinical considerations associated with neurological and sensory disorders

337 338

7.6.1.1 The central nervous system (CNS) neuroanatomy

338

7.6.1.2 Central nervous system (CNS) clinical considerations (by etiology)

340

7.6.1.3 Peripheral nervous system (PNS) neuroanatomy

342

7.6.1.4 Peripheral nervous system (PNS) clinical considerations (by etiology)

342

7.6.1.5 Sensory systems

343

7.6.2 Research and AI considerations in neurological and sensory disorders 7.7 Musculoskeletal disorders (MSDs) 7.7.1 Musculoskeletal disorders (MSD) and diseases and associated AI applications 7.8 Integumentary system and exocrine glands

344 351 352 356

7.8.1 Dermatology

356

7.8.2 Integumentary system disorders and diseases and associated AI applications

357

7.9 Endocrine glands 7.9.1 Endocrine disorders and diseases and associated AI applications 7.10 Digestive and excretory systems 7.10.1 Digestive and excretory disorders and diseases and associated AI applications

363 364 368 368

xxii

Contents

7.11 Renal system and urinary system 7.11.1 Renal and urinary disorders and diseases and associated AI applications 7.12 Respiratory (pulmonary) system 7.12.1 Respiratory system diseases and disorders and associated AI applications 7.13 Reproductive systems

371 372 374 375 386

7.13.1 Female reproductive system

386

7.13.2 Female reproductive cycle

387

7.13.2.1 Disease conditions of the female reproductive system with recent, related AI programs 7.13.3 Male reproductive system

389 389

7.13.3.1 Male reproductive process

391

7.13.3.2 Functional disorders of the male reproduction system with recent, related AI programs

391

7.13.4 Disease conditions of the male reproduction system with recent AI programs 7.14 Physical injuries, wounds and disabilities

392 393

7.14.1 Fatal injury data

393

7.14.2 Nonfatal injury data

395

7.14.3 Disabilities

398

7.15 Infectious disease

400

7.16 Human development, aging, degeneration and death

406

7.17 Chronic disease

412

7.18 Mental and behavioral disorders

414

7.19 Nutrition and exercise (preventive care)

418

7.19.1 Physical exercise

418

7.19.2 Nutrition

420

References

422

Contents

8.

xxiii

SARS-CoV-2 and the COVID-19 pandemic

445

8.1 Background

445

8.1.1 Definitions

445

8.1.2 History of pandemics

446

8.1.2.1 Historical overview

446

8.1.2.2 Recent history

446

8.1.3 Incidence and prevalence of COVID-19 8.2 Pathogenesis and bioscience considerations for SARS-CoV-2

447 448

8.2.1 Mechanisms

448

8.2.2 Theories

448

8.2.3 Life cycle of SARS-CoV-2

450

8.2.4 Review of AI regarding the pathogenesis of SARS-CoV-2

450

8.3 Clinical considerations regarding SARS-CoV-2 infection

452

8.3.1 Clinical manifestations (signs and symptoms)

452

8.3.2 Diagnostic testing

453

8.3.2.1 Antigen testing

453

8.3.2.2 Molecular genetic test (PCR test)

454

8.3.2.3 Antibody testing

454

8.4 Treatment and management strategies

454

8.4.1 General measures

455

8.4.1.1 Basic preventive steps

455

8.4.1.2 Mitigation

455

8.4.1.3 Contact tracing

455

8.4.1.4 Modeling

456

8.4.1.5 Herd immunity and R Naught (RO or RO)

456

8.4.2 Therapeutics

457

8.4.2.1 Monoclonal antibodies

457

8.4.2.2 Convalescent plasma (serum)

457

xxiv

Contents

8.4.2.3 Hydroxychloroquine (Plaquenils ) combined with azithromycin (Zithromaxs )

457

8.4.2.4 Remdesivir

457

8.4.2.5 Dexamethasone (and corticosteroids)

458

8.4.2.6 RNA screening

458

8.4.3 Vaccine (immunization)

458

8.4.4 CRISPR-Cas13 and RNA screening

459

8.4.5 Immunoinformatics

459

8.4.6 Review of AI for clinical considerations for coronavirus infections

460

8.5 Epidemiology and public health considerations in COVID-19 8.5.1 Current epidemiologic considerations

461

8.5.2 Review of AI for epidemiology and public health considerations

462

Conclusion

464

References

Epilogue

464

469

Glossary of terminology Glossary of abbreviations Index

461

505

473 497

List of Illustrations Figure. Intro 1.1 Moore’s Law. Processing power for computers will double every two years while costs will be halved. Figure 1 1 Turing Test. One human functions as the questioner while a second human and a computer function as hidden respondents. The questioner interrogates the respondents using a specified format and context. After a preset length of time or number of questions, the questioner is asked to decide which respondent was human and which was a computer. Figure 2 1A Basic computer model (input layer). Input data is made up of information or data provided by an external source called an input device (Table 2 3). This external source of data (“user input”) is characterized schematically by a series of red dots representing “data points” (or “data nodes” when they produce a network of data points). Figure 2 1B Basic computer model (inner/hidden layer). In the computer input layer (red), the hardware (keyboard and monitor) are functioning as the input and output devices respectively, i.e., an I/O (input/output) process and the application programming interface (API) and central processing unit (CPU) software (blue) are functioning as the data processing unit of the inner (hidden) layer. Figure 2 2A Basic computer model (inner layer framework). Target code or binary (machine) code information becomes the information that will populate the inner (hidden) layer (blue) and be used by its software (OS, API, CPU, servers and software apps) for data processing. Figure 2 2B Basic computer model (inner layer functions). The object code file(s) (red) also directs coded instructions to the API and appropriate servers and/or software apps in the inner (hidden) layer (blue). Figure 2 3A Basic computer model (output layer). Input data points (red) and the inner (hidden) layer (blue) represent the transfer of the machine code target data to the inner (hidden) layer and output layer (green). Figure 2 3B Basic computer model (outer layer devices). Each data point (or node) in the input layer (blue) has the potential to be executed by any (or all) software programs in the inner (hidden) layer. Figure 3 1 Artificial intelligence schematic. The broadest classification of AI includes the subcategories of machine learning (ML) and deep learning (DL) within which artificial neural networks (ANN) and deep learning are subsets of machine learning. Figure 3 2 The neuron. The neuron is the basic unit of the human neural network that receives and sends trillions of electrical signals throughout the brain and body. It consists of a cell body, an axon, and dendrites. Figure 3 3 Neurotransmitter chemicals across synaptic cleft. Chemical messenger transmits signals across a synapse (synaptic cleft) such as a neuromuscular junction from one neuron (nerve cell) to another “target” neuron. Figure 3 4 Mathematical model of neuron. Schematic diagram of the mathematical model of a neuron where input “weights (w)” are activated and summed (Σ) by cell body producing an output (similar to the computer model). Figure 3 5 Schematic of the cortical neural network (CNN). A vast network of one hundred billion interconnecting neurons creates a complex in the human brain called the “cortical neural network.” This network has the potential of producing one hundred trillion neural connections.

5 9

15

16

17

17

18

19

31

31

32

33

33

xxv

xxvi

List of Illustrations

Figure 3 6 Figure 3 7

Figure 3 8

Figure 3 9

Figure 3 10

Figure 3 11

Figure 3 12

Figure 3 13

Figure 3 14

Figure 3 15

Figure 3 16

Figure 3 17

Figure 3 18

Neural network with 3 computer layers. This model demonstrates input nodules (red) to inner (hidden) layer nodes (blue), in relationship to the cerebral cortex, and output (green). Deep neural network. The inner (hidden) layer nodes. From (Fig. 3 6) are distributed throughout the brain as progressive cortical centers or layers (blue) to create the deep neural network (DNN). Convolutional neural network (CNN). The deep neural network (DNN) is graphically represented in a linear dimensional, multilayered distribution of nodes (blue) demonstrating the hundred trillion deep neural network connections that create the “convolutional neural network (CNN).” This neural process is analogous to the deep learning algorithms in AI. The limbic system (hippocampus, amygdala, thalamus). The subcortical limbic system (hippocampus, amygdala, and thalamus) in the human midbrain serves as relay stations between input layer data (red) and the inner (hidden) layers (blue nodes) of the deep neural network (DNN). The cerebral functions of each of these brain nuclei have direct analogous implications in the AI process. Neural layer transmissions from limbic system to higher cortical centers. Signals from the limbic system are transmitted to the higher cortical neural centers (blue arrows, #3) for cognitive interpretations at which point they are “cataloged” as memory in the limbic system and corresponding progressive cortical layers. Finally, the impulses are generated as output (green, #4) representing some aspect of human intelligence. Natural language processing (NLP) and natural language generations (NLG). Once NLP unlocks the verbal context (red) and translates it into human language (blue), NLG takes the output and analyzes the text in context (blue) and produces audio or text output (green). Expert system. The basic structure of an expert system consists of a human expert (e.g., doctor) and knowledge engineer (e.g., related expert) as input (red); a knowledge base (related database[s]), inference engine (AI algorithm), explainable AI (AI algorithm) and user interface (NLP audio or text) as inner (hidden) layers (blue); and the user (from input, i.e., the doctor) as output recipient (green). Forward and backward chaining. In forward chaining the inference engine follows the chain of conditions and derivations and finally deduces the outcome. In Backward Chaining, based on what has already happened, the inference engine tries to find conditions that could have occurred in the past for this result. Artificial intelligent robots. Artificial intelligent robots are simply robots that are controlled by AI programs. This difference is illustrated through this Venn diagram. The overlap between the two technologies represents the category of “Artificial Intelligent Robots.” Case study (example) of “Why is Mona Lisa smiling?” (input layer). Leonardo da Vinci’s masterpiece, “Mona Lisa” can serve as the sensory visual input stimulus (red) to ask the AI computer the classic question, “Why is she smiling?”. Case study (example) of “Why is Mona Lisa smiling?” (inner layer). Labeled information is “neural networked” (blue number 3 arrows) to relevant, preexisting, higher cortical neural layers and their unlabeled data related directly and indirectly to the user’s question (“Why is she smiling?”). Case study (example) of “Why is Mona Lisa smiling?” (processing). Nerve impulses from the subcortical and cortical levels are transmitted through associated cortical and optic radiations (green number 4 arrows) to the visual cortex (area V1and V5). Case study (example) of “Why is Mona Lisa smiling?” (output). Through logical inference rules (green), the brain and an AI computer can evaluate reasonable deductions, probabilities, and conclusions to the question, “Why is she smiling?” And most likely, Explainable AI (XAI) will conclude as have art experts over the years, “Only Leonardo knows for sure.”.

34 34

35

35

36

52

53

54

60

64

65

66

66

List of Illustrations

Figure 4 1 Figure 4 2 Figure 4 3

Figure 4 4

Figure 4 5

Figure 4 6

Figure 4 7 Figure 5 1

Figure 5 2

Figure 5 3

Figure 5 4

Figure 5 5

Sources of waste in American health care. Over $500 billion in health care spending is excess from supplier time wastefulness, waste, fraud and misuse. Population health (independent variables small scale). Example of a small population model where demographic and epidemiological factors (independent variables) are measured. Population health (dependent variables small scale). AI algorithms (regression analysis, Bayesian probabilities and inference logic) analyze the statistical results of independent variables against one another as well as against other potential dependent variables (e.g., health and wellness, risks of aging, etc.). Population health (interventions small scale). AI algorithms measure positive and negative correlations between dependent and independent variables allowing appropriate professionals and caregivers the ability to introduce corrective interventions. Population health (large scale). The population health concept can be used in large scale assessments where AI analysis of demographic and epidemiologic independent variables produce the dependent variable outcomes and the intervention similar to small models. Healthcare analytics. Data mining techniques in health care analytics fall under 4 categories: (1) descriptive (i.e., exploration and discovery of information in the dataset); (2) diagnostic (i.e., why something happened); (3) predictive (i.e., prediction of upcoming events based on historical data); and (4) prescriptive (i.e., utilization of scenarios to provide decision support). The public health system. Public health is a network of vertically integrated and interoperable systems delivering and assessing the provision of public health services. Convolutional Neural Networks (CNNs).Similar to the convolutional neural networks described in Chapter 3 (Figure 3 5), AI’s deep learning CNN process is used to classify the image for diagnosis. In this example, 5 convolutional layers are followed by 3 fully connected layers, which then output a probability of the image belonging to each class. These probabilities are compared with the known class (stroke in the training example) and can be used to measure how far off the prediction was (cost function), which can then be used to update the weights of the different kernels and fully connected parameters using back-propagation. When the model training is complete and deployed on new images, the process will produce a similar output of probabilities, in which it is hoped that the true diagnosis will have the highest likelihood. Comparison of machine learning vs. deep learning CNNs.AI systems look at specific labeled structures and also learn how to extract image features either visible or invisible to the human eye. The comparison in this Figure between classic machine learning and deep learning approaches is applied to a classification task. Both approaches use an artificial neural network organized in the input layer (IL), hidden layer (HL) and output layer (OL). The deep learning approach avoids the design of dedicated feature extractors by using a deep neural network that can represents complex features as a composition of simpler ones. Number of diagnostic clinical algorithms by technology.Diagnostic imaging currently more than doubles all other forms of AI diagnostic algorithms due primarily to advanced (GPU) image recognition software. However, it is felt that genetic testing will grow significantly in the coming years as a principal diagnostic testing modality. Chromosome.In the nucleus of each cell, the DNA molecule is packaged into thread-like structures called chromosomes. Within each DNA helix are “sequences” (“genetic code”) made up of four nitrogen base compounds, paired as “base pairs” (adenine paired with thymine and guanine paired with cytosine). Together, a base pair along with a sugar and phosphate molecule is called a nucleotide. Normal human karyotype.The overall number and shape of all your chromosomes is called a karyotype.

xxvii

82 92 92

93

93

98

113 130

131

147

169

170

xxviii List of Illustrations

Figure 5 6

Figure 6 1

Figure 6 2

Figure 6 3 Figure 7 1

Figure 7 2 Figure 7 3

Figure 7 4

Figure 7 5

Figure 8 1

The cellular biology of the human genome.There are a number of elements that make up what is referred to as the human genome including the cellular biology of genetics which includes the cell, its nucleus, chromosomes within the nucleus, the DNA strands within the chromosomes and the base compounds of the genes within the chromosomes. Comorbidities by age. Combined conditions (“comorbidities”) encompass physical as well as mental disorders in patients with their greatest frequency for occurrence in the elderly population. This makes the issue of comorbidities a demonstrable public health issue. Nanorobotic technology. Medical robots (micro-bot, nanorobots, nanobots) use nearmicroscopic mechanical particles to localize a drug or other therapy to a specific target site within the body. Stages of human (and stem cells) development. Stem cells have the potential to develop into many different cell types during early (pluripotent) life and growth. Chronic inflammation. Through its inflammatory mediators and cellular components that damage tissue throughout the body, especially the blood vessel (perivasculitis) walls (with infiltration and diapedesis) supporting virtually every organ system, chronic inflammatory disease is considered the progenitor or originating cause of all (emphasis on all) the major human disease categories. Genetic modification & stem cell therapy. The patient’s own stem cells are used in a procedure known as autologous (from “one’s self”) hematopoietic stem cell transplantation. CRISPR-Cas9. CRISPR guide RNAs target specific spots in the genome for the Cas9 enzyme (“genetic scissors”) to cut (scissor), forming a double-strand break. A machine learning algorithm predicts which types of repairs will be made at a site targeted by a specific guide RNA. Possibilities include an insertion of a single base pair, a small deletion, or a larger change known as a microhomology deletion [25]. Immunogenics (immunotherapy) Chimeric Antigen Receptor T cells (CAR-T). CAR-T-cell therapy begins by removing a patient’s lymphocytes and transducing them with a DNA plasmid vector (a DNA molecule distinct from the cell’s DNA used as a tool to clone, transfer, and manipulate genes) that encodes specific tumor antigens. These modified and targeted lymphocytes are then reintroduced to the patient’s body through a single infusion to attack tumor cells. Telomeres. At the ends of chromosomes (“tips”) are stretches of DNA called telomeres which are chains of chemical code made up of the four nucleic acid bases (guanine, adenine, thymine, and cytosine). SARS-CoV-2 Life Cycle. The life cycle of the novel coronavirus (SARS-CoV-2) begins when its spike protein attaches to an ACE2 receptor on a cell membrane (1) and penetrates the cell wall where it replicates a genomic RNA (2 4), then produces ‘subgenomic RNAs’ (5 6), synthesizes various spike proteins through translation (7) and new genomic RNA becomes the genome of a new virus particle (8). This combines with the strand genomic RNA, merges in the endoplasmic reticulum-Golgi apparatus into a complete virus particle within a vesicle (9), and the new viral particles are released (exocytosis) to the extracellular region (10).

170

242

254

259 297

303 304

306

419

451

Foreword by Adam Dimitrov A primary care perspective on “Artificial Intelligence in Health Care” We are on the brink of a transformative era in medicine. Yes, the health-care system clearly has its flaws. We are reminded daily of the high cost of health care in this country, its suboptimal clinical outcomes, and dissatisfaction on the part of patients and clinicians with their experience of care. Some would argue that we don’t even have a true health-care system in this country, but rather a health-care market which drives expensive and uncoordinated care, leaving patients too often to navigate their own care. The growing reliance on prescription medications to control chronic conditions has led to a culture of sickcare rather than wellness. American health care itself needs healing. These shortcomings sometimes blind us to the fact that we are experiencing amazing advances in the field of medicine itself. Targeted cancer treatments such as immunotherapy continue to evolve, in some cases making cancer a chronic rather than a terminal disease. 3D printers are being developed to create tissue and organs for transplant patients. Our knowledge of pharmacokinetics is expanding, allowing us to prescribe medications for patients specific to their genetic and metabolic makeup, thereby reducing the number of side effects. Robotic surgery continues to allow physicians to help patients in ways never before possible. Medicine is advancing at an exponential pace. On the forefront of that change comes Artificial Intelligence or AI. Ironically, many doctors in this generation grew up quite familiar with AI years before their medical training. In between the long hours of studying and extracurricular activities, many an aspiring physician spent their downtime trying to defeat an AI opponent . . . sometimes for hours. I am speaking of the immersive world of video games. For the past two or three decades, young men and women know all too well the challenges of trying to defeat a computer-generated opponent, whether it be chess against the computer or a final battle against a computerized adversary on a video game console. Every one of us “gamers” can recall the satisfaction in getting past a particularly difficult level in a game or finally winning against that monster that seemed to know our every move. The video game industry has grown to become a multibillion-dollar industry as millions of gamers try to “outsmart” their computerized opponents armed with the power of artificial intelligence. Just as we see artificial intelligence develop in various industries, there is no doubt that AI will play an increasing role in the transformation of health care. Clinical decision support already helps to guide physicians on which drugs to prescribe for hospitalized patients with certain conditions, and perhaps more importantly, prevent medication errors. Digital imaging tools allow primary care doctors to screen for diabetic retinopathy in their own office, a significant resource in areas where access to eye care may not be as prevalent. Many believe that the specific causes of particular conditions such as autism will be discovered not by

xxix

xxx

Foreword by Adam Dimitrov

scientists conducting clinical studies, but rather by computer algorithms that identify an environmental or genetic cause. It just so happens that as I write this foreword, we are in the middle of the coronavirus (COVID-19) pandemic. There is no doubt that when the pandemic dissipates, we will learn of various ways in which AI was called upon to combat this novel pathogen. Due to lack of available testing, we have already heard how agencies are using Global Positioning Systems on users’ smartphones to predict hot spots and spread of the disease. Computer models are surely being used to study the genetic makeup of the virus to help develop a vaccine or effective medication. Many of the predictive models that are presented to local governments and health-care systems use AI as a means of tabulating the data. Which is why this book by Dr. Lou Catania comes at such an opportune time. Dr. Catania has compiled a resource that is both introductory and comprehensive to anyone who wishes to explore the world of artificial intelligence, whether a seasoned clinician or a layperson outside of the medical field. This book touches upon all aspects of AI . . . from neural networks to population health management, from the Mona Lisa to the reduction of waste in health care. It is clear that Dr. Catania has poured as much dedication into this book as he did for thousands of patients over his many years of practice in the field of optometry. As a distinguished clinician, author, and speaker, his insights are a welcome addition to the emerging field of AI in the health-care arena. As I type these last words this evening, I ready myself for somewhat of an uncertain day in my primary care practice tomorrow. Due to the COVID-19 pandemic, we have rapidly deployed virtual visits, with more than 80% of our clinical encounters over the past month taking place via video chat. The overnight evolution of telemedicine will certainly be one of the silver linings that comes out of this pandemic. But for now, I think I will put my laptop down and see if I can finally beat the computer tonight in a game of poker. Adam Dimitrov, MD, FAAFP Family Medicine Physician, Baptist Health, Jacksonville, FL, United States

Foreword by Ernst Nicolitz A medical specialist’s perspective on Artificial Intelligence in Health Care As a medical specialist (specifically, an ophthalmic surgeon) for over 40 years, the evolution of artificial intelligence (AI) is occurring at a rate I find challenging. I’d venture a guess that most of my colleagues feel the same. But, Dr. Lou Catania, an associate of mine for more than 20 years with Nicolitz Eye Consultants, our multidisciplinary practice in Jacksonville, FL, has challenged me further. He has invited me to write a Foreword for the book he has been working on for over a year. Lou has asked me to provide you, the reader, with a specialist’s perspective on AI. Undoubtedly, there are many different specialists’ perspectives on such a disruptive technology, but thanks to Lou’s efforts in writing this excellent text, I feel I have a bit of a preferential advantage. For me, the first section of the book was invaluable in providing just enough technical information about basic computing as the foundations to AI technology. Lou does go a little deep (at least for most health professionals, I would presume), but he aptly justifies it with Einstein’s famous line, “Everything should be made as simple as possible, but not simpler.” I felt that the computer information he presents does indeed prove to be necessary to fully appreciate AI’s enormous role in health care as covered in Section 2. Most medical professionals that follow the progress in AI are familiar with its extraordinary ability to evaluate imagery of any kind through pattern recognition. A “graphic processing unit” studies (and learns) from among millions of patterns (images) and then uses machine learning to differentiate normal from abnormal. This is precisely what we as medical specialists (like ophthalmologists, radiologists, dermatologists, pathologists, and others) do in everyday diagnosis and even during treatment (i.e., surgical) procedures. We “learn” through training and experience and then identify abnormal conditions in our patients. What’s more, as Lou so methodically outlines throughout this book, selectivity and specificity study results suggest equal or superior comparisons between machine (computer) AI findings and those of the respective specialists in the field. Given this accuracy and breadth of AI, one can easily see it (as some do) as an existential threat to the human factor in medical specialty care. I don’t quite see it that way as much as an adjunct to, and verification of my 40 years of experience and hopefully, my willingness to always give a second thought to my clinical decisions. Nothing wrong with a “curbside consult,” be it from a colleague or an “intelligent machine.” I don’t know of any medical professional who would ignore a suggestion coming from an analysis of a database of hundreds of millions of sources (“Big Data Analytics”). But again, Lou manages to skillfully demonstrate the immeasurable benefits that AI brings to specialty care through a balance of “big data,” robotics and efficient technology versus irreplaceable real-world considerations (i.e., the value of increased personal time and interactions with a patient versus strictly quantitative

xxxi

xxxii

Foreword by Ernst Nicolitz

analysis). Lou’s frequent references to the AI algorithm, “Explainable AI (XAI)” that gives the user a rationale for AI’s conclusions and recommendations, exemplifies the synergies of a partnership, if you will, between AI and the medical specialist. What I find so valuable in this book is the way Lou has assessed AI’s enormous role in the critical issues related to administrative health care and public health, as well as the most prevalent clinical issues we face today. As an active associate in our practice, I am very aware of Lou’s keen interest in immunology and genetics. In these two areas in the text, he shows perhaps AI’s greatest contributions to current and future health and wellness. He skillfully describes AI’s influences and applications in the diagnostic aspects of these two fields (in Chapters 5 and 6) and then expands the discussion (in Chapter 7) to the vital therapeutic role immunology and genetics is playing in health care and how AI is making it all possible. After 40+ plus years as a medical/surgical specialist, I am thrilled to see a technology like AI entering my domain of practice—ophthalmology. I am certain that specialists in all fields feel the same as we reap the benefits of such a powerful and disruptive technology. I am also thrilled to see the book that Lou has produced from his extensive background, expertise and knowledge base in computers, AI, and in health care. It will help us all make the transition to a new level of health and wellness care to our patients. Ernst Nicolitz, MD, FACS Senior Ophthalmic Surgeon, Nicolitz Eye Consultants, Jacksonville, FL, United States

Preface The three principal subjects of this book are artificial intelligence (AI), health care and related biosciences, three rather complex topics. Their interrelationship, integration, and interoperability (a word you’ll be hearing a lot more about) adds to the complexity of the subject matter. Meanwhile, the audience for which the book is aimed includes multiple disciplines of health care providers, researchers, administrators, scientists, information technologists (IT), educators, students, and interested laypersons. Among each of these groups will be individuals with varying levels of expertise and experience in either or all of the topics to be discussed. And regarding the interrelationships of the three topics, few individuals in any discipline have the full breadth of the evolving and disruptive spheres of AI, healthcare, and the biosciences. This book hopes to reach readers through understandable explanations of the biosciences, AI, and health, and mostly, their integration and interoperability (there it is again). Please understand, that certain portions of the text in your area of expertise might be developed at an introductory level to help educate the reader with less or no experience in that area. No discussion at any level is intended to be patronizing or condescending, as I hope no one will interpret it as such. Rather, my goal in writing this book is simply to provide each group of readers with a comfortable base of relevant information in all the related areas and their current and evolving synergies to which this book strives to address. The mathematical formulae and equations associated with AI and their applications in current and evolving healthcare concepts (business and clinical) can be considered, at the very least, challenging to all. The author (moi) is a clinical and academic health care professional with over 50 years experience, and a student of AI for the past 10 years. While comfortable with the many levels of business and clinical health care, I will be the first to acknowledge unadulterated terror when confronted with Bayesian theorems, linear algebra, multivariable calculus, the stuff of which AI algorithms are made. It also took me years to begin to understand the applications of data science and the complex computer programming used in integrating AI into health care. Thus I will be taking author’s privilege (aka a bit of cowardice) to avoid in-depth coverage of some of these areas in favor of the more practical, less theoretical applications of AI in health care and bioscience. I have strived to remain acutely aware that the discussions and the text of this book, albeit challenging, should be no more difficult to understand and follow than the contents of a simple traveler’s guide for all interested readers regardless of their background. I use this analogy of a “traveler’s guide” because that’s the way I try to unfold your journey through AI and its applications in health, bioscience, and wellness. I will never discuss a subject or specific xxxiii

xxxiv

Preface

topic wherein I haven’t first presented its fundamental concepts and elements. And as often as possible, I will refer you back to those basic explanations when necessary (with page references) on any advancing, complex topic. I have even color-coded certain concepts in the text and illustrations to make the journey a little more colorful and easier to follow. While I refer to this book as a “guide book,” I also see it as a reference book to which I would hope you will return to when reading or hearing other information on AI and health care that may leave you with questions. Whereas the subject areas of the book, AI, health care, and bioscience are each relatively complicated topics, the greater goal of this book is to present an appreciation of the increasingly intimate relationship between these sciences and technologies. To accomplish that goal, after descriptions of the basics of computers (Chapter 2) and AI technology (Chapter 3), I spend considerable time describing the business aspects of health care (indeed a “very big” business) in Chapter 4. Then I do a deep dive into AI applications in the clinical and bioscience aspects of health care (especially my passion for immunology and genetics) in Chapters 5 7. Admittedly, I do use considerable medical terminology, which is necessary for a fuller understanding of the condition(s) to be discussed. But hopefully, the glossary at the end of the book will serve as an aid when needed. For the AI-related aspects of the book, I try to represent the profound relationship AI has with health care (clinical and business) in understandable terms. But the significant basis of all the discussions is my reliance on the most recent AI literature that I have exhaustively researched in each area with discussion, excerpt, and references to over 1600 AI, health and bioscience research, and technical papers. And, I must admit, and apologize in advance that some of the information provided in these literature reviews may be challenging at times with the authors and researchers use of specific technical terms and descriptions that border on, and admittedly, cross the line of “simple.” There are lots of books written on AI and health care, but too many of them presuppose or assume more than the reader’s experience provides. This book aims to provide an understandable picture of AI in health care and related biosciences, in its current state at the date of publication and to do so in a manner that accommodates health care and IT professionals, technologists, technicians, and even interested laypersons through an organized, methodical, and simple approach. But let’s remember the words of Albert Einstein: “Everything should be made as simple as possible, but not simpler.”

Postscript The manuscript for this book took over a year to organize and construct. During the final weeks of those efforts (February 2020) the world was attacked by the novel coronavirus (SARS-COV-2). A virtual instant need and new applications for AI were introduced into health care. The clinical, scientific, public health, and AI world responded. So too have I by adding a Chapter 8, SARS-COV-2 and COVID-19 Pandemic, to the completed manuscript. I hope it offers a helpful summary (albeit dated to future readers) of COVID-19 and its associations with AI.

Acknowledgments To paraphrase an old proverb, “It took a village to write this book.” When you combine the relationship of two subjects, artificial intelligence (AI) and health care, we all know that the magnitude of each fills countless volumes of text and terabytes of databases. It’s quite obvious that completing such a comprehensive analysis can’t be done alone. In spite of only one author’s name on the cover of this book, it should truly be an exhaustive list of: (1) researchers; (2) AI and health care experts; and (3) editors and friends who collectively helped make it a reality. While I can’t enumerate all of the specific entities and individuals in each of these groups, allow me to briefly mention how, together, they made up the “village” that contributed so much in putting my name on the cover. First, the amount of research to even approximate an overview of AI’s influence and applications in health and wellness is proliferating daily and requires enormous amounts of time. But more so, it means diligent review of countless technical, biomedical, and clinical literature papers, books, journals, as well as weekly and monthly magazines (e.g., Forbes, Harvard Business Review, Time, Newsweek, and others) and daily newspaper articles. Coverage of this abundance of information resources would be impossible without the invaluable search assistance of PubMed, Medline, and MedlinePlus databases provided by the US National Institute of Medicine, National Institute of Health, Center for Disease Control and Prevention, and by the powerful search engines, Google Scholar, ScienceDirect, and Scopus. Combined, these resources provided far more information than could ever be included in covering any one subject, let alone two. The second group in “the Village” I want to thank are those “behind the scenes” contributors. Specifically, I speak of the experts upon whom I relied indirectly and directly. Those whom I consider having been among the greatest assets in my ability to complete this book include experts in AI, most who don’t even know I exist and very well may never even be aware of me acknowledging them for their contribution. These behind the scene individuals are the AI instructors in MIT’s CSAIL, Johns Hopkins’, and Stanford’s online AI courses, Udacity, edX, and others. Over an 8-year period, they have taught me the science and technology (maybe not as much the math!) that I needed to understand AI and to be able to present it (hopefully) in a cogent manner in this book. And then there are the professional and personal “Villagers” who helped me and supported me editorially, cerebrally, and emotionally through this long editorial journey as well as over many years of tutelage and friendship. I have always leaned on Craig Percy, the editor of my first book, back in the 1986, for guidance and often, for brutal honesty, but mostly for a deep friendship which has lasted for more than 30 years. Of course, contemporarily, I am xxxv

xxxvi

Acknowledgments

grateful to Chris Katsaropoulos, my Elsevier acquisition editor for guiding me through the early stages of the publishing process. Chris is a first-class professional with a personal manner that goes a long way during the sometimes stressful, emotional phases of the approval process. And certainly, Rafael Trombaco, the Elsevier Project Manager (EPM), Niranjan Bhaskaran, the Elsevier Production Manager, Venkateswari Kalaiselvan, copyeditor were the professionals who transformed my “scattered manuscript” into a beautiful book. They managed the complex pieces of the editing production process as would a conductor lead a symphony orchestra. And we can all thank them for the exceptional literary composition they produced. You probably noticed two Forewords in the front matter of the book which is not typical. I did so because I wanted the readers to get a real-world perspective on AI from a primary care provider’s and medical specialist’s viewpoint. Dr. Ernst Nicolitz is the senior surgeon in our multidisciplinary eye care practice, Nicolitz Eye Consultants in Jacksonville, FL, and a respected colleague of mine for over 25 years. World class surgeons like Ernie are sometimes less than humble. But among Ernie’s greatest strengths are his humility, warmth, and kindness, which reflects in his patient care. I knew he would give an honest and objective clinical and personal specialist’s feeling about AI and he did. And on the primary care side, I wanted that same humility, warmth, and kindness and there was only one physician I was certain had it all, my own personal family doctor, Adam Dimitrov. I asked Adam to write the Foreword in late 2019, fortuitously just prior to the coronavirus pandemic. Knowing what a difficult and stressful time the following months would turn into for him as the pandemic worsened, I fully expected and understood his need to cancel. Instead, he wrote a beautiful and thoughtful piece on AI’s role in primary care and the view of a primary care physician, literally from the trenches. Others, whom again never even knew they were helping me through the challenge of weaving together this complex of AI, business, medical, and public health information deserve a shout out. They include a math genius and good friend, John Paulos, who’s lucid and, believe it or not, humorous “simplifications” (his New York Times best-seller, “Immuneracy” is a must read) of some of the most complex mathematics on planet earth. I relied on his explanations regularly in trying to explain (and often avoiding with apologies to the reader) the multitude of linear algebra, statistics, regression analyses which are part and parcel of AI. On the business side of health care, I am indebted to a very special longtime friend, advisor, and financial wizard, Bob Jackson, for guiding me through many of the nuances involved in governmental and nongovernmental bureaucracies. And I can’t forget my friend, Brian Armitage, who was the first person to proofread the rough draft of my manuscript (over 900 pages) and gave me a thumbs up, which meant more than he’ll ever know. I dedicated the book “to all the health care workers who have served us all during the COVID-19 pandemic and every day” all of whom deserve our enduring thanks. Also, the dedication includes “the families of all those lost to the COVID-19 pandemic.” There loses should never be forgotten. And finally, I offer a very special thank you and all my love to Stephanie, my wife of 52 years (at the time of this writing). She was a great mother to our three children, one of whom was disabled and whom, together we suffered his loss to cancer. She is my inspiration, my moral and emotional support, my strength, and my cheerleader. Without her there would be no book.

SECTION

I Artificial intelligence (AI): Understanding the technology “It is not the strongest of the species that survives, nor the most intelligent; It is the one most adaptable to change.” Charles Darwin

Introduction Change is disruptive. By definition, it never leaves us in the same place we started. It moves us to another place. Sometimes that other place is familiar and comfortable, and sometimes it is new and different. When it is a new place, it requires us to learn and understand it. If it is uncomfortable or disruptive, it leaves us with the choice to reject it and retreat to the same place we started or to accept the disruption, understand it, and adapt to it. Health care is a place where change is continuously occurring in the interest of improving the human condition. These changes include new therapies, new technologies, new procedures, and new methods of communication. These changes take us to a new place in the art and science of caring for people and require health care providers to continually learn and understand these new places where health care is moving. When the changes are disruptive to the status quo, health care providers do not have the option to reject them and retreat to the more comfortable place they started. To do so would be to deprive their patients of new and hopefully better ways and means for providing care and, indeed, a better level of health care. So, adapting to disruptive changes and technologies in health care is not an option but rather a requirement. Disruption and adaptation are an intrinsic part of the process of health care. Clayton Christensen, a professor at the Harvard Business School, introduced the term disruptive innovation in 2012. He defined it as “a process (in this case, health care) by which a product or service takes root initially in simple applications at the bottom of a market and then relentlessly moves upmarket, eventually displacing established competitors” [1]. Artificial intelligence (AI) is a disruptive technology. It can be defined simply as a branch of computer science dealing with the simulation of intelligent behavior in computers, the capability of a machine to imitate intelligent human behavior [2]. Albeit simplistic, nonetheless, the implications of that definition suggest an existential change in all areas where human intelligence currently controls decision-making. Thus, the possibilities for AI are endless (Table Intro 1.1), from models and algorithms that can predict the effects of climate 1

2

Foundations of Artificial Intelligence in Healthcare and Bioscience

Table Intro 1.1 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21.

Limited list of AI applications.

Agriculture Aviation Computer science Education Finance Algorithmic trading Market analysis and data mining Personal finance Portfolio management Underwriting History Government and military Heavy industry Hospitals and medicine Human resources and recruiting Job search Marketing and advertising Media and e-commerce Music News, publishing and writing Online and telephone service

22. 23. 24. 25. 26. 27. 28. 29. 30. 31. 32. 33. 34. 35. 36. 37. 38. 39. 40. 41. 42.

Power electronics Sensors Telecommunications Toys and games Transportation Artificial life Automated reasoning Bio-inspired computing Concept mining Data mining Knowledge representation Email spam filtering Robotics Automotive Cybernetics Developmental robotics Evolutionary robotics Hybrid intelligent system Intelligent agent and control Video games Litigation

change to those that help researchers improve health care. It should be no surprise that AI will soon become the world’s most disruptive technology [3]. An interesting parlor game might be to have a group of friends identify a time span and list what they would consider to be the top 10 disruptive technologies during that time period. Each individual’s selections would probably make for interesting discussion. The common denominators among the listings would undoubtedly produce a compelling list of advances during the identified period that the group of friends could reflect on. A discussion of how long, how difficult, how comfortable or uncomfortable they felt in adapting to the disruption(s) produced by the changes associated with each technology would reveal interesting personal perspectives. Just to test the validity of this proposed “game,” presented below is a list (author’s choices) of the top 10 disruptive technologies since the year 1900 to the present. But, before reading my list, make your own list and then see if there are any common denominators between lists and among the different items which you think deserve a place in “the final listing.” 1. 2. 3. 4. 5. 6.

Flight (1903) Refrigeration (1920) Antibiotics (the 1930s) Microprocessor Computers (1947) Space exploration (the 1960s) The Internet and World Wide Web (1983)

Section I • Artificial intelligence (AI): Understanding the technology

7. 8. 9. 10.

3

The Human Genome (2003) Immunogenetics and immunotherapies (since 2000) The smartphone (2007) Artificial intelligence (since 2007)

Another interesting option in this little parlor game would be to identify a specific topic with or without a timeframe and create your top 10 list. Many such listings (top 10, top 5, etc.) are found on the Internet on an array of categories, both general and specific. One such defined listing, as an example and relevant to the subject of this book, is a list of the “Top 10 Medical Innovations of 2019” selected and prioritized by a panel of Cleveland Clinic physicians and scientists [4]. 1. 2. 3. 4. 5. 6. 7. 8. 9. 10.

Pharmacogenomic testing; Artificial intelligence; Treatment of acute stroke; Immunotherapy for cancer treatment; Patient-specific products achieved with 3-D printing; Virtual and mixed reality; Visor for prehospital stroke diagnosis; Innovation in robotic surgery; Mitral and tricuspid valve percutaneous replacement and repair; RNA-based therapies

To illustrate how the nature of top 10 or top 5 lists can vary with their creators, consider the top 5 list Forbes Magazine published in 2017 of the “Five Technologies That Will Disrupt Healthcare By 2020.” [5] 1. 2. 3. 4. 5.

Artificial intelligence Immunotherapies/genetics Big Data Analytics CRISPR 3D Printing

Notwithstanding the predictable parochial nature of the priorities and initiatives of the Cleveland Clinic (medically oriented) compared to the Forbes Magazine (business-oriented) listings, it’s noteworthy that 4 out of the top 10 selections (#2, #4, #8 and #10) for the Cleveland Clinic and 4 out of 5 in the Forbes list relate directly to AI. Further, as will be reflected in Section 2 of this book (“AI Applications in Health Care”), most of the other items on both lists also rely heavily on AI. The fact is, universal presence and high prioritization of AI appear on virtually every top 10 listings on the Internet, be it related to business, economy, technology, and indeed, to health care. We will revisit this “game” of Top 5 and Top 10 lists in Section 2 of this book when we discuss AI applications in an array of business, administrative and clinically related health issues. The amount of AI technologies disrupting health care is so vast and expanding so rapidly that being all-inclusive is an impossibility. Research for this book included exhaustive

4

Foundations of Artificial Intelligence in Healthcare and Bioscience

Table Intro 1.2 order). • • • • • • • • • • • • • • • •

Sources of references (in alphabetical

Academic sites (Johns Hopkins Univ., Stanford, Harvard, MGH) Center for Disease Control and Prevention (CDC) Commercial IT industry websites Google Scholar Government (GOV) and Non-government (NGO) health, AI and IT organizations Health and wellness websites Institute of Medicine Major business journals newspapers (Forbes, Fortune) Major newspapers (New York Times, Washington Post) Medical websites Medline National Academy of Science National Institute of Health PubMed Research Institutes and Foundations (Brookings, RWJ, Kaiser) Science Direct

reading and reviews of literally thousands of topic related articles and papers from an array of respected and recognized sources (Table Intro 1.2). Final selections of materials for inclusion in the book are based on relevancy to the topic, the currency of information (unless historically pertinent), and the authenticity of the source. So, what we will be doing (in Section 2) is highlighting for discussion and review the top 3 “Primary” AI technologies and/or programs in each of the health care categories to be covered. Then we will include a listing of 3 “Additional” related AI technologies (with citations) in current use or development. The reader will be able to select any topic from that list that they find interesting, or that may have direct relevance to them personally or professionally for their additional reading and research. And finally, every 2 3 years, we will present subsequent editions of this book wherein we will offer a new list of “Primary” top 3 listings of AI technologies for discussion and an updated “Additional” list (of 3) for your review and reference. It is apparent from both retrospective and prospective analysis that AI as a disruptive technology will represent a significant factor in most areas of life and certainly in health care moving into the future. It is also likely that Moore’s Law will prevail. This Law states that processor speeds or overall processing power for computers will double every 2 years, while costs will be halved [6]. Given this proven axiom and its general rate of human adaptability, New York Times’ economist, Thomas Friedman’s projected estimate (graphically illustrated [Fig. Intro 1.1] in his book “Thank You for Being Late: An Optimist’s Guide to Thriving in the Age of Accelerations” [7]) may prove accurate as to our general delayed adaptability to change. There is nothing more powerful than an idea whose time has come (Victor Hugo). AI is a disruptive concept and a science whose time has come. The goal of this book is to first provide a general understanding of AI as a science and technology (Section 1) and then,

Section I • Artificial intelligence (AI): Understanding the technology

5

FIG. INTRO 1.1 Moore’s Law. Processing power for computers will double every two years while costs will be halved. Adapted from Friedman TL. Thank you for late: an optimist’s guide to thriving in the age of accelerations. 2016.

its applications in health care’s business and clinical domains (Section 2). Depending on your current level of understanding of AI, you may view discussions in this book as anywhere from an elementary level through an intermediate or even advanced level discourse of the subject. But, given the exponential rate of progress and applications in this science and technology, especially in health care, in 5 years you may likely find the book’s discussions, especially in health care applications of AI, “old news.” You may have derived from the book’s title and descriptor (“Foundations of Artificial Intelligence in Healthcare and Bioscience: A User-Friendly Guide for IT Professionals, Healthcare Providers, Researchers and others interested in AI in health and wellness”) that my goal is to approach this highly complex subject in as “comfortable and casual” a literary manner as possible. As such, I have attempted to limit overly scholarly, academic, technical, or clinical jargon except where necessary. Further, in that effort, as well as a product of the dynamic nature of the subject matter, some of the cited materials used in the book (all cross-referenced and validated) are not all from traditional, academic, scientific publications. Instead, they are taken from timely news articles, current monthly, even weekly periodicals, and commercial and lay media websites and magazines. And finally, despite the objective nature of the computer-related science and technologies defining AI, the most challenging aspects in adapting to this disruptive technology may well be the subjective and ethical issues it generates. “Is AI creating an existential threat to the integrity, uniqueness, even the essence of humankind?” Is it creating a “humanoid” existence? [8] And specifically to health care, will AI replace doctors or make them better? [9] (More on these perplexing questions in the Epilogue.) Some respected scientists and futurists like the late Stephen Hawking [3], Elon Musk et al. say, “. . .such powerful systems would threaten humanity” [10]. Other equally respected scientists like Bill Gates (“AI is the holy grail”) [11], Ray Kurzweil et al. claim, “Artificial intelligence will reach human levels by around 2029” [12]. Meanwhile, scientific institutions like MIT [13], Stanford University [14], Brookings Institute [15], et al., and expert business sources

6

Foundations of Artificial Intelligence in Healthcare and Bioscience

like Harvard Business Review [16], Forbes [17], NY Times [18], McKinsey [19], etc. are enthusiastically supporting and encouraging AI’s continued growth. This book will provide an objective presentation and an understanding of AI and its applications in health care. However, it will not address or opine on the associated subjective, ethical, and emotional questions regarding AI technology and its effects on health care going forward. That will be for the bioethicists to debate and for you, the reader, to decide for yourself.

References [1] Christensen C. Disruptive innovation. Harvard Business Review; 2012. [2] Merriam-Webster’s unabridged dictionary. 2019. [3] Laura HG. Why AI will be the World’s most disruptive technology. DueDigital.com 2018; 12 March. [4] Saleh N. Cleveland clinic unveils top 10 medical innovations for 2019. MDLinx; 2018. [5] Press G. Top 10 hot artificial intelligence (AI) technologies. Forbes

Tech; 2017.

[6] Kenton W. Moore’s Law. Investopedia; 2019. [7] Friedman TL. Thank you for being late: an optimist’s guide to thriving in the age of accelerations. Farrar, Straus, and Giroux; 2016. [8] Purkayastha P. Artificial intelligence and the threat to humanity. NewsClick; 2017. [9] Parikh R. AI can’t replace doctors. But it can make them better. MIT Technol Rev 2018. [10] Lewis T. A brief history of artificial intelligence. Live Science; 2014. [11] Ulanoff L. Bill Gates: AI is the holy grail. Mashable Tech 2016. [12] Marr B. 28 Best quotes about artificial intelligence. Forbes 2017. [13] Kiron D. What managers need to know about artificial intelligence. MIT Sloan Manage Rev. 2017. [14] Parker CB. One hundred year study on artificial intelligence (AI100). Stanford News Service; 2016. [15] Desouza K. Krishnamurthy R. Dawson GS. Learning from public sector experimentation with artificial intelligence. Brookings Institute Think Tank; 2017. [16] Brynjolfsson E, Mcafee A. The business of artificial intelligence. Harvard Business Review (Cover story); 2017. [17] Columbus L. How artificial intelligence is revolutionizing business in 2017. Forbes Tech Big Data; 2017. [18] Lewis-Kraus G. The great AI awakening. The New York Times Magazine; 2016. [19] Burkhardt R, Hohn N, Wigley C. Leading your organization to responsible AI. McKinsey and Comp; 2019.

1 The evolution of artificial intelligence (AI) 1.1 Human intelligence I don't think there’s anything unique about human intelligence. Bill Gates

To understand “artificial” intelligence, a review of “bona fide” human intelligence is necessary. The definitions for human intelligence are extraordinarily complex and range from evolutionary to biological to neurological to psychological explanations. Large bodies of research and literature exist from studies of each type of intelligence and their relationships to human intelligence. A 2006 paper entitled “A Collection of Definitions of Intelligence” [1] list an astounding 71 definitions for “intelligence.” They conclude that “. . .scan(ning) through the definitions pulling out commonly occurring features, we find that intelligence is: • A property that an individual agent has as it interacts with its environment or environments; • Intelligence is related to the agent’s ability to succeed or profit concerning some goal or objective; and • It depends on how able the agent is to adapt to different objectives and environments.” Putting these key attributes together, they adopted their informal definition of intelligence to be: “Intelligence measures an agent’s ability to achieve goals in a wide range of environments.” After extensive research on the meaning of human intelligence for this discussion, I settled on a definition found in an article in a neuroscience journal (Nature Reviews Neuroscience) entitled “The Neuroscience of Human Intelligence Differences.” In it, a study was cited [2] wherein 52 prominent researchers agreed on a broad definition of intelligence. The reason I have selected this neuroscience-related definition is due to the intimate relationship between AI and neuroscience that will be discussed in detail in Chapter 3’s discussion on neural networking and “deep learning.” Thus, the definition of human intelligence to be used in comparison with AI is as follows: “Intelligence is a very general capability that, among other things, involves the ability to reason, plan, solve problems, think abstractly, comprehend complex ideas, learn quickly and learn from experience. . . it reflects a broader and more profound capability for comprehending our surroundings, ‘catching on,’ ‘making sense’ of things, or ‘figuring out’ what to do” [3]. The structural elements outlining such a definition of intelligence include the general categories of reasoning, learning, perception, problem-solving, and communication. Foundations of Artificial Intelligence in Healthcare and Bioscience. DOI: https://doi.org/10.1016/B978-0-12-824477-7.00002-X © 2021 Elsevier Inc. All rights reserved.

7

8

Foundations of Artificial Intelligence in Healthcare and Bioscience

Table 1–1 The general categories of human intelligence represent the array of cognitive and noncognitive features of human thinking and acting. These features are emulated and reproduced through algorithms in artificial intelligence. Intelligence category

Category features

Reasoning

• • • • • •

Learning

Perception

Problem solving Communication

Inductive Deductive Auditory Experience Motor Sensory • Visual • Hearing • Cerebral • Assessment • Decision-making • To speak; and

• • • •

Observational Memory Perceptual Pattern recognition

• Relational • Spatial • Stimuli

• Solutions • Alternatives • To recognize and understand spoken and written language

Each of these categories includes subcategories that define their functional attributes (Table 1 1). These categories and subcategories are revisited in Chapter 3, “The Science and Technologies of Artificial Intelligence,” where their analogs in AI are identified and compared.

1.2 Defining artificial intelligence (AI) Given the complexity of defining human intelligence, it’s no wonder that the definition of artificial intelligence (AI) presents its own set of intricacies. The modern dictionary definitions of AI also focus on the technology as being a sub-field of computer science and how machines can imitate human intelligence (being human-like rather than becoming human) [4]. This feature of AI is demonstrated by a test developed by an early pioneer in computer science, Alan Turing (Fig. 1 1) [5]. As early as 1936, this British mathematician had a profound influence on the science of computing [6] in his writings, research, and his code-breaking activities during World War II (excellent movie, “The Imitation Game” chronicles his enormous achievements and his tragic life). He is also considered a founding father of AI from his theorizing that the human brain is, in large part, a digital computing machine [7] (see the Neural Networking discussion in Chapter 3, page 31) and measurable by the “Turing Test” that he developed in 1951. The Turing Test is a method of testing a machine’s (computer’s) ability to exhibit intelligent behavior equivalent to, or indistinguishable from that of a human. During the test, one human functions as the questioner while a second human and a computer function as hidden respondents (Fig. 1 1). The questioner interrogates the respondents using a specified

Chapter 1 • The evolution of artificial intelligence (AI)

9

FIGURE 1–1 Turing Test. One human functions as the questioner while a second human and a computer function as hidden respondents. The questioner interrogates the respondents using a specified format and context. After a preset length of time or number of questions, the questioner is asked to decide which respondent was human and which was a computer.

format and context. After a preset length of time or number of questions, the questioner is asked to decide which respondent was human and which was a computer. The test is repeated numerous times, and if the questioner makes the correct determination in half of the test runs or less, the computer is considered to have artificial intelligence. That is, the questioner regards it as “just as human” as the human respondent [8]. The challenges of defining AI began to be addressed in earnest at a conference at Dartmouth College (“Dartmouth Summer Research Project on Artificial Intelligence”) in 1956 when research engineers, John McCarthy and Marvin Minsky coined the term artificial intelligence [9]. The studies from that conference conclude, “. . .every aspect of learning or any other feature of intelligence can in principle be so precisely described that a machine can be made to simulate it.” That conclusion evolved into a modern Miriam-Webster’s definition of AI (stated previously in the Introduction) as “a branch of computer science dealing with the simulation of intelligent behavior in computers; the capability of a machine to imitate intelligent human behavior” [10]. The concept of AI, as envisioned and articulated by the genius of Turing, McCarthy, and Minsky, seemed to have preceded the actual computer hardware and software necessary for its full maturation, especially its cornerstone and most profound component, deep learning. Slow but steady progress was indirectly stimulated in 1982 when the Japanese government invested $400,000,000 in the “Fifth Generation Computer Project” (FGCP) which did not meet its ambitious goals of revolutionizing computer processing and improving AI through logic programming over the following 10 years [11]. But it did inspire a new generation of computer engineers and scientists [12]. As a result, through the 1990s to 2000s, even in the absence of major funding, AI development thrived and began to meet the original goals identified by FGCP. Increasing international interest grew exponentially with AI’s dramatically increasing memory capacity, processing speeds, natural language processing (NLP), and machine

10

Foundations of Artificial Intelligence in Healthcare and Bioscience

learning (ML). Applications in banking and the financial market vagaries of the1990s and early 2000s began to produce mainstream and serendipitous press attention in AI [12]. In 1997, the reigning world chess champion and grandmaster Gary Kasparov was defeated by IBM’s Deep Blue, a chess-playing computer program [13]. AI computers continued to become “smarter” and proving it to the world with IBM’s “Watson” defeating “Jeopardy” champions Ken Jennings and Brad Rutter in 2011 [14]. Also, Google’s DeepMind Alpha Go program defeated the Chinese Go champion, Kie Je, and Korean champion Lee Sedol in 2016 [13]. After these events, appreciation of AI grew from curiosity to excitement as human adaptability converged with Moore’s Law. The introduction of AI, specifically into health care, experienced resistance in the 1970s and 1980s, due in large part to practitioner conservatism and uncertainties in this evolving technology. Among those concerns included AI systems’ inability to explain how decisions are reached. A cognitive scientist (Daniel Dennett) at Tufts University put it best. “Since there are no perfect answers, we should be as cautious of AI explanations as we are of each other’s. No matter how clever a machine seems, if it can’t do better than us at explaining what it’s doing, then don’t trust it” [15]. Indeed, early iterations of AI computing did not provide software “explanation facilities” for computed conclusions. Since then, however, algorithms have grown to include “Explanation Facilities” (now referred to as “Explainable AI or XAI”) in their output information. Mass General Hospital has already developed and tested “Explanation Facilities” that aim to give AI systems used in medical diagnoses the ability to explain their reasoning rather than leaving the algorithmic conclusions a mystery [16]. Along with inherent human adaptability, reticence to disruptive technologies and the limited funding resulting from the FGCP experience through the 80s and 90s, it wasn't until the 1990s that AI was seriously incorporate into health care. Even then, and to this day, concerns regarding professional liability in diagnosis and treatment continue to limit AI’s use to augmenting clinicians’ decisions in clinical health care rather than providing decision-making itself [17]. Notwithstanding the perceived limitations in AI’s applications in clinical health care, there was an early awareness of the value of AI’s big data analytics and blockchain technologies in the management of health care information and data. The processing speeds and volumes of data manageable with evolving AI hardware and software now provide dramatic improvements in areas of health care information previously considered unattainable. These big data analytics capabilities have provided enormous benefits in the areas of “precision health care,” “population health,” medical research (especially in immunology and genetics), medical imaging analysis, and robotics. AI’s applications in each of these areas of health care are discussed in depth in Section 2 of this book. To wit, the Introduction and evolving applications of AI in health care information management and clinical care can only be understood and appreciated through a comfortable understanding of what AI is and how it works. That is the goal of Chapter 2 (“The Basic Computer”) and Chapter 3 (“The Science and Technologies of Artificial Intelligence”) of this book.

Chapter 1 • The evolution of artificial intelligence (AI)

11

References [1] Legg S, Hutter M. A collection of definitions of intelligence. Cornell University; 2007. arXiv.org . cs .arXiv:0706.3639v1. [2] From Gottfredson LS. Mainstream science on intelligence: an editorial with 52 signatories, history, and bibliography. Intelligence 1997;24:13 23. [3] Ian J, Deary IJ, Penke L, Johnson W. The neuroscience of Human Intelligence differences. Nat Rev Neurosci 2010;11:201 11 March. [4] Stanton A. AI develops human-like number sense general intelligence. Phys.org 2019; May 10.

taking us a step closer to building machines with

[5] Turing A. Can machines think? Mind 1950;59:433 60. [6] Copeland BJ. Alan Turing: British mathematician and logician. Encyclopedia Britannica. Updated: January 23, 2019. [7] Aggarwal A. Genesis of AI: the first hype cycle. scryanalytics.com/articles; January 2018. [8] Rouse M. Turing test. Search Enterprise AI; December 2017. [9] Marr B. The key definitions of artificial intelligence (AI) that explain its importance. Forbes Magazine; February 14, 2018. [10] Merriam-Webster Dictionary. Merriam-Webster Inc.; 2019. [11] Pollack A. Fifth-generation' became Japan’s lost generation. New York Times Archives; June 5, 1992. [12] Anyoha R. The history of artificial intelligence. Harvard University SITNBoston; August 28, 2017. [13] Press G. The brute force of IBM Deep Blue and Google DeepMind. Forbes; February 7, 2018. [14] ,http://www.nytimes.com/2011/02/17/science/17jeopardy-watson.html.. [15] Knight W. The dark secret at the heart of AI. MIT Technology Review; April 11, 2017. [16] Kreps GL, Neuhauser L. Artificial intelligence and immediacy: designing health communication to personally engage consumers and providers. Patient Educ Couns 2013;92:205 10. , https://doi.org/ 10.1016/j.pec.2013.04.014 . . [17] Serdar Uckun S. AI in medicine

an historical perspective. Rowanalytics; February 8, 2018.

2 The basic computer This Chapter (Two) on the basic computer, as with the next Chapter (Three) are meant to give you a practical understanding of computer systems from their essential functions through the more complex algorithmic driven AI computer functions. They are not meant to present an exhaustive analysis of computer science and technology, but rather to provide a “user-friendly,” general explanation with as little technical computer jargon as possible in this very complex topic. In other words, the goal of these next 2 chapters is to create a comfortable level of knowledge of the computer process, specifically AI computing, so that you fully understand the relevance, reasons, and usefulness of AI in health care. The word “algorithm” has already been mentioned numerous times up to this point, and it takes center stage in most of the AI descriptions going forward. Thus, an up-front definition of its literal meaning, as well as its complex relationship to AI, is valuable at this point. Simply defined, an algorithm is a procedure or formula for solving a mathematical problem, based on conducting a sequence of specified actions or steps that frequently involves repetition of an operation [1]. The fundamental framework (a platform for developing software applications [2]) of all AI computing is mathematics-based, including linear algebra, multivariable calculus, Bayesian logic, statistics, and probabilities, all of which algorithms utilize extensively [3]. As will be discussed, AI deals with enormous volumes of data (billions of data points) requiring incredible amounts of calculations, computations, and iterations (in the trillions) [4] done at almost instantaneous speeds. The development of sophisticated algorithms (to be discussed in Chapter 3, “AI Software”) in conjunction with the rapid development of powerful AI hardware (Chapter 3, “AI hardware”) capable of generating and delivering these incomprehensible amounts of calculations at near-instantaneous speeds is responsible for the explosion of AI computing [5]. The hardware and software of both basic and AI computer systems operate within 3 functional categories: input layer(s) and their related hardware; inner (hidden) layers and their related software; and output layer(s) and their related software and hardware. This book, as its title (“Foundations of Artificial Intelligence in Healthcare and Bioscience”) implies, attempts to provide an “illustrated guide” to the components of basic and AI computer science and technologies and their applications in health care. To achieve this goal, the illustrations throughout Chapters 2 (“The basic computer”) and 3 (“The science and technologies of artificial intelligence”) are color-coded (Table 2 1) for clarity, cross-referencing and ease in following the explanations of each computing system. Next, we describe how each of the 3 layers of computing (input layer, inner [hidden]) layer, and output layer) operate and interrelate to produce the basic computer process. After that, we discuss the key elements (language, programming, hardware, and software) of basic

Foundations of Artificial Intelligence in Healthcare and Bioscience. DOI: https://doi.org/10.1016/B978-0-12-824477-7.00008-0 © 2021 Elsevier Inc. All rights reserved.

13

14

Foundations of Artificial Intelligence in Healthcare and Bioscience

Table 2–1 Color coding guide for multicolored illustrations in Chapters 2 and 3.

computing. And with that information, we are then able to move on to Chapter 3 and the extraordinary expansions from basic computing into AI computing.

2.1 Layers of basic computers 2.1.1 Input layer Input data to the input layer of a computer system is made up of information or data provided by an external source called an input device (Table 2 2). This external source of data (“user input”) may be a single command or a complex of information and facts. This input is characterized schematically in Fig. 2 1A by a series of red dots representing “data points” (or “data nodes” when they produce a network of data points). Collectively, these data points (or nodes) introduced by the input hardware device constitute the input layer(s) of the computer process (Fig. 2 1A). This illustrated model of the input layer uses a text command to a standard keyboard as its user input. This input process activates the operating system (OS) compiler that translates the user input into computational digital code (machine code). This code is sent to the RAM drive. Then the keyboard’s device driver (in the control unit [CU] of the central processing unit [CPU] more on CPU below and Chapter 3, Page 47) activates the monitor’s I/O

Chapter 2 • The basic computer

15

Table 2–2 Common computer input devices (alphabetical). • • • • • • • • • • • • • • •

Barcode reader Cameras Digital camera Gamepad Graphics tablets Joystick Keyboard Microphone Mouse (pointing device) Optical character reader (OCR) Scanner Touchpads Trackballs Video capture hardware Webcam, USB

FIGURE 2–1A Basic computer model (input layer). Input data is made up of information or data provided by an external source called an input device (Table 2 3). This external source of data (“user input”) is characterized schematically by a series of red dots representing “data points” (or “data nodes” when they produce a network of data points).

(Input/Output, all discussed further under “Basic Computer Hardware”) device controller (in the CU of the CPU) that transmits the machine code to the monitor (and Application Programming Interface [API], if applicable). So, in this particular computer process of the input layer, the hardware (keyboard and monitor) are functioning as the input and output devices, respectively, i.e., an I/O (input/output) process, as illustrated in Fig. 2 1B. The API and CPU software are functioning as a data processing unit of the inner (hidden) layer.

16

Foundations of Artificial Intelligence in Healthcare and Bioscience

FIGURE 2–1B Basic computer model (inner/hidden layer). In the computer input layer (red), the hardware (keyboard and monitor) are functioning as the input and output devices respectively, i.e., an I/O (input/output) process and the application programming interface (API) and central processing unit (CPU) software (blue) are functioning as the data processing unit of the inner (hidden) layer.

This I/O, API, and CPU data processing unit explains the meaning of “hidden” as an alternate description of the inner (hidden) layer. The “inner” label explains the internal operational aspects of the software framework relative to the computer system at large. The “hidden” label implies computer activity removed from direct observation by the user. This hidden aspect also explains the concept of “back-end programming,” which is described ahead under “Basic computer language and programming.”

2.1.2 Inner (hidden) layer The source code (user input) from the input layer is “translated” by the compiler software into target code, and, as part of the I/O process, it is transmitted to the output device (monitor in this example). This target code or binary (machine) code information also becomes the information that populates the inner (hidden) layer and is used by its software (OS, API, CPU, servers, and software apps) for data processing (Fig. 2 2A). The array of arrows, Figs. 2 3A and 2 3B between the input data points and the inner (hidden) layer represent the transfer of the machine code target data to the inner (hidden) layer. Each data point (or node) in the input layer has the potential to be executed by any (or all) software programs in the inner (hidden) layer. This information transfer is referred to as “compilation.” In this compilation process, the target data is assembled (by “assembler” software) into an executable object file that is transmitted from the input layer to the inner (hidden) layer through the RAM drive (a hardware component). This process occurs in milliseconds

Chapter 2 • The basic computer

17

FIGURE 2–2A Basic computer model (inner layer framework). Target code or binary (machine) code information becomes the information that will populate the inner (hidden) layer (blue) and be used by its software (OS, API, CPU, servers and software apps) for data processing.

followed instantaneously by the execution of object code instructions to the CPU (microprocessor). This execution produces a series of stored programmed instructions to device controllers and device drivers to perform specific tasks to the CPU’s ALU (Arithmetic Logic Unit) for mathematical and logical operations on the information in the RAM drive. The object code file(s) also directs coded instructions to the API and appropriate servers and/or software apps in the inner (hidden) layer (Fig. 2 2B).

FIGURE 2–2B Basic computer model (inner layer functions). The object code file(s) (red) also directs coded instructions to the API and appropriate servers and/or software apps in the inner (hidden) layer (blue).

18

Foundations of Artificial Intelligence in Healthcare and Bioscience

Table 2–3 Common computer output devices (alphabetical). • • • • • • • •

Braille embosser Braille reader CPU (Central Processing Unit) Flat panel GPS (Global Positioning System) Headphones Monitor Plotter

• Printers a. Dot matrix printer b. Inkjet printer c. Laser printer d. 3-D printer • Projector • Sound card • Speakers • SGD (Speech-generating device) • TV • Video card

2.1.3 Output layer The function of the inner (hidden) layer software is to interpret the user input and generate the data, information, and answers to the user’s queries and commands. This generated data and information is disseminated and displayed through multiple output devices (Table 2 3) constituting the output layer. The most common output device in basic computing is the monitor (used in this example, Fig. 2 3A). In the expansion to AI computing, the output becomes far more complex relative to the information provided and to the autonomous devices in robotics where primary computer output serves as input for a robotic operating system (ROS). Lots more on robotics in Chapter 3, AI discussion.

FIGURE 2–3A Basic computer model (output layer). Input data points (red) and the inner (hidden) layer (blue) represent the transfer of the machine code target data to the inner (hidden) layer and output layer (green).

Chapter 2 • The basic computer

19

FIGURE 2–3B Basic computer model (outer layer devices). Each data point (or node) in the input layer (blue) has the potential to be executed by any (or all) software programs in the inner (hidden) layer.

The output is produced from data processing in the form of a software response such as a mathematical calculation result, an electronically delivered text message or audio or visual information. It also can be provided in a physical form such as a printout or graphics. It can be text, graphics, tactile, audio, and video (Fig. 2 3B). Virtually all data can be parsed by suitably equipped and API programmed computer-to-human readable format. Alternatives to a human-readable representation are a machine-readable format or medium of data primarily designed for reading by electronic, mechanical, or optical devices, or computers. Examples of such include barcodes, HTML code, graphics, and other data forms. Beyond the traditional output devices like monitors and printers, audio and speech synthesizers have become increasingly popular and are considered the preferable output option by many users and systems. Natural language processing (NLP) and generation (NLG) to be discussed in Chapter 3, are significantly advanced beyond basic computing’s sound synthesis and are now an integral part of AI computing.

2.2 Basic computer language and programming Computer languages are called code, and it is through this code that the computer receives instructions (“programmed”) to solve problems or perform a task. Because the goals of computer programs are so diverse, so too are the languages used (Table 2 4) [6]. Programmers (“coders”) select languages that are most suited to address the goals of the program in use. Expert coders are skilled in multiple languages, and after they access the problem or the task the program is attempting to answer, they can choose the most appropriate code to use [7]. Computers are machines and thus require a means of converting human language and commands (source code) from keyboards, spoken word (audio), or visual imagery into

20

Foundations of Artificial Intelligence in Healthcare and Bioscience

Table 2–4 10 most popular programming languages (in order of popularity). • • • • • • • • • •

Python Java C/C11 JavaScript Golang R Swift PHP C# MATLAB

machine code (or computational language) for the input layer. This conversion is referred to as analog-to-digital conversion (ADC) and is accomplished by hardware and software called compilers. All computers understand only binary numbers [8]. The digital language the basic computer uses is called “binary code,” which has 2 discrete values—zero and 1. Each of these discrete values is represented by the OFF and ON status of an electronic switch called a transistor. The compiler, through complex electronic processes, converts the human-readable format into machine language digital format called machine code. It can also convert audio from sound wavelength frequencies at precise intervals into machine language digital format. And compilers can also convert visual input by digitizing its matrix of color and light pixels. Other human senses, including hearing, sight, taste, and touch are potential sources of input capable of being converted to digital code. There are numerous types of compilers with the capacity to translate any source code into executable mathematical, computational target code [9]. Computer programming is simply computer code (language) written electronically (by a programmer or coder) on a semiconductor integrated circuit (a microchip) producing a set of instructions for an operating system (e.g., Windows, MacOS, Linux) or a specific software application. These instructions are written to allow the program to perform an endless variety of functions repeatedly and efficiently. When a microchip is programmed, it is called a microprocessor (e.g., CPU, GPU discussed below), the central functional unit of computer software. Semiconductor companies (Table 2 5) [10] manufacture microchips that computer hardware and software companies use to program their operating systems and applications. Program code can be developed as “front-end” programming where the program code is visible to the user. Front-end programming is typical in websites and browsers. “Back-end” programming is associated more with databases and servers. Other types of frameworks include (but are not limited to) programs such as Bootstrap, AngularJS, and EmberJS. These types of programs control how content looks on different devices such as smartphones and tablets [11].

Chapter 2 • The basic computer

21

Table 2–5 The world’s top 10 semiconductor companies. 1. Intel 2. Samsung 3. Taiwan Semiconductor 4. Qualcomm 5. Broadcom 6. SK Hynix 7. Micron Technology 8. Texas Instruments 9. Toshiba 10. NXP Walton J. The world’s top 10 semiconductor companies. Investopedia; May 13, 2019.

2.3 Basic computer hardware Basic computer hardware consists of the physical elements of a computer system. All computers have different hardware specifications, but there are some essential components common to all of them. They include the devices and peripherals that are used for input, processing, output, and storage of data. One of the most critical pieces of hardware in a basic computer is the RAM drive or random-access memory drive. It is a high-speed type of computer memory that temporarily stores all input information as well as application software instructions the computer needs immediately and soon after. RAM is read from any layer of the computer at almost the same speed. As a form of dynamic, short-term memory, it loses all of its memory when the computer shuts down [12]. Ram should not be confused with ROM (read-only memory) that stores the program required to initially boot the computer. It then retains its content even when the device is powered off [13]. There are 4 categories of computer hardware [14]: 1. Input devices (Table 2 2) [15] transfer raw data into the input layer. Devices include (but are not limited to) keyboard, touchpad, mouse, joystick, microphone, digital camera, video, webcam. The raw (or source) data includes anything that provides information and direction to a computer system. This would include (but not be limited to) information, commands, questions, audio, video, barcode, imagery, facts, material, figures, details, data, documents, statistics, sensory stimuli. 2. Processing devices or microprocessors (integrated circuits, e.g., Intel Pentium chip) participate in all 3 layers of the computing process (sometimes termed I/O devices explained below). They use a central processing unit (CPU) and for AI, a graphic processing unit (GPU). The CPU or GPU is considered “the brain of the computer” (lots more on CPUs and GPUs in Chapter 3). The CPU consists of 4 main parts: (1) it receives a set of instructions as digital

22

Foundations of Artificial Intelligence in Healthcare and Bioscience

Table 2–6 Hard drive Magnetic strip SuperDisk Tape cassette Zip diskette Blu-ray disc CD-ROM disc

Computer data storage devices. CD disc. DVD USB flash drive CF (CompactFlash) Memory card MMC NVMe

SDHC Card SmartMedia Card SD card SSD Cloud storage Network media

code from RAM; (2) the control unit (CU) decodes and executes instructions; (3) the Arithmetic Logic Unit (ALU) performs calculations and logical operations from which it makes decisions, and (4) it sends output to the RAM drive and the hard drive for storage [16]. Using software called compilers (or translators) [17], the CPU transforms the raw data into binary code information (described in “Basic computer language and programming” section below) that can be manipulated and stored as memory (caches) in RAM. This coded information is retained as memory only while the computer power is on. It is removed from RAM and stored on the hard drive when the computer is shutdown. 3. Output devices (Table 2 3) [18] are part of the output layer(s) and disseminate and display data and information through multiple means including (but not limited to) digital monitor text, images, video cards, sound cards, printouts (2-D and 3-D), and robotics. 4. Storage devices (Table 2 6) [19] are retention and memory storage devices used for permanent storage (,SAVE. on the keyboard). They include (but are not limited to) peripheral devices such as hard drives, optical disk drives, thumb drives. Another essential addition to hardware and software discussions includes a group of hardware found in the CU of the CPU that controls communications between multiple computer layers as well as the computer input layer to external devices. These communicating hardware devices are called I/O (Input/Output) units. They include (but are not limited to) mouse, keyboards, monitors, touchpad, disk drives, display adapters, USB devices, Bit-mapped screen, LED, Analog-to-digital converter, On/off switch, network connections, audio I/O, printers. I/O units typically consist of a mechanical component and an electronic component where the electronic component is called the device controller. The device controller serves as an interface between a device and its device driver (software in the CPU that communicates with a specific device). As an example, input to a keyboard (mechanical component) is translated by a compiler to digital code that communicates with a specific device driver in the CPU, which activates a specific I/O device controller for a selected output device (e.g., a monitor, printer, see Table 2 3) [20].

2.4 Basic computer software The term ‘software’ refers to the set of electronic program instructions or data a computer’s microprocessor reads in order to perform a task or operation. Based on what the instructions

Chapter 2 • The basic computer

23

accomplish, the software is categorized into 2 main types [21]. “Systems software” includes the programs that are dedicated to managing the applications and data integration into the computer itself and with its associated hardware. These programs include (but are not limited to) the operating system or “OS” (e.g., MacOS and Windows), compilers, application programming interfaces (APIs), device drivers, file management utilities, and servers. The second type of software is called “application software” or more commonly, “applications or apps.” These user-specific programs are installed and managed through hardware input devices and microprocessors and enable the user to complete an enormous array of tasks ranging from basic educational, to recreational, to business and administrative health care computing and beyond. Examples include (but are not limited to) creating documents, spreadsheets, databases, and publications, doing online research, sending an email, designing graphics, running business and administrative functions, and even playing games. Software applications in the inner (hidden) layer of a computer are organized in a somewhat abstract functional collection called a “framework.” Frequent references in Chapter 3 to machine learning and deep learning categories of AI can be considered frameworks [2]. These platforms have their own specific (but changeable) language to communicate, interrelate, and interface with other general, specific, and application software programs. They can also be used to develop and deploy specific (e.g., research theories, financial models) or general (e.g., art, music, ecological) software applications. Finally, this last software program discussed, the “application programming interface” or API, is perhaps the most essential part of any computer system. It is a software intermediary (i.e., “software to software” program) whose function is to specify how software components should interact or “integrate” with databases and other programs [22]. Most operating environments, such as MS-Windows, Google Maps, YouTube, Flickr, Twitter, Amazon, and many software applications and websites (millions), provide their APIs, allowing programmers to write applications consistent with their operating environment. A simple analogy to understanding how an API operates is to think of the API as a restaurant waiter taking your order (your input), communicating the order(s) to the kitchen (processing), and then delivering the food to your table (output). Effectively, the API software communicates, interacts, and integrates the entire computing process.

2.5 Servers, internet and world wide web (www) 2.5.1 Servers The term “server” simply means a machine (computer) that “serves” other machines. Thus, the name, “server machine” [23]. Whereas processors and CPUs are “the brain of the computer,” servers are their “heart.” They are computers themselves (hardware supporting software) designed to process requests and deliver data to another computer, a local network, or over the Internet. There are multiple types of servers (Table 2 7) [24] including (but not limited to) file servers (local data to an intranet), printer, email, database servers, and web servers are supporting web browsers (also called “clients”) [25].

24

Foundations of Artificial Intelligence in Healthcare and Bioscience

Table 2–7

Types of servers.

Type of server

Description

Server platform

Hardware or software system that acts as an engine that drives the server. Synonymously with operating system (OS). Computing and connecting database servers and the end user. Multimedia capabilities to websites Chat comprises different network servers that enable the users to connect to each other File transfer protocol provides secure file transfer between computers Enables users to work together in a virtual atmosphere irrespective of the location, through the Internet or a corporate intranet. Chat comprises different network servers that enable the users to connect to each other through an IRC network. Manages mailing lists for open interactive or a one-way list that provide Transfers and stores mail over corporate networks through LANs, WANs and across the Internet. Distribution and delivery source for many public news groups, approachable over the USENET news network. Mediator between client program and external server to filter requests, improve performance and share connections. Enables users to log on to a host computer and execute tasks as if they are working on a remote computer. Provides static content to a web browser by loading a file from a disk and transferring it across the network to the user’s web browser intermediated by the browser and the server using HTTP.

Application server Audio/video server Chat server FTP server Groupware server IRC server List server Mail server News server Proxy server Telnet server Web server

Data from: Breylan Communications, 2020.

By definition, a server refers to a large, powerful computer that stores (hosts) websites and communicates with other computers, internet resources, software applications, databases, printers. When you type (input) a URL (uniform resource locator) address into your computer browser (e.g., Safari, Chrome, Explorer), your computer communicates with the server hosting that website. This process allows the website data to be transmitted and displayed on your computer. The communication steps between a computer and a server, “the heart” of electronic computing, include the following components [23]: 1. The hypertext transfer protocol, or HTTP: This is the language that browsers and Web servers use to speak to each other. 2. The server name (“www.DomainName.com”): The Domain Name System, or DNS, translates the “human-readable” domain name that you provide into a numerical internet protocol (IP) address. The browser uses this IP address to connect to the Web server. 3. The filename (“web-server.html”): The file name relates to all of the files like images, computer language, fonts and more, that are relevant to the particular website being visited.

Chapter 2 • The basic computer

25

An IP address is assigned to your computer by your internet service provider (ISP) each time you log on, whereas a server’s IP address remains the same (static). All servers have an IP address. This IP address is how the browser, using a domain name (a human-readable IP address), communicates with a web server’s IP address to access the website’s specific HTML code (HyperText Markup Language universal website language) and pull up the site. Once connected, the browser uses the URL to request a specific file or page on the server’s website. The server then sends all the HTML text for the Web page you requested to your browser. From here, using the computer’s compiler, the browser converts the data from the Web page and formats the data onto the computer screen. All this happens in milliseconds. In the example above, a URL uses the input source from a browser to demonstrate the relationship of the input to a web server. However, any input source (other computers, internet resources, software applications, databases, printers, as mentioned above) can be used to communicate with an appropriate server. There are numerous types (Table 2 7), and over 80 million servers connected to the Internet [26], most of them now cloud-based.

2.5.2 Internet The Internet, sometimes called “The Net,” is a global, decentralized network connecting millions of computers globally. It is a computer network infrastructure or “a network of networks” with each computer capable of communicating with all the other computers connected to the Internet network. These connections are made through a language known as protocols, the most popular of which is HTML (mentioned above) and implemented through user computer hardware or software [27]. The amount of information transferred over the Internet per day (as of 2019) exceeds 5 exabytes [28]. The Internet is seen as having 2 major components. First is its software or network protocols responsible for translating the alphabetic text of a message into electronic signals that are transmitted over the Internet, and then back again into legible, alphabetic text. The protocols, such as the TCP/IP (Transmission Control Protocol/Internet Protocol), present sets of rules that devices must follow in order to complete tasks. The second principal component, the hardware of the Internet, includes everything from the computer or smartphone that is used to access the Internet to the cables that carry information from 1 device to another. These various types of hardware are the connections within the network. Devices such as computers, smartphones, and laptops are endpoints, or clients, while the machines that store the information are the servers. The transmission lines that exchange the data can either be wireless signals from satellites or 4G (soon 5G broadband) and cell phone towers or physical lines, such as routers, cables, and fiber optics. Each computer connected to the Internet is assigned a unique IP address that allows it to be recognized. When 1 computer attempts to send a message to another, the data is sent as a digital “packet” with an assigned IP address and port number (address for the endpoint computer). A router will identify the port number and send it to the appropriate computer where it is translated back into alphabetic text [28].

26

Foundations of Artificial Intelligence in Healthcare and Bioscience

2.5.3 World wide web (www) Sometimes confused with the Internet, the World Wide Web (WWW or “the web”) is a way of accessing, viewing, and sharing information over the Internet. That information, be it text, music, photos or videos or whatever, is written on the web pages (numbering in the billions) served up by a web browser (a software program to present and explore content on the Web) [29]. Browsing the web on the Internet remains a primary source of obtaining information. However, as will be demonstrated in Section 2 of this book (“AI applications in diagnostic and treatment technologies”), the rise of software apps has become increasingly popular. Computer users increasingly browse www sites and their hyperlinks to obtaining news, messages, weather forecasts, videos, health information, and the like. These www sites now number greater than 1,275,000,000 [23]. Given the basics of computers that you have been introduced to in this chapter, you are now ready to enjoy their relationship and their extraordinary evolution into the science and technologies of artificial intelligence. I think you will quickly begin to understand the meaning of “disruptive technologies” as you appreciate the incredible elevation of 1 level of computing to the transitional horizon of AI’s immeasurable potential.

References [1] WhatIs.com. Algorithm. TechTarget; March 2019. [2] Software terms: framework definition. TechTerms; 2013. [3] Prabhakar A. Mathematics for AI: all the essential math topics you need. Medium; August 9, 2018. [4] Paruthi A. Artificial intelligence hardware. Medium; December 16, 2018. [5] Singh H. Hardware requirements for machine learning. eInfochips; February 24, 2019. [6] Goel A. 10 Best programming languages to learn in 2019. Hackr.IO; March 18, 2019. [7] McCandless K. What is computer programming? Code Academy Insights; June 13, 2018. [8] Rafiquzzaman M. Chapter 1, Introduction to digital systems. In: Fundamentals of digital logic and microcontrollers. 6th ed. Safari Books Online; 2019. [9] Gedikli A. Development of programming learning environment with o techniques. J Invest Eng Technol 2018;1(2):14 18. [10] Walton J. The world’s top 10 semiconductor companies. Investopedia; May 13, 2019. [11] Park JS. The status of JavaScript libraries & frameworks: 2018 & beyond. Medium; March 29, 2018. [12] Martindale J. What is RAM? Digital Trends; February 19, 2019. [13] RAM vs. ROM. Diffen; 2019. [14] Amuno A. The four categories of computer hardware. TurboFuture; January 28, 2019. [15] Input. Computer Hope; November 13, 2018. [16] Durden O. The importance of a computer CPU. Chron; March 20, 2019. [17] Bolton D. What is a programming compiler? ThoughtCo. David Bolton; May 8, 2018. [18] Output Devices. Computer Hope; April 1, 2018.

Chapter 2 • The basic computer

[19] Storage Devices. Computer Hope; January 31, 2019. [20] Silberschatz A, Gagne G, Galvin PB. Operating system concepts. Google Books.com; 2018. p. 7 11. [21] Kabir J. What is software and its types? Quora.com; July 12, 2018. [22] Beal V. API - application program interface. Webopedia; 2019. [23] Brain M. How web servers work. How Stuff Works; 2019. [24] What are some of the different kinds of servers? Breylan Communications; 2019. [25] Mitchell B. Servers are the heart of the internet. Lifewire; December 14, 2018. p. 309. [26] Kuˇcera L. How many servers exist in the world? Quora. 2016. [27] Beal V. The difference between The Internet and The World Wide Web. Webopedia; August 7, 2018. [28] Sample I. What is the Internet? The Guardian; October 22, 2018. [29] Browser. Computer Hope; October 2, 2018.

27

3 The science and technologies of artificial intelligence (AI) Artificial intelligence is the new electricity Andrew Ng

Let’s start this discussion about AI with an analogy (somewhat prosaic perhaps) that should set the tone for the Chapter. Most everyone is somewhat familiar with the historic Model T Ford automobile and the contemporary and arguably disruptive Tesla automobile of the 21st century. Both are automobiles. Both have 4 tires, a steering wheel, brakes, axles, internal combustion engines (oops, make that just the Model T), and so on. Despite the numerous features they share in common, nobody would deny that the dramatic differences beyond their generic similarities make them uniquely different technologies. So too is the case to be made between the generic (basic) computer technology used today (discussed in Chapter 2) and the AI computer “disruptive” technology to be addressed in this Chapter. Notwithstanding unique differences, the information in Chapter 2 (“The basic computer”) is essential because there are some common denominators between basic computing and AI computing that must be understood. Most relevant among them include the 3 fundamental categories of the input layer, the inner (hidden) layer, and output layer. The devices associated with these layers remain similar between basic computing and AI computing. AI, however, introduces advances in some, expansion of others, and an array of new, additional hardware and software devices from natural language processing (NLP) to neural networks to AI robotics. The functions and goals of the input layer and output layer remain relatively similar in both computing technologies (with some additions in the output layer). Still, the inner (hidden) layer between basic and AI computing presents substantial differences in structure (hardware) and function (software). Those significant modifications in the inner (hidden) layer effectively render AI a unique technology. Please also note that through this discussion on AI computing, the color-coding introduced in Chapter 2 (Table 21) will continue to be used throughout this Chapter (3) as well.

3.1 The theory and science of artificial intelligence (AI) The definition of AI presented in this book’s introduction stated: “(AI) can be defined simply as a branch of computer science dealing with the simulation of intelligent behavior in Foundations of Artificial Intelligence in Healthcare and Bioscience. DOI: https://doi.org/10.1016/B978-0-12-824477-7.00009-2 © 2021 Elsevier Inc. All rights reserved.

29

30

Foundations of Artificial Intelligence in Healthcare and Bioscience

computers; the capability of a machine to imitate intelligent human behavior” [1]. The breadth and depth of the words in this simple definition bespeak the magnitude of the science of AI. For “. . .a machine to imitate intelligent human behavior” is to effectively “mimic” the functions of the human brain or restated, in biologic terms, to mimic neuroscience. These neurological functions are related to the control centers, progressive cortical layers, and the neural networking of the human brain. Thus, AI must simulate the structures of these layers and neural centers and their function of neural networking [2]. That neural networking is precisely what the science of AI attempts to do. It means that at the very least, AI computing must simulate the neurologic (brain) functions responsible for (but not limited to): • • • • •

Speech recognition and natural language processing; Visual perception and pattern recognition; Analysis and planning; Problem-solving and decision-making Learning, reasoning, and remembering.

In Chapter 1 (“The evolution of artificial intelligence”), a discussion of human intelligence resolved a set of categories and subcategories that define the features of human intelligence (see Chapter 1, Table 11). Each of these features must be assimilated and demonstrated in an electronic system to accurately call it “artificial intelligence.” This goal is attained in AI through multiple algorithms achieving higher levels of learning, reasoning, problem-solving, and communicating. These levels are classified into 3 subcategories of AI that include: “machine learning,” “neural networking” and “deep learning.” So schematically, the broadest classification is AI, and the associated subcategories of AI are machine learning, neural networking and deep learning (Fig. 31). These 3 subcategories of AI introduce unique aspects to the computing process that must be achieved electronically to reproduce the qualities of human intelligence (enumerated in Table 11) produced by the human brain. To accomplish this overwhelming task, the science of AI uses “algorithms” (defined previously in Chapter 2, “Basic Computing”) to simulate the progressive layers of neuronal functions and neural networking in the human brain. The best way to “get one’s head around” the extraordinarily complex discussion of the neuroscience model of AI is to present a basic (illustrated and color-coded) model. This model can show how the AI process simulates the neurobiology of the human brain in the AI subcategories of machine learning and deep learning (from Fig. 31). Such an AI neural network model incorporates an array of analogous, compelling hardware components and highly sophisticated software (algorithm) programs. A preliminary discussion of these critical elements allows a better understanding of how AI utilizes the human neural model in what is referred to in AI as the “Artificial Neural Network (ANN)” [3]. With this understanding, an analysis of each step of the machine learning, neural networking, and deep learning AI subcategories, with their relevant software and supporting hardware, can then be presented in conjunction with their neurobiological analogs. These correlations help to make the complex science of AI more understandable (I hope!).

Chapter 3 • The science and technologies of artificial intelligence (AI)

31

FIGURE 3–1 Artificial intelligence schematic. The broadest classification of AI includes the subcategories of machine learning (ML) and deep learning (DL) within which artificial neural networks (ANN) and deep learning are subsets of machine learning.

3.2 Artificial neural network (ANN) model of artificial intelligence (AI) The fundamental neuroanatomical component of the brain that dictates neural functioning is appropriately called “the neuron” (Fig. 32). Scientists estimate that there are approximately 100 billion neurons in the human brain [4]. It is the basic unit of the central nervous system responsible for the transmission of nerve impulses through several threadlike “arms” called axons and dendrites. The nerve impulses travel down axons reaching junctions called synapses, where neurotransmitter chemicals are released across the synaptic cleft activating

FIGURE 3–2 The neuron. The neuron is the basic unit of the human neural network that receives and sends trillions of electrical signals throughout the brain and body. It consists of a cell body, an axon, and dendrites.

32

Foundations of Artificial Intelligence in Healthcare and Bioscience

FIGURE 3–3 Neurotransmitter chemicals across synaptic cleft. Chemical messenger transmits signals across a synapse (synaptic cleft) such as a neuromuscular junction from one neuron (nerve cell) to another “target” neuron.

other neurons (Fig. 33). All of this activity can be reduced to a mathematical model [5,6], (Fig. 34). This vast network of 1 hundred billion interconnecting neurons creates a complex in the human brain called the “neural network” (Fig. 35). This network has the potential of producing 1 hundred trillion neural connections [7] between input data (information to the brain), the inner (hidden) layers, and resulting in output (human intelligence) (Fig. 36). Consistent with the neuroscience model, Fig. 36 can be restated (re-illustrated) with the inner (hidden) layer nodes distributed through the cerebral cortex as progressive cortical centers or layers (Fig. 37). These distributed cortical layers are called the “deep neural network” [8]. This deep neural network can also be graphically represented (Fig. 38) in a linear dimensional, multilayered distribution of nodes demonstrating the hundred trillion deep neural network connections that create the “convolutional neural network (CNN)” [9]. This neural process is analogous to the deep learning algorithms in AI discussed below. The subcortical limbic system (hippocampus, amygdala, and thalamus) in the midbrain serves as relay stations between input layer data and the inner (hidden) layers of the deep neural network (Fig. 39). The cerebral functions of each of these brain nuclei have direct

Chapter 3 • The science and technologies of artificial intelligence (AI)

33

FIGURE 3–4 Mathematical model of neuron. Schematic diagram of the mathematical model of a neuron where input “weights (w)” are activated and summed (Σ) by cell body producing an output (similar to the computer model).

FIGURE 3–5 Schematic of the cortical neural network (CNN). A vast network of one hundred billion interconnecting neurons creates a complex in the human brain called the “cortical neural network.” This network has the potential of producing one hundred trillion neural connections.

implications on the AI process identified in the AI Software discussion that follows. Signals from these limbic cortical centers are transmitted to the higher neural layers for cognitive interpretations, at which point they are “cataloged” as memory in the limbic system as well as corresponding progressive cortical layers. Then, they are subsequently generated as output in some aspect of human intelligence (Fig. 310) [10].

34

Foundations of Artificial Intelligence in Healthcare and Bioscience

FIGURE 3–6 Neural network with 3 computer layers. This model demonstrates input nodules (red) to inner (hidden) layer nodes (blue), in relationship to the cerebral cortex, and output (green).

FIGURE 3–7 Deep neural network. The inner (hidden) layer nodes. From (Fig. 36) are distributed throughout the brain as progressive cortical centers or layers (blue) to create the deep neural network (DNN).

Similar to the basic computer model from Chapter 2 (Layers of basic computers), each input data point (or node) in an ANN undergoes the “compilation process” (Chapter 2, page 16) as it connects with each analogous inner (hidden) layer (algorithms, servers, and databases). It is at this point in the process that AI computing begins to diverge from the basic computing process.

Chapter 3 • The science and technologies of artificial intelligence (AI)

35

FIGURE 3–8 Convolutional neural network (CNN). The deep neural network (DNN) is graphically represented in a linear dimensional, multilayered distribution of nodes (blue) demonstrating the hundred trillion deep neural network connections that create the “convolutional neural network (CNN).” This neural process is analogous to the deep learning algorithms in AI.

FIGURE 3–9 The limbic system (hippocampus, amygdala, thalamus). The subcortical limbic system (hippocampus, amygdala, and thalamus) in the human midbrain serves as relay stations between input layer data (red) and the inner (hidden) layers (blue nodes) of the deep neural network (DNN). The cerebral functions of each of these brain nuclei have direct analogous implications in the AI process.

36

Foundations of Artificial Intelligence in Healthcare and Bioscience

FIGURE 3–10 Neural layer transmissions from limbic system to higher cortical centers. Signals from the limbic system are transmitted to the higher cortical neural centers (blue arrows, #3) for cognitive interpretations at which point they are “cataloged” as memory in the limbic system and corresponding progressive cortical layers. Finally, the impulses are generated as output (green, #4) representing some aspect of human intelligence.

Here the mathematical modeling of each neuron (from Fig. 33) begins to undergo computations and analysis by AI algorithms. These numerous algorithms (Table 31 [1113]) are classified (from Fig. 31) as machine learning, neural networking and deep learning. Machine learning is the critical learning process in AI (analogous to the neural process in Fig. 36). In contrast, neural networking and deep learning (comparable to the CNN process in Fig. 38) are branches of machine learning with some more highly sophisticated, unique characteristics.

3.3 AI software (algorithms) As defined in Chapter 2, algorithms are procedures or formulae for solving a mathematical problem, based on conducting a sequence of specified actions or steps that frequently involves repetition of an operation. The fundamental framework of all AI computing is mathematicsbased, including linear algebra, multivariable calculus, Bayesian logic, statistics, and probabilities, all of which algorithms utilize extensively [14]. The algorithm is the computational process using the language of mathematics. When applied in computer science, it expresses maximal, practical, efficient solutions in words to mathematical problems and questions. From the brief neuroscience outline of the human neural network described above, algorithms are now identified as specific to ANN, some specific to CNN, and some common to both (Table 31). As previously described in Chapter 2, APIs (Application Programming Interface) are software intermediaries (i.e., “software to software” programs) whose functions are to

Chapter 3 • The science and technologies of artificial intelligence (AI)

Table 3–1

37

Main algorithms used in artificial intelligence.

Algorithms Artificial neural network (ANN)

Brief description

Nonlinear statistical data modeling tools where the complex relationships between inputs and outputs are modeled or patterns are found. Application programming Software intermediary whose function is to specify how software components should interface (API) interact or “integrate” with databases and other programs. Regression analysis Statistical methods that examine the relationship and influence between one or more independent variables on a dependent variable (y 5 f[x]). Linear Approach to modeling the relationship between a scalar response (or dependent variable) and one or more explanatory variables (or independent variables). Polynomial Relationship between the independent variable x and the dependent variable y is modeled as an nth degree polynomial in x. Logistic Explains the relationship between one dependent binary variable and one or more nominal, ordinal, interval or ratio-level independent variables. Support vector machine (SVM) Analyze data used for classification and regression analysis. k  Nearest neighbors A plurality vote of its neighbors, with the object being assigned to the class most common among its k nearest neighbors (k is a positive integer, typically small). Random forest Additive model that makes predictions by combining decisions from a sequence of base models. Decision trees Creates a training model that can be used to predict class or value of target variables by learning decision rules inferred from prior data (training data). Naive Bayes Provides a statistical method to analyze the ideas and possibilities of user input to update their probability as more evidence or information becomes available at each neural layer. Applies logical rules to the knowledge base to deduce new information. Components of Inference engine (If/Then deduction) expert systems. Natural language generation (NLG) Focuses on generating natural language from structured data such as knowledge base or logical form (linguistics). Natural language processing (NLP) Concerned with the interactions between computers and human (natural) languages, in particular how to program computers to process and analyze large amounts of natural language data. Applications in deep learning only: Convolutional neural network A class of deep neural networks, most commonly applied to analyzing visual imagery. (ConvNet/CNN) k  Means clustering Identifies k number of centroids, and then allocates every data point to the nearest cluster, while keeping the centroids as small as possible. Association rule If-then statements that help to show the probability of relationships between data items within large data sets in various types of databases. Hierarchical clustering A technique which groups the similar data points such that the points in the same group are more similar to each other than the points in the other groups. Hidden Markov models (HMM) A class of probabilistic graphical model that predicts a sequence of unknown (hidden) variables from a set of observed variables. Markov decision processes (MDPs) A mathematical framework for modeling decision making in situations where outcomes are partly random and partly under the control of a decision maker. Q-learning Uses Q-values (“action values”) to iteratively improve the behavior of the learning agent. SARSA (State-Action-RewardIn the current state, S an action, A is taken and the agent gets a reward, R and ends up in State-Action) next state. Deep Q-Networks Takes as input the state of the environment and outputs a Q value for each possible action. DDPG (deep deterministic Uses a stochastic (random probability distribution) behavior policy for good exploration but policy gradient) estimates a deterministic (determined by cause) target policy.

38

Foundations of Artificial Intelligence in Healthcare and Bioscience

specify how software components should interact or “integrate” with databases and other programs [15]. These API software algorithms interpret user input (e.g., text, data, audio, graphics, natural language processing, video, GPUs [graphic processing units], knowledge engineers [experts]) to allow 1 computer to be used (“user input”) by other computer programs (databases). Thus, APIs enable the AI computer algorithms to analyze the main factor(s) of user input that one is attempting to understand or predict [13]. The frameworks within which algorithms operate are the 3 subcategories of AI described previously (Fig. 31); machine learning, ANN (in our neuroscience analog) and deep learning (CNN in our neuroscience analog). Descriptions of their respective algorithms and associated hardware that follow offer a complete picture of the science of AI. And once again, as mentioned early in Chapter 2, the scope of this book (and ergo, of the author) limits indepth descriptions to only brief samples of the complex mathematical equations, formulae, and technical jargon (i.e., no binary code) related to each algorithm. For those readers interested, detailed explanations and applications of each algorithm are available in multiple mathematical textbooks [1719].

3.3.1 Machine learning Machine learning is a framework (a platform for developing software applications [20]) that allows computers to learn directly from examples and experience in the form of data. Given a large amount of data to use as examples (input) to detect patterns to determine how a task is achieved, the system “learns” how best to produce the desired output [21]. As with the human neural network (Fig. 37), in machine learning, there are hundreds of layers, each with thousands of nodes, trained upon millions of data points [22]. There are 3 branches of machine learning: supervised (labeled data), unsupervised (unlabeled data), and reinforcement learning. Supervised (labeled data) is usually associated with machine learning, while unsupervised (unlabeled data) and reinforcement learning refer more to neural networks and deep learning. These distinctions are not absolute in that neural networks and deep learning are subcategories of machine learning, yet still part of that category. Thus, certain forms of unsupervised and reinforcement learning are also considered a part of machine learning as well as ANN and deep learning. So too, supervised learning, part of the machine learning framework, may also be regarded as deep learning in certain forms. Often in the literature, algorithms are classified as both machine and deep learning. Each branch of machine learning is driven by mathematical formulae (algorithms) that analyze data to produce answers, predictions, and insights from which the user can make decisions. Each branch conducts its analysis through different algorithms and data processing functions. A brief description of all 3 branches of machine, ANN and deep learning with practical examples help to clarify their meaning [22].

3.3.1.1 Supervised (labeled) data The majority of practical machine learning uses supervised learning [23]. It is a form of learning where the user has a dataset of “labeled” input variables (x) and an output

Chapter 3 • The science and technologies of artificial intelligence (AI)

39

variable (Y). The algorithm learns the mapping function from the input to the output by “supervising” the process. It develops “training data” from the variable’s structure or pattern recognition (e.g., a dataset of apples, round and red versus bananas, long and yellow). As it learns the correct answers, the algorithm iteratively makes predictions on the training data. The algorithm’s self-modifications are called “training.” This process of the ANN is analogous to the human brain’s neural progressive cortical layer’s memory functions. The machine (computer) uses its AI software to process keyboard, visual or natural language “processed” input (NLP); web browsers to communicate with databases; and regression analysis software to analyze, interpret and synthesize these layers of information, layer-by-layer. This is similar to the neural layers of the brain (see Fig. 37). From this process, the computer arrives at (“machine learns”) relationships and conclusions [24]. The goal in supervised learning is to approximate the mapping function so well that when you have new input data (x), you can predict the output variables (Y) for that data. Learning stops when the algorithm achieves an acceptable level of performance [25]. Supervised learning problems are grouped into regression and classification problems [26]. A classification problem is when the output variable (y) is a category, such as “disease” and “no disease.” A regression problem is when the output variable (y) is a real value, such as “dollars” or “weight.” The process is mathematically stated as y 5 f(x). All classification and regression algorithms come under supervised learning. They include (with limited thumbnail descriptions for each due to their mathematical complexities beyond the scope of this book): • Regression analysis includes Linear; Logistic; Support vector machine (SVM); k-Nearest Neighbors; Random Forest; Polynomial. These powerful statistical methods examine the relationship and influence between 1 or more independent variables (x) on a dependent variable (y 5 f(x) as described above). Regression coefficent Pn Mathematically stated β 5

  ðx i 2 x Þ y i 2 y Pn 2 i51 ðx i 2x Þ

i51

Where: x is nothing but the value y (which we are going to predicate) for particular x (means y is a linear function of x) [27]. • Decision trees: This algorithm is used for solving regression and classification problems. It can create a training model for predicting the class or value of target variables by learning decision rules inferred from prior data (training data). J X   Mathematically stated: H ðT Þ 5 IE p1 ; p2 ; . . . ; pj 5 2 pi log2 pi i51

Where {p_{1},p_{2},. . .} {p_{1},p_{2},. . .} are fractions that add up to 1 and represent the percentage of each class present in the node that results from a split in the tree [28].

40

Foundations of Artificial Intelligence in Healthcare and Bioscience

• Naive Bayes: A straightforward and powerful algorithm for classification problems. Thomas Bayes, an 18th-century mathematician, developed a mathematical formula for determining conditional probability. AI Bayesian algorithms provide a statistical method to analyze the ideas and possibilities of user input to update their probability as more evidence or information becomes available at each neural layer. Mathematically stated: PðHjEÞ 5

PðEjHÞ  PðHÞ PðEÞ

Where: P(H) is the probability of hypothesis H being true. This is known as the prior probability. P(E) is the probability of the evidence (regardless of the hypothesis). P(E|H) is the probability of the evidence given that the hypothesis is true. P(H|E) is the probability of the hypothesis given that the evidence is there [27].

3.3.2 Neural networking and deep learning 3.3.2.1 Unsupervised (unlabeled) data Unsupervised learning and semi-supervised learning (limited labeled data, but large dataset) are generally referred to as “deep learning” or a subcategory of machine learning. This process utilizes an algorithm where only input data (x) is known with no corresponding output variables (“unlabeled”). The goal is to group (“cluster”) the structure or distribution of the data to learn more about the data. This deep learning is called unsupervised learning because there are no correct answers. Algorithms are left to their devices to discover and present the relevant structure in the data [9]. This deep learning process in the ANN maximizes the analogous deep neural network, hundreds of millions of progressive cortical layer functions (Figs. 3.7 and 3.8) of the human brain. As mentioned in the previous neuroscience discussion, this process is called the convolutional neural network (CNN) [9]. The computer’s software interacts with specific knowledge database(s) (“unlabeled data”), which includes stored data from previously labeled experiences and other preexisting, directly and indirectly related, digitally stored knowledge bases on the worldwide web. This process is similar to the brain’s limbic system (amygdala and hippocampus) and higher cortical neural layers storing past quantified and qualified human experiences (personal or observed behavioral, educational, sensory, emotional). Like the human brain’s hippocampus (storage of long-term memory, including all past knowledge and experiences), the computer learns and stores new information into long term memory. The stored limbic information from the supervised, labeled data is networked (“neural networking”) at the higher cortical layers with the unlabeled data. This aggregate of information is analyzed, integrated, and transmitted for interpretations, decision-making and interactions, and stored within the limbic system and related cortical centers in the brain. In the corresponding AI system, this information is stored by inference engine software for future deep learning experiences. The information storage allows for AI’s continued and expanding learning potential similar to that of the human brain.

Chapter 3 • The science and technologies of artificial intelligence (AI)

41

The unsupervised learning information (active and stored) is programmed into AI software allowing the computer, using Bayesian deduction reasoning, to employ it in an active collectively, progressive analytical process that extracts synergies, improves performance(s) and continued learning [29]. When humans learn, they alter the way they relate to the world. When algorithms learn, they change the way they process the labeled data. They assess variables and pattern recognition and alter themselves as they are exposed to the data. Through an extensive iterative, analytical process, with minimal human intervention, the algorithm learns from previous computations to produce reliable, repeatable decisions and results [30]. Unsupervised learning problems are grouped into clustering and association problems [31]. A clustering problem is where you want to discover the inherent groupings in the data (e.g., grouping clinical conditions by their symptom pattern). An association rule learning problem is where you want to find rules that describe large portions of your data, such as conditions that symptom pattern x also tend to demonstrate y (thus, y 5 f(x) as described previously). Association Rules work based on “if/then” statements in supervised and unsupervised learning [32]. These statements help to reveal associations between independent data in a database, relational database, or other information repositories. The main applications of Association Rules are in data analysis, classification, clustering, and many others [33]. All clustering algorithms come under unsupervised learning algorithms. They include (with limited thumbnail descriptions for each due to their mathematical complexities beyond the scope of this book) [30]: • K  means clustering: K-means algorithm identifies k number of centroids, and then allocates every data point to the nearest cluster while keeping the centroids as small as possible. Mathematically stated: Given a training set x(1), . . . , x(m) x(1), . . . , x(m), and want to group the data into a few cohesive “clusters.” Here, we are given feature vectors for each data point x(i) A ℝn x(i) A Rn as usual; but no labels y(i) (making this an unsupervised learning problem). The goal is to predict k centroids and a label c(i) for each data point. The k-means clustering algorithm is as follows [34]: 1: Initialize cluster centroids μ1 ; μ2 ; . . . ; μk Aℝn randomly: 2: Repeat until converhence: f For every i; set cðiÞ : 5 are min:x ðiÞ 2μj : U 2

j

For each j; set Pm ðiÞ ðiÞ i51 1fc 5 jgx : μJ : 5 P m ðiÞ i51 1fc 5 jg

• Hierarchical clustering: A technique that groups similar data points such that the points in the same group are more similar to each other than the points in the

42

Foundations of Artificial Intelligence in Healthcare and Bioscience

other groups. The group of related data points is called a Cluster. Mathematically stated: Sim(C1, C2) 5 Min Sim(Pi, Pj) such that Pi A C1 & Pj A C2: For single linkage algorithm (MIN) defined as the similarity of 2 clusters C1 and C2 is equal to the minimum of the similarity between points Pi and Pj such that Pi belongs to C1 and Pj belongs to C2 [35]. • Hidden Markov models (HMM): A class of probabilistic graphical models that predicts a sequence of unknown (hidden) variables from a set of observed variables. The Markov process assumption is that the “future is independent of the past given that we know the present.” A simple example of an HMM is predicting the clinical condition (hidden variable) based on the symptom pattern the patient demonstrates (observed). An HMM is viewed as a Bayes Net unrolled through time with observations made at a sequence of time steps being used to predict the best series of hidden states [36]. Mathematically stated: Consider the situation where you have no knowledge of the outcome when you are examining the patient. The only way for you to know what the outcome (diagnosis) might be is to recognize the symptom pattern during examination. Here, the evidence variable is the symptom pattern, while the hidden variable is the diagnosis. t   HMM representation: P ðR0 ; R1 ; . . . ; Rt ; Uo ; U1 ; . . .; U t Þ 5 PðRo Þ L PðRi Ri51 ÞPðUi Ri51 Þ i51

3.3.2.2 Reinforcement learning Falling between supervised and unsupervised learning is reinforcement learning, focusing on learning from experience. It enables a user to learn in an interactive environment by trial and error using feedback from its actions and experiences and gives a reward function that tries to optimize the experience. Supervised and reinforcement learning use mapping between input and output. With supervised learning, feedback provided to the user is a correct set of actions to the learner for performing a task. Reinforcement learning uses rewards and punishment as signals for positive and negative results. The goal of the agent is to learn the consequences of its decisions [37]. The AI software (as with the human cortical centers) tries to maximize the most significant benefit(s) it can receive when interacting with an uncertain environment [38]. It uses a process called “memory networking” which is an artificial neural networking (ANN) using RAM (random access memory, analogous to the hippocampus) to differentiate and adjust the connections between unsupervised and supervised learned information [39]. This form of learning in the CNN process (in large part as well as supervised and unsupervised learning) accentuates the analogy to the human brain’s limbic system. This highly sophisticated AI process mimics the plasticity of the human brain’s limbic system and cortex. A comparison of limbic centers of the brain and AI computer software analogs in Table 32 [4042], identify the ways AI “mimics” human neurological functions. As compared to unsupervised learning, reinforcement learning is different in terms of goals. While the goal in unsupervised learning is to find similarities and differences between

Chapter 3 • The science and technologies of artificial intelligence (AI)

Table 3–2

43

Limbic centers of the brain and AI computer functional analogs.

Neurological structure

Al computer functional analog

Progressive cortical layers

• • • • • • • • • •

Hippocampus

Amygdala Thalamus

Supervised learning Unsupervised learning Reinforcement learning RAM Memory networking Plasticity Reinforcement learning Robotics Artificial neural network Unsupervised learning

data points, in reinforcement learning, the goal is to find a suitable action model that would maximize the total cumulative reward of the user. Since reinforcement learning requires a lot of data, it is most applicable in domains where simulated data is readily available, like games. It is used widely in AI for playing computer games (e.g., AlphaGo Zero, the first computer program to defeat a world champion in the ancient Chinese game of Go [43]). Another example of reinforcement learning is Google’s DeepMind’s work on Deep Reinforcement Learning for Robotic Manipulation with asynchronous automobiles (more on robotics discussed below). But the simple game of PacMan is an ideal example of reinforcement learning. The goal for PacMan is to eat the food in the grid while avoiding the ghosts on its way. The grid world is an interactive environment. PacMan receives a reward for eating food and punishment if he gets killed by the ghost (and loses the game). To build an optimal policy, PacMan faces the dilemma of exploring new states while maximizing its reward at the same time. This is called “Exploration versus Exploitation trade-off.” The total cumulative reward is PacMan winning the game [44]. As with supervised, semi-supervised, and unsupervised learning, reinforcement learning has a set of unique mathematical algorithms. They include (with limited thumbnail descriptions for each due to their mathematical complexities beyond the scope of this book) [40]: • Markov Decision Processes (MDPs): a mathematical framework for modeling decision making in situations where outcomes are partly random and partly under the control of a decision-maker. The Markov property states that “The future is independent of the past given the present.” • Q-learning: uses Q-values (“action values”) to improve the behavior of the learning agent iteratively: Mathematically stated: 0

1 learned value

Qnew ðst ; at Þ’ð1 2 αÞU Qðst ; at Þ 1 |fflfflfflffl{zfflfflfflffl} old value

α |{z}

Bzfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflffl}|fflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflffl{ C B   C C UB rt 1 γ U max Q st11;a B|{z} C |{z} a @ |fflfflfflfflfflfflfflfflfflffl{zfflfflfflfflfflfflfflfflfflffl} A

learning rate

reward

discount factor

estimate of future value

44

Foundations of Artificial Intelligence in Healthcare and Bioscience

Where r1 is the reward received when moving from the state st to the state st 1 1, and α is the learning rate (0 , α1) [45]. • SARSA (State-Action-Reward-State-Action): An algorithm for learning a Markov decision process policy, used in reinforcement learning. Mathematically stated : QðSft g; AftgÞ :5 QðSft g; AftgÞ 1 αT½Rft 1 1g 1 γTQðSft 1 1g; Aft 1 1gÞ 2 QðSft g; AftgÞ

Where, in the current state, S an action, A is taken and the agent gets a reward, R and ends up in next state; learning rate α determines to what extent the newly acquired information overrides the old information, and discount factor γ determines the importance of future rewards [46]. • Deep Q-Networks (DQN): Take as input the state of the environment and outputs a Q value for each possible action. Deep-Q-Networks combine deep learning and reinforcement learning with learning how to play video games at superhuman levels [47].

• DDPG (Deep Deterministic Policy Gradient): a policy gradient algorithm that uses a stochastic (random probability distribution) behavior policy for good exploration but estimates a deterministic (determined by cause) target policy. Finally, algorithms apply Bayesian probability logic (statistical analysis of increasing available information to confirm the probability of a hypothesis [48] through “Inference Engine” software [“if/then” inference rules logic [49]]) that can be applied to interpret reasonable expectations. Applications of each of these machine learning and deep learning processes are demonstrated directly in many of the topics addressed in the business, administrative, and clinical aspects of health care in Section 2 of this book. And finally, having provided analogies between these AI algorithms and neuroscience, they can also be categorized in general with the human intelligence categories presented in Table 11 and compared in Table 33 relative to associated neurological analogs [50]. In summary, the universal AI process can be categorized into roughly 7 activities (Table 34), all distinct, yet all directly or indirectly interrelated. Together, they provide the matrix of electronic, mathematical logic, reasoning, and decision-making, which constitutes the “art and science” of artificial intelligence.

Chapter 3 • The science and technologies of artificial intelligence (AI)

45

Table 3–3 Comparison of human intelligence categories (from Chapter 1, Table 1.1) and neurological analogs. Intelligence category Reasoning Learning

Perception

Problem solving Communication

Table 3–4

Neurological analogs

Category features • • • • • •

Inductive Deductive Auditory Experience Motor Sensory a. Visual b. Hearing c. Cerebral • Assessment • Decision-making • To speak; and

• Frontal lobe • • • •

Observational Memory Perceptual Pattern recognition

• Solutions • Alternatives • To recognize and understand spoken and written language

• Relational • Spatial • Stimuli

• • • • • •

Frontal lobe Temporal lobe Limbic system Parietal lobe Occipital lobe Thalamus

• • • •

Frontal lobe Amygdala Broca’s area Wernicke’s area

Summary of AI categories.

Summary of Al categories Artificial intelligence (Al)

Machine learning (ML) Algorithm

Supervised machine learning

Unsupervised machine learning (deep learning or CNN) Semi-supervised machine learning (deep learning or CNN) Reinforcement learning (deep learning or CNN)

A broad definition which describes the ability of a machine to demonstrate intelligent behavior. Al algorithms exhibit either nonadaptive (e.g. rulebased) or adaptive intelligence (e.g. machine learning). A type of Al that uses mathematical models to automatically map input to desired outputs in a way that does not rely on explicit rule-based programming. The computational process formulated using the language of mathematics. When applied in Al. it expresses maximal practical, efficient solutions in words to mathematical problems and questions. An approach to training ML algorithms in which a model is provided input (e.g. digital images) that is classified with a label. For example, a model used to distinguish images of cats from dogs would be shown many images of cats, labeled as cats, and many images of dogs, labeled as dogs—a process denoted as training. Following training, a user could show this model an image of a cat or dog without a label, and the model on its own should be able to classify the image. An approach to training ML models in which the input data has not been labeled, and the algorithm identifies patterns or regularities in the data. An approach to training ML models when the number of labeled input data are limited, but the available input dataset is large. A combination of supervised and unsupervised techniques use both labeled and unlabeled input data, respectively. Falling between supervised and unsupervised learning, reinforcement learning focuses on learning from experience. It enables a user to learn in an interactive environment by trial and error using feedback from its own actions and experiences and gives a reward function that tries to optimize the experience.

46

Foundations of Artificial Intelligence in Healthcare and Bioscience

Table 3–5 • • • • • • • •

AI hardware.

Computer servers (file, mail, print, web, game, application) Computer processing units (CPUs) Graphic processing units (GPUs) Accelerators Quantum processors using “qubits” (vs digital binary code) Neuromorphic chips (“self-learning” microchips) Application-specific integrated circuit (ASIC) Field-programmable gate array (FPGA) integrated circuit with hardware description language (HDL)

3.4 AI hardware As described above in “AI Software (Algorithms),” the fundamental framework of all AI computing is mathematics-based, including linear algebra, multivariable calculus, Bayesian logic, statistics, and probabilities [14]. These mathematical sciences conduct incredible amounts of calculations, computations, and iterations (in the trillions) done at almost instantaneous speeds [51]. The guiding principle and goal of AI hardware technology are to is support this enormous volume of data processing and the calculations and computations the AI software algorithms must execute. The input, output, and storage device hardware listings (Tables 23, 24, and 25) from the “Basic Hardware” discussion in Chapter 2, plus the microprocessing devices (RAM and CPU), are all hardware used in AI computing. But the data processing and millisecond speeds associated with basic computer hardware (especially with the CPU) are woefully inadequate to accommodate the volume and speed needed for AI’s machine learning and deep learning data processing. As such, the most active area of development in the AI industry lies in the dramatic upsurge and advances in AI hardware (Table 35). Currently, numerous prominent companies are producing AI-specific hardware (Google, Microsoft, Intel, IBM, Amazon, Alibaba, Baidu, Nvidia) as well as many startups [2]. A review of each of the significant forms of current and evolving hardware provides an understanding of the physical resources that drive the AI computing process and software algorithms.

3.4.1 Ram (random access memory) As described in Chapter 2, a RAM microchip is a high-speed type of computer memory device that temporarily stores all input information as well as application software instructions the computer needs immediately and in the near future. RAM is read from any layer of the computer at almost the same speed. As a form of dynamic, short-term memory, it loses all of its memory when the computer shuts down [52]. Sometimes referred to as DRAM (dynamic random-access memory) or SDRAM (synchronous dynamic random-access memory), these terms are generally interchangeable. Another common term, especially in the video game space, is VRAM, or video RAM [52]. It is used to denote the memory available to a graphics chip and GPU (graphic processing unit) processors. As described below, GPUs function at rapid rates and thus, need a memory

Chapter 3 • The science and technologies of artificial intelligence (AI)

47

chip that can accommodate their production. RAM is available in chips with storage capacity as high as 256 GB, although 832 GB is more common [53]. RAM speeds range from 800 to 4200 MHz (megahertz equals 1 million commands sent or received from the CPU and GPU, referred to as bandwidth). So, an example of a good system for gaming and computing might be 32 GB of RAM storage at a processing speed of 2400 MHz.

3.4.2 Computer servers (file, mail, print, web, game, apps) In Chapter 2, computer servers were described as “the heart” of the computer. As computers themselves (hardware supporting software), they are designed to process requests and deliver data to another computer, a local network, or over the internet. Table 27 lists the multiple server types [50].

3.4.3 Central processing unit (CPU) The Central Processing Unit (CPU), described in Chapter 2 as the “brain” of the computer consists of 4 main parts: 1. It receives a set of instructions as digital code from RAM; 2. The control unit (CU) decodes and executes instructions; 3. The Arithmetic Logic Unit (ALU) performs calculations and logical operations from which it makes decisions; and 4. It sends output to the RAM drive and the hard drive for storage [53]. CPUs are built by placing billions of microscopic transistors onto a single computer chip [54]. Those transistors allow it to make the calculations it needs to run programs stored on the system’s memory. Additional CPU chips can be added to a computer to increase operating speeds. The most common advancements in CPU technology have been in making transistors smaller and smaller. That has resulted in the improvement of CPU speed over the decades. A multi-core (small processors) CPU can have anywhere from 2 to greater than 32 core processors which can execute multiple instructions simultaneously at speeds of 5 GHz or higher. (One hertz is the speed of 1 operation per second; 1 GHz equals 1 million operations per second) [54]. The CPU is the analog to the human brain at large when using our neuroscience analogy to AI computing. It is the processor for all the operations the computer conducts. How your computer operates is based on mathematical operations, and the CPU controls all of them. Information sent from an input device is compiled (from Chapter 2, page 14) and then transferred to the CPU. The CPU’s ALU is then responsible for all mathematical and logical operations. It processes instructions using 4 basic functions [55]: 1. Fetch: Each instruction is stored in memory and has its own address. The processor takes the address number from the program counter to track which instructions should execute next. 2. Decode: All programs executed are translated into Assembly instructions. Assembly code is decoded into binary instructions.

48

Foundations of Artificial Intelligence in Healthcare and Bioscience

3. Execute: a. ALU calculations; b. move data from 1 memory location to another; or c. jump to a different address. 4. Store: Gives output data feedback to RAM.

3.4.4 Graphic processing unit (GPU) If CPUs are the “brains” of computing, GPUs are the “brawn.” A CPU can do anything a computer requires, whereas a GPU is a specialized microprocessor optimized for displaying graphics and doing very specific computer tasks. CPUs and GPUs are both made from hundreds of millions of transistors, which can process thousands of operations per second. The difference between them is that a CPU uses between 1 and 4 processing cores clocked anywhere from 1 to 4 GHz. GPUs process tasks in different ways and are best at focusing all their computing abilities on a specific task. The GPU uses thousands of smaller and more efficient cores than the CPU and can handle multiple functions of lively parallel data at the same time. An analogy might be to use a sports car to move a large number of boxes from one point to another (i.e., multiple parallel processes). The process would be fast but inefficient, whereas, a large truck, albeit slower, would conduct the process considerably faster. GPUs are 50100 times faster in tasks that require multiple parallel processes, such as computer graphics and gaming (for which they were initially developed). But their most significant value lies in their ability to conduct massive loads of iterative computations in machine learning, deep learning, and big data analytics. GPUs process data at rates in the billions of calculations per second, whereas central processing units (CPUs) compute at only millions of processes per second [56]. The most effective data sources utilized in machine learning are graphics (video and images). Thus, the development of the GPU by the AI technology company, Nvidia, in 1999, is responsible for the breakthrough advancement in AI computing [57]. Then, in 2006 “GPU accelerators” were introduced supercharging AI computing’s capabilities and growth [58]. The microprocessor industry continues to grow with increasingly powerful chip technology and combined technologies. The APU (Accelerated Processing Unit) chip combines the best features of gaming graphic cards and processors [59]. Powerful cloud-based processors (e.g., TPU  Tensor Processing Unit by Google) are now available to researchers and developers (and beta downloads) in need of high-speed machine learning and training of AI models [60].

3.4.5 Accelerators [61] An AI accelerator is a microchip designed to enable faster processing of AI tasks. Like other dedicated-use accelerators, such as graphics processing units (GPUs), auxiliary power units (APUs), and power processing units (PPUs), AI accelerators are designed to perform their tasks in a way impossible for traditional CPUs to achieve. A purpose-made accelerator delivers greater performance, more features, and greater power efficiency to facilitate its given task.

Chapter 3 • The science and technologies of artificial intelligence (AI)

49

An example of the value of AI accelerators is demonstrated in Google DeepMind’s AlphaGo project, where the number of possible piece positions in the game made processing the task impossible with a “brute force” approach. Accelerators focus on multicore, simple AI arithmetic functions done in mass quantities and computed through specialized algorithms. The number of such functions required for a task would render traditional computing approaches impossible.

3.4.6 Quantum processors using “qubits” (vs digital binary code) Quantum computing is an industry (i.e., IBM, Microsoft, Google, Intel, and other tech heavyweights) goal over the next 510 years. The Quantum theory is the branch of physics that deals with the world of atoms and the smaller (subatomic) particles inside them. Transistors, the basic unit of computers, are now approaching the point where they’ll soon be as small as individual atoms. At this level, wave-particle duality exists wherein things can exist simultaneously in 2 states of matter and energy (e.g., Schrödinger’s cat). In the 1970s and 80s, physicists (Landauer, Bennett, Benioff, Feynman, and Deutsch) outlined the theoretical basis of a quantum computer. When the duality theory applies to computing, instead of binary bits in 0s and 1s, a quantum computer has quantum bits or “qubits” that can store both 0s and 1s simultaneously, or an infinite number of values in between, in multiple states (i.e., store multiple values) “at the same time!” The number of qubits stored through this concept is called “superposition.” A quantum computer can store multiple numbers at once, and so to can process them simultaneously. Thus, instead of working in serial (doing a series of things one at a time in a sequence), it can work in parallel (doing multiple things at the same time). Finally, upon a command, the dual state “collapses” into one of its dual states and gives the answer to your question. This process of parallel computing would make computing millions of times faster than conventional computing [62]. Intel has already developed a simulated supercomputer. The next step is to make the qubits. It takes something like 5 trillion transistors to simulate 42 qubits. It likely requires 1 million or more qubits to achieve commercial relevance. But starting with a simulator, you can build the underlying architecture, compilers, and algorithms. Until there are physical systems that are a few hundred to a thousand qubits, it’s unclear exactly what types of software or applications a quantum computer will be able to run. But let’s remember this. The first transistor was introduced in 1947. The first integrated circuit followed in 1958. Intel’s first microprocessor, which had only about 2500 transistors, didn’t arrive until 1971. Each of those milestones was more than a decade apart. If 10 years from now, there are quantum computers that have a few thousand qubits, it would undoubtedly change the world in the same way the first microprocessor did [63].

3.4.7 Neuromorphic chips (“self-learning” microchips) Neuromorphic computing includes the production and use of neural networks to prove the efficacy of how the brain performs its functions, not just reaching decisions, but memorizing

50

Foundations of Artificial Intelligence in Healthcare and Bioscience

information and even deducing facts. The engineering of a neuromorphic device involves the development of components, hardware, and software, whose functions are analogous to parts of the brain and what such parts are believed to do. This concept represents a comprehensive analogy of the input, compilation, and inner (hidden) layers that constitute the ANN discussed previously. The goal is an entirely new class of computers capable of being “trained” (machine learning) to recognize patterns using far fewer inputs (e.g., big data and blockchain discussed below) than a digital neural network would require [64]. Neuromorphic chips do all the processing and functioning without having to send messages back and forth to the cloud. They function similarly to the human brain, conserving energy by only operating when needed. In recent years there has been more emphasis on developing software than hardware. But Prof. Irving Wladawsky-Berger, a Research Affiliate at MIT Sloan School of Management, says, “Neuromorphic computing and chips bring the much-needed evolution in computer hardware,” [65].

3.4.8 Application specific integrated circuit (ASIC) This semiconductor microchip is designed to be customized for specialized use as opposed to a generic processor. ASIC chips are part of any product that requires specific features that cannot be achieved by off-the-shelf-ICs. Examples of ASICs include chips that are designed to run particular devices such as hand-held devices (e.g., cell phones). They are used primarily to add specific features to a product to gain a competitive edge. As such, many companies are using these customized chips for their products [66]. The ASIC, however, can be used with or replaced by the FPGA.

3.4.9 Field-programmable gate array (FPGA) integrated circuit with hardware description language (HDL) FPGAs are hardware circuits that a user can program to carry out 1 or more logical operations. They are integrated circuits that are sets of circuits on a chip (an “array”). Those circuits, or arrays, are groups of programmable logic gates, memory, or other elements. With these arrays, a user can write software that loads onto a chip and executes functions. That software is later replaced or deleted, but the hardware chip remains unchanged. FPGAs are useful for prototyping application-specific integrated circuits (ASICs) or processors. The FPGA is reprogrammed until the ASIC or processor design is bug-free, and the actual manufacturing of the final ASIC can begin again. Intel uses this FPGA method to prototype new ASIC chips. In some cases, high-performance FPGAs outperform GPUs used to accelerate inference processing in analyzing large amounts of data for machine learning [67]. Hardware Description Language (HDL) is specialized software that can be used to describe and program FPGA digital circuits in a textual manner. Two types of HDL include Verilog and VHDL (VHSIC Hardware Description Language), both used to describe the structure and behavior of electronic circuits, and most commonly, FPGA digital logic circuits. HDL enables a precise, formal description of an electronic circuit that allows for the automated analysis and simulation of the circuit [68].

Chapter 3 • The science and technologies of artificial intelligence (AI)

Table 3–6

51

Natural language process (NLP) and natural language generation (NLG).

Type of NLP

Description/example

Simple terms

Speech recognition

Software that understands or transcribes spoken language (e.g., Dragon Naturally Speaking) Software that extracts information from written text (e.g., portions of IBM Watson) Software that produces narratives and reports, in easy-to-read language (e.g., Arria’s NLG Platform) Software that speaks or reads out text (e.g., CereProc, CereVoice)

Speech-to-text

Natural language understanding (NLU) Natural language generation (NLG) Speech synthesis

Text mining, text analytics Data in, language out Text-to-speech

3.5 Specialized AI systems 3.5.1 Natural language processing (NLP) Among the technologies of AI that make it truly more user-friendly and are having a profound effect on practical AI applications (in Section 2) are Natural Language Processing (NLP) and its associated Natural Language Generation (NLG) (discussed below and Table 36). NLP is a specialized software application using machine learning (ANN) and computational linguistics, enabling computers to understand and process human languages and to get computers closer to a human-level understanding of language. Recent advances in machine learning and ANNs have allowed computers to do quite a lot of useful things with natural language. Deep Learning (CNN) has also enabled the development of programs to perform things like language translation, semantic understanding, text summarization, and chatbots [69]. The general NLP process includes the following steps: 1. 2. 3. 4. 5. 6.

A user (human) talks to the machine (this input can be audio, text, video); The machine captures the audio; NLP algorithms convert the audio to binary code conversion; Processing of the code data (compilation process  see Chapter 2, page 16); Algorithms convert data to audio (this is part of the NLG process) The machine responds to the human by outputting the data as audio or text.

The NLP algorithms apply language-specific syntactic and semantic rules (language-specific) to produce the input source and convert it to computer code. Syntactic analysis assesses how the natural language input aligns with the grammatical rules to derive meaning from them. Semantic rules must analyze the meaning conveyed by a text by interpretation of words and how sentences are structured. Here, NLP also uses NLG algorithms to access databases to derive semantic intentions and convert them into human language output (Fig. 311). This complex, subjective process is one of the problematic aspects of NLP that is being refined. This challenging process is referred to as “natural language understanding (NLU)” and differentiates NLP from basic computing speech recognition (see Chapter 2, page 19) [70].

52

Foundations of Artificial Intelligence in Healthcare and Bioscience

FIGURE 3–11 Natural language processing (NLP) and natural language generations (NLG). Once NLP unlocks the verbal context (red) and translates it into human language (blue), NLG takes the output and analyzes the text in context (blue) and produces audio or text output (green).

Besides its powerful ability to integrate with big data, NLP provides some of the most common AI applications we all use, including: • Language translation applications (e.g., Google Translate); • Word Processors (e.g., Microsoft Word “Grammarly” for grammatical accuracy of texts); • Interactive Voice Response (IVR) applications used in call centers to respond to specific users’ requests; • Personal assistant applications, chatbots (e.g., Google, Siri, Cortana, Amazon’s Echo, and Alexa).

3.5.2 Natural language generation (NLG) Natural Language Generation (NLG) is the AI technology that analyzes, interprets, and organizes data into plain, written text or audio output (Fig. 311). It converts data into naturalsounding language, the way it is spoken or written by a human. Once NLP unlocks the context hidden in data and translates it into human language, NLG takes the output and analyzes the text in context. You can think of NLG and NLP engaged in a joint endeavor to provide readymade conversation [71]. NLG aids the machine in sorting through many variables and putting “text into context,” thus delivering natural-sounding sentences and paragraphs that observe the rules of English grammar. While AI machine learning can memorize all the words and grammatical rules in individual languages (most of which keep evolving), there is still a multitude of other things it must factor in attempting to produce natural language generation. Thus, as stated for NLP,

Chapter 3 • The science and technologies of artificial intelligence (AI)

53

FIGURE 3–12 Expert system. The basic structure of an expert system consists of a human expert (e.g., doctor) and knowledge engineer (e.g., related expert) as input (red); a knowledge base (related database[s]), inference engine (AI algorithm), explainable AI (AI algorithm) and user interface (NLP audio or text) as inner (hidden) layers (blue); and the user (from input, i.e., the doctor) as output recipient (green).

NLG (and NLU), albeit they are a significant benefit already in AI, they continue to be refined and improved from their current state [72].

3.5.3 Expert systems One of the most significant application of AI in the clinical aspects of health care delivery is the domain of “expert systems.” These are AI computer programs utilizing the deep learning process to analyze stored knowledge base(s) to deduce and provide options, alternatives, suggestions, advice to health care providers through “if/then” rules, inference reasoning and forward and backward chaining to a question, problem or strategy. This human interface activity is communicated “provider to computer” through NLP processing and “computer to provider” through NLG [73]. It is valuable to dive a little more deeply into expert systems as well as the following 2 items (Big Data Analytics and Blockchain) than we have into some other specialized systems. The applications of these 3 AI systems are of enormous importance in both the business, administrative, and the delivery of health care. You’ll be able to reference back to this section for review when we discuss them in Section 2, but an initial foundation in their structure and functions is worthwhile. (Additional discussion on Expert Systems related directly to health care is also found in Chapter 5, page 187.) The basic structure of an expert system consists of the following parts (Fig. 312) [74]. • The knowledge base used to store expert system expertize, including facts and rules. In the process of building a knowledge base, the knowledge base should be able to acquire new knowledge, expressing and storing knowledge in a way that the computer can accomplish. • The working memory is responsible for storing the input fact; • The reasoning machine matches the facts in the working memory with the knowledge and gets new information. The intermediate information obtained during processing is also stored in a storage unit;

54

Foundations of Artificial Intelligence in Healthcare and Bioscience

FIGURE 3–13 Forward and backward chaining. In forward chaining the inference engine follows the chain of conditions and derivations and finally deduces the outcome. In Backward Chaining, based on what has already happened, the inference engine tries to find conditions that could have occurred in the past for this result. Reproduced with permission of Data Flair, 2020.

• The interpreter is responsible for interpreting the results of the inference engine output, including explaining the correctness and reason of the conclusion; and • Finally, the human-computer interaction interface responds to the input user. The “Rule-Based Expert System,” the most common form of an expert system, starts with human experts working with “knowledge engineers” to create a “knowledge database.” This database stores both factual, exact information on the given subject matter as well as heuristic (trial and error, intuitive or “rule of thumb”) knowledge. The knowledge engineer then categorizes, organizes, and stores the information in the form of IF-THEN-ELSE rules, to be used by the “inference engine” (an algorithm). A potentially more powerful expert system can provide knowledge in a neural network (ANN). The weakness of such a deep learning approach is that the ANN is limited by its “training set” of stored knowledge and its inability to provide reasoning in an “explanation facility” or “explainable AI (XAI)” (see discussion in Chapter 1, page 10). The inference engine manipulates the knowledge database to arrive at a solution through forward and backward chaining. Forward chaining answers the question, “What can happen next?” Here, the Inference Engine follows the chain of conditions and derivations and finally deduces the outcome. In backward chaining, the expert system finds out the answer to the question, “Why this happened?” Based on what has already happened, the Inference Engine tries to find conditions that could have occurred in the past for this result (Fig. 313) [74]. One other interesting related system is called the Fuzzy Logic-Based Expert System, which refers to the indiscriminate nature of real things with a series of transitional states between them, without clear dividing lines. This occurrence is common in medical diagnosis versus Boolean logic wherein things are entirely true (having degree of truth or 1.0) or completely false (having a 0.0

Chapter 3 • The science and technologies of artificial intelligence (AI)

55

degree of truth). Fuzzy logic uses reasoning about inherently vague concepts, such as subjective or qualitative descriptions of disease (e.g., a complicated medical condition) [75]. Finally, the “user interface” provides a response to the user (usually not the original expert) through NLG or screen text. The responses in expert systems used in health care generally include an “explanation facility” (XAI) to justify the logic used in diagnostic decision-making. Of course, in medical expert systems, their value is directly proportional to the quality of the knowledge database (created by “human experts” and thus the risk of “garbage in  garbage out”). Only an XAI can objectively defend the information provided through the user interface. This approach is required in expert systems being used in health care today. Significant progress is being made in the development of “Explaining AI (XAI)” algorithms for all forms of AI applications that address this “black box” issue or “how was the outcome determined?” [76].

3.5.4 “Internet of things” (IoT) Beyond the universal Internet network is an additional “hybrid” system known as the “Internet of Things” (IoT), which is a system of interrelated computing devices, mechanical and digital machines, objects, even people and animals. These objects are provided with unique identifiers (UIDs) and the ability to transfer data over a network without requiring human-to-human or human-to-computer interaction [77]. All categories of industries (particularly health care, as discussed frequently in Section 2) are using IoT to operate more efficiently, better understand customers to deliver enhanced customer (patient) service, improve decision-making, and increase the value of the business (clinical care). A “thing” in the Internet of Things (IoT) can be a person with a heart monitor implant, an automobile that has built-in sensors to alert the driver when tire pressure is low or any other natural or human-made object that can be assigned an IP address and can transfer data over a network. The next level of IoT is a sensor network of billions of smart devices (e.g., smartphones) that connect people, systems, and other applications to collect and share data. In healthcare, IoT offers many benefits, including the ability to monitor patients more closely to use the data that’s generated and analyze it. Hospitals often use IoT systems to complete tasks such as inventory management for both pharmaceuticals and medical instruments. There is much more information and applications of IoT in Section 2 of this book.

3.5.5 Cyber-physical system (CPS) An addition or perhaps more appropriate, an enhancement of the IoT is a technology called the Cyber-physical System (CPS) [78]. CPS is the system of collaborating computer entities that are in comprehensive and accelerated connection with the surrounding physical world and its on-going processes. Furthermore, these systems provide and use data-accessing and data-processing services available on the Internet to achieve the ends mentioned above. CPS integrates with IoT (and cloud-computing plus an array of other technologies) to create what is now called “Industry 4.0” [79]. During the First Industrial Revolution,

56

Foundations of Artificial Intelligence in Healthcare and Bioscience

manufacturing production facilities were developed with the help of water and steam powers. During the Second Industrial Revolution, mass production was realized with the help of electrical energy. During the Third Industrial Revolution, electronic and information technologies were introduced that furthered increased production automation. The Fourth Revolution, or Industry 4.0, represents the current trend of automation technologies in the manufacturing industry, and mainly includes enabling technologies such as the cyberphysical systems (CPS), the Internet of Things (IoT) and cloud computing [80]. The healthcare counterpart to Industry 4.0 is called Health 4.0 [81]. It, too, uses cyberphysical systems (CPS), cloud computing, the internet of “Everything” including things, services, and people and evolving mobile communication networks (5G). These all produce the personalization of healthcare, a goal of modern health care. Much more about Health 4.0 and “personal health” is discussed in Section 2.

3.5.6 Big data analytics In the Introduction to this Section 1, I present a 2017 listing by Forbes Magazine of the “Five Technologies that will Disrupt Healthcare By 2020” [43]. Big Data Analytics was number 3 on the list. While we have touched on some of the others on the list and will expand on them in Section 2 of this book, Big Data Analytics may have the most profound influence on both the business, administrative, and clinical aspects of health and wellness and, thus, requires an in-depth description here. This full explanation will make you more comfortable and help you understand the magnitude of the effects of this special category of AI. Indeed, big data analytics is “disruptive” in a very positive way. For any data to be useful, it must be analyzed, interpreted, and then addressed. Thus, algorithms, not databases, and datasets become the critical factor in utilizing data most effectively. As we mentioned in the previous discussion on expert systems, computer-based algorithms in medicine are “expert systems” using rules, encoded knowledge, general principles (heuristics and deterministics), and apply them to diagnosis and treatment. Conversely, in machine learning, algorithms analyze vast numbers of variables, looking for combinations that reliably predict outcomes (e.g., regression analysis). But machine learning is most effective in handling enormous numbers of predictors and combining them in interactive ways. Machine learning allows the use of enormous amounts of new data whose volume or complexity would previously have made analyzing them unimaginable [82]. “Big data” is an evolving term that describes a large volume of structured, semistructured, and unstructured data that has the potential to be mined for information and used in machine learning projects and other advanced analytics applications. It is characterized by “The 10 [sometimes contracted to the first 5 or 3] Vs of Big Data” (Table 37), of which each term describes a specific property of Big Data Analytics that must be understood to capture the essence of the technology [83]. 1. VOLUME: HOW MUCH DATA IS THERE? It is commonly cited that 4.4 zettabytes of data existed globally in 2013. That number is set to grow exponentially to 44 zettabytes (44 trillion gigabytes) by 2020 as it more than doubles each year (Moore’s Law). Most of

Chapter 3 • The science and technologies of artificial intelligence (AI)

Table 3–7

57

The 10 Vs of Big Data.

1. VOLUME: How much data is there? 2. VELOCITY: How quickly is the data being created, moved, or accessed? 3. VARIETY: How many different types of sources are there? 4. VERACITY: Can we trust the data? 5. VALIDITY: Is the data accurate and correct?

6. VIABILITY: Is the data relevant to the use case at hand? 7. VOLATILITY: How often does the data change? 8. VULNERABILITY: Can we keep the data secure? 9. VISUALIZATION: How can the data be presented to the user? 10. VALUE: Can this data produce a meaningful return on investment?

this data is transient, but health care data such as clinical notes, claims data, lab results, gene sequences, medical device data, and imaging studies are information that must be retained. It becomes even more useful when combined in multiple ways to produce brand new insights. Thus, in health care, storage techniques are necessary, on-premises or in the cloud, to handle and retain large amounts of data. They must also ensure that the infrastructure can keep up with the next V on the list without slowing down other critical functions like EHR access or provider communications. 2. VELOCITY: HOW QUICKLY IS THE DATA BEING CREATED, MOVED, OR ACCESSED? Healthcare information accounts for a respectable proportion of the data transmitted in the world, and the figures continue to rise as the Internet of Things (IoT), medical devices, genomic testing, machine learning, natural language processing, and other data generation and processing techniques continue to evolve. Some of this data must update in real-time at the point of care (e.g., ICU) and be displayed immediately. Thus, system response time is a critical metric in health care. Defining which data sources require immediate access versus days, weeks, or months becomes a necessity. 3. VARIETY: HOW MANY DIFFERENT TYPES OF SOURCES ARE THERE? The types of health care information vary widely, but the more they integrate, the more insights can be garnered. Big data analytics addresses the integration of multiple datasets or ones too complex to be handled through traditional processing techniques. One of the most significant barriers to effective data management is the variety of incompatible data formats, non-aligned data structures, and inconsistent data semantics used throughout the health care system. Health IT developers are starting to address the problem by enlisting the help of application programming interfaces (APIs) and new standards such as Fast Healthcare Interoperability Resource (FHIR) technologies. 4. VERACITY: CAN WE TRUST THE DATA? Providers cannot utilize insights that may have been derived from data that is incomplete, biased, or filled with noise. A New York Times 2014 study [84] showed that data scientists in health care spend more than 60% of their time cleaning up data before being used. Data governance and information

58

5.

6.

7.

8.

9.

Foundations of Artificial Intelligence in Healthcare and Bioscience

governance are vital strategies that healthcare organizations must employ to ensure that their data is clean, complete, standardized, and always ready for use. VALIDITY: IS THE DATA ACCURATE AND CORRECT? Similar to veracity, the validity of data is a critical concern for clinicians and researchers. A dataset may be complete, but is its content and values correct, up to date, and was the information generated using accepted scientific protocols and methods? Who is responsible for curating and stewarding the data? Healthcare datasets must include accurate metadata that describes when, how, and by whom the data is created. This data helps to ensure that analysts understand one another, that their analytics are repeatable, and that future data scientists can query the data and find that for which they are looking. VIABILITY: IS THE DATA RELEVANT TO THE USE CASE AT HAND? Understanding which elements of the data are tied to predicting or measuring the desired outcome is essential for producing reliable results. To do this, organizations must understand what elements they have, are they robust enough to use for analysis and whether the results are genuinely informative or just an interesting diversion. Many predictive analytics (Chapter 4, page 99) projects focus on identifying innovative variables for detailing certain patient behaviors or clinical outcomes. These variables will no doubt be an ongoing process as more datasets become available. VOLATILITY: HOW OFTEN DOES THE DATA CHANGE? Healthcare data changes quickly, by the second in some cases, which raises the question of how long certain data is relevant, which historical metrics to include in the analysis, and how long to store the data before archiving or deleting it. As the volume of data continues to increase, these decisions become increasingly important based on the cost of data storage and complicated by the fact that HIPAA requires providers to retain specific patient data for at least 6 years. VULNERABILITY: CAN WE KEEP THE DATA SECURE? Speaking of HIPAA, data vulnerability has skyrocketed in the wake of multiple ransomware attacks and a litany of data breaches. Security is a priority in the healthcare industry, especially as storage moves to the cloud and data starts to travel between organizations as a result of improved interoperability. In 2016, close to a third of hospitals said they were spending more money on data security than in the previous year [85]. VISUALIZATION: HOW CAN THE DATA BE PRESENTED TO THE USER? Clinicians struggle with their electronic health record interfaces complaining about too many clicks, too many alerts, and not enough time to get everything done. These issues add to the complexity of information processing that is part of every clinician’s daily workflow and sour users further on the potential of health IT. Filtering clinical data intuitively helps to prevent information overload and may help to mitigate feelings of burnout among overworked clinicians. Other valuable tools include interactive dashboards for reporting financial, operational, or clinical metrics to end-users, online mapping tools to visualize public health concerns. Also available are a variety of new apps for desktops, tablets, and even smartphones giving users ways to interact with data more meaningfully.

Chapter 3 • The science and technologies of artificial intelligence (AI)

59

10. VALUE: CAN THIS DATA PRODUCE A MEANINGFUL RETURN ON INVESTMENT? Ultimately, the only reason to engage in big data analytics is to extract some value from the information at hand. Many healthcare organizations are still in the early phases of developing the competencies that allow them to achieve these goals and generate actionable insights that apply to real-world problems. But the value is there for those who adhere to strong data governance principles and take a creative approach to disseminate insights to end-users across organizations. In summary, Big Data is a term used to describe a collection of data that is huge and yet growing exponentially with time. In short, such data is so large and complex that none of the traditional data management tools can store it or process it efficiently. Big Data analytics is a process of using advanced algorithms and machine learning for examining, filtering, aggregating, and modeling large datasets (big data). This allows for the discovery of hidden patterns, trends, conclusions, and meaningful correlations, preferences between variables to retrieve intelligent insights from the data not achievable through human analyzes and to drive decisions. Applied to health care, this concept is advancing the areas of diagnostics, preventive medicine, precision medicine, population health, research, and cost controls [86]. All of these areas are discussed in detail in Section 2, Chapter 4 of this book.

3.5.7 Blockchain While different from both big data analytics and AI machine learning, the intersection and convergence of blockchain technology with AI and big data may prove to be one of the strongest (and most disruptive) advances in the future of health care. An understanding of blockchain technology, combined with its applications with AI and big data, quickly demonstrates the potential of these combined technologies. Blockchain is a distributed database existing on multiple computers at the same time. It is continually growing as new sets of recordings, or ‘blocks,’ are added to it. Each block contains a timestamp and a link to the previous block, so they form a chain (ergo, “blockchain”). Any particular entity does not manage the database, but rather, everyone in the network gets a copy of the whole database. Old blocks are preserved forever, and new blocks are added to the ledger, irreversibly making it impossible to manipulate the ledger by faking documents, transactions, and other information. All blocks are encrypted uniquely so that everyone can have access to all the information. Still, only a user who owns a special cryptographic (coded) key can add a new record to a particular chain. As long as you remain the only person who knows the key, no one can manipulate your transactions. Also, cryptography (writing or solving codes) is used to guarantee the synchronization of copies of the blockchain on each computer (or node) in the network. Blockchain is commonly associated with the cryptocurrencies and bitcoin, but in the context of health care, we can think of blockchain as a digital medical record. Every record is a block which has a label stating the date and time when the document is entered. Neither the doctor nor the patient should be able to modify the records already made. Nevertheless, the

60

Foundations of Artificial Intelligence in Healthcare and Bioscience

doctor owns a private key that allows him/her to make new records, and the patient holds a public key that enables them to access the files anytime. This method makes the data both accessible and secure [87]. Blockchain can create a mechanism to manage large quantities of medical information (e.g., medical records, EHRs, insurance data) stored in the cloud. This method of managing big data will increase interoperability while maintaining privacy and security of health care data. It contains inherent integrity and conforms to strict legal regulations [88]. Increased interoperability is beneficial in health care management and health outcomes [89]. For perspective, health-related blockchain spending was estimated to be about $177 million in 2018 but is expected to rise to more than $5.6 billion by 2025. Blockchains are causing old systems to evolve, increasing efficiency while cutting down on costs across the healthcare industry [90].

3.5.8 Robotics Last but certainly not least among the “Specialized AI Systems” is one of AI’s most exciting and disruptive applications in all areas of life, from games to motor vehicles to health care. However, it is important to note upfront that robotics and artificial intelligence are not the same things at all. The 2 fields are almost entirely separate. This difference is classically illustrated in textbooks on both technologies through a Venn diagram (Fig. 314). The overlap between the 2 technologies represents the category of “Artificially Intelligent Robots.” In other words, artificially intelligent robots are simply robots that are controlled by AI programs [91]. As with any complex topic, definitions can be challenging to capture the full meaning of specific terms and concepts. Nonetheless, a few words do require general definitions in the field of robotics. Thus, a good “general” definition of a robot is a programmable machine that physically interacts with the world around it and is capable of carrying out a complex series of actions autonomously (self-governing) or semi-autonomously [92]. The “general” definition of robotics is an interdisciplinary branch of engineering and computer science (AI) that deals with the design, construction, operation, and use of robots, as well as computer systems for their control (“artificial intelligent robots”), sensory feedback, and information

FIGURE 3–14 Artificial intelligent robots. Artificial intelligent robots are simply robots that are controlled by AI programs. This difference is illustrated through this Venn diagram. The overlap between the two technologies represents the category of “Artificial Intelligent Robots.”

Chapter 3 • The science and technologies of artificial intelligence (AI)

61

processing [93]. And one more relevant term, “robotic process automation” (RPA) is defined as the use of software to mimic human actions to perform a sequence of steps, leading to meaningful activity, without any human intervention [94]. Robots are loosely defined as electromechanical devices that are actuated by specific input agents like sensors or transducers and are guided to work by computer circuitry with the standard input layer, inner layer (processor in this case) and output layer. The inputs to the robots are via sensors; the CPU unit does the processing; then, the desired mechanical action output is obtained. The sensory inputs that the robot takes can be anything from smell, touch, visual differences, voice (NLP). The central processing unit is the microprocessor or microcontroller that processes this input quantity, searches for the corresponding function to perform from the previously programmed instruction set, and then sends the signal on to the output port. Upon reception of this signal, the robot will perform the desired action. The input to the robot will be via sensors and transducers which might include (but not be limited too): • • • • • • •

Contact/touch sensors Temperature sensors Light sensors Sound sensor Proximity sensor Distance sensor Pressure sensor

The processing unit is a microcontroller or microprocessor (described earlier in “AI Software,” page 36). This choice depends on the driving load for the robot, as will the output unit. The most common output units include (but are not limited to): • • • •

Actuators Relays Speakers CD screens

Based on the robotic applications, the actuating sources change. For wired applications, cables and wires are used, while for wireless robots, RF (radio frequency), RFID (radio frequency identification), Wi-Fi, DTMF (Dual Tone Multi-Frequency), technologies are used. Wireless technology is used in most robots today, from autonomous, semi-autonomous to humanoid robots. The working of the wireless robot is similar to the wired robot, except for the change in the circuitry usually transmitter-receiver pair for wireless technology [95]. The overlap in the Venn diagram in Fig. 314, represents the category of “Artificial Intelligent Robots.” A traditional robot is a machine able to perform repetitive, highly specialized tasks (e.g., industrial robots). However, an “AI Intelligent Robot” using Machine Learning can extract information from the surrounding environment to make meaningful decisions (a “behavior-based” robot).

62

Foundations of Artificial Intelligence in Healthcare and Bioscience

Two examples of how Machine Learning can improve the performance of robotics systems are Multi-Robots Systems and Swarms. Programming collective behaviors using traditional programming can become an extremely challenging task. Using Machine Learning (Reinforcement Learning) makes it easier and possibly leading to more innovative solutions not previously considered. The main differences between Multi-Robots systems and Swarms are that the former has global knowledge of the environment. Robotic systems can have a centralized architecture. The latter, swarms, don’t have a global understanding of the environment and use a decentralized architecture. Deep Learning algorithms can provide even more significant benefits in robotics by using Multi-Layer Artificial Neural Networks (ANNs). They can perform incredibly well in tasks such as image recognition, which have critical applications in the robotic system’s vision. One problem area for using Deep Learning algorithms in robotics is the current inability to track the algorithm’s decision-making process fully. This problem is addressed and corrected by “explanation facilities” [96]. Another variant of robotics is the AI “creation” of “chatbots.” (Lots more about chatbots in Chapter 5, page 185.) These are computer programs that simulate human conversation through voice commands or text chats or both. A chatbot (short for chatterbot) is an AI feature that can be embedded and used through any major messaging applications. There are several synonyms for the chatbot, including “talkbot,” “bot,” “IM bot,” “interactive agent,” “conversation or virtual assistants”: or “artificial conversation entity” [97]. A chatbot that functions through machine learning is programmed to self-learn as it introduces new dialogues and words. In effect, as a chatbot receives new voice or textual conversations through NLP, the number of inquiries that it can reply to (NLG) and the accuracy of each response it gives increases. Some examples of chatbot technology are virtual assistants like Amazon’s Alexa, Google (Home) Assistant, Apple’s Siri and messaging apps, such as WeChat and Facebook messenger. A discussion of robotics in this day and age would not be complete without the mention of the controversial yet compelling subject of “self-driving (driverless or autonomous) vehicles.” Relevance of this topic to the theme of this book, i.e. AI and health care can only be justified in 2 ways. First, it is highly likely that all of your interactions with the health care system in the future will include you need for transportation and undoubtedly that will include the use of a hybrid, robotic vehicle. The second justification for a discussion on autonomous vehicles (the first being pretty “thin”), is simply that the subject is “pretty cool,” especially when talking about disruptive technologies. By definition, self-driving (autonomous) vehicles have been classified into 6 levels published in 2014 by the Society of Automotive Engineers International and officially adopted as the standard for autonomous vehicle technology in 2016 by the National Highway Transportation Safety Administration [98]. The 6 levels include [99] 1. Self-Driving Car Automation Level 0: No automation; 2. Self-driving car automation level 1: Some autonomous functions (e.g., cruise control, automatic braking), but driver assistance always required;

Chapter 3 • The science and technologies of artificial intelligence (AI)

63

3. Self-Driving Car Automation Level 2: Partial automation includes pre-programmed or fixed scenarios, but the driver must monitor the environment and keep hands on the wheel at all times; 4. Self-Driving Car Automation Level 3: Conditional autonomy where a car can safely control all aspects of driving in a mapped environment, but a human is still required to be present, monitoring and managing changes in road environments or unforeseen scenarios; 5. Self-Driving Car Automation Level 4: High automation. where the car has self-driving automation with no driver interaction needed and the car stops itself if the systems fail; 6. Self-Driving Car Automation Level 5: Fully autonomous where program controls the destination with no human involvement nor the ability to intervene. Vehicles will have no steering wheels nor gas and brake pedals. So how will these increasing levels of automation be achieved? Autonomous vehicles are fitted with numerous sensors, radars, and cameras to generate massive amounts of environmental data. All of these data form the complex similar to the human brain’s “sensorium” or totality of neurological centers that receive, process, and interpret sensory stimuli (analogous to the neural networks mimicking neuroscience as described in previously in this Chapter). These stimuli allow the autonomous vehicle to “see, hear and feel” the road, road infrastructure, other vehicles, and every other object on/near the road, just like a human driver. This data is processed with on-board-computers, and data communication systems are used to securely communicate valuable information (input) to the autonomous driving cloud platform (e.g., Tesla’s “Autopilot” [100], Waymo’s Lidar et al. [101]). The autonomous vehicle communicates the driving environment and/or the particular driving situation to the Autonomous Driving Platform. The Autonomous Driving Platform uses AI algorithms as an “intelligent agent” to make meaningful decisions. It acts as the control policy or the brain of the autonomous vehicle. This intelligent agent connects to a database that acts as a memory where past driving experiences are stored. This data, along with the real-time input coming in through the autonomous vehicle and the immediate environment around it, helps the intelligent agent make accurate driving decisions. The autonomous vehicle now knows precisely what to do in this driving environment and/or particular driving situation. Based on the decisions made by the intelligent agent, the autonomous vehicle detects objects on the road, maneuvers through the traffic without human intervention, and gets to the destination safely. Autonomous vehicles are also being equipped with AI-based NLP, NLG, gesture controls, eye tracking, virtual assistance, mapping, and safety systems, to name a few. These functions are also carried out based on the decisions made by the intelligent agent in the Autonomous Driving Platform. The driving experiences generated from every ride are recorded and stored in the database to help the intelligent agent make much more accurate decisions in the future. This data loop, called Perception-Action Cycle, takes place repetitively. The more the number of Perception-Action Cycles takes place, the more intelligent the intelligent agent

64

Foundations of Artificial Intelligence in Healthcare and Bioscience

becomes, resulting in a higher accuracy of making decisions, especially in complex driving situations. AI, especially neural networks and deep learning, has become an absolute necessity to make autonomous vehicles function safely and adequately. AI is leading the way for the launch of Level 5 autonomous vehicles, where there is no need for a steering wheel, accelerator or brakes [102].

3.6 Sample AI scenarios Given the enormous amount of information presented in this Chapter, it may be time to summarize (at least some of it) in a practical (and maybe somewhat fun) manner. So, to illustrate the concepts and workings of AI, let’s present 2 sample case studies. The first case study will be the application of AI in answering a profound dilemma of the ages, “Why is the Mona Lisa smiling?”; and the second will be, a personal struggle and resolution in, “The ‘Great Steak’ experience.”

3.6.1 “Why is the Mona Lisa smiling?” [103] One of the more elegant examples of how AI artificial neural networking (ANN) replicates the human brain is the human visual system [104] and the sensory domain of vision and sight. This case study (Fig. 315) uses Leonardo da Vinci’s masterpiece, “The Mona Lisa,” as the sensory visual stimulus (supervised, labeled information) and asks the AI computer the classic question, “Why is she smiling?” Relevant information regarding the question is provided through a keyboard (or audio NLP) input. An image of the painting allows the retina to collect wavelengths of light (in “pattern

FIGURE 3–15 Case study (example) of “Why is Mona Lisa smiling?” (input layer). Leonardo da Vinci’s masterpiece, “Mona Lisa” can serve as the sensory visual input stimulus (red) to ask the AI computer the classic question, “Why is she smiling?”.

Chapter 3 • The science and technologies of artificial intelligence (AI)

65

recognition” from the painting). It transmits them through optic radiations (#1 [wavelength colored] arrows in Fig. 315) to the lateral geniculate nucleus (LGN). This process is analogous to the supervised user input portion (GPU pattern recognition) of an AI software platform in machine learning. Acting as a relay station, neurons from the LGN send signals (#2 [blue] arrows in Fig. 316) to the subcortical limbic hippocampus and the amygdala (neural center for emotions and memory) creating neural layers of data (light frequency, intensity, patterns, configuration). This labeled information is “neural networked” (#3 [blue] arrows in Fig. 316) to relevant, preexisting, higher cortical neural layers and their unsupervised, unlabeled data related directly and indirectly to the user’s question(s), in this case, vision, art and specifically, the Mona Lisa. This neural networking is analogous to AI’s API computer software probing knowledge databases to address the “smile” dilemma. These collective stimuli are analyzed and compared (by inference deductions, heuristics, probabilities) with the supervised, first level, labeled light wavelength data. Finally, the nerve impulses from these subcortical and cortical levels are transmitted through associated cortical and optic radiations (#4 [green] arrows in Fig. 317) to the visual cortex (area V1 and V5). There, they are cognitively interpreted (pattern recognition) through cerebral reinforcement and “memory networking.” This process corresponds to the inference engine “deep learning” level in AI, which yields logical decisions, behavioral, situational, emotional, environmental, and all-inclusive interactions [105]. Through these logical inference rules, the brain and an AI computer can evaluate reasonable deductions, probabilities, and conclusions to the question, “Why is she smiling?” (Fig. 318). And AI’s answer would probably be the same as that of all the art experts over the centuries. “Only Leonardo knows for sure.”

FIGURE 3–16 Case study (example) of “Why is Mona Lisa smiling?” (inner layer). Labeled information is “neural networked” (blue number 3 arrows) to relevant, preexisting, higher cortical neural layers and their unlabeled data related directly and indirectly to the user’s question (“Why is she smiling?”).

66

Foundations of Artificial Intelligence in Healthcare and Bioscience

FIGURE 3–17 Case study (example) of “Why is Mona Lisa smiling?” (processing). Nerve impulses from the subcortical and cortical levels are transmitted through associated cortical and optic radiations (green number 4 arrows) to the visual cortex (area V1and V5).

FIGURE 3–18 Case study (example) of “Why is Mona Lisa smiling?” (output). Through logical inference rules (green), the brain and an AI computer can evaluate reasonable deductions, probabilities, and conclusions to the question, “Why is she smiling?” And most likely, Explainable AI (XAI) will conclude as have art experts over the years, “Only Leonardo knows for sure.”.

Beyond this example of sensory experience (vision and sight) correlated with AI machine learning and neural networking (ANN), an infinite amount of scenarios exist, from the most serious to the sublime, as in the next case study.

Chapter 3 • The science and technologies of artificial intelligence (AI)

67

3.6.2 The “great steak” experience [106] AI can reach all provinces of human behavior from the practical to theoretical to intellectual to behavioral and even to the sensory and emotional. This next case study presents a simple illustration of an AI expert system involving multiple domains. The example also demonstrates the 3 levels of machine learning: supervised, unsupervised, and reinforcement learning operating in a scenario of simultaneous sensory, emotional, and practical applications with logical conclusions and results. The first time you taste a great steak and appreciate its flavor, your brain labels “eating a great steak” as a positive gastronomic experience (let’s use the acronym PGE). Your brain has just developed a neural layer of “labeled data” through a “supervised learning” experience (you tasting the steak). Similarly, AI computer input software can create a simulated digital program of your brain’s supervised neural layer by analyzing the texture, characteristics, appearance, chemical compounds, fats, and protein that constitute a “great steak.” The software program “labels” and remembers this data as a PGE. After a couple of “great steak PGEs,” your brain starts asking you, “As much as I enjoy this PGE, can I afford it?” At this stage, the process called “neural networking” begins to develop in your brain using the hippocampus, amygdala, and other preexisting cortical neural layers containing stored financial and economic information and knowledge. This neural networking isn’t adding new, labeled data (as in supervised learning) but rather interconnecting (analogous to API software) the previously labeled, supervised (PGE) neural layer with other preexisting “unlabeled data” in limbic and cortical neural layer(s). This networking is considered “unsupervised learning” (also referred to as neuroplasticity). Computer API software simulates this unsupervised learning experience by creating (and using preexisting) decision-making economic and financial datasets that can analyze the “great steak PGE” against traditional and variable economic norms. With each additional “great steak PGE,” similar to your brain, inference engine computer algorithm(s) begin to probe (using Bayesian logic) additional direct and indirect related economic considerations and datasets. This computer activity (“neural training”) is the actual beginning of the deep learning AI process. Finally, the dilemma created by the unsupervised PGE learning experience (“. . .can I afford it?”) causes your brain’s hippocampus, amygdala, and cortical layers to assess your budgetary priorities. This assessment leads to a “reinforcement learning” experience with a positive emotional and practical result. The question, “Can I afford it (PGE)?” leads you to logical “if/then” calculations (by inference, heuristics, forward and backward chaining). “If I reorganize my budgetary efficiencies (neuroplasticity) to allow me to enjoy an occasional PGE - then I can enjoy more ‘great steak PGEs.’” Using random access memory (RAM), the computer’s sensory input and inference engine software can utilize multiple direct and indirect related (and unrelated) datasets. That process enables analysis and adjustments (“backward chaining”) of the complex economic issues needed to answer the question. The result of such AI reinforcement learning leads to the most significant benefits (“reward”) to you, including an organized, balanced budget, and more “great steak PGEs.”

68

Foundations of Artificial Intelligence in Healthcare and Bioscience

These are just 2 scenarios, albeit lighthearted, from among a virtually infinite amount of AI storylines. Machine and deep learning may be able to “train,” assimilate and continue to expand its knowledge base to address virtually all of life’s issues and questions. Such possibilities also include any subjective and objective (serious) areas, such as health and wellness. Section 2 of this book concentrates on health care scenarios. It provides a plethora of examples and applications of current and future AI methods and strategies in the area of bioscience, health and wellness. And so, we come to the end of Section 1. I hope it has helped your understanding of the science and technologies of AI. All of the discussions have been in general terms to provide you with a basis for the more specific health-related discussions in Section 2. In that Section, we become rather specific in applying the numerous AI concepts and programs that are changing (“disrupting”) the business and administration of health care. Perhaps of more value to some will be an expansion of your understanding of and benefits from the clinical aspects of health and wellness care. I hope you enjoyed Section 1, and I hope Section 2 becomes your AI health and wellness guide and reference text for years to come. Thank you.

References [1] Merriam-Webster Dictionary. Merriam-Webster Inc.; 2019. [2] Shapshak P. Artificial Intelligence and brain. Bioinformation 2018;14(1):3841. , https://doi.org/ 10.6026/97320630014038 . . [3] Dormehl L. What is an artificial neural network? Here’s everything you need to know. Digital Trends 2019. [4] Grafman J. A glossary of key brain science terms. The Dana Foundation; 2106. [5] Karpathy A. Convolutional neural networks for visual recognition. Stanford.edu. cs231n.github.io.; 2019. [6] Diaa AMA. The mathematical model of the biological neuron. The Bridge: Experiments in Science and Art, Virtual Residence, Stefanos & Diaa group, 11th week, posted on November 24, 2018. [7] Zimmer C. 100 trillion connections: new efforts probe and map the brain’s detailed architecture. Sci Am 2011. [8] Miikkulainen R, Liang J, Meyerson E, et al. Artificial intelligence in the age of neural networks and brain computing. Chapter 15 - Evolving deep neural networks. Academic Press; 2019. p. 293312. [9] Saha S. A comprehensive guide to convolutional neural networks — the ELI5 way. Data Science; December 15, 2018. [10] Dormehl L. What is an artificial neural network? Here’s everything you need to know. Digital Trends; 2019. [11] Algorithms. Wikipedia. Edited on April 25, 2019. [12] Li H. Which machine learning algorithm should I use? SAS Blog; April 12, 2017. [13] Dasgupta A, Nath A. Classification of machine learning algorithms. Int J Innovative Res Adv Eng (IJIRAE) 2016;3 ISSN: 23492763 Issue 03. [14] Prabhakar A. The merging of humans and machines, is happening now. DARPA; January 27, 2017.

Chapter 3 • The science and technologies of artificial intelligence (AI)

69

[15] Beal V. The difference between the internet and the World Wide Web. Webopedia; August 7, 2018 [16] Gallo A. A refresher on regression analysis. Harvard Business Review. No. 4; 2015. [17] McElwee K. From math to meaning: Artificial intelligence blends algorithms and applications. Princeton University. January 2, 2019. [18] Prabhakar A. Mathematics for AI: All the essential math topics you need. Essential list of math topics for Machine Learning and Deep Learning. Towards Data Science. August 9, 2018. [19] Deisenroth MP. Faisal AA. Ong CS. Mathematics for Machine Learning. Cambridge University Press. July 3, 2020. [20] Software terms: framework definition. TechTerms; 2013. [21] Shalev-Shwartz S, Ben-David S. Understanding machine learning: from theory to algorithms. Cambridge: Cambridge University Press; 2014. [22] Paruthi A. Artificial intelligence hardware. Medium; December 16, 2018. [23] Brownlee J. Supervised and unsupervised machine learning algorithms. Machine Learning Mastery; March 16, 2016. [24] LeCun Y, Bengio Y, Hinton G. Deep learning. Nature 2015;521:43644. [25] Brownlee J. Understanding machine learning algorithms. Machine Learning Mastery; March 16, 2016. [26] Pereira JA, Martin H, Acher M, et al. Learning software configuration spaces: a systematic literature review. GroundAI. arXiv:1906.03018v1. June 7, 2019. [27] Polamuri S. Supervised and unsupervised learning. Datasprint; September 19, 2014. [28] Witten I, Eibe F, Hall M. Data mining. Burlington, MA: Morgan, Kaufmann; 2011. p. 1023. ISBN 978-012-374856-0. [29] Liang P. Semi-supervised learning for natural language. MIT Press; 2005. p. 4452. [30] Soni D. Supervised vs. unsupervised learning. Medium; March 22, 2018. [31] Wen I. Data mining process in R language. Computer Language. October 19, 2018. [32] Sarmah H. Understanding association rule learning & its role in data mining. Analytics India Mag; February 18, 2019. [33] Sarmah H. Understanding association rule learning & its role in data mining. ProLearning. February 18, 2019. [34] Piech CK. Means. Stanford University CS221; 2013. [35] Reddy C. Understanding the concept of hierarchical clustering technique. DataScience; December 10, 2018. [36] Dorairaj S. Hidden Markov models simplified. Medium; March 20, 2018. [37] Mayo M. 5 things to know about machine learning (18:n11). KDnuggets; 2018. [38] Sutton RS, Barto AG. Reinforcement learning an introduction. The MIT Press; 2012. [39] Fan S. Google’s new A.I. gets smarter thanks to a working memory. Singularity Hub; November 01, 2016. [40] Fan S. DeepMind’s new research on linking memories, and how it applies to AI. SingularityHub; September 26, 2018. [41] Admin. Meet the amygdala of the self-driving car. AutoSens; September 12, 2017. [42] Trafton A. How the brain switches between different sets of rules. MIT News; November 19, 2018. [43] Press G. The brute force of IBM deep blue and Google DeepMind. Forbes; February 7, 2018. [44] Bhatt S. Things you need to know about reinforcement learning. KDNuggets; March 2019.

70

Foundations of Artificial Intelligence in Healthcare and Bioscience

[45] ADL. An introduction to Q-Learning: reinforcement learning. Medium; September 3, 2018. [46] Bhartendu. SARSA reinforcement learning. MathWorks. version 1.0.0.0 (117 KB). [47] Hinzman L. Deep-Q-Networks explained. Toward Data Science; February 15, 2019. [48] Lin H. Chapter 11  Bridging the logic-based and probability-based approaches to artificial intelligence. Academic Press; 2017. p. 21525. , https://doi.org/10.1016/B978-0-12-804600-5.00011-8 . . [49] Suchow JW, Bourgin DD, Griffiths TL. Evolution in mind: evolutionary dynamics, cognitive processes, and bayesian inference. Science Direct. Elsevier 2017;21(7):52230. , https://doi.org/10.1016/j. tics.2017.04.005 . . [50] Kinser PA. Brain structures and their functions. Serendip; May 2, 2018. [51] Singh H. Hardware requirements for machine learning. eInfochips; February 24, 2019. [52] Martindale J. What is RAM? Digital Trends; February 3, 2019. [53] RAM. Computer hope. April 2, 2019. [54] Martindale J. What is a CPU? Digital Trends; March 8, 2018. [55] Silwen. What are the main functions of a CPU? TurboFuture; January 21, 2019. [56] Alena. GPU vs. CPU computing: what to choose? Medium; February 8, 2018. [57] Litjens G, Kooi T, Bejnordi BE, et al. A survey on deep learning in medical image analysis. Med Image Anal 2017;42:6088. [58] Graphics Processing Unit (GPU). Nvidia; March 26, 2016. [59] Leaton R. APU vs. CPU vs. GPU. Which one is best for gaming? WePC; January 4, 2019. [60] Johnson K. Google cloud makes pods with 1,000 TPU chips available in public beta. LogMeIn; May 7, 2019. [61] Rouse MAI. Accelerators. SearchEnterpriseAI; April 2018. [62] Woodford C. Quantum computing. ExplainThatStuff; March 26, 2019. [63] Greenemeier L. How close are we  really - to building a quantum computer? Scientific American; May 30, 2018. [64] Fulton S. What neuromorphic engineering is, and why it’s triggered an analog revolution. ZDNet; February 8, 2019. [65] Muslimi M. Are neuromorphic chips the future of AI and blockchain? Hackernoon; April 2, 2019. [66] Understanding ASIC development. AnySilicon; October 23, 2017. [67] Touger E. What is an FPGA and why is it a big deal? Prowess; September 24, 2018. [68] Wood AM. Get started with VHDL programming: design your own hardware. Who is hosting this? February 12, 2019. [69] Seif G. An easy introduction to natural language processing. Data Science; October 1, 2018. [70] Garbade MJ. A simple introduction to natural language processing. Medium; October 15, 2018. [71] Ghosh P. The fundamentals of natural language processing and natural language generation. Dataversity; August 2, 2018. [72] Joshi N. The state of the art in natural language generation. Allerin; April 8, 2019. [73] ArseneIoan O, DumitracheIoana M. Expert system for medical diagnosis using software agents. Science Direct. doi.org/10.1016/j.eswa.2014. 10.026. [74] Team. What is expert system in artificial intelligence  how it solve problems. Dataflair; November 15, 2018.

Chapter 3 • The science and technologies of artificial intelligence (AI)

71

[75] Tan H. A brief history and technical review of the expert system research IOP Conf. Ser.: Mater. Sci. Eng. 242 012111. 2017. [76] Tjoa E, Guan C. A survey on explainable artificial intelligence (XAI): towards medical XAI. Cornell University. arXiv:1907.07374 [cs.LG]; October 15, 2019. [77] Rouse M. The Internet of Things (IoT). TechTarget; March 2019. [78] Hermann M, Pentek T, Otto B. Design principles for industrie 4.0 scenarios. In: Proceedings of 2016 49th Hawaii International Conference on Systems Science, January 58, Maui, Hawaii; 2016. doi:10.1109/HICSS.2016.488. [79] Li DX, Eric LX, Ling L. Industry 4.0: state of the art and future trends. Int J Prod Res 2018;56 (8):294162. , https://doi.org/10.1080/00207543.2018.1444806 . . [80] GTAI (Germany Trade & Invest). Industries 4.0-smart manufacturing for the future. Berlin: GTAI; 2014. [81] Thuemmler C, Bai C. Health 4.0: how virtualization and big data are revolutionizing healthcare. Basel: Cham; Springer; 2017. [82] Mullainathan S, Spiess J. Machine learning: an applied econometric approach. J Economic Perspect 2017;31(2):87106. , https://doi.org/10.1257/jep.31.2.87 . . [83] Bresnick J. Understanding the many V’s of healthcare big data analytics. HealthITAnalytics; June 05, 2017. [84] Lohr S. For big-data scientists, ‘Janitor Work’ is key hurdle to insights. New York Times; 2014. [85] Heath S. More hospitals invest spending in healthcare data security. HealthITSecurity; February 1, 2016. [86] Catalyst. Healthcare big data and the promise of value-based care. NEJM; January 1, 2018. [87] Omelchenko D. What is blockchain in layman’s terms. ihodl.com. September 4, 2017. [88] Zaria A, Ekblaw A, Vieira T, Lippman A. Using blockchain for medical data access and permission management. In: 2nd International Conference on Open and Big Data (OBD); August 2224, 2016. [89] Anuraag A, Vazirani AA, Odhran O’Donoghue O, Brindley D. Implementing blockchains for efficient health care: systematic review. J Med Internet Res 2019. [90] Morrissey D. How blockchain technology is helping the healthcare industry evolve. Electronic Communications Network (ECN) Magazine; 2019. [91] Owen-Hill A. What’s the difference between robotics and artificial intelligence? Robotiq; July 19, 2017. [92] Nichols G. Robotics in business: everything humans need to know. Robotics; July 18, 2018. [93] , https://en.wikipedia.org/wiki/Robotics.; 2019. [94] Kappagantula S. Robotic process automation  all you need to know about RPA. Edureka; May 22, 2019. [95] Swetha B. A simple explanation of how robots work. TechSpirited; December 17, 2018. [96] Ippolito PP. Need for explainability in AI and robotics. Toward Data Science; April 18, 2019. [97] Frankenfield J. Chatbot. Investopedia; June 26, 2019. [98] National Highway Transportation Safety Administration (NHTSA). Automated driving systems. Federal Registry. ,https://transportation.gov/.; November 23, 2018. [99] Rhodes MG. Self-driving cars explained. Dryve; November 28, 2017. [100] , https://www.tesla.com/support/autopilot.. [101] Ohnsman A. Self-driving unicorn aurora, backed by Amazon, is buying a laser lidar maker. Forbes; May 23, 2019. [102] Gadam S. Artificial intelligence and autonomous vehicles. Medium; April 19, 2018.

72

Foundations of Artificial Intelligence in Healthcare and Bioscience

[103] Catania LJ, Nicolitz E. Artificial intelligence and its applications in vision and eye care (Chapter 2). In: Advances in optometry and ophthalmology 2018 textbook. pp 910, 2452-1760/18/a 2018. Elsevier Inc. , https://doi.org/10.1016/j.yaoo.2018.04.001 . . [104] Garvert MM, Frston KJ, Dolan RJ, et al. Part 2: Subcortical amygdala pathways enable rapid face processing. Science Direct, Vol. 102. Elsevier; 2014. p. 30916. [105] Mujica-Parodi LR, Jiook Cha J, Gao J. From anxious to reckless: a control systems approach unifies prefrontal-limbic regulation across the spectrum of threat detection. Frontiers in Systems Neuroscience, Vol. 11. Article 18; April 2017. [106] Catania LJ, Nicolitz E. Artificial intelligence and its applications in vision and eye care (Chapter 2). In: Advances in optometry and ophthalmology 2018 textbook. pp 1113, 24521760/18/a 2018. Elsevier Inc. , https://doi.org/10.1016/j.yaoo.2018.04.001 . .

SECTION

II Artificial intelligence (AI): Applications in health and wellness Nothing in medicine ever comes easy, and all of the intelligence in the world, artificial or not, won’t change that. Jason Moore, Director of the University of Pennsylvania’s Institute for Biomedical Informatics, Perelman School of Medicine

Introduction We are all interested in health and wellness for ourselves personally, our loved ones, and for humanity. We tend to think of health as defined by the World Health Organization (WHO) since 1948 as “. . .a state of complete physical, mental, and social well-being and not merely the absence of disease or infirmity” [1]. Although there is no consensus on the concept of wellness and well-being, there is general agreement that, at a minimum, “. . .well-being includes the presence of positive emotions and moods (e.g., contentment, happiness), the absence of negative emotions (e.g., depression, anxiety), satisfaction with life, fulfillment and positive functioning” [2]. Beyond a personal interest in health and wellness, we must also consider the concept of “public health.” It is “the art and science of preventing disease, prolonging life, and promoting health through the organized efforts and informed choices of society, organizations, public and private communities, and individuals” [3]. Sadly and acutely, the COVID-19 pandemic (Chapter 8) has made the world painfully aware of the devastating implications and direct influence the science of epidemiology and discipline of public health have on our lives. By this definition, we begin to understand that our personal health and wellness is dependent on a system greater than any one individual. Indeed, it is a system in which the whole is greater than the sum of its parts. It is a public system whose goal is to deliver the best care to each individual it serves. That enormous collective goal has created the largest “business” in the world, the business of health care. In the United States alone, health care expenditures in 2018 exceeded $3.65 trillion [4]. Projecting what that expenditure may rise to over the next ten years is not an exercise about which anyone wants to think. But we must. Governmental agencies (GOV) and non-governmental organizations (NGOs) that manage the business and administration of health care in the United States and globally face the formidable challenge of dealing with the economics, financing, access, availability, and delivery of 73

74

Foundations of Artificial Intelligence in Healthcare and Bioscience

health care. They also must consider the quality of personal and public health and wellness care we all seek and expect. These agencies’ toolkits vary widely, including ever-increasing technologies and frameworks, programs, and systems. But the vast array of demographics (e.g., geographic, age, socioeconomics, cultural) and epidemiologic factors (e.g., etiologies of diseases, risk factors, patterns, frequencies) related to health and wellness are overwhelming. Without expanding and “disruptive” technologies, these dedicated and hardworking GOV and NGO agencies will continue to lose ground to the ever-increasing challenges they face. Enter artificial intelligence! Certainly not a “turnkey solution” to the immense task of managing the business of health care, but AI “. . .promises to be truly life-changing, i.e., disruptive. From personal care to hospital care to clinical research, drug development, and insuring care, AI applications are revolutionizing how the health sector works to reduce spending and improve patient outcomes” [5]. We are entering a window in time where the landscape of health will start to be redefined, from treatment to cures, to prevention. This transformation will encompass sweeping changes in the pools of data we rely on; in the functional building blocks of the “work” of healthcare, such as doctors, hospitals, and emergency rooms; and in the economic undercurrents and data streams that will reset how the marketplace rewards value [6]. Section 2 of this book, “AI Applications in Health and Wellness,” will deal with the business and administrative aspects of health care (Chapter 4) as well as the associated clinical diagnostic and treatment technologies (Chapters 5 8). Both the business and clinical categories are “vertically integrated,” one of two terms we will be using multiple times, especially in Chapter 4, “AI Applications in the Business and Administration of Health Care.” A generic definition of “integration or, to integrate” is simply, “to form, coordinate, or blend into a functioning or unified whole” [7]. Vertical integration is more a business term than a health care term, but as established above, health care is indeed a business. Thus, we must examine it as such, that is, a business providing health and wellness services. Therefore, vertical integration is the coordination of (health) services among operating units (i.e., providers, facilities [hospitals, etc.], insurers) that are at different stages or levels of the process of delivery of patient services [8]. By addressing the different stages and levels, vertically integrated systems intend to address the following: • Efficiency goals • manage global capitation; • form large patient and provider pools to diversify risk; • reduce the cost of payer contracting. • Access goals • offer a seamless continuum of care; • respond to state legislation. • Quality goals For clarity purposes, by definition, horizontal integration means the coordination of activities across operating units that are at the same stage or level in the process of delivering

Section II • Artificial intelligence (AI): Applications in health and wellness

75

services [8]. Our discussion on the business and administration of health care will focus on the broader, more comprehensive issues of vertical integration. The second term and critical concept we will be discussing regarding the business and administration of health care is “interoperability.” If you recall, in the first paragraph of the Preface of this book, I mention “interoperability” as “a word you’ll be hearing a lot more about.” Well, this is the beginning. It can be defined as the ability of different information systems, devices, or applications to connect, in a coordinated manner, within and across organizational boundaries to access, exchange, and cooperatively use data amongst stakeholders to optimize the health of individuals and populations [9]. In other words, interoperability is the practical implementation and application of vertical integration within the health care industry. There are varying degrees of interoperability. Each demonstrates the types of information in which exchange organizations may engage. There are 4 types of exchange [9]: 1. “Foundational” interoperability establishes the inter-connectivity requirements needed for one system or application to share data with and receive data from another; 2. “Structural” interoperability defines the syntax (the arrangement of words and phrases to create well-formed sentences in the language) of the data exchange where there is a uniform movement of healthcare data from one system to another so that the clinical or operational purpose and meaning of the data is preserved; 3. “Semantic” interoperability is the ability of 2 or more systems to exchange information and to interpret and use that information. This ability supports the electronic exchange of patient data and information among authorized parties via potentially disparate health information and technology systems and products; and 4. “Organizational” interoperability facilitates the secure, seamless, and timely communication and use of data within and between organizations and individuals. It becomes apparent from the definitions of vertical integration and interoperability that their dual roles are essential in organizational health care and its maximally efficient communications, coordination, delivery, and ultimate quality patient outcomes. The practical Table Intro 2.1 • • • • • • • • • • •

Health care systems and concepts.

Big data analytics Blockchain Public Health Health information and records: Electronic Health Record (HER) Population Health Precision Health Personalized Health Predictive Analytics Descriptive Analytics Prescriptive Analytics Preventive Health

76

Foundations of Artificial Intelligence in Healthcare and Bioscience

implementation of the concepts and tools needed to realize these efficiencies (Table Intro 2.1) lie in the systems AI is making possible through its digital manipulative capabilities of massive amounts of data (i.e., big data analytics). The analysis and examples of the applications of these concepts and tools become the basis for our discussions in the following Chapters 4 7. You will recall the in-depth descriptions of big data analytics and blockchain from Section 1, Chapter 3. They are included in Chapter 3, “The Science and Technologies of Artificial Intelligence (AI)” versus this Chapter 4, “AI Applications in the Business and Administration of Health Care,” because they are more AI-related technologies than actual health care technologies. But their vital and fundamental relationship to health care (in numerous systems) will become evident in this Section. Hopefully, the information in the Chapter 3 descriptions will help you better understand the applications of big data analytics and the blockchain technologies directly to the business and administration of health care. If at any time these applications and/or benefits are not apparent to you, a quick review of the Chapter 3 explanations may be helpful. Because the AI tools and health care concepts we will be discussing are vertically integrated and interoperable, by definition, many of them will apply to multiple systems to be addressed. As such, it becomes impossible to explain some concepts before they are mentioned relative to another system. What we will do for clarity and continuity in this Chapter 4 is start with a generic discussion of “AI applications in government agencies (GOVs), nongovernmental organizations (NGOs), and third-party health insurers.” Then we will follow with numbered discussions [labeled TEXT #1 through TEXT #10] of the each major AI health care systems and concepts to be covered in the Chapter. Wherever the application of an idea or system is mentioned relative to another idea or system discussed (i.e., interoperability), it is accompanied by its respective [TEXT #, Page #]. If needed for better contextual understanding, at any point, you can read or review the individual concept or system referenced. Finally, the goal of Section 2 of this book is to present as many of AI’s applications and influences on the most relevant business and clinical issues in health and wellness care. In that the amount of current and evolving applications has become overwhelming, I pledge to try my best to cover the most significant developments in each category. As I mentioned in the introduction to Section 1, what we will be doing in this Chapter 4, is highlighting for discussion the “top 3” AI technologies and/or programs (listed as “Primary [Topics] 1 through 3”) in each of the health care categories to be covered. Then we will include a listing (with footnote citations) of 3 additional related AI technologies (listed as “Additional [Topics] 4 through 6”) in current use or development, no less significant but impossible to cover all. The reader will be able to select any of the 3 additional citations that they find interesting or that may have direct relevance to them personally or professionally, for their further referenced reading and research. And finally, every 2 3 years, we will present subsequent editions of this book and submit new “Primary” and “Additional” listings of updated AI technologies for your review and reference.

Section II • Artificial intelligence (AI): Applications in health and wellness

77

References [1] World Health Organization. About the World Health Organization. Constitution. ,http://www.who.int/ governance/eb/constitution/en/. [accessed 27.12.17]. [2] Well-being concepts. National Center for Chronic Disease Prevention and Health Promotion, Division of Population Health; 2018. [3] Winslow CEA. Introduction to public health. Office of Public Health Scientific Services, Center for Surveillance, Epidemiology, and Laboratory Services, Division of Scientific Education and Professional Development; 2018. [4] Sherman E. U.S. Health Care Costs Skyrocketed to $3.65 Trillion in 2018. Fortune Magazine; 2019. [5] Intel AI Insights Team. AI and healthcare: a giant opportunity. Forbes; 2019. [6] Laraski O. Technology alone won’t save healthcare, but it will redefine it. Forbes Insights; 2019. [7] Integrate. Merriam-Webster dictionary. 2020. [8] Pan American Health Organization. Integrated delivery networks: concepts, policy options, and road map for implementation in the Americas. 2008. [9] National Academy of Medicine (NAM). Procuring interoperability: achieving high-quality, connected, and person-centered care. Washington, DC: 2018.

4 AI applications in the business and administration of health care 4.1 AI applications in government agencies (GOVs), non-governmental organizations (NGOs) and third-party health insurers In 2013 the Governor of Ohio, John Kasich said, “For those that live in the shadows of life, those who are the least among us, I will not accept the fact that the most vulnerable in our state should be ignored. We can help them” [1]. Such is the belief that government (GOV) agencies (Table 4 1 [2]) and non-governmental organizations (NGOs) (Table 4 2 [3]) abide by. Notwithstanding that laudable goal, administrative complexity of the United States health care system is one reason why the U.S. spends double the amount per capita on health care compared with other high-income countries [4]. GOV and NGOs are now using AI to automate decision-making, parts of their supply chains, refining compliance functions and financial and tax reporting. As AI effects more of the healthcare system, consumers may not even recognize these influences because most happen behind the scenes [5]. Systemwide health care costs in 2017 set the expense at $1.1 trillion. Of that sum, $504 billion is excess from supplier time wastefulness, waste, fraud and misuse [6]. The second classification greatest loss ($190 billion) as indicated by the NIH Institute of Medicine, is credited to excessive regulatory costs (Fig. 4 1). The utilization of artificial intelligence in expanding efficiencies and distinguishing, monitoring, and correcting misuse is basic to this monetary crisis in health care services. AI has effectively identified a few areas for reform, starting from the design of treatment plans to adjusting repetitive occupations to prescription and medication creation and management [7]. Forbes Magazine predicts that AI for healthcare IT applications will exceed $1.7 billion by 2019. They also predict that introducing AI into the healthcare workflows will result in 10 15% productivity gain over the next 2 3 years [8]. This increased productivity will benefit health care delivery administrative and cost efficiencies. Examples of these GOV, NGO and third-party AI applications are documented in the following literature reviews.

4.1.1 Primary AI applications GOVs, NGOs, and third-party health insurers (1, 2, 3) 1. Transferring time-consuming human tasks to machines [9]: The “iron triangle” in health care defines 3 interlocking elements: access, affordability, and effectiveness. AI applications can provide conventional strategies for cutting costs, improving treatment, Foundations of Artificial Intelligence in Healthcare and Bioscience. DOI: https://doi.org/10.1016/B978-0-12-824477-7.00003-1 © 2021 Elsevier Inc. All rights reserved.

79

80

Foundations of Artificial Intelligence in Healthcare and Bioscience

Table 4–1 • • • • • • • • • • • • • • • • • • • • • • • • • • •

United States governmental health agencies.

ACL Administration for Community Living (https://acl.gov) Administration for Children and Families (www.acf.hhs.gov) Agency for Healthcare Research and Quality (www.ahrq.gov) Agency for Toxic Substances and Disease Registry (www.atsdr.cdc.gov) Centers for Disease Control and Prevention (www.cdc.gov) Centers for Medicare and Medicaid Services (www.cms.hhs.gov) Food and Drug Administration (www.fda.gov) Health Care Finance Administration (www.ncbi.nlm.nih.gov/books/NBK218421/) Health Resources and Services Administration (www.hrsa.gov) Home of the Office of Disease Prevention and Health Promotion (https://health.gov) Indian Health Service (www.ihs.gov) Inspector General Office, Health and Human Services Department (https://oig.hhs.gov) Institute of Medicine website (www.iom.edu) Maternal and Child Health Bureau (https://mchb.hrsa.gov) Mine Safety and Health Administration (MSHA) (www.msha.gov) National Center for Health Statistics CDC (www.cdc.gov/nchs/index.htm) National Institutes of Health (www.nih.gov) National Library of Medicine (https://www.nlm.nih.gov) National Institute of Nursing Research (www.ninr.nih.gov) Occupational Safety and Health Administration (https://www.osha.gov) Office of Minority Health (OMH) (www.minorityhealth.hhs.gov) Substance Abuse and Mental Health Services Administration (www.samhsa.gov) The National Institute for Occupational Safety and Health (NIOSH) CDC (www.cdc.gov/niosh/index/htm) U.S. Public Health Service Home (www.usphs.gov) US Department of Health and Human Services (www.hhs.gov) US Environmental Protection Agency (www.epa.gov) VA.gov (www.va.gov)

Data from Federal Registry, 2019.

and improve availability. “What we see now is a path to unlocking that triangle so you can improve one side without breaking the other,” says Kaveh Safavi, head of Accenture’s global health practice. Improving health care cost-structure issue lies in transferring time-consuming human tasks to machines, while empowering patients, where conceivable, to self-administration for their needs. Such an approach can decrease the amount of human time and effort required to improve patients’ health. AI could help address some 20% of unmet clinical demand, according to Accenture. 2. Performance-based contracting [10]: When AI serves various stakeholders (providers, insurers/payers, pharma companies, etc.), more innovations are accomplished in healthcare. Challenges existing in implementing precision medicine [Text #10, page 101] have been addressed through a process called “performance-based contracting.” This is a transactional methodology that attempts to align payments with real-world clinical performance. This new methodology shares the risk over a range of stakeholders. Financial incentives for real-world outcomes give stakeholders a share of financial risk and

Chapter 4 • AI applications in the business and administration of health care

Table 4–2

81

Nongovernmental organizations (NGOs) working in global health.

International organizations • The Global Fund to Fight AIDS, Tuberculosis and Malaria • Joint United Nations Programme on HIV/AIDS (UNAIDS) • World Bank • World Health Organization (WHO) Scientific organizations • American Association for the Advancement of Science (AAAS) • American Society for Microbiology (ASM) • American Society of Tropical Medicine and Hygiene (ASTMH) • American Thoracic Society (ATS) • Coalition for Epidemic Preparedness Innovations (CEPI) • Consortium of Universities for Global Health (CUGH) • CRDF Global • The Global Health Network • Infectious Diseases Society of America (IDSA) • International Society for Infectious Diseases (ISID) • International Diabetes Federation (IDF) • Planetary Health Alliance Advocacy/policy organizations • Center for Strategic and International Studies Global Health Policy Center • GBCHealth • The Earth Institute • Global Alliance for Chronic Diseases (GACD) • Global Health Council • Global Health Technologies Coalition (GHTC) • Kaiser Family Foundation (KFF) U.S. Global Health Policy • Research!America Global Health R&D Advocacy Foundations • Africare • Bill and Melinda Gates Foundation • Foundation for NIH (FNIH) • UN Foundation (UNF) • Wellcome Trust Other resources • Gapminder • Institute for Health Metrics and Evaluation (IHME) • Worldmapper Data from Nongovernmental organizations (NGOs) working in global health research. Fogarty International Center at the National Institutes of Health; 2019.

rewards to a specified treatment. Challenges do exist with this contracting methodology include managing uncertainties, identifying metrics, and forecasting results. The full risk remains an uncertainty and the concept of betting on best treatments is a game of chance. Prediction rates of patients have improved to over 20-fold faster with neural networks than without predictive methodologies, with preliminary accuracy levels of over 80%. By

82

Foundations of Artificial Intelligence in Healthcare and Bioscience

FIGURE 4–1 Sources of waste in American health care. Over $500 billion in health care spending is excess from supplier time wastefulness, waste, fraud and misuse. From: Institute of Medicine. National Academy of Medicine. NIH. 2020.

leveraging these AI models, uncertainty diminishes by more 50%, meaning more predictive and reliable performance-based terms. AI capabilities can serve to align value with health outcomes through such methodologies. The value of implementing performance-based agreements at scale across multiplestakeholders will extend to individual patients. More positive and more precise treatment results for patients will be the result of incentivizing health outcomes. Given the costadvantages, shared cost-reduction will occur by mitigating financial risks. 3. Fraud detection [11]: Besides being illegal, fraud is a costly problem for health care organizations and insurers. Its detection relies on computerized (rules-based) and manual reviews of medical claims. Besides being a time-consuming process, it is cumbersome and unable to quickly intervene upon identifying anomalies after the incident. AI-supported data mining through neural networks, as an example, can conduct high-speed searches of Medicare claims for patterns associated with medical reimbursement fraud. It is estimated such AI systems could provide $17 billion in annual savings by improving the speed and accuracy of fraud.

4.1.2 Additional AI applications to GOVs, NGOs, and third-party health insurers (4, 5, 6) 4. “IBM Watsons Care” [12]: This Watson program helps organizations unlock and integrate the full breadth of information from multiple systems and care providers, automate care management workflows, and scale to meet the demands of growing populations under management.

Chapter 4 • AI applications in the business and administration of health care

83

5. “Costly problem of dosage errors” [13]: A mathematical formula developed with the help of AI has correctly determined the correct dose of immune suppressant drugs to administer to organ patients. 6. “AI for Humanitarian Action” [14]: An NGO (Microsoft) is sponsoring a $40 million, fiveyear AI initiative to focus on 4 priorities: (1) helping the world recover from disasters; (2) addressing the needs of children; (3) protecting refugees and displaced people; and (4) promoting respect for human rights.

4.2 Big data analytics in health care [Text #1] A comprehensive definition and description of big data analytics as a concept and tool of AI are presented in Chapter 3 (page 29). In this chapter, we will focus on the applications of big data analytics in health care. Perhaps, more than any other concept or technology discussed in the forthcoming chapters, you will find a considerable amount of vertical integration and interoperability associated with big data analytics. Seventy-seven percent of healthcare executives reported that their organizations are accelerating investments in big data analytics and artificial intelligence (AI) [15]. Big data (AI) technology enables healthcare organizations to analyze an immense volume, variety, and velocity of data across a wide range of healthcare networks. This supports evidencebased, decision-making and actionable intervention. The various analytical techniques include systems such as descriptive analytics [Text #6, page 98]; diagnostic analytics [Text #7, page 99]; mining/predictive analytics [Text #8, page 99]; and prescriptive analytics [Text #9, page 100]. These methodologies are ideal for analyzing a large proportion of text-based health (structured) documents and other unstructured (e.g., physician’s written notes and prescriptions, medical imaging, etc.) clinical data [16]. The main challenge in this field is the enormous and continuing increase in the generated data within the health care domain. The big data analytical tools save lives, develop innovative techniques to diagnose and treat various diseases, and provide innovative healthcare services by extracting knowledge from patients’ care data. The following are active examples of this big data analytics effect.

4.2.1 Primary AI literature reviews of big data analytics (1, 2, 3) 1. Clinical Decision Support Services (CDSS) [17]: Big Data technologies provide new, powerful instruments that gather and jointly analyze large volumes of data collected for different purposes, including clinical care, administration, and research. This makes the “Learning Healthcare System Cycle” possible where healthcare practice and research work together. By providing fast access to the same set of data available for research purposes enables clinical decision support. The proper design of dashboard-based tools enables precision medicine [Text #10, page 101] decision-making and case-based reasoning.

84

Foundations of Artificial Intelligence in Healthcare and Bioscience

The concept of Clinical Decision Support Services (CDSSs) embedded in Big Data represent a new way to support clinical diagnostics, therapeutic interventions, and research. This information, properly organized and displayed, reveals clinical patterns not previously considered. New reasoning cycles are generated and explanatory assumptions can be formed and evaluated. To properly model and update the different aspects of clinical care, CDSSs need to support their relationships. Models of clinical guidelines and care pathways can be very effective in comparing the analytic results with expected behaviors. This may allow effective revision of routinely collected data to get new insights about patients’ outcomes and to explain clinical patterns. These actions are the essence of a Learning Health Care System. 2. Big data and cost of health care [18]: As the aging population increases the burden on the healthcare system, administrators and politicians’ access the most effective ways to manage the country’s healthcare network. Meanwhile, the health care provider pool continues to decrease. Healthcare administrators are looking seriously at AI as a way to improve the efficiency of care delivery. Eventually, big data technology supported by AI will increase the range and effectiveness of providers and help mitigate the global shortage of medical professionals. Big data and AI can help the growing care provider shortage in many ways [19]. Healthcare providers will have to leverage big data technology to make the biggest impact. Technologies could support the frameworks for medical technologies ranging from robotic surgical assistants to highly advanced expert diagnostic systems. Healthcare providers will continue to provide care to underserved populations by using big data technology to make treatment more accessible. Helping care providers will be supported by big data technologies to help make improvements in the delivery of healthcare services [20]. Big data systems, in combination with AI technology, can enhance the ability to analyze tumors and make accurate diagnoses. Medical researchers and the healthcare community are also enthusiastic about AI’s potential to serve as a resource for quickly restoring functionality for injured and sick patients. Yet another valuable potential benefit of big data technology is to decrease the cost of caregiving. Accenture management consulting firm forecasts that care providers could use AI to slash operational costs by $150 billion per year by 2026 [18]. Using AI technology, healthcare organizations will be able to find new opportunities for analyzing the big data collected from patients’ Internet-connected medical devices as well as healthcare provider information networks. This approach will allow organizations to cut costs, improve community wellness and lower the cost of providing care. 3. Big data for critical care [21]: Data dependency is paramount in the critical care department (CCD) in any of its forms: intensive care unit (ICU), pediatric intensive care unit (PICU), neonatal intensive care unit (NICU) or surgical intensive care units (SICU). This dependency involves very practical implications for medical decision support systems (MDSS) at the point of care [22]. ICUs care for acutely ill patients. Many of these, and particularly SICU patients, are technologically dependent on life-sustaining devices,

Chapter 4 • AI applications in the business and administration of health care

85

including infusion pumps, mechanical ventilators, catheters, and so on. Beyond treatment alone, prognosis is extremely important and is assisted in combining different data sources in critical care. Medical device connectivity is essential for providing a clinical decision support framework in the different types of ICU. While EHRs [Text #3, page 88] offer enormous workflow benefits, documentation, and charting systems, they are no stronger than the data they convey. Providers’ care can be augmented by automated and validated data collection through a seamless form of medical device connectivity and interoperability supported both inside and outside the clinical environment and capable of following the patient throughout the care process. Although 80% of medical data is unstructured (physician notes, medical correspondence, individual EMRs, lab and imaging systems, CRM systems, finance and claims) it is all clinically relevant [23]. These data can be coordinated in multiple locations for better treatment and prognosis. Big Data technology can capture all of the information about a patient to get a more complete view of the medical data and gaining access to this valued data. It can be categorized into clinical and advanced analytics providing critical building of sustainable healthcare systems that are more accessible. Such Big Data Analytics methods are invaluable in an extremely data-intensive environment such as the ICU.

4.2.2 Additional AI literature reviews of big data analytics (4, 5, 6) 4. Big data and reinforcement learning (RL) [24]: Big data and RL provide unique opportunities for optimizing treatments in healthcare. 5. Big data and IoT in “smart ambulance” system [25]: A novel approach using a smart ambulance that is equipped with IoT (Internet of Things) and big data technology is used to disseminate the information to a main center or hospital [25]. 6. The promise of big data and AI in clinical medicine and biomedical research [26]: The use of big data and AI to help guide clinical decisions is a central aspect of precision medicine [Text #10, page 101] initiatives.

4.3 Blockchain in health care [Text #2] The description of blockchain technology in Chapter 3 (page 59) stated that “. . .the intersection and convergence of blockchain technology with AI and big data [Text #1, page 83] may prove to be one of the strongest (and most disruptive) advances in the future of health care.” Forty percent of health executives see blockchain as one of their top 5 priorities [27]. Blockchain, in combination with AI and big data, has emerged as a new way to enable secure and efficient transactions as well as helping health care organizations utilize emerging technologies. The addition of AI technologies including (and not limited to) computer vision, natural language processing, machine learning, and speech recognition, blockchain technology delivers high accuracy in information treatment and workforce augmentation [28].

86

Foundations of Artificial Intelligence in Healthcare and Bioscience

Computer vision is used in coordination with natural language processing to convert the millions of health records and notes transmitted into digital documents in the form of images. A machine learning feedback loop from incorrectly transcribed information feeds continuously trains the system using manual intervention for greater accuracy. AI’s interoperability with a health care blockchain can then enable benefits of distributed records combined in sophisticated and powerful ways with cognitive technologies to target health insights for continuous secured, encrypted access. AI’s ability in blockchain to identify patterns hidden (to the naked eye) in semi-structured datasets provides meaningful connections of thousands of disparate entities. Permissible distribution in a system of health information through blockchain could make a wide array of executable datasets available for these highly specialized, trusted, narrow AI agents. The ultimate goal is not just aggregating and analyzing data, but to improve the care delivered to patients. The following reviews and programs illustrate the potential of AI and blockchain technology in health care.

4.3.1 Primary AI literature reviews of blockchain (1, 2, 3) 1. Health care industry analysis of blockchain from 2018 to 2025 [29]: The health care system is woefully inadequate at sharing information between hospitals, physicians, institutions, and other health providers. This lack of interoperability makes transmission, retrieval, and analysis of data very difficult, producing concerns regarding security, privacy, storage, and exchange of data. These shortcomings in the system can be addressed effectively by blockchain, in conjunction with AI. First, blockchain introduces complete elimination of third-party intermediaries. Second, it streamlines operational processes and provides significant cost containment through a more transparent procedure. Third, blockchain can also allow for a shift to a more value-based healthcare system versus the existing fee-based healthcare systems. It can use natural language processing (NLP) to improve patient involvement and create opportunities for more consumer-centric product segments and revenue streams. Blockchain can save the healthcare industry up to $100 billion per year by 2025 in data breach-related costs, IT costs, operations costs, support function and personnel costs, counterfeit related frauds, and insurance frauds. Fraud within health insurance associated with providers is also expected to witness up to $10 billion reduction in costs annually. AI and blockchain used for healthcare data exchange will contribute the largest, reaching a value of $1.89 billion by 2025. The use of blockchain has the potential to solve the widespread problems in healthcare information systems related to interoperability and non-standardization in the industry. Such shifts to blockchain-based solutions will require significant investments and efforts, including seamless integration with the current infrastructure. There may be provider and patient resistance to blockchain providers. Also, industry support will be

Chapter 4 • AI applications in the business and administration of health care

87

necessary to bring standardization and promote interoperability between different networks developed by different enterprises and run on different consensus protocols. 2. A new project, Pharmeum (2019) [30]: The Pharmeum project is a system heavily reliant on blockchain and AI technology that allows for coexistence between doctors, pharmacies, regulators, and patient. Once achieved, it can help with predictive health [Text #8, page 99] in many ways, including analyzing patient health history and evolving patterns that can provide definite auditability of prescriptions and medication flows. In creating a coexistence of the current digital-analog hybrid system into a fully AI digital, blockchain-based system, Pharmeum will help doctors reduce their amount of wasted effort, reducing the potential for incorrect diagnosis, treatment, and prescription assignment. Doctors’ prescription histories are available at will in the system, and patients’ health history can be monitored, preventing incorrect medication or dosage instructions by using tablets and laptops for medical record collection and dissemination through a medication pad. Pharmeum also can resolve pharmacy inefficiencies with blockchain by writing review periods into prescription smart contracts and minimizing patient waiting times for medication. Pharmeum will include an AI function that will let pharmacies create diagnoses earlier in case of terminal illness through access to patient history, family history, symptom information, and other vital data. According to the British Medical Journal, 70% of medication-related treatment errors come from the prescribing process alone [31]. Eighty percent of these could be reduced or eliminated with an efficient, digitized AI prescription system. The technologies included in Pharmeum blockchain includes zero-knowledge proofs, prescription tokens, medicine asset tokens, and Pharmeum tokens. Personally, Identifiable Information (PII) will be stored off-chain, while non-PII will be stored onchain for use with AI and other factors. 3. Centers for Disease Control and Prevention (CDC) [32]: The CDC’s Office of Technology and Innovation and the National Center for Health Statistics worked with IBM to create a pilot program for an EHR blockchain. The digital ledger nature of blockchain makes it ideal to record transactions for data owners, healthcare providers, the CDC, and other agencies. Thus, all of the players involved, including users and auditors, can be confident that the data is secure and accurate for transferring. It also allows the ability for them to determine if there has been interference with the data. The National Ambulatory Medical Care Survey, a service that collects information about ambulatory care services in hospital emergency rooms and doctors’ offices, submits data to the CDC in the pilot program. “There are a number of edits that we conduct on the data. There’s a rigorous process on the National Center for Health Statistics (NCHS) side to review the data, then store the data, and then make a public use file. We’ll be able to see exactly as the data moves through the lifecycle, who has access to it, and at which point. There are no limitations on the frequency of the data received or the size of it.” [32] The blockchain pilot program provides “complete transparency” to

88

Foundations of Artificial Intelligence in Healthcare and Bioscience

the healthcare providers and CDC officials, while complying with privacy laws like the Health Insurance Portability and Accountability Act (HIPAA). The publication of immunization Clinical Decision Support (CDSi) algorithms by the CDC is another recent development. These algorithms are publicly available and can be embedded directly into patient EHRs (Text #3, page 88). They enable adaptive immunization forecasting to tailor person-specific vaccine recommendations. They also couple with clinical, claims, social, and immunization registry data to recognize and interpret critical health factors. This ensures the right immunization is offered to the right person at the right time. This technology leveraging CDSi logic gives providers personalized prompts whenever a patient is seen, not just in annual visits [33].

4.3.2 Additional AI literature reviews of blockchain (4, 5, 6) 4. Authenticating medical records: Medical errors are the third leading cause of death in the United States: [31] Blockchain technology is being used to facilitate the authenticity of medical records. 5. Clinical research and data monetization: [34] A major benefit of blockchain technology is moving data ownership from institutions and corporations into the hands of the people who generated the data. 6. Blockchain for diabetes health care: [35] Blockchain-based platforms exist for AI assisted sharing of cross-institutional diabetic health care data.

4.4 Health information and records (electronic health record or EHR) [Text #3] Record keeping is the foundation of the health care business. The introduction of the electronic health record (EHR originally referred to as EMR for “electronic medical record”) has brought that process to new heights. EHR systems have the capacity to store data for each patient encounter, including demographic information, diagnoses, laboratory tests and results, prescriptions, radiological images, clinical notes, and more [36]. Primarily designed for improving healthcare efficiency, the secondary use of the EHR for clinical informatics applications is of value in many studies [37]. Secured, encrypted data can now be stored and processed electronically from all levels of data processing. From the private office desktop computers to the most sophisticated database servers, Through IoTs, blockchain (Text #2, page 85) technologies, and the “cloud,” AI can now be applied to all of these digital electronic systems. In 2009, 12.2% of hospitals and doctors were using EHR systems. By 2017, the percentage had grown to 75.5% [38]. Then, with the inclusion of electronic requirements for reimbursement under the “Administrative Simplification Compliance Act (ASCA)” [39] from CMS (Center for Medicare and Medicaid Services) and HIPAA (Health Insurance Portability and Accountability Act), by 2019 virtually 100% of all health providers had implemented EHR systems. Beyond the administrative issues associated with EHRs, the dynamic clinical patient

Chapter 4 • AI applications in the business and administration of health care

89

information captured is of far more value. It provides opportunities for research, risk prediction algorithms, and the advancement and values of population health [Text #4, page 91] and precision medicine [Text #10, page 101] [40]. Notwithstanding the power of AI technology and its relationship to the EHR, the risks and dangers must also be recognized in the areas of security of the highly personal and sensitive information in this health records process. Among the EHR systems highlighted below, you will see that security is the major issue with most, and blockchain technology is often the common denominator in providing such protection.

4.4.1 Primary AI literature reviews of health information and records (EHR) (1, 2, 3) 1. Diagnostic and/or predictive algorithms [41]: Leveraging digital health data through AI enables computer information systems to understand and improve care. Routinely collected patient data is approaching a volume and complexity never seen before. Using big data analytics [Text #1, page 83] to sort this data into predictive statistical models provides major benefits in precision medicine [Text #10, page 101], not only for patient safety and quality but also in reducing healthcare costs [42,43], A problem in this approach to data processing is that 80% of the effort in an analytic model is preprocessing, merging, customizing, and cleaning datasets, and not analyzing them for insights. This limits the scalability (expanding) of predictive models [44,45], Also, there can be thousands of potential predictor variables (e.g., free-Text notes from doctors, nurses, and other providers) in the EHR that are often excluded. This exclusion reduces the potential accuracy of the data and may produce imprecise predictions: false-positive predictions [46]. Deep learning and artificial neural networks (ANNs) address these challenges to effectively unlock the information in the EHR. Natural language processing (NLP) is also invaluable in sequencing predictions and resolving mixed data settings. Given these strengths, AI systems are most capable of handling large volumes of relatively “messy” data, including errors in labels and large numbers of input variables [47,48]. These neural networks are able to learn representations of key factors and interactions from the data. The deep learning approaches can incorporate the entire EHR, including free-Text notes. They produce predictions for a wide range of clinical problems and outcomes that outperform state-of-the-art traditional predictive models. By mapping EHR data into a highly organized set of structured predictor variables and then feeding them into a statistical model, the system can simultaneously coordinate inputs and predict diagnostics and medical events through direct machine learning [49]. 2. Data-mining medical records: According to data from the U.S. Department of Health and Human Services, there is progress in the value-based healthcare delivery system (the healthcare delivery model in which providers, including hospitals and physicians, are paid based on patient health outcomes [50]) in the U.S. This runs almost parallel to the significant implementation rate of electronic health records (EHR) [51].

90

Foundations of Artificial Intelligence in Healthcare and Bioscience

As the volume of patient data increases, finding tools to extract insights could grow more challenging and complex. Researchers are exploring how AI can extract useful information from massive and complex medical data sets [52]. Certain types of AI applications are emerging to improve analyses of medical records and are being implemented in the healthcare market. These AI applications include: • Diagnostic Analytics [Text #7, page 99]: Defined as “a form of advanced analytics which examines data or content” to determine why a health outcome happened. More time and research will be needed to determine if diagnostic analytics AI applications gain greater traction in the healthcare industry. • Predictive Analytics [Text #8, page 99]: When companies and healthcare professionals use machine learning they analyze patient data in order to determine possible patient outcomes, such as the likelihood of a worsening or improving health condition, or chances of inheriting an illness in an individual’s family. Predictive analytics platforms appear to deliver trends that are proving meaningful to healthcare systems and investors in the healthcare space. • Prescriptive Analytics [Text #9, page 100]: An example of prescriptive analytics is when research firms develop machine learning algorithms to perform comprehensive analyses of patient data to improve the quality of patient management such as handling patient cases or coordinating the flow of tasks, such as ordering tests. 3. EHR integration and interoperability: The Office of the National Coordinator for Health Information Technology (ONC) defines interoperability as, “the ability of a system to exchange electronic health information with and use electronic health information from other systems without special effort on the part of the user.” [53] Users must be able to find and use the information, whether sending or receiving, as well as send to, and receive information from third-party systems (independent IT vendors). In practical terms, integration is having automatic access (versus manual entry) into the EHR for clinical data from sources within and outside the health systems and to be able to use that information when treating a patient. Interoperability is anticipated to be a key enabler of population-based alternative payment models, delivery reforms, and improved performance measurements. However, integrating third-party information involves more than finding, sending, and receiving. Only the more advanced EHR systems support integration. This is the reason there has been slower progress toward integration [54]. EHR faces 2 main challenges to integration. First, the technical infrastructure must make relevant information available through the user interface. APIs will be able to integrate relevant information when individual screens are accessed, or it will provide tabs to link to third-party content for worklists with pertinent details. Second, administrative challenges exist regarding willingness among key players (health systems, insurers, and vendors) to make integration happen. EHR integration will necessitate the healthcare industry establishing standards and prioritize functional integration. Again, APIs can enable platforms that can aggregate data across multiple providers or vendors. This could ultimately help facilitate a widely adopted, fully integrated digital healthcare ecosystem. Further integration can

Chapter 4 • AI applications in the business and administration of health care

91

occur as publicly available “open” APIs allow users to access data with few restrictions. The U.S. Department of Health and Human Services will likely create a definition of open APIs in healthcare that will include openly published specifications [55].

4.4.2 Additional AI literature reviews of health information and records (EHR) (4, 5, 6) 4. Data extraction from faxes, clinical notes entered into EHR automatically [56]; 5. Recording of the patient/doctor conversation incorporated into EHR [57]; 6. Innovative data pipeline using raw EHR data along with delivering (Fast Healthcare Interoperability Resources) FHIR standard output data [58];

4.5 Population health [Text #4] Broadly defined, population health is “the health outcomes of a group of individuals, including the distribution of such outcomes within the group” [59]. It is a health research and organizational concept that has advanced enormously with expanding applications of AI. Because of its intimate association with AI and its vertical integration with so many organizational, business, administrative, and clinical health care issues, its elements must be understood individually as well as their associations with the other health care topics discussed in this chapter. Population Health comprises 3 main components: health outcomes, health determinants, and policies [60]. Population health outcomes are the product of multiple “inputs” or determinants of health, including policies, clinical care, public health [Text #12, page 111], genetics, behaviors (e.g., smoking, diet, and treatment adherence), social factors (e.g., employment, education, and poverty), environmental factors (e.g., occupational, food, and water safety), and the distribution of disparities in the population. Thus, population health can be thought of as the science of mathematically analyzing (AI) the inputs and outputs of the overall health and well-being of a population and using this knowledge to produce desirable population outcomes. A population’s health can be analyzed at various geographic levels (e.g., countries, states, counties, or cities) as well as health disparities based on race or ethnicity, income level, or education level [61]. Population health, in more practical terms, is the influence of social and economic forces combined with biological and environmental factors that shape the health of entire populations. The “affecting influences” in a defined population group are referred to as “determinants” or “independent variables” and the outcomes as “dependent variables.” These variables contribute to policies and interventions that can benefit the population. All of these demographic and epidemiological variables occur in small sample populations (e.g., a nursing home) and in large samples (e.g., a country). Fig. 4 2 presents an example of a small population (we’ll use a nursing home as our example) where we measure some key demographic and epidemiological factors (independent variables) among the

92

Foundations of Artificial Intelligence in Healthcare and Bioscience

residents, including age, companionship, physical health, and cognitive memory. AI algorithms (regression analysis, Bayesian probabilities and inference logic) are used to analyze the statistical results of these independent variables against one another as well as against other potential dependent variables (e.g., health and wellness, risks of aging, etc.), associations and outcomes result (Fig. 4 3). In this small population (nursing home) example, greater age correlated positively with greater risk of falls; less companionship correlated negatively or reduced health; reduced physical health correlated negatively with reduced companionship; and greater cognitive memory correlated positively with increased companionship. These positive and negative correlations allowed appropriate professionals and caregivers the ability to introduce corrective interventions, e.g., more focused physical therapy; and promoting social interactions and companionships for certain patient cohorts. These evidence-based interventions represent applied population health (Fig. 4 4). Similar to the example above for a small sample, the same approach can be (and is being) used at a global level. Fig. 4 5 illustrates the population health concept expanded to analyze the health of an entire country’s population (e.g., Country X). Whereas the demographic and epidemiologic dependent variables may differ from the small example, the AI

FIGURE 4–2 Population health (independent variables small scale). Example of a small population model where demographic and epidemiological factors (independent variables) are measured.

FIGURE 4–3 Population health (dependent variables small scale). AI algorithms (regression analysis, Bayesian probabilities and inference logic) analyze the statistical results of independent variables against one another as well as against other potential dependent variables (e.g., health and wellness, risks of aging, etc.).

Chapter 4 • AI applications in the business and administration of health care

93

FIGURE 4–4 Population health (interventions small scale). AI algorithms measure positive and negative correlations between dependent and independent variables allowing appropriate professionals and caregivers the ability to introduce corrective interventions.

FIGURE 4–5 Population health (large scale). The population health concept can be used in large scale assessments where AI analysis of demographic and epidemiologic independent variables produce the dependent variable outcomes and the intervention similar to small models.

analysis process producing the independent variable outcomes and the intervention process remains similar. The outcomes derived from the analysis of independent to dependent variables provide a road map for interventions and policies that address the negative outcomes and reinforce the positives. In effect, the potential health of the “population” versus individual care is enhanced.

94

Foundations of Artificial Intelligence in Healthcare and Bioscience

It is worthwhile to differentiate here between the terms “population health”, “public health” [Text #12, page 111] and “community health.” Population health is the health outcome of a group of individuals. In a broader sense, population health includes the health outcomes themselves and the efforts to influence those outcomes. Public health tends to focus primarily on large scale concerns or threats, such as vaccination and disease prevention, injury and illness avoidance, healthy behaviors, and minimizing outbreaks that jeopardize public health. It tends to be more focused on creating conditions in which individuals can be healthy. Community Health shares many similarities with both population health and public health, but it tends to be more geographically based. Community health also addresses a broader spectrum of issues than population health or public health. Such issues include public policy influences, creating shared community resources, and conducting a more holistic approach to health and wellness [62]. One other related term is “Population Health Management (PHM)” which is a systematic approach whose goal it is to enable all people in a defined population to maintain and improve their health. It is a practical application of population health within the medical delivery system [60]. AI applications in population health management are expanding from tabletop exercise software for identifying strengths and weaknesses in preparation and management of medical disasters [63] to reconfiguring management protocols or strategies at the personal and/or population level. The continuing growth of AI applications in PHM is unlimited, with a projected estimate of more than $100 billion in the next 2 decades [64]. Indeed, Covid-19 (see Chapter 8) is demonstrating the significant value population health will provide in applications to active pandemic management and future related planning. Population health management is a discipline within the healthcare industry that studies and facilitates care delivery across the general population or groups of individuals. Thus, it is within this domain wherein AI can help reduce health care costs. Also, regulations such as Health Insurance Portability and Protection Act (HIPPA) and the HITECH Act have helped advance the digitization of the health care industry and incentivize the development and adoption of electronic health records (EHR) [Text #3, page 88]. However, ultimately, the most valuable assets in the reams of medical data collected by organizations and companies in the population remains to be seen. Population health management tools are currently being used to analyze health care datasets. However, countless patterns and trends are not uncovered because clinicians may not ask the right question. AI’s unsupervised learning can assist in this breach with minimal human intervention by analyzing data and discovering common patterns and anomalies. AI algorithms powered by unsupervised learning can assess health records data, financial data, patient-generated data, IoT devices, and other relevant sources to automatically discover patient cohorts that share unique combinations of characteristics. Patterns derived from population health data can help health institutions engage in preventive [Text #11, page 106] and predictive care [Text #8, page 99]. This data can result in cost savings when discovered early versus late stages in managing and treating costly and complex diseases.

Chapter 4 • AI applications in the business and administration of health care

95

Personalized healthcare [Text #10, page 101] of this nature may seem unattainable within the traditional healthcare model because of costs and logistics, such as paying medical professionals and scheduling appointments. Nonetheless, it can lower cost and logistical barriers and provide the ability to complement care delivered by healthcare providers. By reaching many patients and promoting population-based recommendations, it can integrate strategies of PHM while at the same time tailoring care to individual patients, using a model of healthcare based on an “N of 1” [65]. Population Health is a moving target whose goals, once achieved, will quickly be replaced with new ones. Because of such complexities and dynamic nature of health management, Population Health is especially suitable to machine learning, which uncovers hidden patterns and predictive trends in large and disparate sources of data in short timespans.

4.5.1 Primary AI literature reviews of population health (1, 2, 3) 1. Health intelligence: How artificial intelligence transforms population and personalized health [66]: To present a coherent portrait of population health [67], AI methods extract health and non-health data at different levels of degree, coordination, and integration about populations and communities and their epidemiology and history of control of chronic diseases. Health intelligence is the critical metric of initiatives such as precision medicine [Text #10, page 101] [68]. Initiatives like the National Institute of Health (NIH) “All of Us” Research Program [69] seek to introduce new models of patient-powered research for better health care solutions to prevention, treatment of disease, prediction of epidemics and pandemics, and improving the quality of life. Personalized healthcare [Text #10, page 101] is a system that seeks to manage disease by considering individual variability in environment, lifestyle, and genes for each person. In personalized healthcare, information technology is necessary to improve communication, collaboration, and teamwork between patients, their families, healthcare communities, and multidisciplinary care teams. AI is used to process personalized data, to elicit patient preferences, and to help patients (and families) participate in the care process. It also allows physicians to use this participation to provide high quality, efficient, personalized care by personalizing “generic” therapy plans and connecting patients with information not typically included in their care setting. Clinicians and public health [Text #12, page 111] practitioners use personalized healthcare to deliver “personalized” therapy or interventions based on the best evidence to maintain high-quality patient care and create healthier communities. Acceptance of all technologies, especially for diagnostics in a clinical setting, concerns related to scalability, data integration and interoperability, security, privacy, and ethics of accumulated data, are examples of future challenges. But notwithstanding such limitations, evolving AI tools and techniques are already providing substantial benefits in in-depth knowledge of individuals’ health and predicting population health risks, and their use for medicine. As a result, public health is likely to increase substantially in the near future.

96

Foundations of Artificial Intelligence in Healthcare and Bioscience

2. Transforming diabetes care through artificial intelligence [70]: A recent study of 300,000 patients with type 2 diabetes who were started on medical treatment found that after 3 months, 31% of patients had discontinued their diabetes medications. This increased to 44% by 6 months, and 58% by 1 year. Only 40% eventually restarted diabetes medications [71]. Optimal care for persons with diabetes often is hampered by the absence of real-time monitoring of key health information necessary to make informed choices associated with intensive therapy and tight diabetes control. AI is employed to gather massive amounts of vital information to meet consumer demand in every business, including health care and specifically, diabetic care. A 2017 survey found that 68% of mobile health app developers and publishers believe that diabetes continues to be the single most important health care field with the best market potential for digital health solutions within the near future, and that 61% see AI as the most disruptive technology shaping the digital health sector [72]. A review of the published literature documents the substantial advances in AI technology over the last 10 years and how it is helping diabetes and their clinicians make more informed choices. Studies are underway that represent a diverse and complex set of innovative approaches aimed at transforming diabetic care in 4 main areas: automated retinal screening, clinical decision support, predictive population risk stratification, and patient self-management tools. Today, research suggests that AI approaches in these 4 vital areas of diabetic care are rapidly transforming care. Despite challenges, this review of recently published, high-impact, and clinically-relevant studies suggest that diabetes is attracting top health care AI technology companies as well as start-ups that are using innovative approaches like population health and PHM to tackle daily challenges faced by people with diabetes. Many of the applications have already received regulatory approval and are on the market today. Many more are on the way. 3. Inching toward the data-driven future of population health management [73]: The healthcare industry, data analytics, and population health management are being highlighted in 2019. Nonetheless, the industry is at an early stage of population health management. Currently, population health efforts primarily tend to focus on healthy individuals (veersus the diabetic example in #2 above) in need of preventive screenings, but little on patients with chronic conditions (see Chapter 7, page 411), and not much beyond that. Population health management starts with data. Fragmented silos of data and communication must be dismantled, and a single, centralized data source is one way to do that. Store the relevant data is in one place, so all participants in the system can access the same information. Thus, when providers start working with a patient, they can use a single source to see what the needs and requirements are, leading to more targeted treatments. Providers will be looking at advanced analytics technologies with new sources of data gaining more clinical importance. These include [74]: • Claims data; • Electronic health record (EHR) data [Text #3, page 88]; • Social and community determinants of health;

Chapter 4 • AI applications in the business and administration of health care

97

• Patient-generated health data (PGHD); • Prescription and medication adherence data. Healthcare organizations are focusing more on the social determinants of health, like transportation, housing stability, and food security. AI is ideally suited to analyze that data to help providers identify patients’ needs. AI can also power analytic health tools to help doctors make more informed treatment decisions. Analyzing and powering these tools through AI will become increasingly important as analytic health data becomes more widely used. In 2018, CMS issued its final 2019 Physician Fee Schedule and Quality Payment Program that included plans to reimburse healthcare providers for specific remote AI patient monitoring and telehealth services. These plans suggest a healthcare system that is beginning to embrace virtual care and looking for new ways to improve population health. The road to data-driven, advanced care delivery in the industry may take some time, but it is gradually making its way. Providers are beginning to recognize the importance of access to a central data source, and the value and power of mobile technologies like remote patient monitoring. As reimbursement and the acceptance of technology increase, as COVID-19 (see Chapter 8) continues to integrate into all areas of care, and as providers gain better access to needed data sources, there will be more adoption of data-driven devices.

4.5.2 Additional AI literature reviews of population health (4, 5, 6) 4. Machine learning in population health: opportunities and threats [75]: AI methods that are explainable (e.g., explainable, XAI), that respect privacy, and that make accurate causal inferences will help us take advantage of this opportunity. 5. How mobile health technology and electronic health records will change care of patients with Parkinson’s disease [76]: Non-obtrusive data collection will dominate the market as they interoperate with the personal EHR and other potentially health-related electronic databases such as clinical warehouses and population health analytics platforms. 6. Machine learning in radiology: applications beyond image interpretation [77]: Machine learning has the potential to solve many challenges that currently exist in radiology beyond image interpretation, including implications for population health.

4.6 Healthcare analytics (descriptive, diagnostic, predictive, prescriptive, discovery) [78] [Text #5] The growing healthcare industry generates a large volume of useful data on patient demographics, treatment plans, payment, and insurance coverage, attracting the attention of clinicians and scientists alike. The field of Health Analytics provides tools and techniques that extract information from these complex and voluminous datasets and translate them into knowledge to assist decision-making in healthcare. Health Analytics develops insights through the efficient use of (big) data and the application of quantitative and qualitative analysis. It generates fact-based decisions used for “planning, management, measurement, and learning” purposes [78]. It is a multidisciplinary field that uses mathematics, statistics,

98

Foundations of Artificial Intelligence in Healthcare and Bioscience

FIGURE 4–6 Healthcare analytics. Data mining techniques in health care analytics fall under 4 categories: (1) descriptive (i.e., exploration and discovery of information in the dataset); (2) diagnostic (i.e., why something happened); (3) predictive (i.e., prediction of upcoming events based on historical data); and (4) prescriptive (i.e., utilization of scenarios to provide decision support).

predictive modeling, AI algorithms and machine learning techniques to discover meaningful patterns and knowledge. AI applications allow health analytics to use data mining, Text mining, and big data analytics [Text #1, page 83] to assisting healthcare professionals in disease prediction, diagnosis, and treatment. The “Internet of Things” (IoT) serves as a platform to obtain medical and healthrelated data in numerous healthcare application settings. IoT is a digital technology that connects physical objects that are accessible through the Internet (see Chapter 2, page 23). These physical objects are connected using RFID (radio-frequency identification) technologies including sensors/actuators and communication [79]. Combining health care analytics with IoT enables all the relevant data to be turned into actionable insight. IoT and analytics become an ideal combination for better healthcare solutions. This is resulting in an improvement in service quality and reduction in cost [80]. According to some estimates, the application of data mining can save $450 billion each year from the U.S. healthcare system [81]. Such data mining techniques fall under 4 categories of analytics (Fig. 4 6 [82]) which provide increasing value and difficulty [83]: 1. 2. 3. 4.

descriptive (i.e., exploration and discovery of information in the dataset); diagnostic (i.e., why something happened); predictive (i.e., prediction of upcoming events based on historical data); and prescriptive (i.e., utilization of scenarios to provide decision support).

Each of these health analytic types is described below followed by examples (3 primary and 3 additional) of their use in health care programs.

4.6.1 Descriptive analytics [Text #6] [84] Descriptive analytics is the most basic type. It serves to answer questions like “what has happened?” Descriptive analytics analyze the real-time incoming and historical data for having

Chapter 4 • AI applications in the business and administration of health care

99

insights on how to approach the future. Basic statistics and mathematical techniques are among the most frequently used methods for descriptive analytics. Health care data is often “siloed” (stored) in repositories that do not communicate easily with one another. Besides the challenge of resources to store large amounts of data required for analysis, gaining access, and extracting information from these different sources can consume as much time and resources as the data analysis itself. The healthcare industry relies, largely, on traditional analysis that yields merely descriptive analytics, which summarizes raw data to make it easier to understand but requires humans to interpret past behavior and consider how it influences future outcomes. Machine learning addresses these challenges in descriptive analytics.

4.6.2 Diagnostic analytics [Text #7] [85] Diagnostic health analytics works on answering “why something happened.” It needs extensive exploration and focused analysis of the existing data using tools such as visualization techniques to discover the root causes of a problem, and it helps users realize the nature and impact of issues. This may include understanding the effects of input factors and processes on performance. Diagnostic data analytics works backward from the symptoms to suggest the cause of what has happened. Doctors continue to be responsible for the final diagnosis, but they can use data analytics to save time and avoid possible errors of judgment. Subsequently, the results of each diagnosis, together with a description of the symptoms and additional contributing factors are added to the database used for the analytics. This helps the diagnostic data analytics to be increasingly accurate [86].

4.6.3 Predictive analytics [Text #8, page 99] [78] Predictive analytics is a higher level of analytics that serve to answer questions such as “what could happen in the future based on previous trends and pattern?” Thanks to AI, “intelligence” is added to many mainstream healthcare applications, whereas relying on descriptive analytics is limited to reactive, rather than proactive, solutions. Predictive analytics combined with human interpretation, can drive better decision-making. Thus, the advent of AI in health analytics can now allow healthcare organizations to benefit from leveraging predictive and prescriptive algorithms to improve operations, reduce fraud, manage patient health, and improve patient outcomes. Predictive analytics involves using empirical methods (statistical and others) to generate data predictions as well as assessing predictive power. It uses statistical techniques, including modeling, machine learning, and data mining that analyze current and historical data to make predictions. As an example, predictive analytics can identify high-risk patients and provide them treatment, thus reducing unnecessary hospitalizations or readmissions. Predictive analytics analyzes past data patterns and trends as it tries to predict what could happen in the future. Frequent techniques used for predictive analytics include linear regression, multivariate regression, logistic regression, decision trees, random forest, naïve Bayes, and k-means [84]. At this functional level, a healthcare organization can more effectively monitor operational data and develop predictive analytics to improve operational performance.

100

Foundations of Artificial Intelligence in Healthcare and Bioscience

Big data analytics [Text #1, page 83] goes beyond improving profits and cutting down on waste. It can contribute to predicting epidemics, curing diseases, improving the quality of life, and reducing preventable deaths. Among these applications, “predictive analytics” is being considered the next revolution, both in statistics and medicine around the world. The COVID-19 (see Chapter 8) pandemic puts predictive analytics center-stage in future public health strategies.

4.6.4 Prescriptive analytics [Text #9, page 100] [83] Prescriptive analytics is an advanced form of analytics. It serves to answer a question like “what should a health organization do?”. Prescriptive analytics advises on possible outcomes and results through actions that are likely to maximize critical metrics. Among the prominent techniques used for prescriptive analytics are optimization and simulation. Prescriptive analytics is used when decisions are made regarding a wide range of feasible alternatives. It enables decision-makers to look into consequences and expected results of their decisions and see the opportunities or problems. It also provides the decision-makers with the best course of action to take promptly. Successful prescriptive analytics depends mainly on the adoption of 5 essential elements (1) utilizing hybrid data; (2) including both structured and unstructured data types; (3) integrating predictions and prescriptions; (4) considering all possible side effects; and (5) using adaptive algorithms that can be tailored easily to each situation in addition to the importance of robust and reliable feedback mechanisms [87]. Prescriptive analytics offers the potential for optimal future outcomes for healthcare decision-makers. Based on decision optimization technology, such capabilities enable users to not only recommend the best course of action for patients or providers, but they also enable the comparison of multiple “what if” scenarios to assess the impact of choosing one action over another [88].

4.6.5 Primary AI literature reviews of health analytics (1, 2, 3) 1. Prevent opioid abuse through predictive analytics [89]: Big data analytics in healthcare might be the answer everyone is looking for in the opioid abuse crisis. Data scientists at Blue Cross Blue Shield have started working with analytics experts at Fuzzy Logix to tackle the problem. Using years of insurance and pharmacy data, Fuzzy Logix analysts have successfully identified 742 risk factors that predict, with a high degree of accuracy, whether someone is at risk for abusing opioids. As Blue Cross Blue Shield data scientist Brandon Cosley states in Forbes [90], “It’s not like one thing ‘he went to the doctor too much’ is predictive . . . it’s like ‘well you hit a threshold of going to the doctor, and you have certain types of conditions, and you go to more than one doctor and live in a certain zip code. . .’ Those things add up.” 2. Enormous growth in health care database through analytics and IoT [91]: A recent study by McKinsey reveals that the pieces of content uploaded to Facebook are in the 30 billion range, while the value of big data for the healthcare industry is about 300 billion [92]. These growths are necessitated by technological changes, including internal and external activities in electronic commerce (e-commerce), business operations, manufacturing, and

Chapter 4 • AI applications in the business and administration of health care

101

healthcare systems. Moreover, recent development in in-memory databases provided an increase in database performance and makes data collection possible through the Internet of things (IoT) and cloud computing facilities that provide persistent large-scale data storage and transformation achievable. 3. Chronic illness patients sharing experiences with other patients and doctors [93]: A vertically integrative healthcare analytics system called GEMINI allows point of care analytics for doctors when real-time usable and relevant information of their patients is required from questions they ask about the patients for whom they are caring. GEMINI extracts various data sources for each patient and stores them as information in a patient profile graph. The data sources (patients’ demographic data, laboratory results, and medications) and unstructured data (such as, doctors’ notes) are complex and varied, consisting of both structured data. The patient profile graph provides holistic and comprehensive information on patients’ healthcare profile. GEMINI can infer implicit information useful for administrative and clinical purposes, and extract relevant information for performing predictive analytics. At its core, GEMINI acts as a feedback loop that keeps interacting with the healthcare professionals to gather, infer, ascertain, and enhance the self-learning knowledge base.

4.6.6 Additional AI literature reviews of health analytics (4, 5, 6) 4. AI analytics support the practice of precision medicine (Text #10) [94]. In the challenging setting of chronic diseases characterized by multiorgan involvement, erratic acute events, and prolonged illness progression latencies, AI Health Analytics is heavily utilized. 5. Real-time alerting using prescriptive analytics [89]: In hospitals, Clinical Decision Support (CDS) software analyzes medical data on the spot, providing health practitioners with advice as they make prescriptive decisions. 6. Risk of hospital readmissions [95]: Data analytics tools, like machine learning algorithms and predictive models, can discover whether patients are at risk of hospital readmission or medication non-adherence, and then take appropriate actions to mitigate that risk.

4.7 Precision health (aka precision medicine or personalized medicine) [Text #10] According to the CDC’s (Center for Disease Control) “Precision Medicine Initiative,” this relatively new term is an emerging approach for disease treatment and prevention. It takes into account individual variability in genes, environment, and lifestyle for each person. It allows doctors and researchers to predict more accurately which treatment and prevention strategies for a particular disease works in which groups of people versus the “one-size-fits-all approach” [96]. Sometimes called “Personalized Medicine”, Precision Medicine is another vertically integrated concept in health and wellness care relating to numerous organizational, business,

102

Foundations of Artificial Intelligence in Healthcare and Bioscience

administrative and clinical health care issues. Its goal is to find unique disease risks and treatments that work best for patients. It includes: • • • • •

the use of family history, screening for diseases before you get sick; tailoring prevention; tailoring treatments; looking at your DNA; and the “All of Us” initiative [97] to track history and physical findings and genetic, behavioral, and environmental factors of 1 million Americans for several years to assess health factors; • develop health care solutions that make the best decisions to prevent or treat disease; • predict epidemics; and • improve the quality of life. AI-based precision medicine combines medicine, biology, statistics, and computing. Researchers are taking the first steps toward developing personalized treatments for diseases. They are applying AI and machine learning to multiple data sources, including genetic data, EHRs [Text #3, page 88], sensor data, environmental, and lifestyle data. Characterized by sustained collaboration across disciplines and institutions is the most promising research in the field. AI is being used by large corporations, universities, and government-funded research collectives to develop precision treatments for complex diseases [98].

4.7.1 Primary AI literature reviews of precision medicine/health (1, 2, 3) 1. Translating cancer genomics into precision medicine with artificial intelligence: [99] The foundation of precision medicine is becoming the integration of AI approaches such as machine learning, deep learning, and natural language processing (NLP) to transform big data into a clinically actionable plan. In the field of cancer genomics (see Chapters 5 and 7), the availability of multi-omics data, genotype-phenotype data through genome-wide association studies (GWAS), and literature mining have promoted the development of advanced AI techniques and solutions. This allows medical professionals to deliver personalized care through precision medicine [100]. Precision medicine is a valuable method being applied broadly for gaining insights into the genomic profile of tumor Next-generation sequencing (NGS). Simultaneously sequencing millions of DNA fragments in a single sample to detect a wide range of aberrations provides a complete profile of the tumor. The adoption of NGS for clinical purposes has grown tremendously due to the comprehensive detection of aberrations, combined with improvements in reliability, sequencing chemistry, pipeline analysis, data interpretation, and cost [101]. NGS supports the discovery of novel biomarkers, including mutation signatures and tumor mutational burden (TMB). Statistical analyses are performed, and patterns are discovered through millions of mutations detected by NGS [102]. It is revolutionizing

Chapter 4 • AI applications in the business and administration of health care

103

medical research and enabling multi-layer studies that integrate genomic data of high dimensionality such as DNA-seq, RNA-seq. It uses multi-omics data such as proteome, epigenome, and microbiome. The integrative analysis of multi-omics data provides a better view of biological processes leading to a fuller understanding of these systems compared to single-layer analysis. The advancement of machine learning (ML) technologies impacts the interpretation of genomic sequencing data, which has traditionally relied on manual curation. The 2 critical limitations of manually curating and interpreting the results from genomics data are scalability and reproducibility. These limitations continue to grow as more genomic data become available. The number of curation and amount of time experts or variant scientists can dedicate daily to this task is limited. Organizations are working to build and standardize multi-step AI protocols for variant classification to address such limitations. Some of these organizations include the American College of Medical Genetics and Genomics and the Association for Molecular Pathology (ACMG-AMP). Cancer biologists and molecular pathologists are experts in classifying cancer sequence variants for their pathogenicity and clinical relevance. This is a complex process, difficult to compile into a set of rules comprehensive enough to cover all scenarios. To what degree can ML algorithms learn the complex clinical decisions made by individual pathologists and classify the variants automatically? Massachusetts General Hospital (MGH) experimented and got very promising results. They selected B500 features, built multiple ML models on B20,000 clinical sign-out variants reported by board-certified molecular pathologists, and then compared the prediction results to find the best model [103]. The logistic regression model demonstrated the best performance, with only 1% false negativity and 2% false positivity, which is comparable to human decisions. Bio-NER (Name Entity Recognition) is the foundation of evidence extraction for precision medicine. NLP tools are being used in cancer genomics for the automated extraction of entities such as gene, genetic variants, treatments, and conditions. An essential step for tumor molecular profiling and downstream gene protein or genedisease relationship analysis is identifying genetic variants. Medical providers need accurate identification and interpretation of genetic variation data to design effective personalized treatment strategies for patients. Unfortunately, there is no universal standard for how genetic variants are called out, nor are there multiple ways of presenting the same event in the literature and genomic databases. More sophisticated AI learning methods are being developed. AI assists across the medical continuum from research to prognosis, therapy, and post-cancer treatment care. It remains the main driver of healthcare transformation towards precision medicine. As to how this revolutionary role of AI in healthcare translates into an improvement in the lives of patients remains to be demonstrated. It is dependent heavily on the availability of patient outcome data. 2. From hype to reality: data science enabling personalized medicine [104]: Personalized medicine is deeply connected to and dependent on data science, specifically AI

104

Foundations of Artificial Intelligence in Healthcare and Bioscience

machine learning. The key idea is to base medical decisions on individual patient characteristics (including biomarkers) rather than on averages over a whole population. The US Food and Drug Administration uses the term biomarker for any measurable quantity or score that can be used as a basis to stratify patients (e.g., genomic alterations, molecular markers, disease severity scores, lifestyle characteristics, etc.) [105]. The advantages of personalized medicine are summarized as [106]: a. better medication effectiveness, since treatments are tailored to patient characteristics, e.g., genetic profile; b. reduction of adverse event risks through avoidance of therapies showing no apparent positive effect on the disease, while at the same time exhibiting (partially unavoidable) adverse side effects; c. lower healthcare costs as a consequence of optimized and effective use of therapies; d. early disease diagnosis and prevention by using molecular and non-molecular biomarkers; e. improved disease management with the help of wearable sensors and mobile health applications; and f. smarter design of clinical trials due to the selection of likely responders at baseline. Patient characterization or “personalization” is difficult and requires state-of-the-art approaches offered by data science. Specifically, multivariate stratification algorithms using techniques from AI (including machine learning) play an important role. Driven by the increasing availability of large datasets, there is an increasing dependence on such data science-driven solutions. Specifically, ‘deep learning’ techniques have received a great deal of attention in the area of personalized medicine [107]. Large commercial players entering the field emphasize the perceived potential for machine learning-based solutions [108]. Machine learning algorithms have the potential of integrating multi-scale, multimodal, and longitudinal patient data to make relatively accurate predictions, which, may even exceed human performance [109]. However, the current hype around AI and machine learning must be kept in perspective. Major existing bottlenecks include: a. lack of sufficient prediction performance due to a lack of signals in the employed data; b. challenges with model stability and interpretation; c. a lack of validation of stratification algorithm via prospective clinical trials, which demonstrate benefit compared to standard of care; and d. general difficulties in implementing a continuous maintenance and updating scheme for decision support systems. Current algorithms can’t recommend the right treatment at the right time and dose for each patient. Steps that bring us closer to this goal: a. innovative software tools that better link knowledge with machine learning-based predictions from multi-scale, multi-modal, and longitudinal data; b. innovative modeling approaches, such as causal inference techniques and hybrid modeling, which go beyond typical state-of-the-art machine learning; and

Chapter 4 • AI applications in the business and administration of health care

105

c. new computational modeling approaches that allow us to identify critical transitions in a patient’s medical trajectory. No algorithm can replace a health care provider. Instead, the idea is to provide the health care provider with a tool that supports their decisions based on objective, datadriven criteria, and the wealth of available biomedical knowledge. 3. The future of precision medicine: potential impacts for health technology assessment [110]: Technological progress in precision medicine is expected to continue, spearheaded by programs such as the Precision Medicine Initiative [111] and the 100,000 Genomes Project [112]. This has consequences for the generation of clinical and economic evidence, which means healthcare decision-makers, including health technology assessment (HTA) agencies and guideline producers, should consider how their methods and processes accommodate these new technologies and services. A recent study suggests that precision medicine encompasses more than just pharmacogenetic and pharmacogenomic tests. It covers technologies that offer unique treatment pathways for individual patients [113]. Three major types of precision medicine technology likely to emerge over the next decade are identified as: complex algorithms; digital health applications (‘health apps’); and ‘omics’-based tests (all covered extensively in Chapters 5, 6 and 7). The experts anticipated increased use of algorithms that use AI to aid clinical decision-making over the next decade [114]. These algorithms require large datasets (‘knowledge bases’) that include a large number of variables, such as genetic information, sociodemographic characteristics, and electronic health records. Algorithms provide clinicians and patients with predictions on expected prognosis and optimal treatment choices using patient-level characteristics. These algorithms regularly update as new information is added to the knowledge base, an approach termed ‘evolutionary testing.’ Health apps include a wide range of tools providing disease management advice, receive and process patient-inputted data, and record physical activity and physiological data such as heart rate. Also, a subset of apps will likely fall under precision medicine, with the most advanced utilizing AI-based technologies. The number of health apps is expected to increase significantly over the next decade. Digital health experts predict that significant developments in this area will also involve apps that analyze social or lifestyle determinants of health, such as socioeconomic status or physical activity, to stratify patients, including apps linked to activity monitoring devices (or wearable technologies). The decision problem presented to HTA agencies and guideline developers become increasingly difficult to define when dealing with some precision medicine technologies and services. One expert noted that these issues are particularly relevant for whole-genome sequencing. This can be performed at any point during an individual’s lifetime, inform care pathways for a wide range of diseases, and be analyzed using many different methods [115]. The fast pace of innovation in precision medicine may also mean that assessment bodies will face higher volumes of evaluations. Stratification of a patient population may result in smaller sample sizes recruited to trials for precision medicine interventions. Combined with more complex and variable treatment pathways, this could increase levels of uncertainty of cost-effectiveness estimates presented to decision-makers.

106

Foundations of Artificial Intelligence in Healthcare and Bioscience

Another source of uncertainty is unit costs like ‘omics’-based tests that vary by laboratory [116]. Such tests may also yield continuous results where thresholds must be set to determine the outcome of testing [117]. Resolving some of the issues may require a balanced evaluation of the strengths and potential weaknesses of normative choices within an HTA framework. Any departure from currently established frameworks requires deliberation and co-operation between a wide range of entities across the health system. An appropriate solution depends. Upon: a. decision-making conText within which the HTA agency exists; b. stated objectives of the health system as a whole; c. practicality of the assessment, and d. relevance of the framework to the technology type [118]. Precision medicine interventions will increase over the next decade and will change the way health care services are delivered and evaluated. Such changes may be driven first by the complexity and uncertainty around delivering therapies that use biomarker data and, secondly, by the innovative nature of AI-based technologies. Worldwide healthcare systems need to consider adjusting their evaluative methods and processes to accommodate these changes.

4.7.2 Additional AI literature reviews of precision medicine/health (4, 5, 6) 4. Radiomics with artificial intelligence for precision medicine in radiation therapy [119]: Radiomics, emerging from radiation oncology, is a novel approach for solving the issues of precision medicine and how it can be performed, based on multimodality medical images that are noninvasive, fast and low in cost. 5. Machine learning in medicine (MLm): Addressing ethical challenges [120]: MLm algorithms use data that are subject to privacy protections, requiring that developers pay close attention to ethical and regulatory restrictions at each stage of data processing. The “All of Us” precision medicine research cohort in the US, is to fund the development of more representative data sets that can be used for training and validation [121]. 6. The future of personalized care: humanism meets artificial intelligence [122]: Artificial intelligence holds tremendous promise for transforming the provision of health care services in resource-poor settings [123]. AI tools and techniques that are still in their infancy already provide substantial benefits in providing in-depth knowledge on individuals’ health and predicting population health (Text #4, page 91) risks, and their use for medicine and public health is likely to increase substantially in the near future.

4.8 Preventive medicine/healthcare [Text #11] The holy grail of health care is preventive medicine (also referred to as preventive healthcare). It is something that has been talked about and strived towards for generations. Perhaps it can be postulated that through the greater emphasis on self-care (i.e., exercise, diet, healthier lifestyles), preventive medicine has made some progress. But in a broader perspective, the business and costs of health care proffers that little headway has been achieved.

Chapter 4 • AI applications in the business and administration of health care

107

Nonetheless, preventive health continues to be vertically integrated into virtually all of the health care categories discussed in this chapter. Mainly, preventive medicine focuses on the health of individuals, communities, and defined populations with the goal of protection, promotion, and maintenance of health and well-being through the prevention of disease, disability, and premature death [124]. The 3 types of prevention generally referred to include primary, secondary, and tertiary. Primary (aka “prevention”) meaning methods to avoid the occurrence of disease either through eliminating disease agents or increasing resistance to disease. Secondary (aka “treatment”) means methods to detect and address an existing disease prior to the appearance of symptoms. And tertiary (aka “rehabilitation”) [125] means methods to reduce the negative impact of symptomatic diseases, such as a disability or premature death, through rehabilitation and treatment [126]. Among the 3 types of prevention, there are universal concepts that can be ascribed to each. First is the concept of “disease prevention.” This includes measures that are aimed at preventing and/or detecting specific conditions. Essential measures in this concept include re-screening, vaccinations, and preventive medication. Second is the concept of “health promotion.” This focuses on promoting and maintaining a healthy lifestyle and a healthy social and physical environment. And third is “health protection” which aims to protect the population against health-threatening factors. Examples are: monitoring the quality of drinking and bathing water, waste disposal, and road safety [127]. All clinical and administrative health professionals involved in preventive health care address an array of components (Table 4 3 [128]) related to the field. There are no less than 4 professional disciplines that address these multiple areas. They include: (1) occupational medicine specialists; (2) aerospace medicine specialists; (3) general preventive medical practitioners (physicians, nurses, social workers, etc.); and (4) public health professionals [Text #12, page 111] [128]. Beyond these general categories of preventive health and medical professionals, there are additional subspecialty categories in the field. They include “Addiction Medicine” which is concerned with the prevention, evaluation, diagnosis, and treatment of persons with the disease of addiction, of those with substance-related health conditions, and of people who Table 4–3

Components of preventive medicine.

• Biostatistics and the application of biostatistical principles and methodology; • Epidemiology and its application to population-based medicine and research; • Health services management and administration including: developing, assessing, and assuring health policies; planning, implementing, directing, budgeting and evaluating population health and disease management programs; and utilizing legislative and regulatory processes to enhance health; • Control of environmental factors that may adversely affect health; • Control and prevention of occupational factors that may adversely affect health safety; • Clinical preventive medicine activities, including measures to promote health and prevent the occurrence, progression and disabling effects of disease and injury; and • Assessment of social, cultural and behavioral influences on health. Data from Preventive Medicine. American board of preventive medicine. www.theabpm.org.

108

Foundations of Artificial Intelligence in Healthcare and Bioscience

show unhealthy use of substances including nicotine, alcohol, prescription medications, and other licit and illicit drugs. “Medical Toxicologists” are physicians and PhDs who specialize in the prevention, evaluation, treatment, and monitoring of injury and illness from exposures to drugs and chemicals, as well as biological and radiological agents. “Undersea and Hyperbaric Medicine” physicians deal with decompression illness and diving accident cases. They use hyperbaric oxygen therapy to treat such conditions as carbon monoxide poisoning, gas gangrene, non-healing wounds, tissue damage from radiation, and burns and bone infections [128]. The categories of general preventive medical practitioners and public health professionals are the disciplines most related to the theme of this book, which is AI in health and wellness. The general preventive medical practitioners (physicians, nurses, social workers) deal mostly with the clinical aspects associated with preventive health care and, as such, receives full attention in Chapters 5 7. The category of public health professionals, while heavily related to clinical aspects of preventive care, have profound relevance to the business and administrative aspects of health care. AI’s influence in these programs conducted by public health professionals addressed in this Chapter 4 through the Primary and Additional programs are discussed below.

4.8.1 Primary AI literature reviews of preventive medicine/healthcare (1, 2, 3) 1. A digital platform to identify health conditions and therapeutic interventions using an automatic and distributed artificial intelligence system [129]: Google has developed a method and system for automatic, distributed, computer-aided, and intelligent data collection/analytics, health monitoring, health condition identification, and patient preventive/remedial health advocacy. The system integrates: (1) distributed patient health data collection devices; (2) centralized or distributed data servers running various intelligent and predictive data analytics engines for health screening, assessment, patient health condition identification, and patient preventive/remedial health advocacy; (3) specifically designed data structures including measurable health indicator vectors, patient health condition identification matrices and patient health condition vectors; (4) portal servers to interface with distributed physician terminal devices; and (5) distributed patient terminal devices for delivering health condition identification, health interventions, and patient preventive/remedial health advocacy, and for monitoring and tracking patient activities. The various intelligent and predictive engines are programmed to learn and extract hidden features and correlations from a large amount of (big) data obtained from the distributed data collection devices. Patient health data (PHD) is normally collected and processed onsite in centralized medical centers, hospitals, clinics, and medical labs. The collected data are transmitted to electronic medical record (EHR) systems to be examined and analyzed by medical professionals for further health screening, health risk assessment, disease prevention, patient health condition identification (PHCI), and patient preventive/remedial health

Chapter 4 • AI applications in the business and administration of health care

109

advocacy (PPRHA). Patient preventive/remedial healthy advocacy may also be referred to as patient therapeutic interventions. The term “therapeutic” is used here to broadly refer to prescriptive or nonprescriptive medicine, supplements, self-directed management, at-home care, therapies, medical/biological tests, referrals based on the patient’s health conditions. This system is a method for automatic and intelligent patient health condition identification (PHCI) and patient preventive/remedial health advocacy (PPRHA) by a processing circuitry that communications with a data repository and a communication interface. It is based on computer technologies and designed to use AI tools to solve technical problems associated with computer-aided health screening, risk assessment, PHCI, and PPRHA. 2. Aging well: using precision to drive down costs and increase health quality [130]: Health is a prominent component of aging well (see Chapter 7, page 406). It is considered as “a state of an individual characterized by the core features of physiological, cognitive, physical and reproductive function, and a lack of disease” [131]. The high prevalence and burden of chronic illnesses in the aging population also imposes a substantial economic toll on the health care system, which includes direct medical costs and income losses. It was estimated that approximately 3.8 trillion global healthcare dollars were spent toward the 4 major diseases (respiratory, cardiovascular, diabetes, and cancer) in 2010 [132]. Precision Medicine [Text #10, page 101] brings the promise of “the right treatment for the right person at the right time.” However, this paradigm promotes a dichotomy of high health care costs and low health quality because the majority of diseases of aging are chronic. Thus, an emphasis on precision medicine alone in aging care may aid in solutions responsive to the rapid development of epidemics. It does not identify the root cause of the problem. To break the seemingly irreversible trend of ever-increasing healthcare costs demands rethinking the current approaches for addressing chronic disease risk and management. Public health (Text #12, page 111) serves to adjust from treatment to prevention, and efforts to do so are increasing. The first step in prevention begins with health promotion and identification of risk factors. Studies show that 10% of modifiable risk factors accounted for 90% of cerebrovascular disease alone [133]. Similarly, researchers have also attributed 42% of all cancers to modifiable risk factors [134]. Findings like this require improvements in risk level knowledge on an individual as well as a population level to determine what early detection and health promotion efforts might prove useful. While more studies are needed to determine causality, current results underscore the potential of risk factors for protecting against disease. This is true at all disease stages where objectives must include slowing onset, mitigating progression, or averting the disease state altogether. Favorable outcomes flow from preventive schemes successfully executed. New York City’s approach to reducing sugary drink consumption is a good case study of this tactic. Using smoking cessation campaigns as a blueprint, the New York City (NYC) Department of Health and Mental Hygiene (DOHMH) implemented a mass media educational campaign identifying the health consequences of sugary drink consumption. For 7 years,

110

Foundations of Artificial Intelligence in Healthcare and Bioscience

these initiatives accomplished a 35% decrease for NYC adults and 27% for public high school students in consuming 1 or more sugary drinks a day [135]. Technology and AI are shaping the way we think of health and will improve current medical and public health practice. The economic impact will be substantial and provide a roadmap for reining in healthcare expenditures and improving health and wellness. Along with the medical model, an emerging new business ecosystem comprised of tens of thousands if not more new technology companies are emerging, centered on providing the right solution to the right person at the right time. Given the global pandemic of chronic diseases, the challenge is timely to bring together these key stakeholders to reduce the timeline of reaching the precision health and preventive health objectives. 3. Artificial intelligence can predict premature death [136]: Computers capable of teaching themselves to predict premature death could significantly improve preventive healthcare. A team of healthcare data scientists and doctors have developed and tested a system of computer-based ‘machine learning’ algorithms that predict the risk of early death due to chronic disease in a mainly middle-aged population. They found this AI system to be accurate in its predictions and performed better than the current standard approach to prediction developed by human experts [137]. The team used health data for just over half a million people aged between 40 and 69 recruited to the UK Biobank between 2006 and 2010 and followed up until 2016. The AI machine learning models used in the study are ‘random forest’ and ‘deep learning’ algorithms. This study builds on previous work by the University of Nottingham which showed that 4 different AI algorithms, ‘random forest,’ ‘logistic regression,’ ‘gradient boosting’ and ‘neural networks,’ were significantly better at predicting cardiovascular disease than an established algorithm used in current cardiology guidelines. The Nottingham researchers predict that AI will play a vital role in the development of future tools that can deliver personalized medicine, tailor risk management to individual patients. Further research will be required to verify and validate these AI algorithms in other population groups and explore ways to implement these systems into routine healthcare.

4.8.2 Additional AI literature reviews of preventive medicine/healthcare (4, 5, 6) 4. AI in early detection of breast cancer [138]: Various factors are driving interest in the application of artificial intelligence (AI) for breast cancer (BC) detection. 5. Role of the primary care provider in preventive health exams [139]: The Affordable Care Act (ACA) of 2010 successfully shifted US health policy to emphasize disease prevention by mandating full coverage of approved preventive services by primary care providers (PCPs). 6. Use of preventive health services among cancer survivors in the U.S. [140]: Although many cancer survivors are in excellent health, the underlying risk factors and side effects of cancer treatment increase the risk of medical complications and secondary malignancies.

Chapter 4 • AI applications in the business and administration of health care

111

4.9 Public health [Text #12] The CDC Foundation defines “Public Health” as “the science of protecting and improving the health of people and their communities by promoting healthy lifestyles, researching disease and injury prevention, and detecting, preventing and responding to infectious diseases. Overall, public health is concerned with protecting the health of entire populations. These populations can be as small as a local neighborhood, or as big as an entire country or region of the world” [141]. The core science of Public Health is epidemiology. From among 102 definitions of epidemiology reviewed in the literature, the currently accepted definition (World Health Organization [142]) reads as follows: “Epidemiology is the study of the distribution and determinants of health-related states or events (including disease), and the application of this study to the control of diseases and other health problems. Various methods are used to carry out epidemiological investigations: surveillance and descriptive studies can be used to study distribution; analytical studies are used to study determinants.” Practically speaking, Public Health attempts to prevent people from getting sick or injured in the first place. It promotes wellness by encouraging healthy behaviors. It conducts scientific research, educates the public about health, and works to assure conditions in which people can be healthy. That can mean educating people about the risks of alcohol and tobacco, vaccinating children and adults to prevent the spread of disease, sets safety standards to protect workers, and develop school nutrition programs to ensure kids have access to healthy food. In its efforts to protect the general population, Public Health works to track disease outbreaks, prevent injuries, and shed light on why some of us are more likely to suffer from poor health than others. The many facets of public health range from speaking out for laws that promote smoke-free indoor air, seatbelts, spreading the word about ways to stay healthy, and giving science-based solutions to problems. Examples of the many fields of Public Health are listed in Table 4 4 [143]. Beyond the fields and providers of Public Health services, there is a series of Public health systems (Table 4 5 [144]) commonly defined as “all public, private, and voluntary entities that contribute to the delivery of essential public health services within a jurisdiction.” This concept ensures that all entities’ that contribute to the health and well-being of the community or state are vertically integrated and interoperable (Fig. 4 7 [144]) in delivering and assessing the provision of public health services. AI has the potential to improve the efficiency and effectiveness of an expanded public health continuum. It can make the concepts of precision/personalized health [Text #10, page 101], predictive health [Text #8, page 99] and preventive health [Text #11, page 106] realities. The use of AI presents a radical expansion in the scope of public health, and many of these activities are vertically integrated and made interoperable through organizations within and beyond established public health institutions [145].

112

Foundations of Artificial Intelligence in Healthcare and Bioscience

Table 4–4 • • • • • • • • • • • • •

Fields and providers in public health.

First responders Restaurant inspectors Health educators Scientists and researchers Nutritionists Community planners Social workers Epidemiologists Public health physicians Public health nurses Occupational health and safety professionals Public policymakers Sanitarians

Data from WHO. Epidemiology. 2017. http://www.who.int/topics/epidemiology/en/.

Table 4–5

10 Essential public health services.

Public health activities that all communities should undertake: 1. Monitor health status to identify and solve community health problems; 2. Diagnose and investigate health problems and health hazards in the community; 3. Inform, educate, and empower people about health issues; 4. Mobilize community partnerships and action to identify and solve health problems; 5. Develop policies and plans that support individual and community health efforts; 6. Enforce laws and regulations that protect health and ensure safety; 7. Link people to needed personal health services and assure the provision of health care when otherwise unavailable; 8. Assure competent public and personal health care workforce; 9. Evaluate effectiveness, accessibility, and quality of personal and population-based health services; 10. Research for new insights and innovative solutions to health problems. From CDC.org. The public health system & the 10 essential public health services; 2018.

4.9.1 Primary AI literature reviews of public health (1, 2, 3) 1. P4 medicine [146]: Back in the introduction to this chapter, I spoke about the concepts of vertical integration and interoperability and their role in both the business and clinical aspects of health care. Throughout this chapter, the 2 concepts kept reappearing in different forms and functions for many current and evolving programs and strategies. We now arrive at the concept of P4 Medicine, which represents the perfect example of both vertical integration and interoperability in both health care business and clinical delivery. Most of the components to be described as part of the P4 Medicine model (predictive analytics) [Text #8, page 99], preventive care [Text #11, page 106], precision or personalize care [Text #10, page 101]) have already been mentioned numerous times

Chapter 4 • AI applications in the business and administration of health care

113

FIGURE 4–7 The public health system. Public health is a network of vertically integrated and interoperable systems delivering and assessing the provision of public health services. From CDC.org. The Public Health System; 2018.

throughout this chapter. They now come together in an elegant public health and clinical delivery model of integrated health care. The quintessential example of the value of AI and its vertical integration and interoperable capabilities in promoting public health, personal health, and the business and administration of health care (i.e., cost containment, access, and quality) is the development “P4 Medicine.” This strategy facilitates 2 revolutions in medicine: (1) systems medicine (global or comprehensive approaches to disease); and (2) AI data-driven personalized measurements. Together they create a new system of health care that provides 4 (“P4 Medicine”) evolving health care concepts: (1) predictive analytics, (2) preventive care, (3) personalized (precision) healthcare, and (4) participatory care [147]. P4 medicine is a plan to radically improve the quality of human life via biotechnology. The premise of P4 Medicine is that, over the next 20 years or so, medical practice will be revolutionized by biotechnology to manage a person’s health, instead of managing a patient’s disease. Today’s medicine is reactive, meaning we wait until someone is sick before administering treatment. Medicine of the future will be predictive and preventive by examining the unique biology of an individual to assess their probabilities of developing various diseases and then designing appropriate treatments, even before the onset of a disease [148]. We are rapidly approaching a point in healthcare evolution where it is possible to treat disease while considering the various factors that affect individual patients so that they may receive customized care. We are transitioning to patient-specific diagnostics and therapeutics. As it relates to predictive medicine (first tier in P4 Medicine), AI technology is beginning to allow us to understand better not only our genomic DNA but also our

114

Foundations of Artificial Intelligence in Healthcare and Bioscience

epigenetic response to environmental changes. We are moving further into the analysis of proteomics, transcriptomics, genomics, metabolomics, and lipidomics, which allows us to predict and target diseases [149]. Preventive care (second tier) flows from predictive care, as well as precision care (third tier). In precision care, we classify people into subpopulations using their common genetic patterns, lifestyles, drug responses, and environmental and cultural factors. This provides enormous amounts of information that is used to deliver the most efficient treatment or preventive care at the right time, to the patients who will best benefit from it. The P4 system does not design a novel therapeutic intervention for each patient, such as a personalized one-patient drug, which would be financially and technically difficult [150]. The concept of participatory medicine relies on the idea that patients should play a decisive role in their healthcare by actively controlling their health status and by participating in the decision-making process regarding their treatments. Doctors are encouraged to evaluate treatment possibilities while considering the benefits and limitations of each alternative, rather than merely imposing a treatment. Decision-making becomes a much more complicated process with more information available and given the increasing number of variables that are taken into consideration. Innovative ways to practice medicine lead to changing the way medicine is taught. Medical education needs to shift its focus from pure content to the development of integrative skills and competences. Basic subjects such as molecular and cell biology, genetics, and pathophysiology must be better understood to practicing precision medicine. Interpersonal skills are another competence of great importance to medical education in the next era. Doctors will be required to communicate and discuss treatment options with their patients as well as with a series of other professionals with whom they must cooperate. It is also a benefit for the next generation of doctors to understand, at least at a basic level, subjects beyond the current medical curriculum, such as coding, AI, blockchain technology, 3-D printing, and other related topics [151]. 2. “geoAI” in environmental epidemiology [152]: The scientific field of geospatial AI (geoAI) has recently been formed from the combination of innovations in spatial science with the rapid growth of methods in AI to glean meaningful information from spatial big data. The innovation of geoAI lies partially in its applications to address real-world problems. These novel geoAI methods are used to address human health-related problems such as environmental epidemiology [153]. Ultimately, one of the higher goals for integrating geoAI with environmental epidemiology is in conducting more accurate and highly resolved modeling of environmental exposures (compared to conventional approaches). This, in turn, would lead to a more accurate assessment of the environmental factors to which we are exposed, and that would improve our understanding of the potential associations between environmental exposures and disease in epidemiologic studies.

Chapter 4 • AI applications in the business and administration of health care

115

Given the advances and capabilities in recent research, we can begin to consider how geoAI technologies can be applied explicitly to environmental epidemiology. By determining the factors to which we may be exposed and thus may influence health, environmental epidemiologists can implement direct methods of exposure assessment. These might include such technologies as biomonitoring (e.g., measured in urine), and indirect methods, such as exposure modeling. Spatial science has been enormously valuable in exposure modeling for epidemiologic studies over the past 2 decades. It has enabled environmental epidemiologists to use GIS technologies to create and link exposure models to health outcome data using geographic variables (e.g., geocoded addresses) in their investigation of the effects of factors such as air pollution on the risk of developing diseases such as cardiovascular disease [154]. “geoAI” applications for environmental epidemiology are moving us closer to achieving the goal of providing a more accurate picture of the environmental factors to which we are exposed, which can combine with other relevant information regarding health outcomes, confounders. It lets us investigate whether a particular environmental exposure is associated with a particular outcome of interest in an epidemiologic study. geoAI is an emerging interdisciplinary scientific field that captures the innovations of spatial science, AI (particularly machine learning and deep learning), data mining, and high-performance computing for knowledge discovery from spatial big data. As climate change increasingly threatens the environmental balance and ecosystems of our planet, geoAI and environmental epidemiology have become some of the most essential worldwide considerations in Public Health. Without immediate attention to these environmental factors adversely affecting climate (i.e., CO2, carbon emissions, greenhouse gases, methane gases, air, and water pollution), the health of future generations in the intermediate and long-range future is at stake. 3. Public health research on gun violence: long overdue [155]: Gun violence is a defining and critical public health challenge of our time, particularly in the USA. Public health strategies are built on research to identify patterns of risk, illuminate productive targets for intervention, and assess the effectiveness of interventions. Unfortunately, in the United States, there seems to be a wanton lack a comprehensive public health approach to gun violence, due in large part to the absence of federal funding for research on gun violence for more than 2 decades. After the 2013 tragedy at Sandy Hook Elementary School, the Centers for Disease Control and Prevention (CDC) asked the Institute of Medicine (IOM) and the National Research Council (NRC) to define a public health research agenda for gun violence. The resulting consensus report, “Priorities for Research to Reduce the Threat of Firearm-Related Violence,” laid out the highest-priority research questions to effect progress in a 3 5-year time frame [156]. A provision in a 1996 omnibus spending bill known as the Dickey Amendment forbade the CDC from using its funds to promote or advocate for gun control. This is thus interpreted as a prohibition on supporting any research on firearms, and the CDC program was dismantled. As a result, we lack even the most basic information about the prevalence and safety of firearms in the United States, nor on the effectiveness of interventions aimed at reducing the

116

Foundations of Artificial Intelligence in Healthcare and Bioscience

probability of injury and death related to their use. The 2013 IOM/NRC report notes that public health research should be integrated with insights from criminal justice and other fields since no single agency or research strategy can provide all of the answers. If implemented, the IOM/NRC public health research agenda report would provide knowledge to inform the U.S. approach to minimizing firearm-related violence and its effects on the health of the American public. Scientific evidence generated by this research would also enable the development of sound policies that support the rights and responsibilities which are central to gun ownership in the United States. It is long overdue to bring the full power of science into this issue of such significant concern to the United States. We need researchers from different disciplines, including public health, social and behavioral sciences, mental health, law enforcement, and AI to work together to tackle this problem. That can only happen if we restore much-needed research funding. It is time to end the counterproductive research freeze on gun violence and its deleterious effects on the public health of the United States.

4.9.2 Additional AI literature reviews of public health (4, 5, 6) 4. Using AI to solve public health problems [157]: Public education program to combat opioid misuse and abuse. 5. AI opens a new frontier for suicide prevention [158]: AI is harnessed to monitor and respond to mental health crises. 6. How to solve homelessness with artificial intelligence [159]: Computer science and social work are merging to battle complex public health societal problems. The goal of Chapter 4 is to serve as a “guidebook” to how the applications of AI in the business and administrative aspects of health care have advanced an industry we all depend upon, perhaps more than any other in our lives. The vertical integration and interoperability of big data analytics, blockchain; GOV and NGO organizations; EHRs; population health; healthcare analytics; precision medicine/health (personalized health); preventive medicine/ healthcare; and public health have changed our health care system, all for the ultimate betterment of our personal health, the health of our loved ones and all humanity. The next chapters attempt to expand this “guidebook” as we drill down into the specifics of how AI is applied in the biosciences and in our clinical health care delivery system. It provides an in-depth analysis and review of AI applications in diagnostic technologies and services (Chapter 5), AI applications in medical therapies and services (Chapter 6), and AI applications in prevalent diseases and disorders (Chapter 7). Finally, Chapter 8 will address AI’s role in the epidemiology, pathogenesis, clinical and management aspects of the SARSCoV-2 virus and the COVID-19 pandemic. It is the specific AI-enhanced clinical health care technologies and services presented in the next 4 chapters that we depend upon to protect, prevent, and assure our health and wellness. I hope you found the information in Chapter 4 valuable and that you will find the remaining chapters worthwhile.

Chapter 4 • AI applications in the business and administration of health care

117

References [1] Kasich JR. State of the State Address. Lima: Ohio State Legislature; 2013. [2] FederalRegister.Gov. 2019. [3] Nongovernmental organizations (NGOs) working in global health research. Fogarty International Center at the National Institutes of Health; 2019. [4] Papanicolas I, Woskie LR, Jha AK. Health care spending in the United States and other high-income countries. JAMA 2018;319(10):1024 39. [5] Marbury D. How health systems are using AI and future predictions. Managed Healthc Executives 2018. [6] Woolhandler S, Himmelstiein DU. Single payer reform: the only way to fulfill the President’s pledge of more coverage, better benefits, and lower costs. Ann Intern Med 2017;166(8):587 8. [7] Mesko B. Artificial intelligence will redesign healthcare. Medical Futurist; 2016. [8] Das R. Top 8 Healthcare predictions for 2019. Forbes; 2018. [9] Insights Team. AI and healthcare: a giant opportunity. Forbes Insights; 2019. [10] Muegge C. Generate tangible value in medicine, health economics and outcomes with AI and machine learning. Digital.AI; 2019. [11] Joudak H, Rashidian A, Minaei-Bidgoli B, et al. Using data mining to detect health care fraud and abuse: a review of literature. Glob J Health Sci 2015;7(1):194 202. [12] IBM. Watson Care Manager. IBM.com; 2019. [13] Kalis B, Collier M, Fu R. 10 promising AI applications in health care. Harvard Business Review; 2018. [14] Smith B. Using AI to help save lives. Microsoft; 2018. [15] Davenport TH, Bean R. How big data and AI are accelerating business transformation. NewVantage Partners LLC. Big Data and AI Executive Survey; 2019. [16] Yichuan Wang Y, Kung KA, Byrd TA. Big data analytics: understanding its capabilities and potential benefits for healthcare organizations. Technol Forecast Soc Change 2016. Available from: http://dx.doi.org/ 10.1016/j.techfore.2015.12.019. [17] Dagliati A, Tibollo V, Sacchi L, et al. Big data as a driver for clinical decision support systems: a learning health systems perspective. Front Digit Humanit 2018;5:8. Available from: https://doi.org/10.3389/fdigh. [18] Ayers R. Can big data help provide affordable healthcare? Dataconomy 2019. [19] Ayers R. 4 ways AI is reshaping the future of health care. CoinBase; 2018. [20] Kuch M. Understanding “Big Data” and its role in global health. Medium; 2017. [21] Subías P, Ribas V. Big data for critical care. Big Data CoE. Version 1.0; 2018. [22] High R, Low J. Scientific American blogs (2014); blogs. ,Scientificamerican.com/ mind.guest-blog/ 2014/10/20/expert-cancer-care-may-soon-be-everywhere-thanks-to-watson. . [23] Marr B. How big data is transforming medicine. Forbes; 2016. [24] Omer Gottesman O, Johansson F, Komorowski M, et al. Guidelines for reinforcement learning in healthcare. Nat Med 2019;25:16 18. [25] Dumka A, Sah A. Chapter 6-Healthcare data analytics and management advances in ubiquitous sensing applications for healthcare. Smart ambulance system using concept of big data and internet of things. Academic Press; 2019. p. 155 76. [26] Rodriguez F, Scheinker D, Harrington RA. Promise and perils of big data and artificial intelligence in clinical medicine and biomedical research. Circulation Res 2018;123:1282 4. [27] Fu R. Digital disruption: the transformation of blockchain in healthcare. Healthcare Weekly: 2018.

118

Foundations of Artificial Intelligence in Healthcare and Bioscience

[28] Quarré F. The Advent of AI and blockchain in health care. Forbes; 2019. [29] BIS Research. Global blockchain in healthcare market: focus on industry analysis and opportunity matrix analysis and forecast, 2018 2025. 2018. [30] Brennan B. Pharmeum: the world’s first blockchain and AI platform enabling access to affordable, digital healthcare globally. Blockchain Healthcare Review; 2019. [31] Makary M. Medical errors now third leading cause of death in United States. BMJ 2016;353:i2139. [32] Heckman J. Blockchain prototype promises ‘complete transparency’ between CDC, health providers. Federal News Network; 2018. [33] Siemienczuk J. No, technology can’t solve our vaccination problem (but it sure can help). Forbes; 2019. [34] Kamel MN, Wilson JT, Clauson KA. Geospatial blockchain: promises, challenges, and scenarios in health and healthcare. Int J Health Geographics 2018;17:25. [35] Simon Lebech Cichosz SL, Stausholm MN, Kronborg T., et al. How to use blockchain for diabetes health care data and access management: an operational concept. Journ Diab Sc Tech 2018. [36] Birkhead GS, Klompas M, Shah NR. Uses of electronic health records for public health surveillance to advance public health. Annu Rev Public Health 2015;36:345 59. [37] Botsis T, Hartvigsen G, Chen F, et al. Secondary use of EHR: data quality issues and informatics opportunities. AMIA Joint summits on translational science proceedings. AMIA Summit Transl Sci 2010;2010:1 5. [38] Gabriel CD, Searcy MT. Adoption of electronic health record systems among U.S. non-federal acute care hospitals: 2008 2014. 2015. [39] Rep. Hobson DL. H.R.3323-Administrative Simplification Compliance Act 107th Congress (2001-2002). Congress.gov; 2002. [40] Goldstein BA, Navar AM, Pencina MJ, et al. Opportunities and challenges in developing risk prediction models with electronic health records data: a systematic review. J Am Med Inform Assoc 2017;24(1):198 208. Available from: https://doi.org/10.1093/jamia/ocw042. [41] Davenport TH, Hongsermeier TM, Mc Cord KA. Using AI to improve electronic health records. Harvard Business Review; 2018. [42] Parikh RB, Schwartz JS, Navathe AS. Beyond genes and molecules precision medicine. N Engl J Med 2017;376:1609 12.

a precision delivery initiative for

[43] Bates DW, Saria S, Ohno-Machado L, Shah A, Escobar G. Big data in health care: using analytics to identify and manage high-risk and high-cost patients. Health Aff 2014;33:1123 31. [44] Press, G. Cleaning big data: most time-consuming, least enjoyable data science task, survey says. Forbes; 2016. [45] Lohr, S. For big-data scientists, ‘Janitor Work’ is key hurdle to insights. NY Times; 2014. [46] Chopra V, McMahon Jr. LF. Redesigning hospital alarms for patient safety: alarmed and potentially dangerous. JAMA 2014;311:1199 200. [47] LeCun Y, Bengio Y, Hinton G. Deep learning. Nature 2015;521:436 44. [48] Wu Y, et al. Google’s neural machine translation system: bridging the gap between human and machine translation. arXi v [cs.CL]; 2016. [49] Bengio Y, Courville A, Vincent P. Representation learning: a review and new perspectives. IEEE Trans Pattern Anal Mach Intell 2013;35:1798 828. [50] What is value-based healthcare? NEJM Catalyst; 2017. [51] Henry J, Pylypchuk Y, Searcy T, et al. Adoption of electronic health record systems among U.S. Non-Federal Acute Care Hospitals: 2008-2015. Healthit.gov; 2016.

Chapter 4 • AI applications in the business and administration of health care

119

[52] Lee CH, Yoon HJ. Medical big data: promise and challenges. Kidney Res Clin Pract 2017;36(1):3 11. Available from: https://doi.org/10.23876/j.krcp.2017.36.1. [53] HHS announces next steps in advancing interoperability of health information. HHS.gov; 2019. [54] Holmgren AJ, Patel V, Adler-Milstein J. Progress in interoperability: measuring US hospitals’ engagement in sharing patient data. 2017. https://doi.org/10.1377/hlthaff.2017.0546 [55] Orenstein D. EHR integration: achieving this digital health imperative. HealthCatalyst; 2018. [56] Knowles M. 4 ways AI can make EHR systems more physician-friendly. HealthIT and CIO Report; 2018. [57] Bresnick J. EHR users want their time back, and artificial intelligence can help. HealthAnalytics; 2018. [58] Rajkomar A, et al. Scalable and accurate deep learning with electronic health records (EHRs). Digital Med 2018;1:18. Available from: https://doi.org/10.1038/s41746-018-0029-1. [59] Kindig D, Stoddart G. What is population health? Am J Public Health 2003;93(3):380 3. [60] Nash DB, Fabius RJ, Skoufalos A, Clarke JL, Horowitz MR. Population health: creating a culture of wellness. 2nd ed. Burlington, MA: Jones & Bartlett Learning; 2016. [61] Jacobson DM, Teutsch S. An environmental scan of integrated approaches for defining and measuring total population health by the clinical care system, the government public health system, and stakeholder organizations. Available at: http://www.improvingpopulationhealth.org. [accessed 08.15.15]. [62] Company Author. Population health vs community health and everything in-between. Eccovia Solutions; 2018. [63] Steward D, Wan TTH. The role of simulation and modeling in disaster management. J Med Syst 2007;31:125 30. [64] Wan TTH. Strategies to modify the risk for heart failure readmission: a systematic review and meta-analysis. In: Wan TTH, editor. Population health management for poly chronic conditions: evidence-based research approaches. New York: Springer; 2018. p. 85 105. [65] Stein N. The future of population health management: artificial intelligence as a cost-effective behavior change and chronic disease prevention and management solution. MOJ Public Health 6(5):00188. Available from: https://doi.org/10.15406/mojph.2017.06.00188. [66] Shaban-Nejad A, Michalowski M, Buckeridge DL. Health intelligence: how artificial intelligence transforms population and personalized health. npj Digital Med 2018;1 Article number: 53. [67] Shaban-Nejad A, Lavigne M, Okhmatovskaia A, Buckeridge DL. PopHR: a knowledge-based platform to support integration, analysis, and visualization of population health data. Ann N Y Acad Sci 2017;1387:44 53. [68] Collins FS, Varmus H. A new initiative on precision medicine. N Engl J Med 2015;372:793 5. [69] The National Institute of Health (NIH). All of Us research program. ,https://allofus.nih.gov/. [accessed 29.08.18]. [70] Dankwa-Mullan I, Rivo M, Sepulveda M, et al. Transforming diabetes care through artificial intelligence: the future is here. Popul Health Manag 2019;22(3). Available from: https://doi.org/10.1089/ pop.2018.0129. [71] Latts L. ADA/IBM Watson Health Study (N . 300,000) finds that nearly 60% of people with T2D discontinue therapy after one year. Presented at: American diabetes association 78th scientific session, June 22 26, 2018. [72] Research 2 Guidance. Top 3 therapy fields with the best market potential for digital health apps. ,https://research2guidance.com/top-3-therapy-fields-with-the-best-marketpotential-for-digital-healthapps/. [accessed 18.07.18]. [73] Kent J. Inching toward the data-driven future of population health management. Healthcare Analytics; 2019. [74] Bresnick J. Which healthcare data is important for population health management? HealthIT Analytics; 2017.

120

Foundations of Artificial Intelligence in Healthcare and Bioscience

[75] Flaxman AD, Vos T. Machine learning in population health: opportunities and threats. PLoS Med 2018;15(11):e1002702. [76] Brundin P, Langston JW, Bloem BR. How mobile health technology and electronic health records will change care of patients with Parkinson’s disease. J Parkinson’s Dis 2018;8(s1):S41 5. [77] Lakhani P, Prater AB, Hutson RK. Machine learning in radiology: applications beyond image interpretation. J Am Coll Radiol 2017. [78] Saiful I, Hasan M, E-Alam N. A systematic review on healthcare analytics: application and theoretical perspective of data mining. Healthc (Basel) 2018;6(2):54. Available from: https://doi.org/10.3390/ healthcare6020054. [79] Madakam S, Ramaswamy R, Tripathi S. Internet of Things (IoT): a literature review. J Comput Commun 2015;3:164 73. [80] Tomar D, Agarwal S. A survey on data mining approaches for healthcare. Int J Bio-Sci Bio-Technol 2013;5:241 66. Available from: https://doi.org/10.14257/ijbsbt.2013.5.5.25. [81] Herland M, Khoshgoftaar TM, Wald R. A review of data mining using big data in health informatics. J Big Data 2014;1:2. Available from: https://doi.org/10.1186/2196-1115-1-2. [82] 4 key types of healthcare analytics 2018.

is your practice using them? Integrated Medical Partners Blog;

[83] Khalifa M. Health analytics types, functions and levels: a review of literature. In: Hasman, et al., editors. Data, informatics and technology: an inspiration for improved healthcare. IOS Press; 2018. Available from: http://dx.doi.org/10.3233/978-1-61499-880-8-137. [84] Zerbib LP, Peth L, Outcault S. Leveraging artificial intelligence to improve healthcare organization operations. Mazars 2019. [85] Simpao AF, Ahumada LM, Gálvez JA, et al. A review of analytics and clinical informatics in health care. J Med Syst 2014;38(4):45. [86] HertzI. Healthcare and BI: how can analytics improve patient care? Technology Advice; 2018. [87] Basu A, Five pillars of prescriptive analytics success. Analytics Magazine; 2013. p. 8 12. [88] Kuttappa S. Prescriptive analytics: the cure for a transforming healthcare industry. IBM Big Data Hub and Analytics; 2019. [89] Lebied M. 12 examples of big data analytics in healthcare that can save people. Bus Intell 2018. [90] Marr B. How big data helps to tackle the no 1 cause of accidental death in the U.S. Forbes; 2017. [91] Ifeyinwa AA, Nweke HF. Big data and business analytics: trends, platforms, success factors and applications. Big Data Cogn Comput 2019;3:32. [92] Brynjolfsson E, Hitt LM, Kim HH. Strength in numbers: how does data-driven decision-making affect firm performance? 2011. Available online: http://ssrn.com/abstract 5 1819486. [93] Ling ZJ, Tran QT, Fan J, Gerald CHK. GEMINI: an integrative healthcare analytics system. Proc VLDB Endow 2017;7(13):1766 71. [94] Miller DD, Brown EW. Artificial intelligence in medical practice: the question to the answer? Am J Med 2018;131:129 33. [95] Kent J. 5 ways to ethically use social determinants of health data. Health IT Analytics; 2019. [96] CDC. What is precision medicine? National Institute of Health (NIH). Genetics Home Reference (GHR); 2019. [97] The National Institute of Health (NIH). All of Us research program. 2019. ,https://allofus.nih.gov/. . [98] Insights Team. How machine learning is crafting precision medicine. Forbes Insights; 2019. [99] Xu J, Yang P, Xue S. Translating cancer genomics into precision medicine with artificial intelligence: applications, challenges and future perspectives. Hum Genet 2019;138(2):109 24.

Chapter 4 • AI applications in the business and administration of health care

121

[100] Li Y, Shi W, Wasserman WW. Genome-wide prediction of cis-regulatory regions using supervised deep learning methods. BMC Bioinform 2018;19:202. Available from: https://doi.org/10.1186/s12859-018-2187-1. [101] Pennell NA, Mutebi A, Zhou ZY. Economic impact of next generation sequencing vs sequential singlegene testing modalities to detect genomic alterations in metastatic non-small cell lung cancer using a decision analytic model. ASCO 2018. [102] Steuer CE, Ramalingam SS. Tumor mutation burden: leading immunotherapy to the era of precision medicine? J Clin Oncol 2018;36:631 2. Available from: https://doi.org/10.1200/JCO.2017.76.8770. [103] Zomnir MG, Lipkin L, Pacula M, et al. Artificial intelligence approach for variant reporting. JCO Clin Cancer Inf 2018. Available from: https://doi.org/10.1200/CCI.16.00079. [104] Fröhlich H, Balling R, Beerenwinkel N, et al. From hype to reality: data science enabling personalized medicine. BMC Med 2018;16:150. [105] FDA. ,https://www.fda.gov/ucm/groups/fdagov-public/@fdagov-drugs-n/documents/document/ ucm533161.pdf. . [106] Mathur S, Sutton J. Personalized medicine could transform healthcare. Biomed Rep 2017;7:3 5. Available from: https://doi.org/10.3892/br.2017.922. [107] Beaulieu-Jones BK, Orzechowski P, Moore JH. Mapping patient trajectories using longitudinal extraction and deep learning in the MIMIC-III critical care database. Pac Symp Biocomput 2018;23:123 32. [108] ,http://medicalfuturist.com/innovative-healthcare-companies/. [109] Yu K-H, Zhang C, Berry GJ, Altman RB, Ré C, Rubin DL, et al. Predicting non-small cell lung cancer prognosis by fully automated microscopic pathology image features. Nat Commun 2016;7:12474. Available from: https://doi.org/10.1038/ncomms12474. [110] Love-Koh J, Peel A, Rejon-Parrilla JC. Future precis medicine: potential impacts health technol assess 2018;36(12):1439 51. [111] Ashley EA. The precision medicine initiative: a new national effort. JAMA 2015;313(21):2119 20. Available from: https://doi.org/10.1001/jama.2015.3595. [112] England NHS. 100,000 genomes project: paving the way to personalised medicine. London: NHS England; 2016. [113] Ijzerman MJ, Manca A, Keizer J, Ramsey SD. Implementation of comparative effectiveness research in personalized medicine applications in oncology: current and future perspectives. Comp Eff Res 2015;5:65 72. Available from: https://doi.org/10.2147/CER.S92212. [114] Alemayehu D, Berger ML. Big data: transforming drug development and health policy decision making. Health Serv Outcomes Res Methodol 2016;16(3):92 102. Available from: https://doi.org/10.1007/ s10742-016-0144-x. [115] Dewey FE, Grove ME, Pan C, Goldstein BA, Bernstein JA, Chaib H, et al. Clinical interpretation and implications of whole-genome sequencing. JAMA 2014;311(10):1035 45. Available from: https://doi. org/10.1001/jama.2014.1717. [116] Fugel H-J, Nuijten M, Postma M, Redekop K. Economic evaluation in stratified medicine: methodological issues and challenges. Front Pharmacol 2016;7:113. Available from: https://doi.org/ 10.3389/fphar.2016.00113. [117] Garattini L, Curto A, Freemantle N. Personalized medicine and economic evaluation in oncology: all theory and no practice? Expert Rev Pharmacoecon Outcomes Res 2015;15(5):733 8. [118] Cowles E, Marsden G, Cole A, Devlin N. A review of NICE methods and processes across health technology assessment programmes: why the differences and what is the impact? Appl Health Econ Health Policy 2017;15(4):469 77. Available from: https://doi.org/10.1007/s40258-017-0309-y. [119] Arimura H, Soufi M, Kamezawa H. Radiomics with artificial intelligence for precision medicine in radiation therapy. J Radiat Res 2018;60(1):150 7. Available from: https://doi.org/10.1093/jrr/rry077.

122

Foundations of Artificial Intelligence in Healthcare and Bioscience

[120] Vayena E, Blasimme A, Cohen IG. Machine learning in medicine: addressing ethical challenges. PLoS Med 2018. Available from: https://doi.org/10.1371/journal.pmed.1002689. [121] National Institutes of Health. All of Us research program. [Cited 28 Sept 2018]. https://allofus.nih.gov/. [122] Shah DT. The future of personalized care: humanism meets artificial intelligence Marshall J Med 2018;4(4)Article 1. Available from: http://dx.doi.org/10.18590/mjm.2018.vol4.iss4.1. [123] Wahl B, Cossy-Gantner A, Germann S, et al. Artificial intelligence (AI) and global health: how can AI contribute to health in resource-poor settings? BMJ Glob Health 2018. [124] Preventive Medicine. American College of Preventive Medicine. www.acpm.org; 2019. [125] Goldston SE, editor. Concepts of primary prevention: a framework for program development. Los Angeles, CA: California Department of Mental Health; 1987. [126] Leavell HR, Clark EG. Preventive medicine for the doctor and his community. New York: McGraw-Hill; 1965. [127] Maas P. What is meant by prevention? Eindhoven University of Technology; 2016. [128] Preventive Medicine. American board of preventive medicine.www.theabpm.org. [129] Google Patent. Digital platform to identify health conditions and therapeutic interventions using an automatic and distributed artificial intelligence system. US Patent Application US10327697B1. United States; 2018. [130] Au R, Ritchie M, Hardy S, et al. Aging well: using precision to drive down costs and increase health quality. Adv Geriatr Med Res 2019;1:e190003. Available from: https://doi.org/10.20900/agmr20190003. [131] Fuellen G, Jansen L, Cohen AA, Luyten W, Gogol M, Simm A, et al. Health and aging: unifying concepts, scores, biomarkers and pathways. Aging Dis 2018. Available from: https://doi.org/10.14336/ AD.2018.1030. [132] Bloom DE, Cafiero ET, Jané-Llopis E, Abrahams-Gessel S, Bloom LR, Fathima S, et al. The global economic burden of non-communicable diseases. Geneva: World Economic Forum; 2011. [133] O’Donnell MJ, Chin SL, Rangarajan S, Xavier D, Liu L, Zhang H, et al. Global and regional effects of potentially modifiable risk factors associated with acute stroke in 32 countries (INTERSTROKE): a case-control study. Lancet 2016;388(10046):761 75. Available from: https://doi.org/10.1016/S0140-6736(16)30506-2. [134] Islami F, Goding Sauer A, Miller K, Siegel R, Fedewa S, Jacobs E, et al. Proportion and number of cancer cases and deaths attributable to potentially modifiable risk factors in the United States. CA Cancer J Clin 2018;68(1):31 54. Available from: https://doi.org/10.3322/caac.21440. [135] Kansagra SM, Kennelly MO, Nonas CA, Curtis CJ, Van Wye G, Goodman A, et al. Reducing sugary drink consumption: New York city’s approach. Am J Public Health 2015;105(4):e61 4. Available from: https://doi.org/10.2105/AJPH.2014.302497. [136] University of Nottingham. Artificial intelligence can predict premature death, study finds. ScienceDaily; 2019. [137] Weng SF, Vaz L, Qureshi N, et al. Prediction of premature all-cause mortality: a prospective general population cohort study comparing machine-learning and standard epidemiological approaches. PLoS One 2019;14(3):e0214365. Available from: https://doi.org/10.1371/journal.pone.0214365. [138] Houssami N, Kirkpatrick-Jones G, Noguchi N, et al. Artificial intelligence (AI) for the early detection of breast cancer: a scoping review to assess AI’s’s potential in breast screening practice. Expert Rev Med Devices 2019;16(5):351 62. Available from: https://doi.org/10.1080/17434440.2019.1610387. [139] Rao A, Kale MS. Characterizing the role of the primary care provider in preventive health exams: NAMCS 2011 2014. J Gen Intern Med 2019. [140] Gupta BS, Cole AP, Marchese M, et al. Use of preventive health services among cancer survivors in the U.S. Am J Preventive Med 2018;55(6):830 8. [141] CDC Foundation. What is public health. CDC.gov; 2019.

Chapter 4 • AI applications in the business and administration of health care

123

[142] WHO. Epidemiology. 2017. http://www.who.int/topics/epidemiology/en/. 15:40:58. [143] APHA. What is public health. American Public Health Association (APHA.org); 2019. [144] CDC. The public health system. USA.gov; 2018. [145] Trishan Panch T, Pearson-Stuttard J, Greaves F, et al. Artificial intelligence: opportunities and risks for public health. Lancet 2019. Available online: https://doi.org/10.1016/S2589-7500(19)30002-0. [146] Hood LE. P4 medicine and the democratization of health care. Institute for Systems Biology. NRJM Catalyst; 2017. [147] Hood L, Flores M. A personal view on systems medicine and the emergence of proactive P4 medicine: predictive, preventive, personalized and participatory. N Biotechnol 2012;29(6):613 24. Available online: http://dx.doi.org/10.1016/j.nbt.2012.03.004. [148] Hood LE. Genomics and P4 medicine. Genomics for Everyone; 2019. [149] Toledo RA, Sekiya T, Longuini VC, Coutinho FL, Lourenc o̧ Jr DM, Toledo SP. Narrowing the gap of personalized medicine in emerging countries: the case of multiple endocrine neoplasias in Brazil. Clinics. 2012;67(Suppl. 1):3 6. Available from: http://dx.doi.org/10.6061/clinics/2012(Sup01)02. [150] Psaty BM, Dekkers OM, Cooper RS. Comparison of 2 treatment models: precision medicine and preventive medicine. JAMA 2018;320(8):751 2. Available from: http://dx.doi.org/10.1001/jama.2018.8377. [151] Rosa G, Gameiro I, Sinkunas V, et al. Precision medicine: changing the way we think about healthcare. Clinics 2018;73:e723. [152] Trang Vo T, Hart JE, Laden F, et al. Emerging trends in geospatial artificial intelligence (geoAI): potential applications for environmental epidemiology. Environ Health 2018;17:40. Available from: https://doi.org/10.1186/s12940-018-0386-x. [153] Baker D, Nieuwenhuijsen MJ. Environmental epidemiology: study methods and application. New York, NY: Oxford University Press; 2008. [154] Hart JE, Puett RC, Rexrode KM, et al. Effect modification of long-term air pollution exposures and the risk of incident cardiovascular disease in US women. J Am Heart Assoc 2015;4(12). [155] Dzau VJ, Leshner AI. Public health research on gun violence: long overdue. Ann Intern Med 2018. [156] Institute of Medicine, National Research Council. Priorities for research to reduce the threat of firearmrelated violence. Washington, DC: National Academies Press; 2013. [157] Bostic B. Using artificial intelligence to solve public health problems. Becker’s Health IT & CIO Report; 2018. [158] Vogel L. AI opens new frontier for suicide prevention. CMAJ 2018;190(4):E119. Available from: https://doi.org/10.1503/cmaj.109-5549. [159] Stern C. How to solve homelessness with artificial intelligence. OZY; 2019.

5 AI applications in diagnostic technologies and services Let’s begin this chapter on current AI diagnostic and treatment technologies with the “Top 10” parlor game we introduced way back in Section 1 Introduction of this book. The game was merely to identify a timeframe and within the limits of that timeframe, select your top 10 (or top 5) “most disruptive technologies.” Then, have each of your willing friends do the same. The common denominators and differences among your listing and theirs would undoubtedly produce an interesting array of pro and con discussions. Given those game rules, let’s try it (again) using as the topics, those of this upcoming Chapter 5 on “diagnostic and treatment technologies.” To make it more manageable for this brief example, let’s do a “Top 5” listing of the most disruptive “diagnostic and treatment technologies” that will dominate the medical landscape by the year 2025. We’ll use some Internet listings (dozens of “Top 5 and Top 10” listings on the Internet, especially healthcare technologies) for you to compare with yours. As in Section 1, I’ll take a shot at my “Top 5” list (below). But before you read mine, make 1 of your own so you can compare your choices with mine and more so, with some of the Internet listings we’ll review. And don’t cheat and look up Internet listings first (I didn’t - honestly). And trust me! There’s a pony in here somewhere, I promise. My Top 5 list of the most disruptive diagnostic and treatment technologies by 2025 include: 1. 2. 3. 4. 5.

DNA intrauterine (prenatal) diagnoses; Routine genomic health profiling; Population health and precision medicine for preventive health; Diagnostic imaging using AI and deep learning algorithms; Telemedicine Internet diagnoses and remote management.

Now that we’ve made our lists let’s go to the Internet and find some lists from the professional analysts and futurists that make projections for a living. Not all lists are projected out to 2025 but are often still worth noting because of their interesting choices. Just about every significant listing I reviewed had common denominators between them (mine included), which begins to hint at where I’m going with this exercise. For openers, the prestigious firm of Deloitte Consulting has a list of “Life Sciences and Health Care Predictions” up to the year 2022 (close enough!). Their list includes the following [1]: 1. Managing healthcare through the genome ; 2. “Smart healthcare” (wearables, digital, IoTs - see Chapter 3, page 55) ; Foundations of Artificial Intelligence in Healthcare and Bioscience. DOI: https://doi.org/10.1016/B978-0-12-824477-7.00005-5 © 2021 Elsevier Inc. All rights reserved.

125

126

Foundations of Artificial Intelligence in Healthcare and Bioscience

3. AI and machine learning to increase the pace and productivity of care ; 4. Artificial intelligence (AI) in health data analysis ; 5. Precision medicine (Personalized healthcare) . Wow! Five our of five of their Top 5 are directly or indirectly related to AI. Tells you something? McKinsey and Company (Healthcare Systems and Services) had a “Top 8” list, so I included them all [2]: 1. 2. 3. 4. 5. 6. 7. 8.

Connected and cognitive devices ; Electroceuticals; Targeted and personalized medicine ; Robotics process automation ; 3D printing; Big data and analytics ; Artificial intelligence (AI) ; Blockchain .

There is an interesting list (“Top 6”) by 2025 on a website (Nexter.org) that bills itself as “. . .a new generation of virality. . .ready to offer our readers the latest hot stories and latest news.” This somewhat unique list includes: 1. 2. 3. 4. 5. 6.

Electronic skins that can measure vital signs; Digital pills with embedded sensor reporting to the prescribing doctor ; 3-D Bioprinting to replace organs with 3-D-printed copies; Recreational cyborgs as human-machine parts and a new fashion trend; Genomics: a medical tool to prevent and cure diseases ; Robots and AI as surgery assistants and remote nurses .

And finally, a “Top 5” list for 2025 from an economics perspective (The New Economy Journal), including: [3] 1. 2. 3. 4. 5.

Gene therapy ; Virtual reality; Immuno-oncology ; Chatbots ; Artificial Intelligence (AI) .

Yet another four out of five list of AI technologies. Staring to see a pony? Now, finally, to make my point about this little game, whether it’s serious consulting firms, healthcare analysts, trendy news sources, or economists, all include AI in their “Top” projections for the coming 3 5 years. And if you count up the items that I identified by an asterisk ( ) after them, you will see that 17 of the total 24 predictions or 70.8% are directly or indirectly related to AI. This kind of a review, albeit “very” informal, suggests that AI has the potential to have a dominant influence in our healthcare going forward. That being the case, I would hope the information from the previous Chapter 4 about AI and the business and

Chapter 5 • AI applications in diagnostic technologies and services

127

administration of healthcare and the information in the remaining Chapters 5 8, you will consider relevant to you, your friends (from the “Top 10” games) and your loved ones. This Chapter 5 will address AI applications on diagnostic technologies and services. You will certainly be familiar with most, if not all of the technologies and services we’ll be discussing, and I think you will find the AI applications regarding them informative and enlightening. Chapter 6 will cover AI applications in what we might consider the “implementation half” of the healthcare delivery system, i.e., medical therapies and services, those which impact the patient directly. Chapter 7 (“AI applications in prevalent diseases and disorders”) will address the bioscience and AI’s applications on each of the major disease categories, all of which you are well aware of, and some, I’m sure that have and will continue to impact your life. And finally, as a late, but necessary edition to this book, Chapter 8 will address “SARS-COV-2 and COVID-19 Pandemic.” Again, I think you will find the information compelling. Indeed, the combination of these 4 remaining Chapters 5 8 should provide a “launchpad” for your further understanding, appreciation, and management of your health and wellness and of those for whom you care for into the future. We will approach Chapter 5's material in the following “guidebook” format: First, we will address 3 significant categories of AI-assisted diagnostic technologies, including: 1. Diagnostic imaging; 2. Laboratory testing; and 3. Genetic and genomic testing. Next, we will address some “additional diagnostic technologies,” some of which are playing an ever-increasing role in medical diagnosis due to their evolving AI relationships and applications. They include: 1. 2. 3. 4. 5.

Vital signs: Electrodiagnosis; Telemedicine (aka Telehealth) Chatbots; Expert systems.

For each of the diagnostic categories and services, we will give a brief description of the relevant technologies and services in the category. Then we will discuss AI’s influences on each of the specific technologies and services. And finally, for each technology and service, I will present 3 literature reviews of recent papers (from among thousands in the literature) on the respective topics, some or all of which will hopefully be of interest to the readers of this guidebook.

5.1 Major diagnostic technologies [4] and their AI applications Nothing in healthcare is more important than a timely and accurate diagnosis. AI can do nothing more valuable for humanity than to make a diagnosis as good as it can be. Few in healthcare doubt that AI is indeed bringing the art and science of clinical diagnosis to new levels. And in this particular arena of healthcare, the concerns over “replacing the doctor” are moot in that the AI-assisted diagnostic tools are in fact, tools and

128

Foundations of Artificial Intelligence in Healthcare and Bioscience

technologies “assisted” by AI and managed by, controlled by, and evaluated for decision-making by the “human” doctor, nurse or appropriate health professional. The process of AI-assisted diagnostic technologies can be described as a combination of hardware for acquisition and data collection (input layer), AI software algorithms for interpretation (inner layer), and results (output) (remember the computer layers from Chapters 2 and 3?). This distinction between the hardware and software of diagnostic technologies will be an excellent way to categorize the vast array of methods and types of clinical testing done in routine healthcare. The categories (Table 5 1) divided by the nature and goals of the

Table 5–1

Diagnostic testing procedures.

Diagnostic imaging: • X-Rays (Conventional radiography); • Mammography; • Fluoroscopy; • Radiomics; • CT Scan; • MRI Scan; • Nuclear medicine scan; • Ultrasound; • Endoscopy; • Fundus imaging • Medical photography Laboratory testing: • Diabetic testing (a1c, FPG, RPG, OGTT); • Hepatitis testing; • Laboratory tests; • Metabolic panel; • Thyroid tests; • Kidney tests; • Liver function tests; Genetic testing (and cancer screening): • Biopsy; • Genetic testing; • Prenatal testing Additional diagnostic technologies: • Vital signs: a. Blood pressure; b. Heart rate; c. Respiratory rate; d. Temperature • Electrodiagnosis; • Telemedicine; • Concurrent medical conditions (“Comorbidity”) • Expert Systems; • Chatbots

Chapter 5 • AI applications in diagnostic technologies and services

129

general testing category, and then each test within each category, will be sequenced and presented as described in the introduction to this chapter.

5.1.1 Diagnostic imaging Among the most promising innovative areas in healthcare is the application of AI in medical imaging, including, but not limited to, image processing and interpretation [5]. Along with clinical laboratory analysis (blood, urine, cultures), the process of diagnostic imaging is arguably the most valuable clinical information we can gather in the diagnostic process. With the irreversible increase in imaging data and the possibility to identify findings that humans can or cannot detect, radiology is now moving from a subjective perceptual skill to a more objective science [6]. Moreover, deep learning networks (convolutional neural network’s - CNNs) have led to more robust models for radiomics (discussed below, page 139). Radiomics is an emerging field that deals with the high-throughput extraction of peculiar quantitative features from radiological images [7]. A staggering 90% of all healthcare data comes from medical imaging [8]. This statistic being the case, it’s exciting to realize that AI’s deep learning (CNN) greatest strength lies in pattern recognition (e.g., Graphic Processing Unit or GPU hardware and software [see Chapter 3, page 48]) and thus, image analysis. No matter what the image (pattern) may be (Figs. 5 1 and 5 2) [9], using Bayesian probabilities and inference logic algorithms, AI’s deep learning CNN process classifies the image for diagnosis. This makes AI-assisted clinical diagnostic image analysis, one of, if not the most important advance in clinical AI healthcare applications. AI algorithms look at medical images to identify patterns after being trained using vast numbers of examinations and images. Those systems will be able to give information about the characterization of abnormal findings, mostly in terms of conditional probabilities to be applied to Bayesian decision-making [10]. AI systems look at specific labeled structures and also learn how to extract image features either visible or invisible to the human eye. This approach mimics human analytical cognition, allowing for better performance than that obtained with old CAD (computer-aided design) software [11]. The current applications of these diagnostic imaging algorithms exceed all of the other approved and clinically utilized AI healthcare applications [12]. Radiological imaging methods are used to investigate regions of the body to detect potential abnormal pathology and aid diagnosis. Through these methods, large volumes of complex digital imaging data are generated from regional or whole-body scanning, which creates a challenge to “reading and interpreting” images. Combined with radiomics, a paradigm shift in radiology and all of medicine is occurring through the use of AI and advanced machine and deep learning algorithms. Neuroradiology focuses on diagnosing conditions of the spine, neck, head, and central nervous system using computed tomography (CT) or magnetic resonance imaging (MRI) machines. In such neuroradiological imaging, rapid diagnosis of acute neurological illnesses such as stroke, cranial aneurysm, hemorrhaging, trauma, etc. is critical. Computer-assisted-diagnosis (CAD)

130

Foundations of Artificial Intelligence in Healthcare and Bioscience

FIGURE 5–1 Convolutional Neural Networks (CNNs). Similar to the convolutional neural networks described in Chapter 3 (Figure 3 5), AI’s deep learning CNN process is used to classify the image for diagnosis. In this example, 5 convolutional layers are followed by 3 fully connected layers, which then output a probability of the image belonging to each class. These probabilities are compared with the known class (stroke in the training example) and can be used to measure how far off the prediction was (cost function), which can then be used to update the weights of the different kernels and fully connected parameters using back-propagation. When the model training is complete and deployed on new images, the process will produce a similar output of probabilities, in which it is hoped that the true diagnosis will have the highest likelihood. Courtesy of American Journal of Neuroradiology.

using 3-D convoluted neural networks (3D-CNN) to screen head CT scans learned from a dataset of 37,236 scans was compared to standard neuroradiologic methods [13]. This AI method to triage radiology workflow and accelerate the time to diagnosis went from minutes to seconds in a clinical environment. Finally, we must remember that AI mimics human intelligence. Radiologists are key people for several current AI challenges, such as the creation of high-quality training datasets, the definition of the clinical task to address, and interpretation of obtained results. Radiologists may play a pivotal role in the identification of clinical applications where AI methods may make a difference. They represent the final user of these technologies, who knows where they can be applied to improve patient care. For this reason, their point of view is crucial to optimize the use of AI-based solutions in the clinical setting. The application of AI-based algorithms often leads to the creation of complex data that needs to be interpreted and linked to their clinical utility. In this scenario, radiologists may play a crucial role in data interpretation, cooperating with data scientists in the definition of useful results [14].

Chapter 5 • AI applications in diagnostic technologies and services

131

FIGURE 5–2 Comparison of machine learning vs. deep learning CNNs. AI systems look at specific labeled structures and also learn how to extract image features either visible or invisible to the human eye. The comparison in this Figure between classic machine learning and deep learning approaches is applied to a classification task. Both approaches use an artificial neural network organized in the input layer (IL), hidden layer (HL) and output layer (OL). The deep learning approach avoids the design of dedicated feature extractors by using a deep neural network that can represents complex features as a composition of simpler ones. Courtesy of American Journal of Neuroradiology.

5.1.1.1 Categories of diagnostic imaging • X-Ray (conventional radiography): There are many types of medical imaging procedures that all work on the same principle. An X-ray beam passes through the body and specific internal structures either absorbed or scattered them with the remaining X-rays producing a pattern sent to a detector on film or a computer screen that is recorded or processed by a computer. Radiography is a technique for generating and recording an x-ray pattern to provide a static image(s) after the termination of the exposure. It is used to diagnose or treat patients by recording images of the internal structure of the body to assess the presence or absence of disease, foreign objects, and structural damage or anomaly. The recording of the pattern may occur on film or through electronic means. 5.1.1.1.1 AI’s influence on conventional radiography [15] From the early days of X-ray imaging in the 1890s to more recent advances in CT, MRI, and PET scanning, medical imaging continues to be a pillar of medical treatment. Moreover, and in contrast to traditional methods based on predefined features, as deep learning algorithms progress, that is, as more data are generated every day with ongoing research efforts, results will provide greater improvements in performance. All these advances promise increased accuracy and reduction in the number of routine tasks that exhaust time and effort.

132

Foundations of Artificial Intelligence in Healthcare and Bioscience

Enabling interoperability among the multitude of AI applications that are currently scattered across healthcare will result in a network of powerful tools. Utilizing such data to train AI on a massive scale will enable a robust AI that is generalizable across different patient demographics, geographic regions, diseases, and standards of care. 5.1.1.1.2 Literature reviews re AI’s influence on conventional radiography 1. Automated triaging of adult chest radiographs with deep artificial neural networks [16]: This study aimed to develop and test an AI system, based on deep convolutional neural networks (CNNs), for automated real-time triaging of adult chest radiographs based on the urgency of imaging appearances. A data set of 470,388 consecutive adult chest radiographs were selected for deep learning. Annotation of the radiographs was automated by developing an NLP system that was able to process and map the language used in each radiology report [17]. NLP performance was excellent, achieving a sensitivity of 98%, specificity of 99%, PPV (positive predictive value) of 97%, and NPV (negative predictive value) of 99% for normal radiographs and a sensitivity of 96%, specificity of 97%, PPV of 84%, and NPV of 99% for critical radiographs. The NLP system was able to extract the presence or absence of almost all the radiologic findings within the free-text reports with a high degree of accuracy. The computer vision algorithms were used to build an automated radiograph prioritization system. AI performance was good, with a sensitivity of 71%, specificity of 95%, PPV of 73%, and NPV of 94% for normal radiographs and a sensitivity of 65%, specificity of 94%, PPV of 61%, and NPV of 95% for critical radiographs. This NLP system was able to extract out the presence or absence of radiologic findings within the free-text reports with a high degree of accuracy. It was also able to assign a priority level with a sensitivity of greater than 90% and specificity of greater than 96%, as assessed with the reference standard data set. Similarly, the deep CNN based computer vision system was able to separate normal from abnormal chest radiographs with a sensitivity of 71%, specificity of 95%, and NPV of 94%. This deep learning system developed on the institutional data set (470,388) of adult chest radiographs was able to interpret and prioritize chest radiographs. Thus, abnormal radiographs with critical or urgent findings could be queued for real-time reporting and completed sooner than the current system [16]. 2. Artificial intelligence rivals radiologists in screening X-rays for certain diseases [18]: A new artificial intelligence algorithm, CheXNeXt, developed at Stanford University, can reliably screen chest X-rays for more than a dozen types of disease in less time than it takes to read this sentence. The algorithm simultaneously evaluates X-rays for a multitude of possible maladies and returns results that are consistent with the readings of radiologists. Scientists trained the algorithm to detect 14 different pathologies. For 10 diseases, the algorithm performed as well as radiologists. For 3, it underperformed compared with radiologists, and for 1, the algorithm outdid the experts. Besides serving underserved areas, algorithms like CheXNeXt could one day expedite

Chapter 5 • AI applications in diagnostic technologies and services

133

care, empowering primary care doctors to make informed decisions about X-ray diagnostics faster, without having to wait for a radiologist. A graduate student, Pranav Rajpurkar said: “The algorithm has evaluated over 100,000 X-rays so far, but now we want to know how well it would do if we showed it a million X-rays — and not just from 1 hospital, but from hospitals around the world.” Scientists used about 112,000 X-rays to train the algorithm. A panel of 3 radiologists then reviewed a different set of 420 X-rays, one by one, for the 14 pathologies. Their conclusions served as a “ground truth”— a diagnosis that experts agree is the most accurate assessment — for each scan. “We should be building AI algorithms to be as good or better than the gold standard of human, expert physicians. Now, I’m not expecting AI to replace radiologists any time soon. Still, we are not truly pushing the limits of this technology if we’re just aiming to enhance existing radiologist workflows,” Rajpurkar said. “Instead, we need to be thinking about how far we can push these AI models to improve the lives of patients anywhere in the world.” 3. Artificial intelligence on the identification of risk groups for osteoporosis [19]: Osteoporosis is an osteometabolic disease characterized by low bone mineral density (BMD) and deterioration of the microarchitecture of the bone tissue, causing an increase in bone fragility and consequently leading to an increased risk of fractures. It affects all bones in the body and shows no signs or symptoms until a fracture occurs. The decrease of bone mineral density occurs from aging, and fracture rates increase over the years, causing morbidity and some mortality [20]. The goal of this paper was to present a critical review of the central systems that use artificial intelligence to identify groups at risk for osteoporosis or fractures. The systems considered for this study were those that fulfilled the following requirements: the range of coverage in diagnosis, low cost, and capability to identify more significant somatic factors. The application of artificial intelligence showed to be adequate for the prognosis of the disease or fracture. The critical review concluded that the construction of a hybrid system composed of artificial intelligence with a simplified method of examination of bone mineral density could provide better results. Given the proposal of greater population coverage, the system will have to deal with a high level of data complexity. • Mammography: Mammography is a type of radiography used to capture images (mammograms) of internal structures of the breasts. Thus, mammography can detect breast cancer in its early, often treatable stages. Two types of procedures include: 1. Screen-film mammography where x-rays are beamed through the breast to a cassette containing a screen and film that must be developed. 2. Full-field digital mammography is a type where x-rays are beamed through the breast to an image receptor. A scanner converts the information to a digital picture that is sent to a digital monitor and/or a printer.

134

Foundations of Artificial Intelligence in Healthcare and Bioscience

5.1.1.1.3 AI’s influence on mammography Radiologic breast cancer screening has witnessed significant changes along with the successes of deep learning in the biomedical imaging in general. One such advancement, published in Radiology Journal, was developed by Rodriguez-Ruiz et al. [21]. Authors compared radiologists’ performances for reading mammographic examinations unaided versus aided (supported by an AI system) and revealed that radiologists improved their cancer detection performance at mammography when using an AI system. Results also indicated that this benefit was obtained without requiring additional reading time. In a complementary study, the data from 7 countries was curated by 101 radiologists. This broad experimental setting included a total of 2652 exams, and the stand-alone AI system was statistically similar to that of radiologists’ interpretations. The sensitivity and specificity of the system were also found better than the majority of radiologists, but always worse than the best radiologist, which is not surprising. These results indicate that AI tools can be used in much broader settings that have never been used before in breast cancer diagnosis routine. However, for this to be a regular clinical practice, there is still an expectation that a lot more experimentation should be done in both retrospective and prospective settings for independent validations. The success of deep learning as a tool to build AI systems is pushing the performance even closer to humans in computer-aided diagnosis and screening in radiology rooms. The subjectivity in terms of the underlying data (the type of lesions, racial and age differences, device manufacturers) when training the deep learning models remains a challenge that needs to be carefully addressed. A stand-alone AI system can supplement expert radiologists as a second reader, which can translate in a reduction in reading time. For AI, to be used up to its full potential in clinical practice, more studies should be performed in real-world settings. The increasing number of scans for diagnosing breast cancer generates tremendous workload for radiologists. For efficient screening and precise diagnosis, AI can play its role, as shown in recent studies on breast cancer screening. 5.1.1.1.4 Literature reviews re AI’s influence on mammography 1. Reduction of false-positive markings on mammograms [22]: A significant disadvantage in using currently available computer-aided detection (CAD) systems is a high rate of false-positive marks. The usefulness of CAD can be assessed by a count of these marks on each image, which is false positives per image (FPPI). False-positive marks may distract the interpreting radiologist with too much “noise.” They could lead to unnecessary workups and biopsies [23]. Therefore, high FPPI is a common complaint of radiologists when reviewing CAD marks. Approximately $4 billion per year are spent in the USA on false-positive recalls and workups. The cost for diagnostic mammogram workups alone is 1.62 billion [24]. For the patient, these falsepositive screening mammograms create unnecessary anxiety and may lead the patient to a biopsy that was not needed. This study compares the performance of a recently developed AI-CAD algorithm directly to a commercially available conventional CAD software using the same test

Chapter 5 • AI applications in diagnostic technologies and services

135

dataset of clinical cases. A retrospective study was performed on a set of 250 2-dimensional (2D) full-field digital mammograms (FFDM) collected from a tertiary academic institution based in the USA, which specializes in cancer healthcare. All of the mammograms were initially interpreted using the ImageChecker CAD, version 10.0 (Hologic, Inc., Sunnyvale, CA). Inclusion criteria were asymptomatic female patients of all ages and any race with 2D screening mammograms performed at the academic institution between 1 January 2013, and 31 March 2013, and whose mammogram records contain archived CAD markings. Mastectomy and breast implant patients were excluded from the analysis. CAD displayed false marks on 200 of the 242 non-cancer cases (83%) and no marks at all on 42 cases (17%). AI-CAD software displayed wrong marks on 126 cases of the 242 non-cancer cases (52%) and no marks at all on 116 cases (48%). This equates to 37% fewer cases with mean scores with AI-CAD and simultaneously 64% more mark-free cases with AI-CAD compared with CAD. There was a 69% reduction in overall FPPI with AI-CAD compared to CAD. Evaluation by lesion type shows the outperformance of AI-CAD for both masses and calcifications. There was an overall 83% reduction in FPPI for calcifications with AI-CAD and a 56% reduction for mass. FPPI for masses was higher than FPPI for calcifications for both systems. The significant decrease in FPPI with AI-CAD may translate into fewer false recalls, improved workflow, and decreased costs. The economic impact of the false-positive recall is considered to be one of the significant drawbacks of screening mammography. But more so, the known psychological and physical risks for patients related to false-positive recalls are incalculable [25]. The scare of the potential of a breast lesion can be a frightening experience. Even when the workup results are benign, the psychological effects of anxiety can last up to 3 years [26]. 2. Visual search in breast imaging: a review [27]: The diagnostic accuracy of radiologists’ interpretation of mammograms is limited by human factors producing both false-positive and false-negative errors. Identifying causes of diagnostic errors may be achievable by better understanding visual searching in breast images and find ways to reduce them. Improving education for radiology residents is also essential. Studies showed that 70% of missed lesions on mammograms attract radiologists’ visual attention and that a plethora of different reasons, such as the satisfaction of search, incorrect background sampling, and erroneous first impression, can cause diagnostic errors in the interpretation of mammograms. Recently, highly accurate tools, which rely on both eye-tracking data and the content of mammograms, have been proposed to provide feedback to the radiologists. In the past few years, deep learning has led to improving the diagnostic accuracy of computerized diagnostic tools and visual search studies will be required to understand how radiologists interact with the prompts from these tools, and to identify the best way to utilize them. Improving these tools and determining the

136

Foundations of Artificial Intelligence in Healthcare and Bioscience

optimal pathway to integrate them into the radiology workflow could be a possible line of future research. 3. Assessing cancer risk from mammograms: deep learning is superior to conventional risk models [28] On 28 March 2019, the U.S. Food and Drug Administration announced a proposed rule [29] to update the landmark policy passed by Congress in 1992 to ensure the quality of mammography for early breast cancer detection (known as the Mammography Quality Standards Act). Yala et al. [30] retrospectively examined nearly 90,000 consecutive screening mammographic examinations from almost 40,000 women obtained over 4 years (2009 2012) at Massachusetts General Hospital (Boston, Mass). The authors defined 4 breast cancer risk models intended to quantify the probability of discovering breast cancer within 5 years after a mammographic screening examination that was negative for disease. The third model, which used only DL analysis of full-resolution mammography, outperformed the first and second models. This suggested that there was more information about breast cancer risk on the mammograms than in the clinical data used by standard risk models. Because breast density scores are included in the standard models, it follows that those density scores do not reflect all the relevant information on the mammogram. DL results are not easily explainable to humans. DL methods are so-called black boxes (see Chapter 3, page 55) that provide little guidance about why conclusions were reached. Yala does not speculate on what exactly makes 1 mammography image predictive of the future occurrence of breast cancer. The effort to make deep learning (DL) methods fully explainable is an area of active research in academia and industry [31]. At the moment, it is unclear whether DL methods will ever be fully explainable (i.e., explainable AI XAI). Are we willing to follow computer recommendations in radiology without completely understanding them if we know that they provide sound advice? There can be little doubt that more deep learning studies will produce further advances of the sort described. • Fluoroscopy: Fluoroscopy is a continuous X-ray image displayed on a monitor that provides the ability of real-time monitoring of a medical procedure or the course of a contrast agent (“dye”) through the body. It is used in a wide variety of examinations and procedures to diagnose or treat patients. Some examples are [32] • Barium X-rays and enemas (to view the gastrointestinal tract) • Catheter insertion and manipulation (to direct the movement of a catheter through blood vessels, bile ducts or the urinary system) • Placement of devices within the body, such as stents (to open narrowed or blocked blood vessels) • Angiograms (to visualize blood vessels and organs) • Orthopedic surgery (to guide joint replacements and treatment of fractures).

Chapter 5 • AI applications in diagnostic technologies and services

137

There are some risks associated with fluoroscopy, similar to other X-ray procedures. Varying with the given procedure, the patient may receive a relatively high dose of radiation, especially for complex interventional procedures. Methods such as placing stents or other devices inside the body requiring extended periods are at increased risk for excessive radiation doses. The probability, however, that a patient will experience these adverse effects from a fluoroscopic procedure is statistically very small. Fluoroscopy should always be performed with the lowest acceptable exposure for the shortest time necessary to minimize the radiation risk. 5.1.1.1.5 AI’s influence on fluoroscopy [33] Artificial intelligence is a powerful assistant in the fluoroscopy procedure room that allows a physician to integrate historical and real-time imaging information. Moreover, since the human brain operation inspires AI, it continually learns from each interaction with the operational environment around it, while gradually improving its performance over time without the need for software engineering involvement. The AI system was trained over the set of 51 procedures from 8 clinical sites. Its performance was tested and measured over 18 independent procedures, comprising 398 configurations from 7 clinical sites, to quantify the AI capabilities compared with a trained human operator in offline on prerecorded procedures. The AI component of the navigation and biopsy guidance system demonstrates performance improvement from 80% to 95% to detect surgical tools on fluoroscopic images. The AI system ignored the operational environment, the fluoroscope. Its performance over time, tracking complex anatomy and operational tools on the challenging fluoroscopic imaging, makes the guidance of diagnostic biopsy reliable and meaningful during a procedure. This, in turn, translates into better control of the procedure and high diagnostic results. 5.1.1.1.6 Literature reviews re AI’s influence on fluoroscopy 1. Robotics-assisted versus conventional manual approaches for total hip arthroplasty [34]: Several studies have compared robotics-assisted (RA), and conventional manual (CM) approaches for total hip arthroplasty (THA), but their results are controversial. Total hip arthroplasty (THA) is an effective method for the management of severe hip joint disorders. Precise placement of cups and femoral stems is crucial to the efficacy of THA [35]. However, this accuracy is difficult to achieve with the conventional manual (CM) approach. Computer-assisted orthopedic surgery (CAOS) has been performed over the last 30 years. The existing CAOS technologies can be broadly categorized into image-guided (based on computed tomography [CT] or X-ray fluoroscopy), imageless navigation systems, positioning systems (patient-specific models, self-positioning robots); and semiactive or active robotics-assisted (RA) systems [36]. The advances in computer and artificial intelligence technology have resulted in parallel developments in robot-assisted

138

Foundations of Artificial Intelligence in Healthcare and Bioscience

THA [37]. Optical positioning is the most widely applied method in orthopedics with Xray fluoroscopy-based navigation. RA-THA achieves the same clinical results as traditional manual techniques, with fewer intraoperative complications and better radiological assessment results. On the other hand, the advantages of the conventional methods are shorter operation time, lower revision rate, and less postoperative complications such as dislocation, which may also be related to the surgical approach. Despite some shortcomings and controversies, with the advancement of artificial intelligence technology, we believe that RA hip replacement technology has excellent potential for clinical application. 2. Use of fluoroscopy in endoscopy: indications, uses, and safety considerations [38]: Historically, fluoroscopy was a tool of the radiologist. Interventional cardiologists and vascular surgeons have revolutionized their respective fields by adopting and adapting their use to their practice. This expansion of fluoroscopic utilization has also flourished within the field of endoscopy and continues to evolve. Perhaps the most common use of fluoroscopy in the endoscopy suite is endoscopic retrograde cholangiopancreatography (ERCP). Numerous other therapeutic interventions can be performed, including biliary and pancreatic stent placement, biopsy brushings, and balloon sweeps on the bile duct to remove stones and debris. Fluoroscopy has afforded this great asset in the management of these biliary-pancreatic conditions. A growing indication for fluoroscopy in endoscopy is the placement of enteral stents. These include esophageal, gastric, duodenal, and colonic stents used in the setting of advanced malignancies for the palliative restoration of luminal patency. While the indications for fluoroscopy during endoscopic procedures continue to expand, formal training in radiation exposure and protection is still not widely emphasized during advanced endoscopy training [39]. The risks of adverse radiation effects are almost always outweighed by the patient’s benefit of these procedures. However, to improve this risk-to-benefit ratio, mainly since only the patient receives the benefit while both the patient and staff assume exposure to risk, it is imperative that the operator understands the principles behind radiation and how to minimize exposure. 3. Imalogix brings fluoroscopy capabilities to radiation dose management platform [40]: Imalogix, an AI provider of process and workflow solutions, announced the availability of the latest evolution to its platform for diagnostic imaging, interventional radiology, cardiology, and surgery. The Imalogix Platform provides a comprehensive infrastructure to enable healthcare organizations to better understand and manage the process, quality and safety related to diagnostic imaging services, interventional procedures, and meet evolving regulatory standards. The Imalogix Platform enables departments utilizing fluoroscopy procedures to manage and enhance the safety and quality of exams through a comprehensive view of peak skin dose (PSD) across their entire patient population. The impending regulations surrounding fluoroscopy from TJC (The Joint Commission) state that this calculation is required for every fluoroscopy exam.

Chapter 5 • AI applications in diagnostic technologies and services

139

Now with the ability to instantly calculate PSD by the patient and classify this information by exam type, organizations can determine realistic radiation dose estimates for different procedure types to better protect patients and physicians. The Imalogix Platform also automatically alerts designated staff for procedures that emit over 5 Gray (Gy) and tracks cumulative patient dose to help ensure all sentential events (.15 Gy) are either avoided or flagged for immediate review and patient follow-up. • Radiomics [41]: The suffix “-omics” has been used numerous times in this book up to this point and will be used extensively through the balance of the book. When used as the suffix of any word in this text, it defines that word as “a field of study in biology” [42] (e.g., genomics, transcriptomics, metabolomics, and radiomics). This may help with the many terms you will encounter with the ending “-omics.” In the new era of precision medicine (see Chapter 4, page 101), radiomics is emerging as the research field that will translate associations extracted from qualitative and quantitative clinical images and clinical data with information with or without associated gene expression to support evidence-based clinical decision-making [43]. Dividing the process into separate steps can yield definable inputs and outputs, including image acquisition and reconstruction, image segmentation, features extraction and qualification, analysis, and model building. Careful evaluation is needed with each step to construct robust and reliable models transferrable into clinical practice for prognosis, non-invasive disease tracking, and evaluation of disease response to treatment. Different kinds of features can be derived from clinical images. Quantitative traits are descriptors extracted from the images by software implementing mathematical algorithms [44]. They exhibit different levels of complexity and express properties firstly of the lesion shape, describing the shape of the traced region of interest and its geometric properties. Second, textural features [45] calculate the statistical inter-relationships providing a measure of the spatial arrangement of intensities, and hence of intra-lesion heterogeneity. They can be extracted either directly from the images or after applying different filters or transforms (e.g., wavelet transform). The main innovation of radiomics relies on the omics suffix, created for molecular biology disciplines. This refers to the simultaneous use of large amounts of parameters extracted from a single lesion, which are mathematically processed with advanced statistical methods. The hypothesis is that an appropriate combination of parameters, along with clinical data, can express significant tissue properties, useful for diagnosis, prognosis, or treatment in an individual patient (personalization). Additionally, radiomics takes advantage of the full use of vast data-analysis experience developed by other -omics disciplines, as well as by big-data analytics (see Chapter 4, page 83) [46]. 5.1.1.1.7 AI’s influence on radiomics Deep learning and radiomics are rapidly taking over in many areas of research and is a relatively new field of research, even though it uses methods developed decades ago. The

140

Foundations of Artificial Intelligence in Healthcare and Bioscience

embrace of deep learning has been in the development of economical and increased computational methods with early success in many different areas. For radiomics, the major challenge is the link to biology and function. Deep learning and radiomic methods will transform medical imaging and its application to personalized medicine (see Chapter 4, page 101) in the next 5 years. As deep learning and radiomics methods mature, their use will become part of clinical decision support systems. They can be used to rapidly mine patient data spaces and radiological imaging biomarkers to move medicine towards the goal of precision medicine for patients [46]. 5.1.1.1.8 Literature reviews re AIs influence on radiomics 1. Radiomics with artificial intelligence for precision medicine in radiation therapy [47]: Recently, the concept of radiomics has emerged from radiation oncology. It is a novel approach for solving the issues of precision (personalized) medicine and how it can be performed based on multimodality medical images that are non-invasive, fast, and low in cost. Radiomics is the comprehensive analysis of massive numbers of medical images to extract a large number of phenotypic features (radiomic biomarkers) reflecting cancer traits, and it explores the associations between the characteristics and patients’ prognoses to improve decision-making in precision medicine. Individual patients can be stratified into subtypes based on radiomic biomarkers that contain information about cancer traits that determine the patient’s prognosis. Machine-learning algorithms are boosting the powers of radiomics for prediction of prognoses or factors associated with treatment strategies, such as survival time, recurrence, adverse events, and subtypes. Therefore, radiomic approaches, in combination with AI, may potentially enable practical use of precision medicine in radiation therapy by predicting outcomes and toxicity for individual patients. 2. What can artificial intelligence teach us about the molecular mechanisms underlying disease? [48] While molecular imaging with positron emission tomography (see Nuclear Medicine Scan, PET, page 146) or single-photon emission computed tomography already reports on tumor molecular mechanisms on a macroscopic scale, there is increasing evidence that there are multiple additional features within medical images. These can further improve tumor characterization, treatment prediction, and prognostication. Early reports have already revealed the power of radiomics to personalize and improve patient management and outcomes. What remains unclear is how these additional metrics relate to underlying molecular mechanisms of disease. Furthermore, the ability to deal with increasingly large amounts of data from medical images and beyond in a rapid, reproducible, and transparent manner is essential for future clinical practice. Here AI may have an impact. It encompasses a broad range of ‘intelligent’ functions performed by computers, including language processing (NLP), knowledge representation, problem-solving, and planning. While rule-based algorithms, e.g., computer-aided

Chapter 5 • AI applications in diagnostic technologies and services

141

diagnosis, have been in use for medical imaging since the 1990s, the resurgent interest in AI is related to improvements in computing power and advances in machine learning (ML). 3. Radiomics in cancer research [15]: Radiographic images, coupled with data on clinical outcomes, have led to the emergence and rapid expansion of radiomics as a field of medical research [49]. Early radiomics studies were largely focused on mining images for a large set of predefined engineered features that describe radiographic aspects of shape, intensity, and texture. More recently, radiomics studies have incorporated deep learning techniques to learn feature representations automatically from example images, hinting at the substantial clinical relevance of many of these radiographic features. Within oncology, multiple efforts have successfully explored radiomics tools for assisting clinical decision making related to the diagnosis and risk stratification of different cancers [50]. Such findings have motivated exploration of the clinical utility of AI-generated biomarkers based on standard-of-care radiographic images [51] with the ultimate hope of better-supporting radiologists in disease diagnosis, imaging quality optimization, data visualization, response assessment, and report generation. • Computed tomography (CT or CAT) scan: CT (CAT) scans are procedures where many X-ray images are recorded as the detector moves around the patient’s body. A radiology technologist performs the CT scan during which you lie on a table inside a large, doughnut-shaped CT machine. The table slowly moves through the scanner, and the X-rays rotate around your body. By repeating this process, several scans are generated in which the computer stacks 1 on top of the other to produce detailed images of organs, bones, or blood vessels. From these cross-sectional images or “slices” a computer reconstructs all the individual pictures of internal organs and tissues. CT scans are used for a long list of diagnostic assessments including (but not limited to) [52]: • detecting bone and joint problems, like complex bone fractures and tumors; • If you have a condition like cancer, heart disease, emphysema, or liver masses, CT scans can spot it or help doctors see any changes; • CT scans show internal injuries and bleeding, such as those caused by a car accident; • CT scans can help locate a tumor, blood clot, excess fluid, or infection; • Used to guide treatment plans and procedures, such as biopsies, surgeries, and radiation therapy; • Doctors can compare CT scans to find out if specific treatments are working. For example, scans of a tumor over time can show whether it’s responding to chemotherapy or radiation. 5.1.1.1.9 AI’s influence on computed tomography (CT or CAT) scans [53] AI has already proven its value to radiologists and pathologists looking to accelerate productivity and improve accuracy. Studies have shown that AI tools can identify features in images quickly and precisely as well, if not better, than humans.

142

Foundations of Artificial Intelligence in Healthcare and Bioscience

To foster standardized, safe, and effective AI for clinical decision support and diagnostics, the American College of Radiology Data Science Institute (ACR DSI) has released [54] several high-value use cases for artificial intelligence in medical imaging, which will be continuously updated as new opportunities present themselves. They include: 1. 2. 3. 4. 5.

Identifying cardiovascular abnormalities; Detecting fractures and other musculoskeletal injuries; Aiding in the diagnosis of neurological diseases; Flagging thoracic complications and conditions; Screening for common cancers.

While more studies will be required to test the utility of AI for these and other use cases, ACR DSI appears confident that medical imaging is ready for artificial intelligence. Supplementing diagnostics and decision-making with AI could offer providers and patients life-changing insights into a variety of diseases, injuries, and conditions that may be difficult to identify with the human eye alone. 5.1.1.1.10 Literature reviews re AI’s influence on computed tomography (CT or CAT) scans 1. Artificial intelligence and cardiovascular computed tomography [55]: Routinely acquired volumetric datasets with large numbers of axial slices are routinely obtained by cardiovascular computed tomography (CT). Review of the axial slices is an integral part of the image analysis, and a critical step is detailed reconstruction of multiple oblique planes and volumes on an advanced 3-D workstation [56]. This is performed by a combination of manual and (semi) automated reconstruction along oblique planes, vascular centerlines, and volumes. This analysis results in a large number of discrete qualitative and quantitative data-points and is increasingly supported by dedicated “smart” software. The combination of large EHRs and automated analysis with ML algorithms allows information gathering, data analysis, and feedback to the individual practitioner and healthcare systems. Similar to the impact of data technology in many aspects of daily life, these changes will impact current models of doctor-patient relationships, with potential benefits for the individual patient and also larger patient populations. It may also have an impact on the work of specialists, including radiologists, but not substituting for them. It will enhance their diagnostic workflow, allowing faster and more precise diagnosis. The role of the future imaging specialist will likely increasingly include generation and management of imaging data, providing access to discrete data beyond the traditional report. This will require close collaboration with IT specialists. 2. Advanced machine learning in action: identification of intracranial hemorrhage on computed tomography scans of the head with clinical workflow integration [57]: Intracranial hemorrhage (ICH) requires prompt diagnosis to optimize patient outcomes. This study hypothesized that machine learning algorithms could automatically analyze computed tomography (CT) of the head, prioritize radiology worklists, and reduce time to diagnosis of ICH. A deep convolutional neural network was trained on 37,074 studies and

Chapter 5 • AI applications in diagnostic technologies and services

143

subsequently evaluated on 9499 unseen studies. The predictive model was implemented prospectively for 3 months to re-prioritize “routine” head CT studies as “stat” on real-time radiology worklists if an ICH was detected. Time to diagnosis was compared between the re-prioritized “stat” and “routine” studies. AI provided improved timing. Computer-aided diagnosis (CAD) has been an active area of research in the past 5 decades [58]. Starting with the detection of breast cancer on mammograms [59], CAD has been extended to several other diseases such as lung cancer [60], colon cancer [61] and, more recently, several brain disorders such as Alzheimer’s disease [62]. Despite all these efforts, the only CAD for breast imaging has been widely adopted in clinical practice. Importantly, most clinical CAD systems are based on traditional computer vision techniques and do not utilize deep learning. 3. Disease staging and prognosis in smokers using deep learning in chest computed tomography [63]: Objective CT analysis provides clinically relevant insights into COPD. Still, it is a radiographic method of anatomic and physiologic analysis that relies on the prespecification of radiographic features considered most likely to be associated with specific clinical outcomes. New techniques in computer vision, natural image analysis, and machine learning have enabled the direct interpretation of imaging data, going directly from the raw image data to clinical outcome without relying on the specification of radiographic features of interest [64]. Convolutional neural network (CNN) analysis and other deep learning-based models are trained using large amounts of data from individuals with known outcomes, such as known disease diagnoses or clinical events like death. Once trained, the CNN model can then use data from other individuals to determine their probability for that event, and can rapidly assess risk across large populations without the need for the manual extraction or review of specific clinical or radiographic features [65]. It is hypothesized that in-depth learning analyses of imaging data could predict clinically relevant outcomes in smokers without the pre-specification of features of interest. • MRI (Magnetic Resonance Imaging) scan: An MRI scan is a medical imaging procedure using strong magnetic fields and radio waves (radiofrequency energy) to create images. The signal in an MRI image is produced from the protons in fat and water molecules in the body. MRI imaging does not expose you to radiation. During an MRI exam, an electric current is passed through coiled wires to create a temporary magnetic field in a patient’s body. Radio waves are sent from and received by a transmitter/receiver in the machine, and these signals are used to make digital images of the scanned area of the body. For some MRI exams, intravenous (IV) drugs given in the arm, such as gadolinium-based contrast agents (GBCAs), are used to change the contrast of the MR image. MRI imaging of the body is performed to evaluate [66]: • organs of the chest and abdomen including the heart, liver, biliary tract, kidneys, spleen, bowel, pancreas, and adrenal glands;

144

Foundations of Artificial Intelligence in Healthcare and Bioscience

• pelvic organs including the bladder and the reproductive organs such as the uterus and ovaries in females and the prostate gland in males; • blood vessels (including MR Angiography); • lymph nodes. Physicians use an MRI examination to help diagnose or monitor treatment for conditions such as: • tumors of the chest, abdomen or pelvis; • diseases of the liver, such as cirrhosis, and abnormalities of the bile ducts and pancreas; • inflammatory bowel diseases such as Crohn’s disease and ulcerative colitis; • heart problems, such as congenital heart disease; • malformations of the blood vessels and inflammation of the vessels (vasculitis); • a fetus in the womb of a pregnant woman. 5.1.1.1.11 AI’s influence on MRI scans One of the greatest strengths of MRIs is their ability to capture enormous amounts of data through their scanning process. To do so, however, requires significant amounts of scanning time while the patient (who frequently is in distress) remains perfectly still inside a chamber that produces claustrophobic effects in many people. The long duration for MRIs is because the machine creates a series of 2D images or slices, which it subsequently stacks to make a 3-D image. NYU has been working on a way to accelerate this process and is now collaborating with Facebook on a project (“FastMRI” [67]) to cut down MRI durations by 90% by applying AI-based imaging tools. AI can capture less data, and therefore image faster, while still preserving, even enhancing all the rich information content of the magnetic resonance images. The scan is run faster, collecting less raw data, but then a machine learning algorithm trained to do the data-to-image conversion at up to 10 times the speed. To interpret the fastMRI data traditionally, there would not be enough data to create the 3-D image. The complexity of the MRI instrumentation makes it prone to breakdowns. Many companies are using AI tools to predict maintenance needs before breakdowns avoiding costly downtime. AI algorithms can make proactive predictions about maintenance and prevention. Hospitals use MRI data before, during, and after operations so surgeons can plan how to proceed with their care. At Boston’s Brigham and Women’s Hospital, an MRI is inside the operating room as part of a more extensive imaging setup called the Advanced Multimodality Image Guided Operating Suite (AMIGO) [68]. Combined with a mass spectrometer to its AMIGO equipment, machine learning analyzes data collected by that component and compares it to a decade worth of historical data about brain tumors while looking at a segmented MRI image. Then surgeons benefit from better insights about patients’ tumors. As a result, people may undergo fewer future operations because the first attempts are maximally successful. Also, some algorithms are trained on over a million images. That means this process could help physicians feel more confident when making diagnoses, thereby reducing potential mistakes and improper treatment plans.

Chapter 5 • AI applications in diagnostic technologies and services

145

5.1.1.1.12 Literature reviews re AI’s influence on MRI scans 1. Artificial intelligence enhances MRI scans [69]: A research team at Massachusetts General Hospital, Martinos Center for Biomedical Imaging, and Harvard University improved image reconstruction through AI and machine learning. Funded by several NIH components, the work was published on 21 March 2018, in Nature. The researchers used recent advances in technology, including more powerful graphical processing units (GPUs) in computers and artificial neural networks, to develop an automated reconstruction process. They named it AUTOMAP, for automated transform by manifold approximation. To train the neural network, the team used a set of 50,000 MRI brain scans from the NIH-supported Human Connectome Project. The team then tested how well AUTOMAP could reconstruct data using a clinical, real-world MRI machine and a healthy volunteer. They found that AUTOMAP enabled better images with less noise than conventional MRI. The signal-to-noise ratio was better for AUTOMAP than conventional reconstruction (21.6 vs. 17.6). AUTOMAP also performed better on a statistical measure of error known as root-mean-squared-error (6.7% vs. 10.8%). Also, AUTOMAP was faster than the manual tweaking now done by MRI experts. “Since AUTOMAP is implemented as a feedforward neural network, the speed of image reconstruction is almost instantaneous, just tens of milliseconds,” Rosen says. “Some types of scans currently require time-consuming computational processing to reconstruct the images. In those cases, immediate feedback is not available during initial imaging, and a repeat study may be required to identify a suspected abnormality better. AUTOMAP would provide immediate image reconstruction to inform the decision-making process during scanning and could prevent the need for additional visits.” There are many potential applications of AUTOMAP. 2. Artificial intelligence in cancer imaging: clinical challenges and applications [70]: Challenges remain in the accurate detection, characterization, and monitoring of cancers despite improved technologies. Radiographic assessment of disease most commonly relies upon visual evaluations, the interpretations of which may be augmented by advanced computational analyses. In particular, AI promises to make great strides in the qualitative interpretation of cancer imaging by expert clinicians, including volumetric delineation of tumors over time, extrapolation of the tumor genotype and biological course from its radiographic phenotype, prediction of clinical outcome, and assessment of the impact of disease and treatment on adjacent organs. AI may automate processes in the initial interpretation of images and shift the clinical workflow of radiographic detection, management decisions on whether or not to administer an intervention, and subsequent observation to a yet to be envisioned paradigm. However, most studies evaluating AI applications in oncology to date have not been vigorously validated for reproducibility and generalizability. The results do highlight increasingly concerted efforts in pushing AI technology to clinical use and impact future directions in cancer care.

146

Foundations of Artificial Intelligence in Healthcare and Bioscience

3. Emerging applications of artificial intelligence in neuro-oncology [71]: AI used with radiomics and radiogenomics in neuro-oncologic imaging will improve our diagnostic, prognostic, and therapeutic methods. This is helping to move the field into precision medicine. With the growth of computational algorithms, AI methods are poised to improve the precision of diagnostic and therapeutic approaches in medicine. The area of radiomics in neuro-oncology will likely continue to be at the forefront of this revolution. A variety of AI methods applied to conventional and advanced neurooncology MRI data can already delineate infiltrating margins of diffuse gliomas, differentiate pseudo-progression from actual progression, and predict recurrence and survival better than methods used in daily clinical practice. Radiogenomics will also advance our understanding of cancer biology, allowing noninvasive sampling of the molecular environment with high spatial resolution and providing a systems-level understanding of underlying heterogeneous cellular and molecular processes. By providing in vivo markers of spatial and molecular heterogeneity, these AI-based radiomic and radiogenomic tools have the potential to stratify patients into more precise initial diagnostic and therapeutic pathways. This will enable better dynamic treatment monitoring in this era of personalized medicine (see Chapter 4, page 101). Although substantial challenges remain, radiologic practice is set to change considerably as AI technology is further developed and validated for clinical use. • Nuclear medicine scan: Also called Radioisotope scans or Radionuclide scans, these scans use radioactive substances to see structures and functions inside your body. Nuclear imaging allows for the visualization of organ and tissue structure as well as their function. A radiopharmaceutical absorbed, or “taken up,” by a particular organ or tissue, can indicate the level of function of the organ or tissue. Nuclear imaging uses a special camera that detects radioactivity. Before the test, a small amount of radioactive material as an injection, swallowed, or inhaled. The camera makes images while the patient lies still on a table. Areas, where the radionuclide collects in more significant amounts, are called “hot spots.” Non-absorbed areas appear less bright on the scanned image and are referred to as “cold spots.” Nuclear medicine scans are used to diagnose many medical conditions and diseases. Some of the more common tests include the following [72]: • PET scan (positron emission tomography) is a nuclear medicine exam that produces a 3-dimensional image of functional processes in the body. A PET scan uses a small amount of a radioactive drug to show differences between healthy and diseased tissue. Some common uses of PET include [73]: - to assess cancer; - to determine the risk of cancer in a lung nodule; - to evaluate the brain in some patients for memory disorders, brain tumors, or seizure disorders; - to assess the heart for blood flow and ischemic heart disease.

Chapter 5 • AI applications in diagnostic technologies and services

147

• Renal scans. These are used to examine the kidneys and to find any abnormalities. These include abnormal function or obstruction of the renal blood flow. • Thyroid scans. These are used to evaluate thyroid function or to evaluate a thyroid nodule or mass better. • Bone scans. These are used to evaluate degenerative and/or arthritic changes in joints, to find bone diseases and tumors, and/or to determine the cause of bone pain or inflammation. • Gallium scans. These are used to diagnose active infectious and/or inflammatory diseases, tumors, and abscesses. • Heart scans. These are used to identify abnormal blood flow to the heart, to determine the extent of the damage of the heart muscle after a heart attack, and/or to measure heart function. • Brain scans. These are used to investigate problems within the brain and/or in the blood circulation to the brain. • Breast scans. These are often used in conjunction with mammograms to locate cancerous tissue in the breast. 5.1.1.1.13 AI’s influence on nuclear medicine scans [74] Nuclear medicine has been using intelligent systems for some years with technologies such as voice recognition, radiomics, nuclear cardiology, oncology, and neurology. There has been success with automatic edge detection and tumor volume delineation plus automatic detection of anatomy and pulmonary nodules on PET/CT (Fig. 5 3). Predictive algorithms such as FRAX (Fracture Risk Assessment Tool) are already in clinical use, and over the last 2 3 years, there have been a growing number of publications covering a variety of clinical AI applications in imaging. The nonclinical parts of nuclear medicine are also benefitting from AI. Improved methods of monitoring doses, dose limit compliance, and algorithms leading to dose reduction are now in regular clinical use.

FIGURE 5–3 Number of diagnostic clinical algorithms by technology. Diagnostic imaging currently more than doubles all other forms of AI diagnostic algorithms due primarily to advanced (GPU) image recognition software. However, it is felt that genetic testing will grow significantly in the coming years as a principal diagnostic testing modality. Courtesy of Mayo Clinc.

148

Foundations of Artificial Intelligence in Healthcare and Bioscience

Proper education about ML would benefit nuclear medicine staff along with a willingness to try new things so that we are in an excellent position to embrace all that AI has to offer. The imaging quality could improve along with dose reduction, increased speed, and accuracy of reporting with the prediction of outcomes. AI could be deeply embedded in many aspects of nuclear medicine in the future. In the words of Bill Gates, “We always overestimate the change that will occur in the next 2 years and underestimate the change that will occur in the next 10.” This will undoubtedly be the case in AI and nuclear medicine. 5.1.1.1.14 Literature reviews re AI’s influence on nuclear medicine scans 1. Application of artificial neural networks to identify Alzheimer’s disease using cerebral perfusion SPECT data [75]: This study aimed to demonstrate the usefulness of artificial neural networks in Alzheimer’s disease diagnosis (AD) using data of brain single-photon emission computed tomography (SPECT). The results were compared with discriminant analysis. The study population consisted of 132 clinically diagnosed patients. The sensitivity of Alzheimer’s disease diagnosis detection by artificial neural network and discriminant analysis was 93.8% and 86.1%, respectively, and the corresponding specificity was 100% and 95%. Artificial neural networks and conventional statistics methods (discriminant analysis) are useful tools in Alzheimer’s disease diagnosis. The results of this study indicate that artificial neural networks can discriminate AD patients from healthy controls. The study simulations provide evidence that artificial neural networks can be a useful tool for clinical practice. 2. Scanning the future of medical imaging [76]: As innovation accelerates, the medical device industry is undergoing rapid change in the form of new business models, AI, and as the Internet of Things creates disruptive possibilities in healthcare. Annual patent applications related to medical devices have tripled in 10 years due to innovations, and technology cycle times have halved in just 5 years (Moore’s Law). Connectivity will have exploded by 2021. The world will have more than 3 times as many smart connected devices as people, and more and more medical devices and processes contain integrated sensors. A total of 13 companies had more than 50 transactions with a median value of slightly below $4 million, possibly reflecting the niche nature of markets for the newer nuclear medicine ligands (a molecule that binds to another molecule). In nuclear and PET imaging, the introduction of specific ligands allows for better targeting in cancer diagnosis and reduces the need for tissue sampling. Noninvasive diagnostic techniques will play an increasing role in medicine, especially cancer diagnosis and management. And theranostic platforms, in which the nuclear diagnostic agent is paired with a complementary radiotherapeutic, hold promise for meeting unserved needs in oncology. As nuclear imaging grows and moves away from metabolic imaging for cancer localization to pathology-specific cancer diagnosis (for instance, the combining of ligands specific to the prostate or breast cancer with radioisotopes for definitive diagnosis),

Chapter 5 • AI applications in diagnostic technologies and services

149

imaging companies will have increasing opportunities to engage in cancer diagnostics, and potentially therapeutics as well. 3. Four future trends in medical imaging that will change healthcare [77]: Artificial Intelligence (AI): AI in the medical imaging market is estimated to rise from $21.48 billion in 2018 to a projected value of $264.85 billion by 2026, according to Data Bridge Market Research’s April 2019 report. These vendors will need to prove their ROI in a very competitive and crowded market with hundreds of AI technology solutions being developed for medical imaging. Through its ability to sift through mountains of scans quickly, AI has the potential to revolutionize the medical imaging industry by offering providers and patients life-changing insights into a variety of diseases, injuries, and conditions that may be hard to detect without the supplemental technology. Virtual Reality & 3-D Imaging: Right now, the world can’t get enough of virtual reality (VR). As amazing as MRIs and CT scans are, currently, their display in 2D requires physicians to use their imaginations to mentally stitch together a full picture of the 3-D organ or body part. Now, new augmented reality technologies, like EchoPixel True 3-D, have made it possible for radiologists or physicians to take slices of MRI pictures to create a 3-D image. The physicians can then examine with 3-D glasses, a VR headset, or even print using a 3-D printer and special plastic. Nuclear Imaging: With nuclear imaging, a patient is injected with or swallows radioactive materials called radiotracers or radiopharmaceuticals before a medical imaging scan like a position emission tomography (PET) and a single-photon emission computed tomography (SPECT). During the scan, the camera focuses on the area where the radioactive material concentrates, showing the doctor what kind of problem exists. These types of scans are particularly helpful when diagnosing thyroid disease, gall bladder disease, heart conditions, cancer, and Alzheimer’s disease. Wearable medical devices are not only a top healthcare trend this year, but they are also slated to revolutionize diagnostic imaging in 2019 as well. • Ultrasound (sonography): Ultrasound (sonography) is a type of imaging that uses high-frequency sound waves to look at organs and structures inside the body. Unlike x-rays, with ultrasound, there is no exposure to radiation. With an ultrasound test, you lie on a table while a special technician moves a device called a transducer over part(s) of your body. The transducer transmits sound waves that bounce off the tissues and structures inside your body. The transducer then captures the waves that bounce back. The ultrasound machine creates images from the sound waves. Images in thin, flat sections of the body are displayed in conventional ultrasound. Ultrasound technology can also produce 3-dimensional (3-D) displays. Doppler ultrasound is another ultrasound technique that allows the doctor to evaluate blood flow through arteries and veins in the abdomen, arms, legs, neck, various body organs such as the liver or kidneys and/or brain (in infants and children) or within.

150

Foundations of Artificial Intelligence in Healthcare and Bioscience

Table 5–2

Internal organs examine by ultrasound (sonography).

• Heart and blood vessels, including the abdominal aorta and its major branches; • Liver; • Gallbladder; • Spleen; • Pancreas; • Kidneys; • Bladder

• Uterus, ovaries, and unborn child (fetus) in pregnant patients; • Eyes • Thyroid and parathyroid glands • Scrotum (testicles); • Brain in infants; • Hips in infants; • Spine in infants

Ultrasound is used to exam the multiple internal organs in the body (Table 5 2). It is also used to: • guide procedures such as needle biopsies, in which needles are used to sample cells from an abnormal area for laboratory testing; • image the breasts and guide biopsy of breast cancer; • diagnose a variety of heart conditions, including valve problems and congestive heart failure, and to assess damage after a heart attack. (Ultrasound of the heart is commonly called an “echocardiogram” or “echo” for short.) Doppler ultrasound images can help the physician to see and evaluate: • blockages to blood flow (such as clots); • narrowing of vessels; • tumors and congenital vascular malformations; • reduced or absent blood flow to various organs; • greater than normal blood flow to different areas, which is sometimes seen in infections. 5.1.1.1.15 AI’s influence on ultrasound (sonography) [78] It is expected that AI will be producing a distinctly different generation of products in the longer-term with the potential of significant economic value. There are promising signs of value-adding AI solutions with ultrasound imaging as it is brought to market. Vendors will need to make ultrasound systems more straightforward and more intuitive to use without compromising on scan quality. Innovations must include providing real-time support for clinicians during scans and procedures. AI-technology and deep learning techniques are being used to address this challenge by using image recognition capabilities and real-time training aids such as probe placement guidance and organ detection. The market is starting to develop solutions to address some of these more technically challenging possibilities. However, they are at an early stage of development, and accurate scan reads, as well as scan quality, remain a concern for regulators. Ultimately, the longer-term outcome is that AI could enable ultrasound to become accessible to anyone and everyone. However, there are sizeable technical challenges and commercial hoops to jump through before this vision can be realized.

Chapter 5 • AI applications in diagnostic technologies and services

151

We are now starting to see AI and deep learning capabilities being deployed to provide anatomy-awareness where specific body parts can be automatically recognized. This image recognition capability is opening the possibility of contextually-aware systems that can assist sonographers in real-time by suggesting relevant tools as well as providing diagnostic or decision-support. For in-experienced users, these real-time aids might be as simple as communicating which organ or body part is being scanned, providing support on how to best position the probe and guiding the user on the anatomical features relevant to the scan. There are many emerging markets for the use of ultrasound in breast imaging. The main limitations of ultrasound for breast imaging include a lengthy exam and reading times, a relatively high number of false positives, and the lack of repeatability of results due to operator dependence. Some vendors have already identified the opportunity to deploy AI-powered solutions to address these limitations, and various propositions have been brought to market. The ultimate step for AI within this part of the workflow will be to arrive at a credible and accurate diagnosis independent of the radiologist. There are still many unanswered technical challenges, including how well algorithms and neural networks trained using localized datasets will perform when applied to more extensive populations. 5.1.1.1.16 Literature reviews re AI’s influence on ultrasound (sonography) 1. Automatic classification of pediatric pneumonia [79]: Pneumonia is one of the most prevalent causes of death among children under 5 years of age [80], with its highest case-fatality rate among infants in the post-neonatal period. Thoracic ultrasound is becoming a useful and readily available technique for physicians assessing a variety of respiratory, hemodynamic, and traumatic conditions [81]. To facilitate the diagnostics of pneumonia in low-resource settings, it would be useful to have an automatic system to assist in the interpretation of ultrasound images. Artificial intelligence based on artificial neural networks is a common approach for automated computer learning and classification. Based on the measurable characteristics of a particular phenomenon after a training and validation step based on a selected dataset, it can assign a classification that can be used as the basis of a diagnosis [82]. This study demonstrates that it is possible to train an artificial neural network to detect evidence of pneumonia infiltrates in ultrasound lung images collected from young, hospitalized children with a diagnosis of pneumonia is. The method achieved a sensitivity of 90.9% and a specificity of 100% to detect vectors associated with pneumonia consolidates when compared to the visual recognition performed by an expert analyst. An algorithm like the one developed in this study proves that it is possible to create a methodology to automatically detect lung infiltrates due to pneumonia in young children using ultrasound. Moreover, this technology might be applied to more portable and less expensive ultrasound devices and be taken to remote, rural areas. High sensitivity and specificity were obtained for the classification of characteristic vectors associated with pneumonia infiltrates. This has the possibility of reducing deaths in resource-limited areas of the world if data is adequately processed to make automatic

152

Foundations of Artificial Intelligence in Healthcare and Bioscience

interpretation feasible for a larger group of patients. In these populations, the lack of diagnostics for pneumonia is critical. 2. Artificial intelligence in breast ultrasound [83]: AI is receiving much attention for its excellent performance in image-recognition tasks. It is increasingly applied in breast ultrasound as a first-line imaging tool for breast lesion characterization for its availability, cost-effectiveness, acceptable diagnostic performance, and noninvasive and real-time capabilities. The use of AI in breast ultrasound has also been combined with other novel technology, such as ultrasound radiofrequency (RF) time series analysis [84], multimodality GPU-based computerassisted diagnosis of breast cancer using ultrasound and digital mammography image [85], optical breast imaging, QT-based breast tissue volume imaging [86], and automated breast volume scanning (ABVS) [87]. AI has been increasingly applied in ultrasound and proved to be a powerful tool to provide a reliable diagnosis with higher accuracy and efficiency and reduce the workload of physicians. It is roughly divided into early machine learning controlled by manual input algorithms, and DL, with which software can self-study. Soon, it is believed that AI in breast ultrasound can not only distinguish between benign and malignant breast masses but also further classify specific benign diseases, such as inflammatory breast mass and fibroplasia. Besides, AI in ultrasound may even predict Tumor Node Metastasis classification [88], prognosis, and the treatment response for patients with breast cancer. 3. Diagnosis of coronary artery diseases and carotid atherosclerosis using intravascular ultrasound images [89]: Cardiovascular ultrasound examination complements other imaging modalities such as radiography and allows more accurate diagnostic tests to be conducted. This modality is non-invasive, widely used in the diagnosis of cardiovascular diseases. Recently, 2 leading ultrasound-based techniques were used for the assessment of atherosclerosis: B-mode ultrasound used in the measurement of carotid artery intima thickness and intravascular ultrasound. These techniques provide images in real-time, portable, substantially lower in cost, and no harmful ionizing radiations are used in imaging. The processing of ultrasound images plays a significant role in the accurate diagnosis of the disease level. The diagnostic accuracy depends on the time to read the image and the experience of the practitioner to interpret the correct information. AI computer-aided methods for the analysis of the intravascular ultrasound images can assist in better measurement of plaque deposition in the coronary artery. In this study, the level of plaque deposition is identified using Otsu’s segmentation method, and classification of plaque deposition level is performed using Back Propagation Network (BPN) and Support Vector Machine (SVM). The result shows SVM classifies more significantly in comparison with the BPN network. • Endoscopy: Endoscopy is the generic term for a procedure that uses an instrument called an endoscope, or scope for short. These scopes have a tiny camera attached to a long, thin

Chapter 5 • AI applications in diagnostic technologies and services

153

tube. The tube is moved through a body passageway or opening to see inside an organ. Sometimes scopes are used for surgery, such as for removing polyps from the colon. There are many different kinds of endoscopy, including (but not limited to): • Arthroscopy to examine joints; • Bronchoscopy to examine lungs; • Colonoscopy and sigmoidoscopy to examine large intestine; • Cystoscopy and ureteroscopy to examine the urinary system; • Laparoscopy to examine internal structures (e.g., abdomen, pelvis, etc.) and perform surgical procedures on internal structures; • Upper gastrointestinal endoscopy to examine the esophagus and stomach. 5.1.1.1.17 AI’s influence on endoscopy [90] A computer-aided method for medical image recognition has been researched continuously for years [91]. Most traditional image recognition models use feature engineering, which is essentially teaching machines to detect explicit lesions specified by experts. As opposed to feature engineering, AI based on deep learning enables recognition models to learn most predictive features from the large data sets of labeled images and perform image classification spontaneously [92]. In this way, AI is now considered more efficient and has become increasingly popular. Recent studies demonstrate that AI would be able to overcome diagnostic subjectivity, which is caused by endoscopists who unconsciously take additional characteristics into account other than microstructures and capillaries. Therefore, it could be a useful real-time aid for nonexperts to provide an objective reference during endoscopy procedures. In the near future, combined electronic chromoendoscopy with AI, the optical diagnosis will achieve optimal diagnostic accuracy that is comparable with a standard histopathologic examination. This will reduce medical costs by avoiding unnecessary resection and pathologic evaluation. 5.1.1.1.18 Literature reviews re: AI’s influence on endoscopy 1. Artificial intelligence and colonoscopy: current status and future perspectives. Digestive endoscopy [93]: In the field of gastrointestinal endoscopy, computer-aided diagnosis (CAD) for colonoscopy is the most investigated area, although it is still in the preclinical phase. Because colonoscopy is carried out by humans, it is inherently an imperfect procedure. CAD assistance is expected to improve its quality regarding automated polyp detection and characterization (i.e., predicting the polyp’s pathology). It could help prevent endoscopists from missing polyps as well as provide a precise optical diagnosis for those detected. Research on automated polyp detection has been limited to experimental assessments using an algorithm based on ex vivo videos or static images. Performance for clinical use was reported to have .90% sensitivity with acceptable specificity. In contrast, research on automated polyp characterization seems to surpass that for polyp detection.

154

Foundations of Artificial Intelligence in Healthcare and Bioscience

2. GI disease screening with artificial intelligence is close [94]: As a tool for the screening and diagnosis of diseases in the gastrointestinal (GI) tract, AI is advancing rapidly. Much of the recently updated literature is on screening colonoscopy. Still, the same principles are relevant and being pursued other GI conditions, such as dysplasia screening in patients with Barrett’s esophagus and the assessment of mucosal healing in inflammatory bowel disease. “A computer can consider a thousand features when evaluating a polyp, which is way beyond what we can do,” said Dr. Michael Byrne, clinical professor in the division of gastroenterology at Vancouver General Hospital. Even with advances to improve visualization in screening colonoscopy, such as improved resolution and better lighting, the reason that AI is expected to prevail is that “the human eye is just not accurate enough.” “There are many technologies to improve screening and diagnosis of GI diseases, but I believe these will struggle if they do not also have some kind of built-in machine intelligence,” Recognizing dysplasia associated with Barrett’s esophagus has parallels with identifying adenomatous polyps in screening colonoscopy. Still, Dr. Byrne also discussed machine learning as an “optical biopsy” for evaluating the mucosa of patients with IBD. No longer a screening approach, the characterization of inflammatory bowel disease (IBD) tissue could help with therapeutic decisions. Overall, there is abundant evidence that “optical biopsy is feasible,” Dr. Byrne said. He indicated that clinical applications are approaching quickly. 3. Artificial intelligence in upper endoscopy: location, location, location [95]: Not every endoscopy is done under optimal conditions, and sometimes endoscopists do not clean as thoroughly as possible or take the time and effort required to inspect all locations within the stomach. The result is lesions not being recognized, as the speed of the exam does not allow recognition of subtle changes, lesions are partially or entirely covered, or lesions never were in the field of view. Interval colorectal cancer is most likely related to the skillset of the endoscopist—that is, some have few or no patients who develop interval colorectal cancer, and some have more patients who develop it. The same appears to be the case for upper endoscopic screening for gastric cancer. The critical quality question for upper and lower endoscopy is the same. How can we change the behavior of the endoscopist with more interval lesions toward that of the endoscopist with few or no interval lesions? Inspection quality varies among endoscopists; complete inspection of the stomach requires time. An AI method that verifies a view of every location within the stomach in itself does not guarantee that lesions will not be missed. But it is very likely to improve the odds of not missing lesions by stimulating those with an incomplete inspection to inspect any lost locations. • Fundus imaging (fundoscopy or ophthalmoscopy): The fundus of the eye is the inner portion that can be seen during an eye examination by looking through the pupil with an instrument called an ophthalmoscope (light source with a condensing magnifying lens). This provides a direct view of the retina, macula, and

Chapter 5 • AI applications in diagnostic technologies and services

155

optic nerve. One of the most significant values of this test is that it is the only noninvasive way to observe the vasculature and nerve tissue of the body in live activity (“in vivo”). Thus, capturing photographic, video, and/or digital imagery of these tissues in vivo provides a unique opportunity for the exam. It allows for the study of multiple body functions and disorders of the vascular (e.g., cardiovascular, diabetes) and neurological (e.g., central nervous system [CNS] abnormalities, tumors, aneurysms) systems. Among the ways to capture fundus imagery for diagnosis include: [96] • Fundus photography (standard, digital [smartphone] 2 dimensional images); • Visual field analysis (patterns of retinal nerve fiber function); • Optical Coherence Tomography (OCT) analyzing retinal layers; • Fluorescein angiography (analysis of status and functions of retinal vessels). 5.1.1.1.19 AI’s influence on fundus imaging [97] One of the earliest associations of AI’s influence in healthcare came with the development of a screening method for diabetic retinopathy using a machine learning algorithm trained to identify abnormal (diabetic) fundi from a database of normal fundus photographs. Deep learning (DL) algorithms have gone well beyond just diabetic screening applied to optical coherence tomography and visual fields. They can now achieve robust classification performance in the detection retinopathy of prematurity, the glaucoma-like disc, macular edema, and age-related macular degeneration. DL in ocular imaging is being used in conjunction with telemedicine as a solution to screen, diagnose, and monitor major eye diseases for patients in primary care and community settings. Due to the strengths of AI convolutional neural networks (CNNs) in pattern recognition, progress in fundoscopy applications for clinical screening and diagnosis of multiple ocular diseases (glaucoma, retinal disease, etc.), as well as systemic (cardiovascular, neurologic, kidney disease) and beyond, are being introduced clinically and continue to advance at a rapid rate. 5.1.1.1.20 Literature reviews re AI’s influence on fundus imaging 1. Prediction of cardiovascular risk factors from retinal fundus photographs via deep learning [98]: Using deep-learning models trained on data from 284,335 patients and validated on 2 independent datasets of 12,026 and 999 patients, cardiovascular risk factors were predicted that were not previously thought to be present or quantifiable in retinal images. These include age (mean absolute error within 3.26 years), gender (area under the receiver operating characteristic curve (AUC 5 0.97), smoking status (AUC 5 0.71), systolic blood pressure (mean absolute error within 11.23 mmHg) and major adverse cardiac events (AUC 5 0.70). It was also demonstrated that the trained deep-learning models used anatomical features, such as the optic disc or blood vessels, to generate each prediction. 2. Automated diabetic retinopathy detection in smartphone-based fundus photography using artificial intelligence [99]:

156

Foundations of Artificial Intelligence in Healthcare and Bioscience

The role of artificial intelligence (AI)-based automated software for detection of diabetic retinopathy (DR) and sight-threatening DR (STDR) by fundus photography was assessed using a smartphone-based device and validated against ophthalmologist’s grading. Three hundred and one patients with type 2 diabetes underwent retinal photography with Remidio ‘Fundus on phone’ (FOP), a smartphone-based device, at a tertiary care diabetes center in India. Grading of DR was performed by the ophthalmologists using International Clinical DR (ICDR) classification scale. STDR was defined by the presence of severe non-proliferative DR, proliferative DR, or diabetic macular edema (DME). Retinal images of 296 patients were graded. DR was detected by the ophthalmologists in 191 (64.5%) and by the AI software in 203 (68.6%) patients while STDR was detected in 112 (37.8%) and 146 (49.3%) patients, respectively. The AI software showed 95.8% (95% CI 92.9 98.7) sensitivity and 80.2% (95% CI 72.6 87.8) specificity for detecting any DR and 99.1% (95% CI 95.1 99.9) sensitivity and 80.4% (95% CI 73.9 85.9) specificity in detecting STDR. Automated AI analysis of FOP smartphone retinal imaging has a very high sensitivity for detecting DR and STDR. Thus, it can be an initial tool for mass retinal screening in people with diabetes. 3. Artificial intelligence in glaucoma [100]: AI techniques can successfully analyze and categorize data from visual fields, optic nerve structure (e.g., optical coherence tomography [OCT] and fundus photography), ocular biomechanical properties, and a combinations thereof. This allows for the identification of disease severity, determine disease progression, and/or recommend referral for specialized care. Algorithms continue to become more complex, utilizing both supervised and unsupervised methods of artificial intelligence. These algorithms, however, often outperform standard global indices and expert observers. Artificial intelligence has the potential to revolutionize the screening, diagnosis, and classification of glaucoma, both through the automated processing of large data sets and by earlier detection of new disease patterns. Also, artificial intelligence holds promise for fundamentally changing research aimed at understanding the development, progression, and treatment of glaucoma, by identifying novel risk factors and by evaluating the importance of existing ones. • Medical (clinical) photography: Finally, both historically and to the present day, clinical photography has been the essence of medical documentation. Photography has changed the way care providers document, discuss, and deliver modern medical care. Medical documentation is now a critical part of patient care. Specialties relying on a visual diagnosis like dermatology have integrated photography into routine practice: rash and lesions appearance and progression can be documented, and patients can even self-document their skin examination for early detection of skin cancer [101]. Photographic documentation encompasses nearly every specialty in medicine. It is crucial in wound management, allowing wound care teams to track the progression of

Chapter 5 • AI applications in diagnostic technologies and services

157

wound healing. Mobile retinal imaging is changing ophthalmologic evaluation. Pre and postoperative imaging in plastic and reconstructive surgery are critical for documentation and identification of subtle contour changes. Digital cameras, smartphones, and laparoscopic/endoscopic image capture are now ubiquitous. Documentation of physical abuse mandates the use of photography and is often vital evidence for legal proceedings [102]. Photography is used to educate both the health professional in training and patients. Part of a physician’s task is to help patients understand their diseases, and photography plays a significant role. As long as the photograph has existed, images have been used in textbooks, atlases, meeting presentations, and case reports. Images are used to publish newly identified diseases, very rare diseases, or unique presentations of common diseases. Nothing has endured as much as a healthcare educational and diagnostic tool as the clinical photograph [102]. 5.1.1.1.21 AI’s influence on medical (clinical) photography As was stated in the opening discussion on “Diagnostic Imaging” (page 129), “One of the most promising areas of health innovation is the application of AI in medical imaging.” With pattern recognition and GPU image analysis as one of its greatest strengths, AI is revolutionizing digital imagery as a diagnostic resource in all areas of healthcare. And among image capturing devices in the clinical setting, the medical photograph is perhaps the most popular source of capture. Indeed, in external disease categories such as dermatology, the medical photograph which has always been a go-to diagnostic tool, now it is enjoying expanded use through AI and deep learning (CNN) analysis [103]. There are already several artificial intelligence studies focusing on skin disorders (see Chapter 7, page 355) such as skin cancer, psoriasis, atopic dermatitis, and onychomycosis. Even more significant expansions of AI’s applications are being applied in eye care [104]. Major clinical areas for these applications include (and are not limited to) vision and refractive care; blindness prevention; cornea and ocular surface; anterior segment (cataracts, uveitis, etc.); retina (especially diabetic retinopathy); glaucoma; and neuro-ophthalmic disorders. 5.1.1.1.22 Literature reviews re AI’s influence on medical (clinical) photography 1. AI in skin cancer [105]: Significant advances in recent years are related to advancements in the utilization of convolutional neural networks (CNNs) for dermatologic image analysis, especially dermoscopy. Recent studies show CNN-based approaches can perform as well as or even better than humans in diagnosing close-up and dermoscopic images of skin lesions. Limitations for AI development include the need for large datasets, “ground truth” diagnoses, and a lack of widely accepted standards. Despite recent breakthroughs, the adoption of AI in clinical settings for dermatology is in the early stages. Close collaboration between researchers and clinicians may provide the opportunity to investigate the implementation of AI in clinical settings to give real benefit to both clinicians and patients.

158

Foundations of Artificial Intelligence in Healthcare and Bioscience

2. Performance of Deep Learning Architectures and Transfer Learning for Detecting Glaucomatous Optic Neuropathy in Fundus Photographs [106]: An extensive database of fundus photographs (n 5 14,822) from a racially and ethnically diverse group of individuals (over 33% of African descent) was evaluated by expert reviewers and classified as glaucomatous optic neuropathy (GON) or healthy. Several deep learning architectures and the impact of transfer learning (a technique to apply features learned to perform 1 task to be applied to other tasks) [107] were evaluated. Results suggest that deep learning methodologies have high diagnostic accuracy for identifying fundus photographs with glaucomatous damage to the optic nerve head (ONH) in a racially and ethnically diverse population. The best performing model was the transfer learning ResNet [108] architecture and achieved an AUC of 0.91 in identifying GON from fundus photographs. Given the increasing burden of glaucoma on the healthcare system as our population ages and the proliferation of ophthalmic image devices, automated image analysis methods will serve an important role in decision support systems for patient management and population and primary care-based screening approaches for glaucoma detection. 3. Doctors’ use of mobile devices in the clinical setting: a mixed-methods study [109]: Mobile device use has become almost ubiquitous in daily life and therefore includes use by doctors in clinical settings. A study was conducted to explore how doctors use mobile devices in the clinical setting and understand drivers for use. The study included doctors in a pediatric and adult teaching hospital in 2013. Focus groups explored doctors’ reasons for using or refraining from using mobile devices in the clinical setting, and their attitudes about others’ use. The survey, completed by 109 doctors, showed that 91% owned a smartphone, and 88% used their mobile devices frequently in the clinical setting. Trainees were more likely to use their mobile devices for learning and accessing information related to patient care, as well as for personal communication unrelated to work. Focus group data highlighted a range of factors that influenced doctors to use on mobile devices in the clinical setting, including convenience for medical photography and factors that limited use. Distraction in the clinical setting due to the use of mobile devices was a critical issue. Personal experience and confidence in using mobile devices affected their use and was guided by role modeling and expectations within a medical team. Doctors use mobile devices to enhance efficiency in the workplace. In the current environment, doctors are making their own decisions based on balancing the risks and benefits of using mobile devices in the clinical setting. There is a need for guidelines around acceptable and ethical use that is patient-centered and that respects patient privacy.

5.1.2 Laboratory (clinical diagnostic) testing Laboratory tests are clinical studies using medical devices that are intended for use on samples of blood, urine, or other tissues or substances taken from the body to help diagnose a disease or other conditions. They are used primarily to support the healthcare practitioner:

Chapter 5 • AI applications in diagnostic technologies and services

• • • • •

159

identify changes in your health condition before any symptoms occur; diagnose or aid in diagnosing a disease or condition; plan your treatment for an illness or condition; evaluate your response to treatment and/or monitor the course of a disease over time.

After the sample is collected, it is sent to a laboratory to see if it contains certain substances and how much. Depending on the test, the presence, absence, or amount of an analyte (substance) may mean you do or do not have a particular condition in question. Sometimes laboratories compare your results to results obtained from previous tests to see if there has been a change in your condition. Some types of lab tests show whether or not your results fall within normal ranges. Normal test values are usually given as a range, rather than as a specific number because normal values vary from person to person. What is normal for one person may not be normal for another person. Other types show whether there is a particular substance present or absent, such as a mutation in a gene, or an infectious organism, which indicates whether you have a disease, an infection, or may or may not respond to therapy. Some laboratory tests are precise, reliable indicators of specific health problems, while others provide more general information that gives health practitioners clues to your possible health problems. Information obtained from laboratory tests may help decide whether other tests or procedures are needed to make a diagnosis or to develop or revise a previous treatment plan. All laboratory test results must be interpreted within the context of your overall health and should be used along with other exams or tests [110]. The magnitude of specific laboratory tests currently used in healthcare is too large for discussion of all trials (350 listed in Table 5 3). Comprehensive resources regarding all lab tests can be found at the U.S. National Library of Medicine, MedlinePlus, Laboratory Tests (https://medlineplus.gov/laboratorytests.html), and Medical Tests (https://medlineplus.gov/ lab-tests/). Also, detailed descriptions (for patients) and normative range values for each test can be found at the American Association for Clinical Chemistry website (https://labtestsonline.org/patient-resources). Discussions in this Chapters 5 (and 6) will identify selected diagnostic tests utilized in the more common diagnostic categories.

5.1.2.1 AI’s influence on laboratory testing The clinical laboratory in healthcare has been among the earliest entities to adopt robotics and algorithms into its workflow. AI technologies known as “Expert Systems” (see Chapter 3, page 53) introduced knowledge-based systems that provide sequential laboratory testing and interpretation as early as 1984 [111]. Expert systems don’t include the ability to learn by themselves, but instead, make decisions based on the accumulated knowledge with which they are programmed. Computational Pathology applies to computational models, machine learning, and visualizations to make lab output both more useful and easily understood for the clinical decisionmaker. Computational pathology has clinical value in all aspects of medicine via a focus on

160

Foundations of Artificial Intelligence in Healthcare and Bioscience

Table 5–3 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25. 26. 27. 28. 29. 30. 31. 32. 33. 34. 35. 36. 37. 38. 39. 40. 41. 42. 43. 44. 45. 46.

Clinical laboratory tests.

5-HIAA 17-Hydroxyprogesterone Acetaminophen Acetylcholine Receptor (AChR) Antibody Acid-Fast Bacillus (AFB) Testing Activated Clotting Time (ACT) Acute Viral Hepatitis Panel Adenosine Deaminase Adrenocorticotropic Hormone (ACTH) Alanine Aminotransferase (ALT) Albumin Aldolase Aldosterone and Renin ALK Mutation (Gene Rearrangement) Alkaline Phosphatase (ALP) Allergy Blood Testing Alpha-1 Antitrypsin Alpha-fetoprotein (AFP) Tumor Marker AMAS Aminoglycoside Antibiotics Ammonia Amniotic Fluid Analysis Amylase Androstenedione Angiotensin-Converting Enzyme (ACE) Anti-DNase B Anti-dsDNA Anti-LKM-1 Anti-Müllerian Hormone Anti-Saccharomyces cerevisiae Antibodies (ASCA) Antibiotic Susceptibility Testing Anticentromere Antibody Antidiuretic Hormone (ADH) Antimitochondrial Antibody and AMA M2 Antineutrophil Cytoplasmic Antibodies (ANCA, MPO, PR3) Antinuclear Antibody (ANA) Antiphospholipid Antibodies Antistreptolysin O (ASO) Antithrombin Apo A-I Apo B APOE Genotyping, Alzheimer Disease APOE Genotyping, Cardiovascular Disease Arbovirus Testing Aspartate Aminotransferase (AST) Autoantibodies (Continued)

Chapter 5 • AI applications in diagnostic technologies and services

Table 5 3 47. 48. 49. 50. 51. 52. 53. 54. 55. 56. 57. 58. 59. 60. 61. 62. 63. 64. 65. 66. 67. 68. 69. 70. 71. 72. 73. 74. 75. 76. 77. 78. 79. 80. 81. 82. 83. 84. 85. 86. 87. 88. 89. 90. 91.

(Continued)

B Vitamins B-cell Immunoglobulin Gene Rearrangement Bacterial Wound Culture Basic Metabolic Panel (BMP) BCR-ABL1 Beta-2 Glycoprotein 1 Antibodies Beta-2 Microglobulin Kidney Disease Beta-2 Microglobulin Tumor Marker Bicarbonate (Total CO2) Bilirubin Blood Culture Blood Gases Blood Ketones Blood Smear Blood Typing Blood Urea Nitrogen (BUN) BNP and NT-proBNP Body Fluid Testing Bone Markers Bone Marrow Aspiration and Biopsy BRCA Gene Mutation Testing Breast Cancer Gene Expression Tests C-peptide C-Reactive Protein (CRP) CA 15-3 CA-125 Calcitonin Calcium Calprotectin CALR Mutation Cancer Antigen 19-9 Carbamazepine Carcinoembryonic Antigen (CEA) Cardiac Biomarkers Cardiac Risk Assessment Cardiolipin Antibodies Catecholamines CD4 Count Celiac Disease Antibody Tests Cell-Free Fetal DNA Cerebrospinal Fluid (CSF) Analysis Ceruloplasmin Chemistry Panels Chickenpox and Shingles Tests Chlamydia Testing (Continued)

161

162

Foundations of Artificial Intelligence in Healthcare and Bioscience

Table 5 3 92. 93. 94. 95. 96. 97. 98. 99. 100. 101. 102. 103. 104. 105. 106. 107. 108. 109. 110. 111. 112. 113. 114. 115. 116. 117. 118. 119. 120. 121. 122. 123. 124. 125. 126. 127. 128. 129. 130. 131. 132. 133. 134. 135. 136. 137.

(Continued)

Chloride Cholesterol Cholinesterase Tests Chromogranin A Chromosome Analysis (Karyotyping) Chymotrypsin CK-MB Clopidogrel (CYP2C19 Genotyping) Clostridium difficile and C. diff Toxin Testing Coagulation Cascade Coagulation Factors Cold Agglutinins Complement Complete Blood Count (CBC) Comprehensive Metabolic Panel (CMP) Continuous Glucose Monitoring Copper Cortisol Creatine Kinase (CK) Creatinine Creatinine Clearance Cryoglobulins Cyclic Citrullinated Peptide Antibody Cyclosporine Cystatin C Cystic Fibrosis (CF) Gene Mutations Testing Cytomegalovirus (CMV) Tests D-dimer Dengue Fever Testing Des-gamma-carboxy prothrombin (DCP) DHEAS Digoxin Direct Antiglobulin Test Direct LDL Cholesterol Drug Abuse Testing EGFR Mutation Testing Electrolytes Emergency and Overdose Drug Testing Epstein-Barr Virus (EBV) Antibody Tests Erythrocyte Sedimentation Rate (ESR) Erythropoietin Estimated Glomerular Filtration Rate (eGFR) Estrogen Receptor, Progesterone Receptor Breast Cancer Testing Estrogens Ethanol Extractable Nuclear Antigen Antibodies (ENA) Panel (Continued)

Chapter 5 • AI applications in diagnostic technologies and services

Table 5 3 138. 139. 140. 141. 142. 143. 144. 145. 146. 147. 148. 149. 150. 151. 152. 153. 154. 155. 156. 157. 158. 159. 160. 161. 162. 163. 164. 165. 166. 167. 168. 169. 170. 171. 172. 173. 174. 175. 176. 177. 178. 179. 180. 181. 182. 183.

(Continued)

Factor V Leiden Mutation and PT 20210 Mutation Fecal Fat Fecal Immunochemical Test and Fecal Occult Blood Test Ferritin Fetal Fibronectin (fFN) Fibrinogen FIP1L1-PDGFRA First Trimester Screening Follicle-stimulating Hormone (FSH) Fructosamine Fungal Tests G6PD Gamma-Glutamyl Transferase (GGT) Gastrin Gastrointestinal Pathogens Panel Genetic Tests for Targeted Cancer Therapy Glucose Tests Gonorrhea Testing Gram Stain Growth Hormone Haptoglobin hCG Tumor Marker HDL Cholesterol Heavy Metals Helicobacter pylori (H. pylori) Testing Hematocrit Hemoglobin Hemoglobin A1c Hemoglobinopathy Evaluation Heparin Anti-Xa Heparin-induced Thrombocytopenia PF4 Antibody Hepatitis A Testing Hepatitis B Testing Hepatitis C Testing HER2 Herpes Testing High-sensitivity C-reactive Protein (hs-CRP) Histamine Histone Antibody HIV Antibody and HIV Antigen (p24) HIV Antiretroviral Drug Resistance Testing, Genotypic HIV Viral Load HLA Testing HLA-B27 Homocysteine Human Epididymis Protein 4 (HE4) (Continued)

163

164

Foundations of Artificial Intelligence in Healthcare and Bioscience

Table 5 3 184. 185. 186. 187. 188. 189. 190. 191. 192. 193. 194. 195. 196. 197. 198. 199. 200. 201. 202. 203. 204. 205. 206. 207. 208. 209. 210. 211. 212. 213. 214. 215. 216. 217. 218. 219. 220. 221. 222. 223. 224. 225. 226. 227. 228. 229.

(Continued)

Human Papillomavirus (HPV) Test Human T-cell Lymphotropic Virus (HTLV) Testing IGRA TB Test Immunoglobulins (IgA, IgG, IgM) Immunophenotyping Immunoreactive Trypsinogen (IRT) Influenza Tests Insulin Insulin-like Growth Factor-1 (IGF-1) Interleukin-6 Intrinsic Factor Antibody Iron Iron Tests Islet Autoantibodies in Diabetes JAK2 Mutation Kidney Stone Risk Panel Kidney Stone Testing KRAS Mutation Lactate Lactate Dehydrogenase (LD) Lactoferrin Lactose Tolerance Tests LDL Cholesterol LDL Particle Testing (LDL-P) Lead Legionella Testing Leptin Levetiracetam Lipase Lipid Panel Lipoprotein (a) Lithium Liver Panel Lp-PLA2 Lupus Anticoagulant Testing Luteinizing Hormone (LH) Lyme Disease Tests Magnesium Marijuana (THC) Testing Maternal Serum Screening, Second Trimester Measles and Mumps Tests Mercury Metanephrines Methotrexate Methylmalonic Acid Mononucleosis (Mono) Test (Continued)

Chapter 5 • AI applications in diagnostic technologies and services

Table 5 3 230. 231. 232. 233. 234. 235. 236. 237. 238. 239. 240. 241. 242. 243. 244. 245. 246. 247. 248. 249. 250. 251. 252. 253. 254. 255. 256. 257. 258. 259. 260. 261. 262. 263. 264. 265. 266. 267. 268. 269. 270. 271. 272. 273. 274. 275.

(Continued)

MRSA Screening MTHFR Mutation Mycophenolic Acid Mycoplasma Myoglobin Nicotine and Cotinine Non-High Density Lipoprotein Cholesterol Opioid Testing Osmolality and Osmolal Gap Ova and Parasite Exam Pap Smear (Pap Test) Parathyroid Hormone (PTH) Parietal Cell Antibody Partial Thromboplastin Time (PTT, aPTT) Parvovirus B19 PCA3 Pericardial Fluid Analysis Peritoneal Fluid Analysis Pertussis Tests Pharmacogenetic Tests Phenobarbital Phenytoin Phosphorus Plasma Free Metanephrines Platelet Count Platelet Function Tests Pleural Fluid Analysis PML-RARA Porphyrin Tests Potassium Prealbumin Pregnancy Test (hCG) Pregnenolone Prenatal Group B Strep (GBS) Screening Procalcitonin Progesterone Prolactin Prostate Specific Antigen (PSA) Protein C and Protein S Protein Electrophoresis Immunofixation Electrophoresis Prothrombin Time and International Normalized Ratio (PT/INR) PSEN1 Red Blood Cell (RBC) Antibody Identification Red Blood Cell (RBC) Antibody Screen Red Blood Cell Count (RBC) Red Cell Indices (Continued)

165

166

Foundations of Artificial Intelligence in Healthcare and Bioscience

Table 5 3 276. 277. 278. 279. 280. 281. 282. 283. 284. 285. 286. 287. 288. 289. 290. 291. 292. 293. 294. 295. 296. 297. 298. 299. 300. 301. 302. 303. 304. 305. 306. 307. 308. 309. 310. 311. 312. 313. 314. 315. 316. 317. 318. 319. 320. 321.

(Continued)

Renal Panel Respiratory Pathogens Panel Respiratory Syncytial Virus (RSV) Testing Reticulocytes Rheumatoid Factor (RF) Rubella Test Salicylates Semen Analysis Serotonin Serum Free Light Chains Sex Hormone Binding Globulin (SHBG) Shiga toxin-producing Escherichia coli Sickle Cell Tests Sirolimus Smooth Muscle Antibody (SMA) and F-actin Antibody Sodium Soluble Mesothelin-Related Peptides Soluble Transferrin Receptor Sputum Culture, Bacterial Stool Culture Stool Elastase Strep Throat Test Sweat Chloride Test Synovial Fluid Analysis Syphilis Tests T-Cell Receptor Gene Rearrangement T3 (Free and Total) T4, Free Tacrolimus Tau Protein and Beta Amyloid TB Skin Test Testosterone Theophylline and Caffeine Therapeutic Drug Monitoring Thiopurine methyltransferase (TPMT) Thrombin Time Thyroglobulin Thyroid Antibodies Thyroid Panel Thyroid-stimulating Hormone (TSH) TORCH Total IgE Total Protein and Albumin/Globulin (A/G) Ratio Toxoplasmosis Testing Trace Minerals Transferrin and Iron-binding Capacity (TIBC, UIBC) (Continued)

Chapter 5 • AI applications in diagnostic technologies and services

Table 5 3 322. 323. 324. 325. 326. 327. 328. 329. 330. 331. 332. 333. 334. 335. 336. 337. 338. 339. 340. 341. 342. 343. 344. 345. 346. 347. 348. 349. 350.

167

(Continued)

Trichomonas Testing Triglycerides Troponin Tryptase Tumor Markers Uric Acid Urinalysis Urine Albumin and Albumin to Creatinine Ratio Urine Culture Urine Metanephrines Urine Protein and Urine Protein to Creatinine Ratio Valproic Acid Vancomycin Vanillylmandelic Acid (VMA) VAP Vitamin A Vitamin B12 and Folate Vitamin D Tests Vitamin K VLDL Cholesterol von Willebrand Factor Warfarin Sensitivity Testing West Nile Virus Testing White Blood Cell (WBC) Differential White Blood Cell Count (WBC) Widal Test Xylose Absorption Test Zika Virus Testing Zinc Protoporphyrin

Source: American Association for Clinical Chemistry; 2019.

computational methods that incorporate clinical pathology, anatomic pathology (including digital imaging), and molecular/genomic pathology datasets (more below under “Genetic Testing”). Continuous remote sensing of patients using “wearables” such as glucose monitoring devices, oximetry, temperature, heart rate, and respiratory rate monitors connected to a central computing device via the “Internet of Things” (IoT) will be the norm. AI-enhanced microfluidics and compact small interactive point-of-care-testing (POCT) labs are set to alter the way diagnostics are carried out. An example is the “Maverick Detection System” from Genalyte [112]. Biological probes bound on silicon biosensors chips bind macromolecules in the serum. The binding is detected by a change in light resonance, which is determined photometrically. They plan to detect up to 128 analytes (substances in the serum) using disposable chips from a single sample.

168

Foundations of Artificial Intelligence in Healthcare and Bioscience

Today’s clinical labs are already using advanced robotics to test minute volumes of blood, serum, and other body fluids from thousands of samples in a day. They give highly accurate and reproducible answers to clinical questions, in scales almost too complicated for humans to duplicate. These machines are driven by conventional algorithmic programs, which represent and use data. They iterate repetitively and exhaustively using a decision sequence, using mathematics and equations, finally presenting a number or result within confidence limits. In the future, robots used in the clinical laboratory will be heuristic (self-learning), using Bayesian logic and inferential processes, with numerous ways to derive the best decision possible, even allowing for missing information. Artificial Intelligence programs combined with databases, data mining, statistics, mathematical modeling, pattern recognition, computer vision, natural language processing, mixed reality, and ambient computing will change the way laboratories generate and display clinical information in the future. AI and machine learning software are beginning to integrate themselves as tools for efficiency and accuracy within pathology. Software is being developed by start-ups, often in tandem with prominent educational institutions or large hospital research laboratories, addressing different diseases and conditions. A review of the functionalities of AI and machine learning software in the field of pathology reveal predominant usage in whole slide imaging analysis and diagnosis, tumor tissue genomics and its correlation to therapy, and finally companion diagnostic devices. The ICU (Intensive Care Unit) of the future will have AI programs, which will concurrently evaluate the continuous streams of data from multiple monitors and data collection devices. The programs will pool their information and present a comprehensive picture of the patient’s health to doctors autonomously adjusting equipment settings to keep the patient in optimal condition [113]. ML offers significant potential to improve the quality of laboratory medicine. ML-based algorithms in commercial and research-driven applications have demonstrated promising results. Laboratory medicine professionals will need to understand what can be done reliably with the technology, what the pitfalls are, and to establish what constitutes best practices as ML models are introduced into clinical workflows [114].

5.1.3 Genetic and genomic screening and diagnosis The best way to begin a discussion on AI’s applications and influences in genetic and genomic screening and diagnosis is to review the relevant science including anatomy (molecular biology), cytogenetics (examination of chromosomes) and genes and genomes, as well as the related AI categories (i.e., machine learning, big data analytics, precision medicine, predictive analytics, preventive health, EHR and robotics). Beyond the science discussion, the remaining information in this Chapter 5 will focus on the screening and diagnostic applications of the related science(s). The genetic and genomic therapeutic considerations will be addressed in Chapter 6 (“AI applications in medical therapies and services”). However, all of the basic science discussed herein applies equally to the Chapters 6 and 7 therapeutic discussions. And finally, it’s important to note that all of the genetic information throughout these chapters (and in fact, throughout this book) is a direct result of (and in gratitude to) the Human Genome Project completed in 2003 [115].

Chapter 5 • AI applications in diagnostic technologies and services

169

FIGURE 5–4 Chromosome. In the nucleus of each cell, the DNA molecule is packaged into thread-like structures called chromosomes. Within each DNA helix are “sequences” (“genetic code”) made up of four nitrogen base compounds, paired as “base pairs” (adenine paired with thymine and guanine paired with cytosine). Together, a base pair along with a sugar and phosphate molecule is called a nucleotide. Credit From: National Library of Medicine, NIH.

5.1.3.1 The science A gene is a fundamental physical and functional unit of heredity made up of DNA (deoxyribonucleic acid, the carrier of genetic information). It is estimated that humans have between 20,000 and 25,000 genes. Every person has 2 copies of each gene, 1 inherited from each parent. Most genes are the same in all people, but a small number of genes (less than 1% of the total) are slightly different between people. Alleles are forms of the same gene with slight differences in their sequence of DNA base compounds. These small differences contribute to each person’s unique physical features (their phenotype) [116]. The human body is composed of trillions of cells. In the nucleus of each cell, the DNA molecule is packaged into thread-like structures called chromosomes (Fig. 5 4). Virtually every single cell in the body contains a complete copy of the approximately 3 billion DNA base pairs (nucleotides or exome) or letters (adenine [As], thymine [Ts], guanine [Gs], and cytosine [Cs]) that make up the human genome. Each chromosome has a constriction point called the centromere, which divides the chromosome into 2 sections, or “arms.” The location of the centromere on each chromosome gives the chromosome its characteristic shape and can be used to help describe the location of specific genes. The overall number and shape of all your chromosomes are called a karyotype (Fig. 5 5). Your genotype is the genetic information you carry for a trait, and your phenotype is how that trait is manifested on your physical body. Genetics is defined as a branch of biology (molecular biology) concerned with the study of genes, genetic variation, and heredity in organisms [117]. Genes express specific traits called the phenotype which may be physical (e.g., hair, eye color, skin color, etc.), while others may carry the risk of certain diseases and disorders that may pass on from parents to offspring. Thus, genetics is the study of heredity, or how the characteristics of living organisms are transmitted from 1 generation to the next via DNA [118,119]. It is the

170

Foundations of Artificial Intelligence in Healthcare and Bioscience

FIGURE 5–5 Normal human karyotype. The overall number and shape of all your chromosomes is called a karyotype. Credit From: National Library of Medicine, NIH.

FIGURE 5–6 The cellular biology of the human genome. There are a number of elements that make up what is referred to as the human genome including the cellular biology of genetics which includes the cell, its nucleus, chromosomes within the nucleus, the DNA strands within the chromosomes and the base compounds of the genes within the chromosomes. Courtesy: Creative Commons License; By Sponk, Tryphon, Magnus, Manske.

bioscience that has the highest potential of influencing virtually every category of health, wellness, and prevention. Genomics is a more recently popularized term that describes the study of all of a person’s genes (their genome), including interactions of those genes with each other and with the person’s environment. A genome is an organism’s complete set of deoxyribonucleic acid (DNA), the chemical compound that contains the genetic instructions to develop and direct the activities of every organism. This science deals with the immense volume of clinical material in the human genome through cellular and molecular biology to advance genetic therapies in the study of the human immune system (immunogenomics) and the treatments, cures, and prevention of disease [119]. Let's summarize the many cellular elements (Fig. 5 6) that collectively make up the Human Genome. They are “relatively straightforward” to understand (kind of) and include [120]:

Chapter 5 • AI applications in diagnostic technologies and services

171

• The human cell within which is its nucleus; • Within the cell nucleus reside chromosomes (23 diploid pairs in somatic cells or 23 single haploid strands in embryonic or germ cells); • Within each chromosomal strand is a double-stranded spiral (helix) of DNA (deoxyribonucleic acid); • Within each DNA helix are “sequences” (“genetic code”) made up of 4 nitrogen base compounds, paired as “base pairs” (adenine paired with thymine and guanine paired with cytosine); • Together, a base pair along with a sugar and phosphate molecule is called a nucleotide; • These nucleotides are held together by hydrogen bonds and arranged in 2 long strands that form the double-stranded spiral mentioned above, called the DNA double helix; • The mapping of these double helixes in the cell of a living organism is called its “karyotype” (see Fig. 5 5); • Defined groups (from a few hundred to a few million) of these base-compound paired sequences on a DNA double helix are called genes (humans have between 20,000 to 25,000 genes); • Pairs or series of inherited genes on a chromosome that determines hereditary characteristics (e.g., hair color, eye color, height, etc.) are called alleles; • The specific makeup and positioning (loci) of these alleles on a chromosome are called a genotype; • A pair of alleles in the same gene is either autosomal dominant or recessive. Homozygous means that both copies of a gene or loci match while heterozygous means that the copies do not match. Two dominant alleles (AA) or two recessive alleles (aa) are homozygous. One dominant allele and one recessive allele (Aa) is heterozygous. An autosomal dominant allele will always be preferentially expressed over a recessive allele; • The visible or observable expression of the results of the genotype, combined with environmental influences (any bodily adjustment to the environment over time), is called the phenotype. DNA molecules are made of 2 twisting, paired strands (double helix). Each strand is made of 4 chemical units, called nucleotide bases (collectively called the human exome). The bases are adenine (As), thymine (Ts), guanine (Gs), and cytosine (Cs). Bases on opposite strands pair specifically; an A always pairs with a T, and a C always with a G. Sequencing DNA means determining the order of the 4 chemical bases that make up the DNA molecule. The sequence tells scientists the genetic information that is carried in a particular DNA segment. Also, sequence data can highlight changes in a gene that may cause disease [121]. Besides dictating the phenotype of the organism, the critical function of the genetic process and thus, the genes are the production of amino acids, the building blocks of the large, complex protein molecules that play many critical roles in structure, function, and regulation of the body’s tissues and organs. A gene traditionally refers to the unit of DNA that carries the instructions for making a specific protein or set of proteins. Each of the estimated 20,000 to 25,000 human genes in the human genome codes for an average of 3 proteins.

172

Foundations of Artificial Intelligence in Healthcare and Bioscience

Located on 23 pairs of chromosomes in the nucleus of a human cell, the genes direct production of proteins with the assistance of enzymes and messenger molecules (ribonucleic acid or RNA). During a process called transcription, information stored in a gene’s DNA is transferred to a smaller ribonucleic acid molecule called mRNA (messenger RNA), which transfers the information (“message”) from the nucleus into the cell cytoplasm. In the next step, translation, the mRNA interacts with a specialized complex called a ribosome, which “reads” the sequence of mRNA bases. This genetic process will be revisited (in a negative light) representing the life cycle of the SARS-CoV-2 virus in Chapter 8 (page 450). Finally, a type of RNA called transfer RNA (tRNA) assembles amino acids into functional body proteins. This stepped process in which hereditary information in DNA is used to make proteins is called the “central dogma of molecular biology” or the science of “transcriptomics.” [122] Beyond transcriptomics, which defines the expression of the genes’ proteins (the proteome), “proteomics” studies their biochemistry, functions, and interactions within the body [123]. If a cell’s DNA is mutated, an abnormal protein will be produced, which disrupts the body’s usual processes and lead to a disease such as cancer [124]. Chemical compounds added to a single gene can regulate its activity. Such modifications are known as epigenetic changes [125]. The epigenome comprises all chemical compounds that have been added to the entirety of one’s DNA (genome) to regulate the activity (expression) of all the genes within the genome. The chemical compounds of the epigenome are not part of the DNA sequence. They exist on or attached to DNA (“epi-” means above in Greek). Epigenetic modifications remain as cells divide, and sometimes, they can be inherited through generations. Patterns of epigenetic modification vary among individuals, in different tissues within an individual, and even in different cells. Because errors in the epigenetic process, such as modifying the wrong gene or failing to add a compound to a gene, can lead to abnormal gene activity or inactivity, they can cause genetic disorders. Conditions, including cancers, metabolic disorders, and degenerative disorders, have all been found to be related to epigenetic errors. Examples of these errors will be mentioned below in the discussion of immunogenetics. The description of the genetic components and process, albeit “challenging,” is only part of the genetics and genomics story. The more overwhelming feature of these sciences lies in the astronomical numbers their components represent. Those numbers include B37.2 trillion (B37.2 3 1016) cells in the human body; with B3 billion (3.0 3 1012) chromosomes within the nuclei of the cells; with 4 base compound sequences within the DNA (deoxyribonucleic acid) helixes constituting the 20,000 to 25,000 genes. The number of possible combinations within these sequences (“genetic codes”) of base compounds is astronomical. And yet, it is among these prodigious numbers of gene sequences that congenital (hereditary) and acquired mutations occur. These mutations (structural changes resulting in a genetic variant) are the underlying cause of all abnormalities in the human organism [116]. Using the standard advanced algebra factorial formula, nCr 5 n! / r!  (n 2 r)!, for possible combinations, where ‘n’ represents the total number of items (in this case, the number of genes or 25,000), and ‘r’ represents the number of items being chosen at a time (4 base compounds), the number of possible mutations in the human genome, spread among 37.2 trillion

Chapter 5 • AI applications in diagnostic technologies and services

173

cells, is 2.5 3 1020. (You can appreciate why we have been using exponential notation instead of whole numbers with “lots and lots of zeros!”). Locating and identifying those mutations and their clinical manifestations is a challenge. Enter AI and the Human Genome Study! Since the completion of the Human Genome Study in 2003 and the continuing advances in AI, machine, and deep learning, the scientific community now has the tools (e.g., Next Generation Sequencing [NGS] [126] to locate and identify genetic mutations in timeframes of minutes, hours, and days, versus the original “non-AI processes” of weeks, months, and years, if at all. Given the picture of the average human genome, the science of genomics is now able to identify the 2.5 3 1020 potential mutations in a human being and where among the 37.2 trillion somatic cells they may be occurring. This ability has changed the current face of healthcare. It provides a future for continuing, expanding diagnostic capabilities (locating and identifying mutations), and treatment options (engineering gene mutations to reduce their negative effects presented in Chapter 7)). Finally, it introduces real personalized, precision health prevention (more below) by correcting gene abnormalities before they produce their negative effects.

5.1.3.2 Cytogenetics Cytogenetics involves the examination of chromosomes to identify structural abnormalities. It’s hard to get one’s head around such testing methodology when we consider the numbers in the previous section (“The Science” of genetics). Think about the multiples of “20,000 and 25,000 genes” in “trillions of cells” in the human body with “3 billion DNA base pairs” in each cell. Exponentially that sum would be 7.5 3 1025 (I think?) or 750,000,000,000,000,000,000,000,000. And that would be the genome for 1 human being. Finding an abnormality (mutation or variants) in that genome is what “big data analytics” (below) is all about. Progress in genetic testing was stalled by the complexity and enormity of the data that needed to be evaluated. Instead, the extensive datasets (e.g., 7.5 3 1025) in cytogenetics provide training for deep learning (CNN) algorithms resulting in dramatically faster (than human) and more accurate analysis. With such advances in artificial intelligence and machine learning applications, researchers are now able to interpret data better and faster, and act on genomic data through genome sequencing. Because AI systems can do it faster, cheaper, and more accurately, they gain perspective on the particular genetic blueprint that orchestrates all activities of that organism. These insights help AI make decisions about care, what an organism might be susceptible to in the future, what mutations might cause different diseases, and how to prepare for the future [127].

5.1.3.3 Genetic testing [128] Genetic testing diagnostic applications are ranked second only to diagnostic imaging.1 But, remember our “Top 5 parlor game” from back in this chapter’s introduction. In my list I included genetic testing as my number 1 and 2 choice and diagnostic imaging as number 4. I stand on my prediction in spite of the published rankings (i.e., genetic testing second and 1

Fei Jiang, Yong Jiang, Hui Zhi, et al. Artificial intelligence in healthcare: past, present and future. Stroke Vasc Neurol. June 21, 2017; 2:e000101. https://doi.org/10.1136/svn-2017-000101.

174

Foundations of Artificial Intelligence in Healthcare and Bioscience

diagnostic imaging first) because almost all medical and non-medical prognosticators agree that genetic testing will surpass all forms of diagnostic tests by 2022 [129]. The downside of that phenomenon will be “buyer beware.” Not all genetic testing is created equally and beside legitimate medical controlled and supervised testing, there will be a proliferation of proprietary tests (e.g., CRIGenetics, 23andMe, Ancestry) that may or may not be accurate and validated. Genetic testing is a medical test that identifies changes in chromosomes, genes, or proteins. The results of a genetic test can confirm or rule out a suspected genetic condition, or it can help determine a person’s risk of developing or passing on a genetic disorder. Available types of testing include the following: • Newborn screening is used just after birth to identify genetic disorders that may be amenable to early treatment. Millions of babies are tested each year in the United States. All states currently test infants for genetic diseases such as phenylketonuria (a genetic disorder that causes intellectual disability if left untreated) and congenital hypothyroidism (a disorder of the thyroid gland). Some states also test for other genetic diseases. • Diagnostic testing identifies or rules out a specific genetic or chromosomal condition. In many cases, genetic testing confirms a diagnosis when a particular condition is suspected based on physical signs and symptoms. Diagnostic testing can be performed before birth (prenatal see below) or at any time during a person’s life, but is not available for all genes or all genetic conditions. • Pharmacogenetics is used to determine what medication and dosage will be most effective and beneficial for you if you have a particular health condition or disease. • Carrier testing is used to identify people who carry 1 copy of a gene mutation that, when present in 2 copies, causes a genetic disorder. This type of testing is offered to individuals who have a family history of a genetic disorder and to people in certain ethnic groups with an increased risk of specific genetic conditions. If both parents are tested, the test can provide information about a couple’s risk of having a child with a genetic condition. • Prenatal testing is used to detect changes in a fetus’s genes or chromosomes before birth. This type of testing is offered during pregnancy if there is an increased risk that the baby will have a genetic or chromosomal disorder. In some cases, prenatal testing can lessen a couple’s uncertainty or help them make decisions about a pregnancy. (As mentioned above, this was my #1 choice in my “Top 5” listing of the most disruptive diagnostic and treatment technologies by 2025 at the beginning of this chapter, page 125). • Preimplantation testing, also called preimplantation genetic diagnosis (PGD), is a specialized technique that can reduce the risk of having a child with a particular genetic or chromosomal disorder. It is used to detect genetic changes in embryos that were created using assisted reproductive techniques such as in-vitro fertilization. • Predictive and pre-symptomatic testing is used to detect gene mutations associated with disorders that appear after birth, often later in life. These tests can be helpful to people who have a family member with a genetic disorder, but who have no features of the disease themselves at the time of testing. Predictive analytics testing can identify mutations that increase a person’s risk of developing disorders with a genetic basis, such

Chapter 5 • AI applications in diagnostic technologies and services

175

as certain types of cancer. The results of predictive and pre-symptomatic testing can provide information about a person’s risk of developing a specific disorder and help with making decisions about medical care (see Cancers and breast cancer in Chapter 7) . • Forensic testing uses DNA sequences to identify an individual for legal purposes. Unlike the tests described above, forensic testing is not used to detect gene mutations associated with the disease. This type of testing can identify crime or catastrophe victims, rule out or implicate a crime suspect, or establish biological relationships between people (for example, paternity). Determining the order of DNA proteins (nucleotides) in an individual’s genetic code, called DNA sequencing, has advanced genetics both for research and clinically. Two methods, whole-exome sequencing and whole-genome sequencing are being used more frequently in healthcare and research to identify genetic variations. Both methods rely on AI and big data analytics, which allows the rapid sequencing of large amounts of DNA. These approaches are known as next-generation sequencing (or next-gen sequencing). The original sequencing technology (the Sanger method) would take years to sequence all of a person’s DNA. Next-generation sequencing has sped up the process (taking only days to weeks to sequence a human genome) while reducing the cost.

5.1.3.4 Big data analytics in genomics [130] Genomic medicine is a clinical science that attempts to build individual strategies for diagnostic or therapeutic decision-making. It utilizes patients’ genomic information to make assessments. Big Data Analytics reveals hidden patterns, unknown correlations, and other insights by examining large-scale datasets. Notwithstanding challenges in the integration and manipulation of diverse genomic data and comprehensive electronic health records (EHRs), on a Big Data the process also provides an opportunity to develop an efficient and practical approach to identifying clinically actionable genetic variants for individual diagnosis and therapy. Next-generation sequencing (NGS) technologies, such as whole-genome sequencing (WGS), whole-exome sequencing (WES), and/or targeted sequencing, are progressively more applied to bioscience studiies and medical practice to identify disease and/or drug-associated genetic variants to advance precision medicine [131]. Precision medicine (see Chapter 4, page 101) allows scientists and clinicians to predict more accurately which therapeutic and preventive approaches to a specific illness can work effectively in subgroups of patients based on their genetic make-up, lifestyle, and environmental factors [132]. Clinical research leveraging EHRs has become feasible as EHRs have been widely implemented [133]. In the area of healthcare research, clinical data derived from EHRs have expanded from digital formats of individual patient medical records into high-dimensional data of enormous complexity. Big Data refers to novel technological tools delivering scalable capabilities for managing and processing large and diverse data sets. On a single level, approaches such as natural language processing (NLP) allow incorporation and exploration of textual data with structured sources. On a population level, Big Data provides the

176

Foundations of Artificial Intelligence in Healthcare and Bioscience

possibility to conduct a large-scale investigation of clinical consequences to uncover hidden patterns and correlations [134].

5.1.3.5 AI in genetic cancer screening Due to its variegated forms with the evolution of the disease, cancer offers a unique context for medical decisions (see also Cancer in Chapter 7, page 314). Radiographic analyses of the disease commonly rely upon visual evaluations, the interpretations of which may be augmented by AI’s advanced computational studies. In particular, AI is making great strides in the qualitative interpretation of cancer imaging, including volumetric delineation of tumors over time extrapolation of the tumor genotype and biological course from its radiographic phenotype. AI can also provide a prediction of clinical outcome, and an assessment of the impact of disease and treatment on adjacent organs. It can automate processes in the initial interpretation of images and shift the clinical workflow of radiographic detection and management decisions [135]. The automated capabilities of AI offer the potential to enhance the qualitative expertise of clinicians, including parallel tracking of multiple lesions, translation of intratumoral phenotypic nuances to genotype implications, and outcome prediction through cross-referencing individual tumors to databases of potentially limitless (millions, perhaps billions) comparable cases. The strengths of AI are well suited to the current generation of targeted and immunotherapies (see also Immunology, Chapters 6 and 7) which can produce a clear clinical benefit that is poorly captured by endpoints based on RECIST (Response Evaluation Criteria in Solid Tumors). These endpoints rely on the assumption that a successful response to therapy will be reflected by tumor shrinkage and, in particular, the measurement of response based on tumor diameter assumes that tumors are spherical and undergo uniform spatial change after treatment. Targeted therapies and immunotherapies lead to novel patterns of response that confound current RECIST-based endpoints and may contribute to the high failure rate of clinical trials and the cost of drug development. Thus, the ability of AI and big data to quantify the biological processes associated with a response other than size answers an urgent need in the field [135]. Several firms focus specifically on diagnosis and treatment recommendations for certain cancers based on their genetic profiles. Since many cancers have a genetic basis, human clinicians have found it increasingly complex to understand all genetic variants of cancer and their response to new drugs and protocols. Firms like Foundation Medicine [136] and Flatiron Health [137], both now owned by Roche, specialize in this approach.

5.1.3.6 AI in immunogenetics (see also Immunology, Chapters 6 and 7) Immunogenetics is the study of the genetic basis of the immune response. It includes the study of normal immunological pathways and identifying genetic variations that result in immune disorders, which may lead to new therapeutic targets for immune diseases [138]. Computational immunogenetics (immunoinformatics) encompasses the use and application of AI, bioinformatics methods, mathematical models, and statistical techniques. Computational approaches are increasingly vital to understand the implications of the wealth of gene expression and epigenomics data being gathered from immune cells. Dozens of immune databases play a vital role in organizing the vast quantities of experimental data generated by

Chapter 5 • AI applications in diagnostic technologies and services

177

modern high-throughput technologies [139]. The sum of these mechanisms is fundamental to the regulation of diverse cellular processes through differential transcriptional readout of the same genetic material. The importance of epigenetics is underscored by many diseases that can develop due to mutations in epigenetic regulatory proteins, dysregulation of the epigenetic machinery, and aberrant placement or removal of epigenetic marks [140].

5.1.3.7 Genetics, precision medicine and AI One of the most exciting prospects about gene technology is the development of precision or personalized medicine (see Chapter 4, page 101). The field, which enables interventions specific to a patient or population of genetically similar individuals, is expected to reach $87 billion by 2023 [141]. Historically, cost and technology limited the implementation of personalized medicine, but machine learning techniques are helping to overcome these barriers. Machines help identify patterns within genetic data sets, and then computer models can make predictions about an individual’s odds of developing a disease or responding to interventions. AI will remain the primary driver to healthcare’s transformation toward precision medicine. AI approaches, such as machine learning, deep learning, and natural language processing (NLP), will address the challenges of scalability and high dimensionality. It will transform big data into clinically actionable knowledge, which is expanding and becoming the foundation of precision medicine. Precision medicine or personalized medicine addresses disease by tailoring treatment based on genomic, lifestyle, and environmental characteristics of each patient. With precision medicine and the advancement of next-generation sequencing (NGS), genomic profiles of patients are increasingly used for risk prediction, disease diagnosis, and development of targeted therapies. The number of publications each year, as indexed in PubMed, has exceeded 1 million since 2011. This volume and veracity in publications indicate multiple hypotheses are being tested at the same time, which makes it harder for researchers to stay up to date in their field in the absence of some automated assists. It, therefore, impacts their ability to generate meaningful and coherent conclusions in a timely manner which are required for evidence-based recommendations in the context of precision medicine [142]. In cancer genomics, publications per year can easily run into tens of thousands, far more than a researcher can keep up with, and this growth in publications has resulted in the rapid growth of application of text mining and NLP techniques. Capturing data on individual variability in genomic, lifestyle, and clinical factors are at the core of precision medicine, which would empower patients to be more engaged in their healthcare. To facilitate patient participation in this AI-empowered digital health transformation, medical professionals should provide robust patient education initiatives related to precision medicine, benefits, and risks of AI, data sharing, and protection. Healthcare providers need to be sensitive to varying degrees of patient preferences for privacy and properly obtain consent for patient data collection and use [143].

5.1.3.8 Literature reviews re AI’s influence on genetics and genomics 1. The immunogenetics of neurological disease [144]: Genes encoding antigen-presenting molecules within the human major histocompatibility complex (MHC) account for the greatest component of genetic risk for

178

Foundations of Artificial Intelligence in Healthcare and Bioscience

many neurological diseases, such as multiple sclerosis, neuromyelitis optica, Parkinson’s disease, Alzheimer’s disease, schizophrenia, myasthenia gravis, and amyotrophic lateral sclerosis. Myriad genetic, immunological, and environmental factors contribute to an individual’s susceptibility to neurological disease. Taken together, the findings of human leukocytic antigens and killer-cell immunoglobulinlike receptor association studies are consistent with a polygenic model of inheritance in the heterogeneous and multifactorial nature of complex traits in various neurological diseases. The majority of neurological conditions, such as MS, NMO, PD, AD, SCZ, MG, and ALS, are considerably more frequent among individuals transmitting specific human leukocytic antigens alleles. This further strengthens the decades-long contention of a strong immune component in the determination of clinical outcomes of neurological diseases. 2. The emerging role of epigenetics in human autoimmune disorders [145]: Epigenetic mechanisms, known for their ability to regulate gene transcription and genomic stability, are key players for maintaining normal cell growth, development, and differentiation. Epigenetic dysregulation directly influences the development of autoimmunity (see Autoimmunity, Chapter 7) by regulating immune cell functions [146]. A more in-depth exploration of the complex epigenetic interactions may be useful for the development of promising treatment strategies targeting the epigenome. Epigenetics will very likely aid in providing further progress in the field of autoimmunity. 3. Predicting response to cancer immunotherapy using noninvasive radiomic biomarkers [147]: Cancer immunotherapy has made promising strides as a result of improved understanding of biological interactions between tumor cells and the immune system. The recent emergence of quantitative imaging biomarkers provides promising opportunities. Unlike traditional biopsy-based assays that represent only a sample of the tumor, images reflect the entire tumor burden, providing information on each cancer lesion with a single noninvasive examination. Computational imaging approaches originating from AI have achieved impressive successes in automatically quantifying radiographic characteristics of tumors [15]. Radiomics-based biomarkers have shown success in different tumor types [148], but there is no evidence yet in immunotherapy. Tumor morphology, visualized on imaging, is likely influenced by several aspects of tumor biology. Findings suggest associations between radiomics characteristics and immunotherapy response showing consistent trends across cancer types and anatomical location. Lesions that are more likely to respond to immunotherapy typically tend to present with more heterogeneous morphological profiles with nonuniform density patterns and compact borders.

5.2 Additional diagnostic technologies and their AI applications Whether consciously or unknowingly, all of us are using and relying on AI every day in countless ways. Things are now taken for granted, like the Internet, email, search engines, social media, chatbots, GPS, online services, voice recognition (natural language processing

Chapter 5 • AI applications in diagnostic technologies and services

179

[NLP]), and on and on. So too in healthcare, AI is being used extensively by healthcare providers and administrators in medical practice, EHRs, hospital care, third party insurers, emergency medical services (EMS) and rescue, robotics, and again, on and on. Oops, by the way, I almost forgot to mention the ubiquitous smartphone and all those healthcare apps (as the saying goes, “There’s an app for that”). I mention all this in the context of “additional diagnostic technologies” created and supported by and continually being expanded by AI. All of the previously discussed AI applications in diagnostic technologies and services are the skeletal framework of healthcare diagnosis. Ultimately, the human level (human intelligence) is the engine that drives, coordinates, and facilitates the machine learning of AI. It is important to keep in mind that the additional resources, technologies, and services presented in the following section, as with those discussed previously, are supporting assets in a diagnostic armamentarium supplemented, driven and managed by the human “captain of the ship.”

5.2.1 Vital signs We are all very familiar with the vital signs used in diagnosis at the acute and chronic levels of care. The signs include body temperature (BT), blood pressure (BP), pulse rate (PR), and respiration (breathing) rate (RR). BT normal range is from 97.8 F (Fahrenheit) or 36.5 C (Celsius) to 99 F (37.2 C). It is obtained (mercuric or electronic) orally, rectally, axially, and by ear or skin. BP is measured using a manual sphygmomanometer or more modern electronic devices. Systolic BP is the higher number or the pressure inside the artery when the heart contracts and pumps blood through the body. The lower number is diastolic BP, which is the pressure inside the artery when the heart is at rest and is filling with blood. PR measures heartbeat rhythm and strength with normal at 60 100 per minute. It is measured at the wrist (radial artery) or side of the neck (carotid artery). The RR is the number of breaths a person takes per minute. It is usually measured when a person is at rest and simply involves counting the number of breaths for 1 minute by counting how many times the chest rises. Normal RR for an adult person at rest ranges from 12 to 16 breaths per minute. Monitoring vital signs is critical in emergencies, hospital care, and the chronically ill patient. In emergency care, “vital signs are vital” is a common refrain. In hospital care and chronically ill patients, monitoring vital signs often requires waking patients to check the signs (although this routine is being challenged as unnecessary and even potentially harmful [149]). Under any circumstances, AI is considered a viable adjunct to increasing the efficiency of the process as well as assuring timely and accurate monitoring. Devices such as the ViSi (Sotera Wireless) [150] are approved and currently being used by many health systems. ViSi (an IoT) is a mobile system platform that is designed to detect deterioration earlier and keep clinicians connected to their patients. The system enables highly accurate, continuous monitoring of core vital signs through wearable sensors that allow for freedom of movement. The concept of wearable devices (IoTs) in healthcare merges ideally with AI as the system attempts to decentralize. There are many non-acute health decisions that people make daily.

180

Foundations of Artificial Intelligence in Healthcare and Bioscience

These decisions do not require a skilled clinician, but they play a significant role in determining a patient’s health, and ultimately, the cost of healthcare. Thanks to AI-driven models, patients now have access to interventions and reminders throughout their day to process decisions based on changes to their vital signs. In southeast England, patients discharged from a group of hospitals serving 500,000 people are being fitted with a Wi-Fi-enabled armband that remotely monitors vital signs [151]. By deploying AI, for instance, the NHS program is not only able to scale up in the U.K. but also internationally. Current Health, the venture-capital-backed maker of the patient monitoring devices used in the program, recently received FDA clearance to pilot the system in the U.S. and is now testing it with New York’s Mount Sinai Hospital. It’s part of an effort to reduce patient readmissions, which costs U.S. hospitals about $40 billion annually [152]. MIT’s CSAIL, helps doctors predict if and when a patient will need mechanical ventilation or vasopressors, blood pressure control, and other interventions. Another CSAIL algorithm helps determine the optimal time to transfer a patient out of the ICU. This has the objective of reducing hospital stay as well as preventing mortality. Other CSAIL algorithms centered on the ICU to lessen the burden on the nurse by automated surveillance with a combination of cameras or algorithmic processing of vital signs [153]. There is, however, no FDA-approved device for home use yet. Until we have FDA devices approved for home use that are automatic, accurate, inexpensive, and integrate with remote monitoring facilities, there remains an obstacle.

5.2.2 Electrodiagnosis Electrodiagnosis (EDX) is a broad term describing methods of medical diagnosis that obtain information about diseases by passively recording the electrical activity of body parts (that is, their natural electrophysiology) or by measuring their response to external electrical stimuli (evoked potentials) [154]. The term EDX includes the following diagnostic tests: • • • • • • • • • • • • • •

Electroencephalography (Intracranial EEG, stereoelectroencephalography) Magnetoencephalography (MEG) Evoked potentials Electrogastrogram (EGG) Magnetogastrography Electrocochleography Electrooculography (EOG) Electroretinography (ERG) Electronystagmography (ENG) Electrocardiography (ECG) Vectorcardiography Magnetocardiography Electromyography (Facial electromyography) (EMG) Nerve conduction study (NCS)

Chapter 5 • AI applications in diagnostic technologies and services

Table 5–4

181

Electrodiagnostic tests.

System

Diagnostic test

Description

Central nervous system

Electroencephalography (Intracranial EEG, stereoelectroencephalography) Magnetoencephalography (MEG)

An electrophysiological monitoring method to record electrical activity of the brain.

Evoked potentials Digestive system

Electrogastrogram (EGG) Magnetogastrography

Ears

Electrocochleography

Eyes

Electrooculography (EOG)

Electroretinography (ERG)

Electronystagmography (ENG)

Heart

Electrocardiography (ECG) Vectorcardiography

Peripheral nervous system

Magnetocardiography Electromyography (EMG)

Nerve conduction study (NCS)

A functional neuroimaging technique for mapping brain activity by recording magnetic fields produced by electrical currents occurring naturally in the brain. An electrical potential in a specific pattern recorded from a specific part of the nervous system, especially the brain. Records the electrical signals that travel through the stomach muscles and control the muscles’ contractions. Recordings of magnetic fields resulting from electrical currents in the stomach. A technique of recording electrical potentials generated in the inner ear and auditory nerve in response to sound stimulation. A technique for measuring the corneo-retinal standing potential that exists between the front and the back of the human eye. Measures the electrical responses of various cell types in the retina, including the photoreceptors, inner retinal cells, and the ganglion cells. Records involuntary movements of the eye caused by a condition known as nystagmus. It can also be used to diagnose the cause of vertigo, dizziness or balance dysfunction by testing the vestibular system. A recording a graph of voltage versus time of the electrical activity of the heart. A method of recording the magnitude and direction of the electrical forces that are generated by the heart. Records the magnetic fields generated by the heart. An electrodiagnostic medicine technique for evaluating and recording the electrical activity produced by skeletal muscles. A medical diagnostic test commonly used to evaluate the function, especially the ability of electrical conduction, of the motor and sensory nerves of the human body.

Table 5 4 lists all major electrodiagnostic tests with thumbnail descriptions of each. Each of these electrodiagnostic studies can help establish diagnoses for disorders affecting nerves and muscles, and they can identify other problems and define the severity of the problem. An injury or disease can interrupt electrical signals that travel from the brain to motor nerves to muscles. Indications for electrodiagnostic testing include:

182

Foundations of Artificial Intelligence in Healthcare and Bioscience

• Entrapment neuropathies including: • Carpal tunnel syndrome • Cubital tunnel syndrome • Radial nerve palsy • Peroneal nerve palsy • Tarsal tunnel syndrome • Radiculopathy in the cervical and lumbar regions • Brachial and lumbosacral plexopathy • Peripheral nerve injuries • Pain, numbness or weakness of upper or lower extremities • Polyneuropathy • Diseases of the neuromuscular junction like myasthenia gravis • Myopathy • Motor neuron disease • Autonomic neuropathy Relative to the AI algorithms used in clinical diagnosis, applications in electrodiagnosis represents the third leading usage. Each of the forms of electrodiagnostic testing listed above has adopted machine learning and deep learning to enhance and refine the information gathered. Much of that has to do with the availability of GPUs for parallel processing faster, cheaper, and more powerful. It also has to do with the simultaneous increase in storage capacity and the flood of data, with each type of testing [155]. An example of deep learning in electrodiagnosis is an early diagnostic system based on Echo State Networks (ESN) that takes specific features of EEG data as an input. This input can predict if the subject is likely to develop Parkinson’s Disease (PD) 10 15 years before developing any symptoms. Results have predicting individual PD development with 85% accuracy [156]. More recently, other deep learning techniques achieved a similar performance while reducing the computational cost and avoiding the need for feature selection [157]. This would allow implementing preventive treatments before the disease develops, when it is too late for treatment.

5.2.3 Telemedicine (aka telehealth) Some things just go together. Bacon and eggs, peanut butter and jelly, wine and cheese, Ben and Jerry, King Kong and Fay Wray (for seniors), Burt and Ernie (for young. . .and old), Lennon and McCartney (that was fun!). So too, telemedicine and AI just go together. Perhaps, even more so, smartphones (cameras) and AI go together. Telemedicine (aka telehealth) is defined by the Health Resources and Services Administration (HRSA) of the US Department of Health and Human Services as the use of electronic information and telecommunications technologies to support and promote long-distance clinical healthcare, patient and professional health education, public health and health administration. Technologies include videoconferencing, the internet, store-and-forward imaging, streaming media, and terrestrial and wireless communications. Telehealth applications include [158]

Chapter 5 • AI applications in diagnostic technologies and services

183

• Live (synchronous) videoconferencing: a 2-way audiovisual link between a patient and a care provider; • Store-and-forward (asynchronous) videoconferencing: transmission of recorded health history to a health practitioner, usually a specialist; • Remote patient monitoring (RPM): the use of connected electronic tools including the Internet of Things (IoT) to record personal health and medical data in 1 location for review by a provider in another location, usually at a different time; • Mobile health (mHealth): healthcare and public health information provided through mobile devices. The information may include general educational information, targeted texts, and notifications about disease outbreaks. There is some debate over the use of the term “telemedicine” or “telehealth” [159]. How do telehealth and telemedicine technology differ? In most cases, they both rely on the same technologies telecommunications technology. However, because telemedicine always deals with private patient health information, it needs to be secure and HIPAA compliant. Telehealth technology, on the other hand, shares general health information or health education and, thus, does not need to follow the same security requirements. The Federal Communications Commission (FCC) defines telehealth as a broader form of telemedicine, encompassing “remote healthcare services beyond the doctor-patient relationship” [160] such as interactions with nurses, pharmacists, and social services personnel. Thus, for our discussion, we will simply use telehealth, which seems to be the broader term. Telehealth technology is constantly evolving. Several decades ago, physicians and other health professionals engaged in telehealth using radios and telephones. Since the advent of the Internet Age and the widespread use of smartphones, telehealth technology has changed dramatically. Now, telehealth technology might include a smartphone app or an online video conferencing software. And with the growth of mobile medical devices, telehealth equipment is starting to incorporate sophisticated tools that can measure a patients’ vitals or scan health data in the home without supervision by a medical professional. The implementation of telehealth practices and technology is showing increased adoption among healthcare providers and institutions. Results from a 2017 survey [161] of 436 medical professionals conducted by telemedicine software company REACH Health shows that 51% ranked telemedicine as a “top” or “high” priority in their practice. To increase clinical and administrative capacity through telehealth, researchers are developing AI-driven technology for healthcare professionals and consumers [162]. To say the least, the 2020 commencement and escalation of the COVID-19 pandemic has driven telehealth to new heights and broad acceptance among healthcare providers worldwide. AI applications in telehealth fall into 4 categories of telehealth mentioned above: live (synchronous) videoconferencing, store-and-forward (asynchronous) videoconferencing, remote patient monitoring (RPM); and mobile health (mHealth). Considering the impact of AI within these 4 categories will provide a dramatic picture of the extent AI-supported telehealth is influencing healthcare.

184

Foundations of Artificial Intelligence in Healthcare and Bioscience

The value of synchronous and task-oriented computer-generated dialogue has been observed for a broad range of applications [163]. Automated conversational interactions offer many other opportunities across the care spectrum to augment and, in some cases, replace human caregiver tasks. These may include: • • • • •

reminders and motivational messages, e.g., for medication, nutrition, and exercise; routine condition checks and health maintenance, based on personal monitoring data; answering health queries and provision of targeted health information and education; providing a personalized means to address social isolation and community involvement; acting as an intermediary or broker entity between multiple careers or service agencies.

The nature and complexity of conversational agent (or virtual assistant) solutions can vary considerably. Speech-text conversion utilities and chatbots capable of audio or typed inputs and outputs are examples of such technologies. These solutions are better suited for interactions where the context of the situation and the user are simple and established. AI mechanisms for these agents are typically rule-based using expert systems or decision tree logical constructs [164]. Asynchronous or store-and-forward telehealth is a platform that allows both patients and providers to interact on their timeline. The patient can enter data through a secure portal by filling out an interactive medical questionnaire and uploading images, labs, or other diagnostic information. Provider logs at the other end review the information, render a diagnosis, and issue treatment recommendations. The provider can always communicate with the patient to ask questions, and can also decide that the case is not suitable for a telehealth diagnosis and recommends a visit to the ER or doctor’s office [165]. CaptureProof’s [166] “asynchronous telehealth” uses an advanced computer vision system that allows healthcare providers and doctors to monitor patient progress, provide instant feedback, update health records, and send instructional media. The patient can also monitor healing progress over time on their own, and coordinate reports to be shared with other healthcare providers (including home health aides, social workers, physical therapists, etc.) involved with the care. The last World Health Organization global eHealth observatory survey [167] noted 4 exemplary well-embedded telehealth services: tele-radiology, tele-pathology, teledermatology, and tele-psychiatry; of these, the first 3 follow asynchronous models of care, and the fourth is synchronous. Remote monitoring (or telemonitoring) involves data acquisition using an appropriate sensor, transmission of data from patient to clinician, integration of data with other data describing the state of the patient (e.g., sourced from the electronic health record), synthesis of an appropriate action or response or escalation in the care of the patient with associated decision support, and storage of data [168]. AI systems for telemonitoring depend on and also expand the scope of other health system ICT (Information and Communication Technology) components. They can potentially outperform humans in many ways. They consistently, mathematically execute their instructions, with a fundamental reliance on inbuilt logic moderated by statistical evidence extracted by machine learning methods from large-scale datasets. They can immediately incorporate and coordinate

Chapter 5 • AI applications in diagnostic technologies and services

185

data from additional tools like location finders (GPS), accelerometers, motion sensors, gyroscopes, etc. Sourcing such additional data by humans is tedious and would need education and training for incorporation in care delivery [164]. Tele-monitoring has been evaluated for the remote surveillance of patients with chronic disease, such as chronic heart failure [169], chronic obstructive pulmonary disease (COPD) [170], and diabetes mellitus [171]. In COPD, AI methods have been applied to the management and surveillance of the condition. A Classification and Regression Tree (CART) algorithm for the early identification of patients at high risk of an imminent exacerbation has been validated using telehealth measurement data recorded from patients with moderate/ severe COPD living at home [172]. Similar approaches could be used as a real-time exacerbation event detector in several chronic conditions. Among the categories of telehealth, perhaps the most versatile would be the mobile health (mHealth) platform. The dramatic proliferation of smartphone apps designed to foster health and well-being provides a range of communication. These include programs that target text messages encouraging healthy behaviors, alerts about disease outbreaks, programs, or apps that help patients with reminders and adherence to specific care regimens. Increasingly, smartphones are using cameras, microphones, or other sensors or transducers to capture vital signs for input to apps and bridging all areas of remote patient monitoring (RPM). A developing mobile app, AI-supported platform, “Curbside Consults/Deep Doc (C2D2)”, will provide 24/7 stat, asap and routine communications between primary care providers and medical/surgical specialists as well as audio/visual provider/patient communications, continuing education, search engine functions and additional professional services [173]. The National Institutes of Health (NIH) and Office of the National Coordinator for Health Information Technology (ONC) support telehealth activities, the development of mobile technologies (such as remote sensors), and research that assesses the effectiveness of care delivered remotely [174]. AI-enabled telehealth offers contributions in the form of quality improvement and enhancement of existing practice, as well as the instigation of new models of care. Examples of the role of AI in remote delivery of healthcare include the use of tele-assessment, telediagnosis, tele-interactions, and telemonitoring. Most AI methods require a substantial learning phase that can only achieve reliability after a very long time and hence should be subject to continuous testing and refinement to develop the human-AI interchange better [164].

5.2.4 Chatbots The form and function of chatbots were first mentioned in Chapter 3 (page 62) under the “Robotics” discussion and then again multiple times in Chapters 4 and 5. In Chapter 3, it was defined as computer programs that simulate human conversation through voice commands (NLP) or text chats or both. The name chatbot was described as a contraction of “chatterbot.” There are several synonyms for a chatbot, including talkbot, bot, IM bot, interactive agent, conversational agents, virtual agent, PACT [175]. (Rather than add yet another acronym to this text, we’ll continue to use chatbots [mostly] in this discussion.) Effectively, it is an AI feature that can be embedded and used through any major messaging applications.

186

Foundations of Artificial Intelligence in Healthcare and Bioscience

AI (in the form of natural-language processing, machine learning, and deep learning, which was discussed in Chapter 3, page 51) makes it possible for chatbots to “learn” by discovering patterns in data. Without training, these chatbots can then apply the model to similar problems or slightly different questions. AI makes it possible for chatbots to “learn” by identifying patterns in data. Without training, these chatbots can apply the defined pattern to similar problems or slightly different questions. This gives them the “intelligence” to perform tasks, solve problems, and manage information without human intervention. Chatbots date back to 1966 when Joseph Weizenbaum, a professor at the Massachusetts Institute of Technology (MIT), developed the ELIZA program, named after a character in Pygmalion, a play (and Broadway musical/movie, My Fair Lady) about a Cockney girl (Eliza Doolittle) who learns to speak and think like an upper-class lady. The introduction of machine learning capabilities in bots has vastly improved the humanlike quotient of their conversations. Given the advancements in artificial intelligence and the considerable amount of time most customers spend on messaging platforms, where many chatbots are deployed, it’s not an exaggeration to say that AI chatbots are becoming a necessity in some industries. Talking bots are an excellent choice for systematic, time-consuming tasks. Chatbots are already part of virtual assistants, such as Siri, Alexa, Cortana, and Google Assistant. Besides, so-called “social bots” promote issues, products, or candidates. At the cutting edge of the chatbot, the spectrum is bots performing more complex, nuanced functions in fields where the human touch remains essential, such as law. In marketing, the belief that AI and chatbots are the next big thing is widespread [176]. Chatbots in healthcare may have the potential to provide patients with access to immediate medical information, recommend early diagnoses at the first sign of illness, or connect patients with suitable healthcare providers (HCPs) in their community [177]. Theoretically, in some instances, chatbots may be better suited to help patient needs than a human physician because they have no biological gender, age, or race and elicit no bias toward patient demographics. Early research has demonstrated the benefits of using healthcare chatbots, such as helping with diagnostic decision support [178], promoting and increasing physical activity [179], and cognitive behavioral therapy for psychiatric and somatic disorders [180], which provide effective, acceptable, and reasonable healthcare with accuracy comparable with that of human physicians. Patients may also feel that chatbots are safer interaction partners than human physicians and are willing to disclose more medical information and report more symptoms to chatbots [181]. However, despite the demonstrated efficacy and cost-effectiveness of healthcare chatbots, the technology is usually associated with poor adoption by physicians and poor adherence by patients [182]. This may be because of the perceived lack of quality or accountability that is characterized by computerized chatbots as opposed to traditional face-to-face interactions with human physicians. Again, the COVID -19 pandemic is rapidly changing such attitudes. The areas where physicians believed chatbots would be most helpful were in the improvement of nutrition, diet, and treatment compliance as well as logistical tasks such as scheduling appointments, locating clinics, and providing medication reminders. The major challenges perceived were an inability of chatbots to understand emotions and address the

Chapter 5 • AI applications in diagnostic technologies and services

187

full extent of a patient’s needs. Physicians believe that healthcare chatbots could replace a substantial role of human HCPs sometime in the future. However, chatbots can be best applied to help physicians rather than replace them [183]. Reverting once again to our “Top 10 Listing” game, the following are the Top 10 chatbot companies as of mid-2019 [184]: 1. Baidu, Inc.: Founded in 2000 and headquartered in Beijing, China, Baidu, Inc. is a technology-based media company engaged in providing Chinese language internet search through its website Baidu.com. 2. Sensely Inc.: Founded in 2013, and headquartered in California, U.S., Sensely is involved in the development of clinical assistance platforms that help clinicians to manage their patients based on the severity of the symptoms and get a better understanding of their health conditions. 3. Your. MD Limited: Founded in 2013, headquartered in London, U.K., Your. MD Limited develops personalized health assistant applications to smartphone users worldwide. 4. Babylon Health Services: Founded in 2013 and headquartered in London, U.K., Babylon Health Services operates a subscription-based mobile healthcare application. 5. HealthTap, Inc.: Founded in 2010 and headquarters at California, US, HealthTap, Inc. operates an online platform that connects people looking for health information to a network of doctors that answer their health questions. 6. Buoy Health, Inc.: Founded in 2014 and headquartered in Massachusetts, U.S., Buoy Health, Inc. develops software that would analyze the symptoms communicated by users, give out a list of possible diagnoses, and guide them toward next steps for their care. 7. Infermedica SP: Founded in 2012, Infermedica is headquartered at Wroclaw, Poland. The company provides artificial intelligence technology solutions for healthcare companies. 8. ADA Digital Health, Ltd.: Incorporated in 2011 and headquartered in Berlin, Germany. The company is a provider of AI-based health applications. 9. PACT Care BV: Incorporated in 2018 and headquartered at Amsterdam, Netherlands, PACT Care BV is into building the platform that connects people with the services, resources, and people they need for health. 10. Woebot Labs, Inc.: Incorporated in 2017 and headquartered in California, U.S. Woebot Labs, Inc. develops healthcare software. The company offers a platform that makes therapy accessible and stigma-free for a patient suffering from anxiety, depression, and mental health issues.

5.2.5 Expert systems Our last “additional diagnostic technology or service” in this Chapter 5 is an AI application, Expert Systems, whose basic structure was described in detail in Chapter 3, page 53. Here we will discuss the expert system’s direct influences in healthcare. Specifically, we will focus on

188

Foundations of Artificial Intelligence in Healthcare and Bioscience

expert systems as a part of current machine-learning-based approaches [185] that combine the best of both old and new-age approaches, i.e., latent causal relationships that are encoded in such expert systems but are further probabilistically refined and continually selfimproving by learning from new data. Medical diagnosis is difficult for a number of reasons: • Cost of acquiring information including testing, talking to the patient; • Reliance on bias and the need to quickly arrive at a diagnosis [186]; • Lack of appropriate diagnostic tools to gather complete information about the patient for a multitude of diseases and disorders; • Incomplete patient medical records; and • Dynamic nature of the diagnostic process. Expert systems seek to emulate the diagnostic decision-making ability of human experts. They include 2 components: (1) a knowledge base (KB), which encapsulates the evidencebased medical knowledge that is selected and organized by experts; and (2) a rule-based inference engine devised by the expert, which operates on the knowledge base to generate a differential diagnosis (see Figure 3 7). Diagnostic knowledge bases generally consist of diseases, findings (i.e., symptoms, signs, history, or lab results), and their relationships. In many cases, they explicitly lay out the relationships between a set of findings and the things that cause them (diseases). Specifically, the (positive predictive value) captures how strongly one should consider a disease if the finding was observed, while the frequency (sensitivity) models how likely it is that a patient with a disease manifests a particular finding. The rule-based inference engine outputs a ranks differential diagnosis by scoring the diseases in the knowledge base as a function of their relationship strengths over all of the input findings. In other words, given a set of findings from a patient, the inference engine examines the strength of the relationships those findings have with each disease in the KB and sort based on some defined scoring function [187]. To address the constraints on the expert system (e.g., dedicate experts, lag times in identifying new diseases, environmental “noise”), a new approach has been developed that lets the knowledge encoded in expert systems serve as the prior for learning a new diagnosis model from scratch. The central idea is that you can utilize expert systems as a data generator, and the generated synthetic medical cases can be used as labeled data for training a model [185]. This machine learning approach to expert systems provides a methodology that not only preserves the properties of the original expert system but also improves accuracy and flexibility. The approach enables combining data generated by an expert knowledge base with data coming from real-world medical records. These types of methods represent an exciting area of AI research as they present the massive stepfunctions deep learning has introduced in the past decade to achieve better accuracy and resiliency to noise. This approach can be a valuable tool in settings where you can cobble together several diverse but related data sets [188].

Chapter 5 • AI applications in diagnostic technologies and services

189

5.2.5.1 Literature reviews re AI’s influences on “additional diagnostic technologies” 1. Association of vital signs and process outcomes in emergency department patients [189]: In discharged elderly patients, specific vital sign abnormalities (systolic blood pressure [SBP] , 97 mm of mercury [mmHg], heart rate .101 beats per minute, body temperature .37.3  C, and pulse oximetry ,92 SpO2) were associated with twice the odds of admission within 7 days of emergency Mayo Clinic Hospital [190]. If vital sign abnormalities are consistently associated with undesirable process outcomes, AI programs could notify EPs prior to final disposition. Recent work has focused on the development of predictive tools based on ED vital signs to assist EPs in identifying patients at risk for decompensation [191]. Despite the associations of vital signs with negative process outcomes, most patients discharged or admitted to the floor with abnormal vital signs did not have adverse results, limiting the utility of vital signs alone as a predictive tool. This suggests a need to incorporate additional factors in any predictive algorithm. Age, serum bicarbonate, and lactic acid have separately been shown to be associated with inpatient deterioration [192]. AI may soon be able to prospectively identify patients at risk of both inpatient and outpatient deteriorations. Although vital sign data by itself was insufficient to create a sensitive and specific algorithm, the addition of other clinical data could lead to the development of a useful AI tool to alert EPs to potentially unsafe dispositions. 2. Recent patient health monitoring platforms incorporating internet of things-enabled smart devices [193]: Synergistic integration of the Internet of Things (IoT), cloud computing, and big data technologies in healthcare refers have led to the notion of “smart health.” Smart health is an emerging concept that addresses the provision of healthcare services for prevention, diagnosis, treatment, and follow-up management at any time or any place by connecting information technologies and healthcare. As a significant breakthrough in smart healthcare development, IoT-enabled smart devices allow medical centers to carry out preventive care, diagnosis, and treatment more competently. Smart health is a major up-and-coming research topic that is based on emerging ICT (Information and Communications Technology) and has attracted cross-disciplinary researchers. The use of IoT technology helps automate the entire patient care workflow. In other words, IoT-enabled smart devices have started to facilitate care and accurate treatment services and strategies by healthcare providers, including doctors, hospitals, and clinics. Patients can use these devices anywhere and immediately transmit their health conditions and test results using IoT-enabled devices and integrated apps, making it easier to fit testing into daily life. For doctors, real-time, remote patient monitoring makes it easier to stay up-to-date and in contact with patients without in-person visits. 3. Physicians’ perceptions of chatbots in healthcare: cross-sectional web-based survey [183]: Physicians believe in both costs and benefits associated with chatbots, depending on the logistics and specific roles of the technology. Physicians agreed that there were

190

Foundations of Artificial Intelligence in Healthcare and Bioscience

significant risks associated with chatbots, including inaccurate medical information. These findings suggest that physicians may be comfortable with using chatbots to automate simple logistical tasks but do not believe that chatbots are advanced enough to replace complex decision-making tasks requiring expert medical opinion. This is not to say that healthcare chatbots have a particular stigma associated with them, but rather, this suggests that improvements are needed for future use to overcome the risks and challenges related to the technology. Nevertheless, nearly half of the physicians surveyed believed that healthcare chatbots could replace a significant role of human HCPs sometime in the future. However, chatbots can be best applied to help physicians rather than replace them. Chatbots are cost-effective to run and can automate repetitive administrative tasks, thus freeing time for physicians to provide higher quality, personalized, and empathetic care to their patients. This research lays the foundation for future investigations on the factors influencing physician adoption of chatbots. Providing physicians with evidence-based research on the advantages and disadvantages of this emerging technology will help inform them on the most appropriate use to complement their practice rather than impede their work. So ends our journey through AI’s applications and influences in healthcare’s diagnostic technologies and services. As I said in the opening comments of this chapter, “Nothing in healthcare is more important than a timely and accurate diagnosis.” And indeed, that is so. However, once the diagnosis is established, our job in healthcare is then to provide proper care to ameliorate what is abnormal. This is when medical therapies and services become the capstone of health and wellness. Whereas AI’s applications in diagnostic technologies may be of direct value to the health professional, they are relatively indirect to the patient. In medical therapies, however, the technologies and services “touch” the patient directly, and thus, AI’s applications and influences may be more felt and appreciated. See how you feel about that as you read Chapters 6 and 7.

References [1] The future awakens. Life sciences and health care predictions 2022. Deloitte Consulting; 2017. [2] Singhal S, Carlton S. The era of exponential improvement in healthcare? McKinsey and Company; 2019. [3] Ballard B. Top 5 healthcare innovations shaping the industry’s future. The New Economy; 2018. [4] Unless footnoted separately, all Diagnostic Technologies descriptions in this section come from MedlinePlus. U.S. National Library of Medicine. U.S. Department of Health and Human Services. National Institutes of Health [Updated May 31, 2019]. [5] Lakhani P, Prater AB, Hutson RK, et al. Machine learning in radiology: applications beyond image interpretation. J Am Coll Radiol 2018;15(2):350 9. [6] Jha S, Topol EJ, Jha S, Topol EJ. Adapting to artificial intelligence: radiologists and pathologists as information specialists. JAMA 2016;316(22):2353 4. [7] Lambin P, Leijenaar RTH, Deist TM, et al. Radiomics: the bridge between medical imaging and personalized medicine. Nat Rev Clin Oncol 2017;14(12):749 62.

Chapter 5 • AI applications in diagnostic technologies and services

191

[8] Patel A. Health technology: the digital revolution - Part 1: AI & imaging. Science Entrepreneur; 2019. [9] Zaharchuk G, Gong E, Wintermark M, et al. Deep learning in neuroradiology. Am J Neuroradiol 2018; 39(10):1776 84. Available from: https://doi.org/10.3174/ajnr.A5543. [10] Sardanelli F, Hunink MG, Gilbert FJ, et al. Evidence-based radiology: why and how? Eur Radiol 2010; 20(1):1 15 Jan. [11] Kohli M, Prevedello LM, Filice RW, et al. Implementing machine learning in radiology practice and research. AJR Am J Roentgenol 2017;208(4):754 60. [12] Jiang F, et al. Stroke Vasc Neurol 2017; svn-2017-000101. [13] Titano JJ, Badgeley M, Schefflein J, et al. Automated deep-neural-network surveillance of cranial images for acute neurologic events. Nat Med 2018;24(9):1337 41. [14] Pesapane F, Codari M, Sardanelli F. Artificial intelligence in medical imaging: threat or opportunity? Radiologists again at the forefront of innovation in medicine. Eur Radiol Exp 2018;2:35. Available from: https://doi.org/10.1186/s41747-018-0061-6. [15] Hosny A, Parmar C, Quackenbush J, et al. Artificial intelligence in radiology. Nat Rev Cancer 2018; 18(8):500 10. Available from: https://doi.org/10.1038/s41568-018-0016-5. [16] Annarumma M, Withey SJ, Bakewell RJ, et al. Automated triaging of adult chest radiographs with deep artificial neural networks. Radiology 2019. Available from: https://doi.org/10.1148/ radiol.2018180921. [17] Zech J, Pain M, Titano J, et al. Natural language-based machine learning models for the annotation of clinical radiology reports. Radiology 2018;287(2):570 80. [18] Armitage H. Artificial intelligence rivals radiologists in screening X-rays for certain diseases. Stanford Medicine; 2018. [19] Cruz AS, Lins H, Medeiro RV, et al. Artificial intelligence on the identification of risk groups for osteoporosis, a general review. Biomed Eng Online 2018;17:12. Available from: https://doi.org/10.1186/s12938-018-0436-1. [20] Kanis JA. Diagnosis of osteoporosis and assessment of fracture risk. Lancet 2002;359(9321):1929 36. Available from: https://doi.org/10.1016/S0140-6736(02)08761-5. [21] Rodríguez-Ruiz A, Krupinski E, Mordang JJ, et al. Detection of breast cancer with mammography: effect of an artificial intelligence support system. Radiology 2019;290:305 14. [22] Cody R, Kent D, Chang L. Reduction of false-positive markings on mammograms: a retrospective comparison study using an artificial intelligence-based CAD. J Digital Imaging 2019;1 7. [23] Alcusky M, Philpotts L, Bonafede M, et al. The patient burden of screening mammography recall. J Women’s Health 2014;23(S1):S-11 9. [24] Siu AL. Screening for breast cancer: U.S. Preventive services task force recommendation statement. Ann Intern, Med 2016;164:279 96. [25] Bond M, Pavey T, et al. Systematic review of the psychological consequences of false-positive screening mammograms. Health Technol Assess 2013;17(13):1 170. [26] Andrykowski MA, et al. Psychological impact of benign breast biopsy: a longitudinal, comparative study. Health Psychol 2002;21(5):485 94. [27] Gandomkar Z, Mello-Thoms C. Visual search in breast imaging: a review. Br J Radiol 2019;20190057. Available from: https://doi.org/10.1259/bjr.20190057. [28] Sitek A, Wolfe JM. Assessing cancer risk from mammograms: deep learning is superior to conventional risk models. Radiology 2019. Available from: https://doi.org/10.1148/radiol.2019190791. [29] To the Mammography Quality Standards Act, FDA; 2019. [30] Yala A, Lehman C, Schuster T, et al. A deep learning mammography-based model for improved breast cancer risk prediction. Radiology 2019;292:60 6.

192

Foundations of Artificial Intelligence in Healthcare and Bioscience

[31] Seah JCY, Tang JSN, Kitchen A, et al. Chest radiographs in congestive heart failure: visualizing neural network learning. Radiology 2019;290(2):514 22. [32] The U.S. Food and Drug Administration. Fluoroscopy. 06/14/2019. [33] Kalanjeri S, et al. State-of-the-art modalities for peripheral lung nodule biopsy. Clin Chest Med 2018;39:125 38. [34] Han P, Chen C, Li P, et al. Robotics-assisted versus conventional manual approaches for total hip arthroplasty: a systematic review and meta-analysis of comparative studies. Int Jour Med Robot 2019. [35] Murayama T, Ohnishi H, Mori T, Okazaki Y, Sujita K, Sakai A. A novel non-invasive mechanical technique of cup and stem placement and leg length adjustment in total hip arthroplasty for dysplastic hips. Int Orthop 2015;39(6):1057 64. [36] Joskowicz L, Hazan EJ. Computer-aided orthopedic surgery: incrementall shift or paradigm change? Med Image Anal 2016;33:84 90. [37] Zheng G, Nolte LP. Computer-assisted orthopedic surgery: current state and future perspective. Front Surg 2015;2:66. [38] Sippey M, Maskal S, Anderson M, Marks J. Use of fluoroscopy in endoscopy: indications, uses, and safety considerations. Ann Laparosc Endosc Surg 2019;4:59. [39] Sethi S, Barakat MT, Friedland S, et al. Radiation training, radiation protection, and fluoroscopy utilization practices among US therapeutic endoscopists. Dig Dis Sci 2019. [40] www.Imalogix.com. Brings fluoroscopy capabilities to radiation dose management platform. Imaging Technology News; 2018. [41] Rizzo S, Botta F, Raimondi S, et al. Radiomics: the facts and the challenges of image analysis. Eur Radiol Exp 2018. Available from: https://doi.org/10.1186/s41747-018-0068-z. [42] “OMICS.” Definitions.net. STANDS4 LLC, 2019. Web. 2019. [43] Gillies RJ, Kinahan PE, Hricak H. Radiomics: images are more than pictures. They are data. Radiology 2016;278(2):563 77. [44] Larue RTHM, van Timmeren JE, de Jong EEC, et al. Influence of gray level discretization on radiomic feature stability for different CT scanners, tube currents, and slice thicknesses: a comprehensive phantom study. Acta Oncol 2017;56(11):1544 53. [45] Ergen B, Baykara M. Texture based feature extraction methods for content-based medical image retrieval systems. Biomed Mater Eng 2014;24(6):3055 62. [46] Parekh VS, Jacobs MA. Integrated radiomic framework for breast cancer and tumor biology using advanced machine learning and multiparametric MRI. NPJ Breast Cancer 2017;3(1):43. [47] Arimura H, Soufi M, Kamezawa H, et al. Radiomics with artificial intelligence for precision medicine in radiation therapy. J Radiat Res 2019;60(1):150 7. Available from: https://doi.org/ 10.1093/jrr/rry077. [48] Cook GJR, Goh V. What can artificial intelligence teach us about the molecular mechanisms underlying disease? Eur J Nucl Med Mol Imaging 2019. [49] Aerts HJ. The potential of radiomic-based phenotyping in precision medicine: a review. JAMA Oncol 2016;2(12):1636 42. [50] Kolossváry M, Kellermayer M, Merkely B, et al. Cardiac computed tomography radiomics: a comprehensive review on radiomic techniques. J Thorac Imaging 2018;33(1):26 34. [51] O’Connor JP, Aboagye EO, Adams JE, et al. Imaging biomarker roadmap for cancer studies. Imaging biomarker roadmap for cancer studies. Nat Rev Clin Oncol 2017;14(3):169 86. [52] Chang L. What is a CT scan? WebMD. National Institute of Biomedical Imaging and Bioengineering: “Computed Tomography” 2018. [53] Bresnick J. Top 5 use cases for artificial intelligence in medical imaging. Health IT Analytics; 2018.

Chapter 5 • AI applications in diagnostic technologies and services

193

[54] https://www.acr.org/Media-Center/ACR-News-Releases/2018/ACR-Data-Science-Institute-ReleasesLandmark-Artificial-Intelligence-Use-Cases. [55] Schoenhagen P, Zimmermann M. Artificial intelligence, and cardiovascular computed tomography. J Med Artif Intell 2018;1:11. [56] Schoenhagen P, Numburi U, Halliburton SS, et al. 3 -dimensional imaging in the context of minimally invasive and transcatheter cardiovascular interventions using multi-detector computed tomography: from pre-operative planning to intra-operative guidance. Eur Heart J 2010;31:2727 40. [57] Arbabshirani MR, Fornwalt BK, Mongelluzzo GJ, et al. Advanced machine learning in action: identification of intracranial hemorrhage on computed tomography scans of the head with clinical workflow integration. NPJ Digital Med 2018;1:9. Available from: https://doi.org/10.1038/s41746017-0015-z. [58] Doi K. Computer-aided diagnosis in medical imaging: historical review, current status, and future potential. Comput Med Imaging Graph 2007;31:198 211. [59] Winsberg F, Elkin M, Macy Jr. J, Bordaz V, Weymouth W. Detection of radiographic abnormalities in mammograms utilizing optical scanning and computer analysis 1. Radiology 1967;89:211 15. [60] Monnier-Cholley L, et al. Computer-aided diagnosis for the detection of interstitial opacities on chest radiographs. Ajr Am J Roentgenol 1998;171:1651 6. [61] Yoshida H, Masutani Y, Maceneaney P, Rubin DT, Dachman AH. Computerized detection of colonic polyps at ct colonography based on volumetric features: pilot study 1. Radiology 2002;222:327 36. [62] Arbabshirani MR, Plus S, Sui J, Calhoun VD. Single subject prediction of brain disorders in neuroimaging: promises and pitfalls. Neuroimage 2016;145:137 65. [63] González G, Ash SY, Vegas-Sánchez-Ferrero G, et al. Disease staging and prognosis in smokers using deep learning in chest computed tomography. AJRCCM 2018;197(2). [64] Ren S, He K, Girshick R, Sun J. Faster R-CNN: towards real-time object detection with region proposal networks. In: Cortes C, Lawrence ND, Lee DD, Sugiyama M, Garnett R, editors. Advances in neural information processing systems 28 (NIPS 2015). New York: Curran Associates, Inc.; 2015. p. 91 9. [65] Esteva A, Kuprel B, Novoa RA, Ko J, Swetter SM, Blau HM, et al. Dermatologist-level classification of skin cancer with deep neural networks. Nature 2017;542:115 18. [66] General ultrasound. RadiologyInfo.org. 2018. [67] Verger R. AI could make MRI scans as much as ten times fasterIn medical imaging, fewer data could be better. Popular Science; 2018. [68] Brigham Health. Inside the advanced multimodality image guided operating suite. Brigham and Women’s Hospital; 2019. [69] Piazza G. Artificial intelligence enhances MRI scans. NIH Research Matters; 2018. [70] Wenya L, Hosny A, Schabath MB, et al. Artificial intelligence in cancer imaging: clinical challenges and applications. CA: A Cancer Jour Clinicians 2019. Available from: https://doi.org/10.3322/ caac.21552. [71] Rudie JD, Rauschecker AM, Bryan RN, et al. Emerging applications of artificial intelligence in neuro-oncology. Radiology 2019. Available from: https://doi.org/10.1148/radiol.2018181928. [72] Nuclear Medicine. Johns Hopkins Medicine. Johns Hopkins University, 2019. [73] What is positron emission tomography? RadioInfo.org; 2019. [74] Hall M. Artificial intelligence and nuclear medicine. Nucl Med Commun 2019;40(1):1 2. ´ D, Białowa˛s J. Application of artificial neural networks to identify Alzheimer’s Disease using [75] Swietlik cerebral perfusion SPECT data. Int J Env Res Public Health 2019;16(7):1303. [76] Alexander A, McGill M, Tarasova A, et al. Scanning the future of medical imaging. J Am Coll Radiol 2019;16:501 7.

194

Foundations of Artificial Intelligence in Healthcare and Bioscience

[77] Waldron T. 4 future trends in medical imaging that will change healthcare. Definitive Healthcare; 2019. [78] Kusta S. Artificial intelligence within ultrasound. Signify Research; 2018. [79] Correa M, Zimic M, Barrientos F, et al. Automatic classification of pediatric pneumonia based on lung ultrasound pattern recognition. PLoS One 2018;13(12):e0206410. Available from: https://doi.org/ 10.1371/journal.pone.0206410. [80] UNICEF. Pneumonia and diarrhoea: tackling the deadliest diseases for the worlds most impoverished children. New York: UNICEF; 2012. 2013. [81] Trinavarat P, Riccabona M. Potential of ultrasound in the pediatric chest Eur J Radiol 2014;83 (9):1507 18PMID. Available from: 24844730. [82] Wang S-C. Artificial neural network. Interdisciplinary computing in Java programming. Springer; 2003. p. 81 100. [83] Wu G, Zhou L, Xu J, et al. Artificial intelligence in breast ultrasound. World J Radiol 2019;11(2):19 26. Available from: https://doi.org/10.4329/wjr.v11.i2.19. [84] Uniyal N, Eskandari H, Abolmaesumi P, et al. Ultrasound RF time series for classification of breast lesions. IEEE Trans Med Imaging 2015;34(2):652 61. [85] Sidiropoulos KP, Kostopoulos SA, Glotsos DT, et al. Multimodality GPU-based computer-assisted diagnosis of breast cancer using ultrasound and digital mammography images. Int J Comput Assist Radiol Surg 2013;8(4):547 60. [86] Malik B, Klock J, Wiskin J, Lenox M. Objective breast tissue image classification using quantitative transmission ultrasound tomography. Sci Rep 2016;6:38857. [87] Wang HY, Jiang YX, Zhu QL, et al. Automated breast volume scanning: identifying 3-D coronal plane imaging features may help categorize complex cysts. Ultrasound Med Biol 2016;42:689 98. [88] Plichta JK, Ren Y, Thomas SM, Greenup RA, et al. Implications for breast cancer restaging based on the 8th edition AJCC staging manual. Ann Surg. 2018;271(1):169 76. [89] Archana KV, Vanithamani R. Diagnosis of coronary artery diseases and carotid atherosclerosis using intravascular ultrasound images. Artif Intell 2019;281 8. [90] Wang Z, Zhao S, Bai Y. Artificial intelligence as a third eye in lesion detection by endoscopy. Clin Gastroenterol Hepatol 2018;16(9):1537. [91] Kegelmeyer Jr WP, Pruneda JM, Bourland PD. Computer-aided mammographic screening for spiculated lesions. Radiology 1994;191(2):331 7. [92] LeCun Y, Bengio Y, Hinton G. Deep learning. Nature 2015;521(7553):436 44. Available from: https:// doi.org/10.1038/nature14539. [93] Kudo S, Mori Y, Misawa M. Artificial intelligence and colonoscopy: current status and future perspectives. Digestive Endosc 2019. Available from: https://doi.org/10.1111/den.13340. [94] Bosworth T. GI disease screening with artificial intelligence is close. GI and Hepatology News; May 14, 2019. [95] de Groen PC. Artificial intelligence in upper endoscopy: location, location, location. gastroenterology. Expert Opinion / Commentary; 2019. [96] Duker JS, Freund B, Sarraf D. Retinal imaging: choosing the right method. Eyenet Mag Amer Acad Ophth 2014. [97] Shu D, Ting W, Pasquale LR, et al. Artificial intelligence and deep learning in ophthalmology. Br J Ophthalmol 2018;103(2). [98] Poplin R, Varadarajan AV, Blumer K, et al. Prediction of cardiovascular risk factors from retinal fundus photographs via deep learning. Nat Biomed Eng 2018;2:158 64. [99] Rajalakshmi R, Subashini R, Anjana RM, et al. Automated diabetic retinopathy detection in smartphone-based fundus photography using artificial intelligence. Eye 2018;32:1138 44.

Chapter 5 • AI applications in diagnostic technologies and services

195

[100] Zheng C, Johnson T, Garg A, et al. Artificial intelligence in glaucoma. Curr Opin Ophthalmol 2019; 30(2):97 103. [101] Molenda SR, Mostow EN. The introduction of skin self-photography as a supplement to skin self-examination for the detection of skin cancer. J Am Acad Dermatol 2014;70:e15. [102] Harting MT, DeWees JM, Vela KM, et al. Medical photography: current technology, evolving issues, and legal perspectives. Int J Clin Pract 2015;69(4):401 9. Available from: https://doi.org/10.1111/ ijcp.12627. [103] Hogarty DT, Su JC, Phan K, et al. Artificial intelligence in dermatology—where we are and the way to the future: a review. Am J Clin Dermatol 2019;1 7. [104] Catania LJ, Nicolitz E. Artificial intelligence and its applications in vision and eye care. Adv Ophthalmol Optometry 2018;3(1):21 38. [105] Reiter O, Rotemberg V, Kose K, et al. Artificial intelligence in skin cancer. Curr Dermatol Rep 2019; 8(3):133 40. [106] Christopher M, Belghith A, Bowd C, et al. Performance of deep learning architectures and transfer learning for detecting glaucomatous optic neuropathy in fundus photographs. Nat Sci Rep 2018;8 Article number: 16685. [107] Weiss K, Khoshgoftaar TM, Wang DD. A survey of transfer learning. J Big Data 2016;3. [108] Szegedy C, Ioffe S, Vanhoucke V. Inception-v4, Inception-ResNet, and the impact of residual connections on learning Pattern Recognit Lett 2016;42Arxiv 12. Available from: https://doi.org/10.1016/ j.patrec.2014.01.008. [109] Nerminathan A, Harrison A, Phelps M, et al. Doctors’ use of mobile devices in the clinical setting: a mixed-methods study. Intern Med J 2017;47(3):291 8. Available from: https://doi.org/10.1111/ imj.13349. [110] U.S. Food and Drug Administration. Tests used in clinical care. 2018. [111] Politakis P, Weiss SM. Using empirical analysis to refine the expert system knowledge bases. Artif Intell 1984;22(1):23 48. [112] Friedman B. Genalyte reveals a POCT system with QC in the cloud. Clinical Lab Industry News; 2018. [113] Menon PK, Medicity T. Effect of artificial intelligence in the clinical laboratory. Medlab Magazine. [email protected]; 2019. [114] Durant TJS. Machine learning and laboratory medicine: now and the road ahead. AACC.org; 2019. [115] Lindor MN, Thibodeau SN, Burke W. Whole-genome sequencing in healthy people. Mayo Clin Proc 2017;92:159 72. Available from: https://doi.org/10.1016/j.mayocp.2016.10.019. [116] Genetics Home Reference (GHR). What is a gene? U.S. Library of Congress; 2019. [117] The definition of genetics. www.dictionary.com. Retrieved October 25, 2018. [118] Genetics Home Reference (GHR). What are genetics and genomics? U.S. Library of Congress; 2019. [119] Genetics vs. Genomics. The Jackson Laboratory; 2019. [120] Genetics Home Reference. What is a genome? U.S. National Library of Medicine. USA.gov; 2019. [121] National Human Genome Institute Research Institute. DNA sequencing factsheet. 2015. [122] Transcriptomics. Nature.com; 2019. [123] Pocket K. No. 15: ‘Omics’ sciences: genomics, proteomics, and metabolomics. International Service for the Acquisition of Agri-biotech Applications (ISAAA); 2019. [124] National Human Genome Institute Research Institute. A brief guide to genomics. 2003. [Updated August 27, 2015]. [125] Genetics Home Reference (GHR). What is epigenetics? U.S. Library of Congress; 2019.

196

Foundations of Artificial Intelligence in Healthcare and Bioscience

[126] NCI Dictionary of Genetics Terms. U.S. Department of Health and Human Services. National Institutes of Health. National Cancer Institute; 2019. USA.gov. [127] Marr B. The wonderful ways artificial intelligence is transforming genomics and gene editing. Forbes; 2018. [128] Genetic Testing. Lister Hill National Center for Biomedical Communications U.S. National Library of Medicine. National Institutes of Health. Department of Health & Human Services. https://ghr.nlm.nih. gov/; 2019. [129] Brodwin E. Genetic testing is the future of healthcare, but many experts say companies like 23andMe are doing more harm than good. Business Insider; 2019. [130] He KY, Ge D, He MM. Big data analytics for genomic medicine. Int J Mol Sci 2017. [131] Collins FS, Varmus H. A new initiative on precision medicine. N Engl J Med 2015;372:793 5. [132] Vassy JL, Korf BR, Green RC. How to know when physicians are ready for genomic medicine. Sci Transl Med 2015;7:287fs219. [133] Gottesman O, Kuivaniemi H, Tromp G, et al. The electronic medical records and genomics (eMERGE) network: past, present, and future. Genet Med 2013;15:761 71. [134] Peters SG, Buntrock JD. Big data and the electronic health record. J Ambul Care Manag 2014;37:206 10. [135] Bi WL, Hosny A, Schabath MB, et al. Artificial intelligence in cancer imaging: clinical challenges and applications. Cancer J Clin 2019. Available from: https://doi.org/10.3322/caac.21552. [136] Staff Reporter. Roche to acquire outstanding shares of foundation medicine for $2.4B. GenomeWeb; 2018. [137] Das R. The flatiron health acquisition is a shot in the arm for Roche’s oncology real-world evidence needs. Forbes; 2018. [138] Immunogenetics. Nature.com; 2019. [139] Gómez Perosanz M, Pappalardo F. Computational immunogenetics. Encycl Bioinforma Comput Biol 2019. [140] Agarwal SK, Weinstein LS. Epigenetics. Genetics of bone biology and skeletal disease. 2nd ed. 2018. [141] Cision. Precision medicine market size to exceed $87 billion by 2023: Global Market Insights Inc.; 2016, [142] Harmston N, Filsell W, Stumpf MP. What the papers say: text mining for genomics and systems biology. Hum Genom 2010;5:17 29. [143] Xu J, Yang P, Xue S, et al. Translating cancer genomics into precision medicine with artificial intelligence: applications, challenges, and future perspectives. Hum Genet 2019;138(2):109 24. [144] Misra MK, Damotte V, Hollenbach JA. The immunogenetics of neurological disease. Immunology 2017. Available from: https://doi.org/10.1111/imm.12869. [145] Mazzone R, Zwergel C, Artico M, et al. The emerging role of epigenetics in human autoimmune diseases. The emerging role of epigenetics in human autoimmune disorders. Clin Epigenetics 2019; 11 Article number: 34. [146] Moosavi A, Motevalizadeh AA. Role of epigenetics in biology and human diseases. Iran Biomed J 2016;20(5):246 58. [147] Trebeschi S, Drago SG, Birkbak NJ, et al. Predicting response to cancer immunotherapy using noninvasive radiomic biomarkers. Ann Oncol 2019;30(6):998 1004. [148] Coroller TP, Agrawal V, Narayan V, et al. Radiomic phenotype features predict pathological response in non-small cell lung cancer. Radiother Oncol 2016;119(3):480 6. [149] Thompson D. Waking hospital patients to check vital signs may do more harm than good. MedicineNet; 2019. [150] Sotera. ViSi. Sotera Wireless Inc.; 2019.

Chapter 5 • AI applications in diagnostic technologies and services

197

[151] Miyashita M, Brady M. The health care benefits of combining wearables and AI. Harv Bus Rev 2019. [152] McCann C. Current health receives FDA clearance for its remote patient monitoring solution, enabling earlier intervention and improved patient outcomes. Current Health; 2019. [153] Matheson R. Automating artificial intelligence for the medical decision-making model replaces the laborious process of annotating massive patient datasets by hand. MIT News Office; 2019. [154] U.S. National Library of Medicine. Electrodiagnosis MeSH Descriptor Data. 2019. [155] Dubreuil L. How can we apply AI, machine learning, or deep learning to EEG? Neuroelectrics; 2018. [156] Ruffini G, IbañezMarta D, Dunne S, et al. EEG-driven RNN classification for prognosis of neurodegeneration in at-risk patients. International conference on artificial neural networks ICANN 2016: Artificial neural networks and machine learning ICANN 2016; 2016, pp. 306 13. [157] Ruffinia G, Iba~neza D, Kroupia E. Deep learning using EEG spectrograms for prognosis in idiopathic rapid eye movement behavior disorder (RBD). BioRxiv 2018. Available from: http://doi.org/10.1101/240267. [158] HealthIT.gov. Telemedicine and telehealth. Official Website of The Office of the National Coordinator for Health Information Technology (ONC); 2017. [159] Telehealth and telemedicine technology. eVisic; 2018. [160] Connect2HealthFCC. Telehealth, telemedicine, and telecare: what’s what?. Federal Communications Commission; 2019. [161] 2017 US telemedicine industry benchmark survey. Reach Health; 2017. [162] Sennaar K. Artificial intelligence in telemedicine and telehealth

4 current applications. Emerj; 2019.

[163] Laranjo L, Dunn AG, Tong HL, Kocaballi AB, Chen J, Bashir R, et al. Conversational agents in healthcare: a systematic review. J Am Med Inf Assoc 2018;25(09):1248 58. [164] Kuziemsky C, Maeder AJ, John O, et al. Role of artificial intelligence within the telehealth domain. Yearb Med Inf 2019. Available from: https://doi.org/10.1055/s-0039-1677897. [165] Wicklund E. Asynchronous telehealth gives providers an alternative to DTC video. mHealth Intelligence; 2019. [166] CaptureProof.com. Keep track of your health over time. 2019. [167] World Health Organization. Telemedicine: opportunities and developments in member states. Report on the second global survey on eHealth. Geneva: World Health Organization; 2010. [168] Nangalia V, Prytherch DR, Smith GB. Health technology assessment review: remote monitoring of vital signs-current status and future challenges. Crit Care 2010;14(05):233. [169] Inglis SC, Clark RA, McAlister FA, et al. Which components of heart failure programs are effective? A systematic review and meta-analysis of the outcomes of structured telephone support or telemonitoring as the primary component of chronic heart failure management in 8323 patients: abridged cochrane review. Eur J Heart Fail 2011;13(09):1028 40. [170] Bolton CE, Waters CS, Peirce S, Elwyn G. Insufficient evidence of benefit: a systematic review of home telemonitoring for COPD. J Eval Clin Pract 2011;17(06):1216 22. [171] Polisena J, Tran K, Cimon K, Hutton B, McGill S, Palmer K. Home telehealth for diabetes management: a systematic review and meta-analysis. Diabetes Obes Metab 2009;11(010):913 30. [172] Mohktar MS, Redmond SJ, Antoniades NC, Rochford PD, Pretto JJ, Basilakis J, et al. Predicting the risk of exacerbation in patients with chronic obstructive pulmonary disease using home telehealth measurement data. Artif Intell Med 2015;63(01):51 9. [173] Catania L, Jackson R, Armitage B. Curbside consults/deep doc (C2D2). Atlantic Beach, FL: Louis J. Catania Inc; 2020. [174] Report to Congress. E-health and telemedicine. US Department of Health and Human Services; 2016.

198

Foundations of Artificial Intelligence in Healthcare and Bioscience

[175] Frankenfield J. Chatbot. Investopedia; 2019. [176] Farrell G. Artificial intelligence chatbots are changing the way you do business and may impact your bottom line. SmartSheet; 2019. [177] Amato F, Marrone S, Moscato V, et al. CEUR workshop proceedings. Chatbots Meet eHealth: Automatizing Healthcare. ,http://ceur-ws.org/Vol-1982/paper 6.pdf.; 2017 [accessed 26.02.19]. [178] Razzaki S, Baker A, Perov Y, et al. A comparative study of artificial intelligence and human doctors for triage and diagnosis. arXiv 2018. Available from: https://arxiv.org/pdf/1806.10698.pdf [accessed 26.02.19]. [179] Kramer J, Künzler F, Mishra V, et al. Investigating intervention components and exploring states of receptivity for a smartphone app to promote physical activity: protocol of a microrandomized trial. JMIR Res Protoc 2019;8(1):e11540. [180] Suganuma S, Sakamoto D, Shimoyama H. An embodied conversational agent for unguided internetbased cognitive behavior therapy in preventative mental health: feasibility and acceptability pilot trial. JMIR Ment Health 2018;5(3):e10454. [181] Lucas GM, Rizzo A, Gratch J. Reporting mental health symptoms: breaking down barriers to care with virtual human interviewers. Front Robot AI 2017;4:51. [182] Andersson G. Internet-delivered psychological treatments. Annu Rev Clin Psychol 2016;12:157 79. [183] Palencia A, Flaschner P, Thommandram A. Physicians’ perceptions of chatbots in health care: cross-sectional web-based survey. JMIR 2019;21(4). [184] Meticulous Blog. Top 10 companies in healthcare chatbots market. Meticulous Research; 2019. [185] Ravuri M, Kannan A, Tso GJ, et al. From expert systems to machine-learned diagnosis models. Proc Mach Learn Res 2018;85:1 16. [186] Saposnik G, Redelmeier D, Ruff CC, et al. Cognitive biases associated with medical decisions: a systematic review. BMC Med Inf Decis Mak 2016;16:138. Available from: https://doi.org/10.1186/ s12911-016-0377-1. [187] Ravuri M, Kannan A, Tso GJ, et al. Learning from the experts: from expert systems to machine-learned diagnosis models. Cornell University; 2018. V1. Revised August 14. [188] Kannan A. The science of assisting medical diagnosis: from expert systems to machine-learned models. Medium; 2019. [189] Hodgson N, Poterack R, Mi KA, et al. Association of vital signs and process outcomes in emergency department patients West J Emerg Med 2019;20(3):433 7eScholarship UC Irvine. Available from: https://doi.org/10.5811/westjem.2019.1.41498. [190] Gabayan GZ, Gould MK, Weiss RE, et al. Emergency department vital signs and outcomes after discharge. Acad Emerg Med 2017;24(7):846 54. [191] Imperato J, Mehegan T, Henning DJ, et al. Can an emergency department clinical “triggers” program based on abnormal vital signs improve patient outcomes? CJEM 2017;19(4):249 55. [192] Henning D, Oedorf K, Day D, et al. Derivation and validation of predictive factors for clinical deterioration after admission in emergency department patients presenting with abnormal vital signs without shock. West J Emerg Med 2015;16(7):1059 66. [193] Kang M, Park E, Hwan Cho B, et al. Recent patient health monitoring platforms incorporating internet of things-enabled smart devices. Int Neurourol J 2018;22(Suppl. 2):S76 82. Available from: https://doi. org/10.5213/inj.1836144.072.

6 Current AI applications in medical therapies and services The range and depth of today’s medical therapies and services are enormous. Any attempt to discuss them comprehensively and understandably requires that they be divided into generic categories based on their purpose and goal(s). That being said, doing so yields a fairly extensive listing, hopefully, one with no major omissions. Thus, the listing I have evolved for this Chapter includes: A. B. C. D. E. F. G. H. I. J.

Medical care (primary, secondary, tertiary, quaternary care); Pharmaceutical and biopharmaceutical care; Hospital care; Nursing care; Home health care, nursing home, and hospice care; Concurrent medical conditions (“Comorbidity”, aka “Multimorbidity”); Precision medicine; Medical/surgical Robotics; Stem cells and regenerative medicine; Genetics and genomics therapies.

One other guidepost I must explain in furtherance of the theme of this book, as a userfriendly guide to AI in healthcare and bioscience and, in the interest of a comprehensive and understandable approach to the healthcare-related categories listed above. Where applicable, I will reference the AI-related technologies and services presented in the previous Chapters (2 5) to provide a reasonably comprehensive discussion for each of the medical therapy categories above. A listing with Chapter notations for each of the AI-related technologies and services includes: 1. 2. 3. 4. 5. 6. 7. 8. 9. 10.

Big data analytics (Chapters 3 and 4); Health information and records (EHR) (Chapters 3 and 4); Medical research and clinical trials (Chapter 4); Blockchain (Chapter 3 and 4); Internet of Things (IoT) (Chapter 2); Telemedicine/telehealth (Chapter 5); Chatbots (Chapter 5); Natural language processing (NLP) (Chapter 3); Expert Systems (Chapter 3 and 5); Robotics (Chapter 3 and 6);

Foundations of Artificial Intelligence in Healthcare and Bioscience. DOI: https://doi.org/10.1016/B978-0-12-824477-7.00013-4 © 2021 Elsevier Inc. All rights reserved.

199

200

11. 12. 13. 14. 15.

Foundations of Artificial Intelligence in Healthcare and Bioscience

Population health (Demographics and Epidemiology) (Chapter 4); Healthcare Analytics (Chapter 4); Preventive Medicine/Healthcare (Chapter 4); Public Health (Chapter 4); Access and availability (Chapter 6).

In the interest of “saving trees,” I will not repeat the full descriptions of each of the AI technologies and services listed above, which have all been thoroughly covered in previous Chapters (identified with each listed item). However, I would urge the reader to refer back to any respective Chapter when necessary to review a given topic before reading about its applications in the specific medical therapies or services discussed in this Chapter. You will also notice throughout the discussions in this Chapter that there will be repeated: “cross-references” between AI technologies, therapies, and the health-related categories. This is the product of extensive horizontal and vertical integration and interoperability among these technologies, therapies, and categories, all ultimately enhancing the overall delivery of care. Also, in this particular approach, the business considerations from Chapter 4 have as much significance as the Chapter 5 diagnostic considerations. Thus, all will frequently appear in this comprehensive review. The applications of AI-related therapies in each of the health care technologies and service categories are extensive and having profound and beneficial effects for you, those for whom you care for, and for your loved ones. Many of these AI applications you will recognize from your own health care, and many embedded and concealed within the delivery of your care. All are providing benefits to you and the health care system, and yes, all of them are “disruptive” in nature. As we discuss them by category, you will begin to appreciate how many AI therapies are affecting you and have become part of your personal life, your professional careers, your care, and your health and wellness. And finally, one other point regarding this Chapter’s categories and content. The total body of related literature is enormous. So, what I have tried to do is select the current and what I hope you will find in most cases (but not all) to be relevant information for your interests. Some topics are addressed with descriptive details, and some are presented through clinical studies and research. Once again, if the subject is not discussed adequately regarding your “need to know,” online and/or library research will provide as much additional information as you could want or need. (See the listing of resources I mention in Section 1 Introduction, Table Intro 2.)

6.1 Medical care (primary, secondary, tertiary, quaternary care) The essence of health care starts with the health care professional providing care to persons in need. That care is usually divided into primary care, secondary care, tertiary care, and

Chapter 6 • Current AI applications in medical therapies and services

201

quaternary care. While definitions vary widely, there are broad, general concepts that define each level [1]: • Primary care: This level provider, often referred to as the “gatekeeper,” is the first point of medical consultation. A primary care physician (PCP) (also called a general practitioner or family physician) or a nurse practitioner (nurse clinician) usually serves as the patient’s entry point to the health care system. Care is provided at the doctor’s office, health center, or an urgent care center. Emergency rooms are also a source (not optimal) of primary care for uninsured patients. • Secondary care: This level provider includes medical specialists, surgeons and other health professionals whose training and services focus primarily in a specific field (e.g., neurology, cardiology, dermatology, oncology, orthopedics, ophthalmology, and other specialized fields) and who typically don’t have initial contact with patients. Acute care hospitals are also considered a category of secondary care. This category covers care to admitted patients, a hospital emergency room (ER); childbirth; medical imaging services; and an intensive care unit. Physical therapists, respiratory therapists, speech therapists, occupational therapists, and other allied health professionals often work in secondary care. • Tertiary care: This level of care infers a higher level of care in a hospital and usually on referral from a primary or secondary provider. Health professionals and equipment at this level are highly specialized. Tertiary care services include areas such as cardiac surgery, cancer treatment, acute and chronic pulmonary care and management, burn treatment, plastic surgery, neurosurgery, and other complicated treatments or procedures. • Quaternary care: This level of care identifies a standard of care more complicated than tertiary care. Highly specialized care, experimental treatments, and complicated procedures are considered quaternary care.

6.1.1 Big data analytics and AI in medical care There is no AI technology more vital to and integrated into medical care than Big Data Analytics. It is generated from historical clinical activities and has significant effects on medical care and the health care industry. It assists in planning treatment paths for patients, processing clinical decision support (CDS), and improving health care technology and systems [2]. Big Data comes from hospital information resources, surgeons’ work, activities of anesthesia, physical examinations, radiography, magnetic resonance imaging (MRI), computer tomography (CT), information of patients, pharmacy, treatment, medical imaging, and imaging reports [3]. Clinical activities generate a large number of records, including identification information of patients, diagnosis, medicine care, notes from physicians, and sensor data [3]. AI and biomedical informatics struggle with large amounts of data beyond the capabilities of standard data processing technologies. AI analytical methods have been developed to deal with such data, combining statistical and computational techniques. Thus, “big data” has always been characterized with medical care [4] and the need for “big data analytics” has developed corresponding methods. New deep machine-learning methods and data science have been explored by computer scientists, biomedical informaticians, and AI

202

Foundations of Artificial Intelligence in Healthcare and Bioscience

researchers for decades. However, some experts consider machine learning and AI as nearly synonymous [4].

6.1.2 Health information and records (EHR) and AI in medical care Important health information from clinical activities includes electronic health record (EHR)/electronic medical record (EMR), personal health record (PHR), and medical images. The EMR encompasses structured and unstructured data that contain all of a patient’s medical activity information and often used for treatment and treatment decisions. Conversely, the EHR is more health-related information such as medical information and financial information closely related to the health care of individuals [5]. EHR focuses on health management, whereas EMR focuses on the clinical diagnosis of patients. The EHR also contains data of demographics, medical history, medication and allergies, immunization status, laboratory test results, radiology images, vital signs, personal statistics, and billing information. Conversely, the EMR is the record of care delivery organization (CDO) and belongs to CDO, while EHR is a subset of CDO and belongs to the patients or stakeholders [6]. PHR (personal health record) is derived from a variety of the patient’s health and social information. Its primary role is as a data source for medical analysis and clinical decision support [7]. For example, it includes data of allergies and adverse drug reactions (ADRs), chronic diseases, family history, illnesses and hospitalizations, imaging reports, laboratory test results, medications and dosing, prescription records, surgeries and other procedures, vaccinations, and observations of nursing home events and daily living (ODLs). Medical imaging, on the other hand, mainly comes from X-ray, CT, PET, radiography, magnetic resonance imaging (MRI), nuclear medicine, ultrasound, elastography, tactile imaging, photoacoustic imaging, echocardiography, and so on [7]. Even though EHR use is extensive now, receptionists, medical assistants, and doctors still do a lot of manual entry. Meanwhile, voice recognition capabilities (NLP) are increasingly replacing keyboards. The user can now simply speak into the EHR system rather than typing in information. Also, video-based image recognition capabilities will supplement EHRs and provide additional insight into patients’ conditions that AI is capable of analyzing, and humans may miss [8].

6.1.3 Research/clinical trials and AI in medical care The application of emerging digital technologies has increased our understanding of disease in larger patient cohorts and has fostered the development of personalized therapies. For example, most of the new molecular entities approved by the U.S. FDA in recent years were designed to target disease aberrations initiation and maintenance of precision medicine [9]. Machine learning and computer vision enhance aspects of human visual perception and identify clinically meaningful patterns in, e.g., imaging data [10]. Neural networks are being used for tasks ranging from medical image segmentation, generation, classification, and prediction of clinical data sets [11]. Perspectives and commentaries have recently highlighted applications of DNN to imaging data sets, pharmaceutical properties of compounds, clinical diagnoses and genomics, computer vision applications for medical imaging, and NLP applied

Chapter 6 • Current AI applications in medical therapies and services

203

in EHR. Studies have predominantly focused on data in primary care or hospital ecosystem and early drug discovery applications [12]. Recent research in computer science and medicine, proactive regulatory landscape, and availability of large data sets offers examples and promise of delivering faster cures to patients through AI and ML methods. This perspective seeks to engage and inform researchers from fields, such as computer science, biology, medicine, engineering, biostatistics, and policymakers, with the value of emerging technologies of AI and ML to solve challenges facing modernization of the current clinical development process [13].

6.1.4 Blockchain and AI in medical care AI could be incorporated into a unified health information blockchain to accelerate the discovery, development, and delivery of personalized medicine. If the blockchain is populated with codified, abstracted, and distributed health data, with a non-alterable control over privacy, AI could then be permitted by patients and organizations to search for records. This provides a representation of the patient’s databank in search of health-related red flags, trends, insights, outbreaks, and overlooked cures. Health data reflects who we are at our hereditary levels, the care we receive, and its outcomes. It may be helpful for corporations and universities to have that accessible data arrangement on the blockchain. The transparency blockchain offers, along with the size of medicinal records, may make it impossible. However, with the correct model, there is no reason why the broader learning potential can’t be utilized in the data generated in health care, as long as the records are not stored on the blockchain directly. Instead, the blockchain can be used as a mechanism by which access and authorization for medical records can be offered. AI models use semi-structured datasets. They uncover hidden patterns and create meaning by linking thousands of disparate entities. A permissible distribution of health information via blockchain could make available a wide array of executable datasets for these highly specialized, trusted, narrow AI agents for the first time in history. The ultimate goal would be not just aggregating and analyzing data, but bettering the care delivered to patients everywhere [14].

6.1.5 Internet of Things (IoT) and AI in medical care [15] The application of the Internet of Things (IoT) in healthcare, as with so many things, is enormous. It includes, at a minimum, remote monitoring, personal healthcare, smart sensors (see CarePredict, Chapter 5, page 125, medical device integration. Also included are the pharmaceutical industry integration, healthcare insurance, healthcare building facilities, robotics, smart pills, and even treatments of diseases. It is keeping patients safe and healthy as well as improving how physicians deliver care. To some degree, IoT in healthcare is heavily focused on remote monitoring, telemonitoring, tracking, monitoring, and maintenance of assets. Its principal function includes EHR and the concept of computerized health records. Its use in EMR promises to

204

Foundations of Artificial Intelligence in Healthcare and Bioscience

advance coordination of medical care, nursing home care, facilitate interaction with patients and families, and in improving its efficiency. This has also become the main idea of telehealth (see below), that is, a pool of technologies delivering virtual medical, health, and education services. The integration of EHR systems with the IoT can create broad personalized healthcare solutions which could enable the following: • Connect any wearable/portable device to the cloud, pull and analyze collected patient data in real-time; • Monitor vital health indicators collected by portable devices; • Charts and diagram visualization based on collected data; • Monitor patients at home with the help of live video and audio streaming; • Intelligent emergency notifications sent to a physician and/or family.

6.1.6 Telehealth and AI in medical care [16] Health care inevitably necessitates some face-to-face clinical interaction, with required follow-up depending on the circumstances. Nonetheless, telehealth and Information and Communication Technology (ICT) tools can be used to address many critical issues in health care, such as misdistribution of the demand versus supply of healthcare services. AI could boldly assist with this issue with algorithms to match the availability of care providers with appropriate clinical skillsets and to the need for such skillsets in the varying locations. Telehealth also introduces several operational issues, such as failures in the technology, or when a remote care clinician is not available. AI could potentially alleviate such situations by providing mechanisms for human or virtual interactions to occur, thereby addressing difficulties in timing and availability of clinicians, such as nursing home issues and the time it takes to understand the patient’s problem or taking a history.

6.1.7 Chatbots and AI in medical care [16] Virtual assistants (chatbots) can provide a viable alternative to traditional healthcare delivery models including situations like dealing with cognitively impaired individuals [17], improving accessibility of online clinical information [18], or providing avatar-based patient agents for the elderly at home and in nursing facilities [19]. These cases rely upon a more sophisticated conversational level and knowledge base, one which AI can accommodate with a deeper understanding and through data accumulation by an AI agent. Sometimes, an authentic conversational dialogue may be necessary to incorporate aspects of affective behavior, using multimodal contextual awareness mechanisms [20]. For example, when issues arise from a patient’s past interactions or past medical history that need to be considered in making conversational decisions. Here, a personalized model of the individual’s context will be required (as in NLP) in addition to the context model for the current conversation.

Chapter 6 • Current AI applications in medical therapies and services

205

6.1.8 Natural language processing (NLP) and AI in medical care There are 4 primary areas where natural language processing (NLP) improves the delivery of health care. They include the following [21]: 1) NLP improves EHR data usability: An NLP EHR interface can make it easier for clinicians to access patient information. The interface is organized into sections including: (a) words describing patients’ concerns during an encounter; (b) information relating to the words (e.g., all mentions of fatigue would show on a timeline and note about the word); and (c) discover buried diagnostic data. 2) NLP enables predictive analytics: NLP enables predictive analytics to improve population health concerns. For example, health professionals can use NLP to monitor individuals showing psychological risk factors on social media. A study in 2018 [22] showed a 70% suicide prediction rate. 3) NLP boosts phenotyping capabilities: NLP enables analysts and clinicians to extract and analyze unstructured data such as follow-up appointments, vitals, charges, orders, encounters, symptoms, etc. which enables creation of phenotypes for patient groups. NLP empowers analysts to extract this type of data as well as pathology reports, medications, genetic data to answer complex and specific questions. 4) NLP enables Health System Quality Improvement: NLP automates and accelerates the data collection process from the federal government, and associated agencies requiring all hospitals and nursing homes to report specific outcome metrics. Healthcare organizations are increasingly using NLP to get at the simpler diagnostic entities (e.g., “chest pain”), and major tech companies like Amazon are developing healthrelated, clinical NLP tools [23]. Many open-source tools are also available at no cost for classification, to find phrases, and to look for contextual information that provides clues about family history. But to maximize NLP’s potential, healthcare-specific vendor systems must develop programs that integrate into existing workflows.

6.1.9 Expert systems and AI in medical care Expert or knowledge-based systems are the most frequent type of AI medical (AIM) system in routine clinical use [24]. They contain medical knowledge, usually about a very explicitly defined task, and they can reason with data from individual patients to come up with reasonable conclusions. Although there are many variations, the knowledge in an expert system is typically represented in the form of a set of rules [25]. Expert systems are found more frequently in clinical laboratories and educational settings, for clinical surveillance, or in data-rich clinical environments like intensive care units [26]. There are many types of clinical tasks to which expert systems can be applied. In realtime situations, an expert system attached to a monitor can warn of changes in a patient’s condition. In less acute circumstances, an expert system can scan laboratory results or drug orders and send reminders or warnings through an email system. In a more complicated

206

Foundations of Artificial Intelligence in Healthcare and Bioscience

case scenario, or if the person making the diagnosis is simply inexperienced, an expert system can help come up with likely diagnoses based on patient data [27]. Laboratory expert systems usually do not participate in clinical practice. Instead, they are embedded within the process of care, and clinicians working with patients do not need to interact with them. For the ordering clinician, the system can print a report with a diagnostic hypothesis for consideration. Still, it does not assume responsibility (“garbage in garbage out”) for information gathering, examination, assessment, and treatment. Although there is much practical use of expert systems in routine clinical settings, at present, machine learning systems still seem to be used in a more experimental way. Given a set of clinical cases, a machine learning system will produce a systematic description of those clinical features that uniquely characterize the clinical conditions.

6.1.10 Robotics and AI in medical care During the year 2018, more than 5000 surgical procedures, including orthopedic, ophthalmologic, neurologic, and dental, were conducted by AI-assisted robots [28]. The first medical robotic application appeared in 1985 with the use of a robotic surgical arm assisting in a neurosurgical biopsy. The first fully FDA-approved system (known as the da Vinci surgery system) for laparoscopic surgery emerged 15 years later, giving surgeons the ability to control surgical instruments indirectly via a console [29]. Robotics is serving mostly as a high-tech surgical assistant to help doctors perform minimally invasive surgeries, especially in hard to reach or micro areas. Most of these systems, classified as robotically-assisted surgical (RAS) devices by the FDA, enable surgeons to perform operations using a console to operate surgical arms, cameras, and other instruments that perform the procedure. RAS systems can result in fewer and smaller incisions, lowering the likelihood of blood loss and infections and often translate into less pain and fewer complications for patients. RAS systems are being used increasingly in orthopedic surgery, particularly in knee and hip surgeries. Another interesting development is remotely performed robotic surgery. This surgery could be particularly beneficial for use cases such as battlefield medical treatment and even long space exploration missions. Additionally, artificial limbs and rehabilitation robots are being introduced in various forms. Robots are expanding into diagnosis with the use of microscopic bots that travel inside the human body, to robots to diagnose diseases, detect abnormalities, or identify potential at-risk patients (see Chatbots and AI in Surgical/Robotics below, page 206 and Fig. 6 2 [30]). A procedure, capsule endoscopy, has been FDA-approved and in use since 2001, which involves putting a tiny camera inside a pill-sized casing. Patients swallow the “pill,” and while it makes its way through the GI tract, the camera takes images that doctors can use to determine if there are abnormalities [31]. Medical robotics technology is here to stay with continuing advances in the field. There are, however, challenges to be overcome for these technologies to impact patient care over the long-run. Complex and expensive R&D, factors such as regulations, pricing, and training

Chapter 6 • Current AI applications in medical therapies and services

207

for medical professionals will all affect the evolution of robotics. And indeed, emotional and ethical considerations in a field will be a factor as well [32].

6.1.11 Population health (demographics and epidemiology) and AI in medical care The concept of population health has become a high priority in the delivery of medical care and, thus, on the radar screen of industry. BaseHealth of Sunnyvale, CA, an industry innovator in predictive analysis of health and disease risk and interventional management, has established an ongoing partnership with one of the country’s largest health systems, Banner Health. It has implemented its precision insights’ platform to enable Banner Health to identify both rising health risks within populations and actionable clinical interventions. This system allows clinicians and health managers to form customized, proactive care plans for their plan members designed to intervene in identified risk factors. BaseHealth explored categories of risk and cost on 100,000 Banner Health members. It also implemented its analytic platform on 50,000 of the network’s Medicare Advantage members. It incorporates data on health indicators for more than 42 common chronic and acute diseases, in particular, identifying members transitioning from low to moderate to high risk for chronic and acute conditions, with more information being added on an ongoing basis. “This gives us a much better way to target the right patient with the right intervention for their risk. It provides us with a more proactive approach to healthcare by alerting us to the specific factors driving disease risk. Then, we can plan and implement clinical and health management efforts,” said Michael Parris, Senior Director for Business Intelligence and Analytics at Banner Health. This partnership between industry and a medical delivery system represents a forwardthinking, scientific approach to population health and management of rising risk. Besides identifying who is at risk, this analytic approach also tells us who can avoid predisposed conditions with specific clinical interventions [33].

6.1.12 Precision medicine/health (personalized health) and AI in medical care Precision medicine targets medications and treatments based on a patient’s medical history, genetic makeup, and data recorded by wearable devices. Corporations, universities, doctors, and government-funded research collectives are using AI to develop precision treatments for complex diseases. Their goal is to gain insight from increasingly massive datasets into what makes patients healthy at the individual level. With those insights, new drugs could be developed, new uses for old drugs, suggest personalized drug combinations, and predicting disease risks could be refined [34]. Dr. Bertalan Meskó, director of the Medical Futurist Institute, suggested that “there is no precision medicine without AI.” Although forward-looking, his point acknowledges that without AI analysis, patient data will remain severely untapped [35]. Through the application of

208

Foundations of Artificial Intelligence in Healthcare and Bioscience

AI and machine learning to these multiple data sources like genetic data, EHRs, sensor data, environmental, and lifestyle data, researchers are taking initial steps toward developing personalized treatments for diseases like cancer, depression, and more. The National Institutes of Health (NIH) research program, plans on collecting data on 1 million patients to study precision medicine. It began enrolling participants in May 2018 to create a massive patient information database that researchers can use various methods, including AI, to analyze and develop precision treatments. Eric Topol, a geneticist, cardiologist, and director of the Scripps Research Translational Institute, stated, “Much more can be gleaned, many more strides can be made once we start working with this multi-modal data.” AI-based diagnostic tools, such as the FDA-approved imaging tool [36] for diagnosing diabetic eye disease, have already entered hospitals, but AIbased treatments are still at the foundational stage, Topol says. Dr. Elizabeth Krakow is using machine learning to develop precision cancer treatments at the Fred Hutchinson Cancer Research Center in Seattle, WA. Dr. Krakow treats and studies leukemia patients who have relapsed after stem cell transplant. “Past treatments, the current complexity of the disease, side effects—all that info needs to be integrated to intelligently choose the new treatment,” says Krakow. Dr. Krakow and her team assembled medical data of 350 relapse patients, 1000 pages per patient. They built a machine-learning algorithm to predict the best treatment sequence for any patient, at any point in time. Her study will enable future work by creating a goldstandard that will account for the sequential nature of cancer treatment, which clinical trials have failed to establish [37].

6.1.13 Healthcare analytics and AI in medical care Transforming big data into useful clinical information describes the various forms of health care analytics (descriptive, diagnostic, predictive, and prescriptive Analytics [38] - see also Chapter 4, page 99). Academic research suggests that the quality and efficiency of health care can grow by 15% 20% by transforming big data into analytics [39]. There are many analytical tools to analyze the data, including analysis, abstraction, processing, integration, management, coordination, and storage. The analytic results of healthcare can be “visualized” and presented in pictorial or graphical format for understanding complex data and better decision making. It can then be used to understand patterns and correlations among the data. The potential application areas are fraud detection; epidemic spread prediction, Omics, clinical outcome, medical device design, and the insurance industry. The data can then be applied in almost all the areas of personalized patient care, manufacturing, pharmaceutical development, etc. [40]. The application of big data is widely adopted in personalized healthcare, which offers an individual-centric approach [41]. The application in “Omics” enables the realization of strategies for diseases and increases the specification of medical treatments (e.g. “precision medicine”) [42]. Healthcare Insurance companies/ payers are using big data in underwriting, fraud deduction, and claim management. Big Data implementation facilitates a wider set of

Chapter 6 • Current AI applications in medical therapies and services

209

device materials, delivery methods, and tissue interactions, anatomical configurations to be evaluated. It is used during all phases of pharmaceutical development, particularly for drug discovery [43]. Specifically, data-derived influences will prompt suitable updates of diagnostic assistance, clinical guidelines, and patient triage to permit more particular and modified treatment to advance medical results for patients [44]. The most challenging parts for big data and health care analytics in healthcare are data privacy, data leakage, data security, efficient handling of large volumes of medical imaging data, information confidentiality, and security. Also, wrong use of health data or failure to safeguard the healthcare information, and understanding unstructured clinical notes in the right context, extracting potentially useful information must be addressed [45].

6.1.14 Preventive health and AI in medical care The health care system has always focused primarily on reacting to disease rather than prevention, treating people after they become patients. The exploding costs of disease treatment, particularly chronic and late-stage diseases, has led to a renewed focus on disease prevention. The goal of preventive health care is to assist reactive medicine by preempting disease through preventive measures and early detection. Though this is not a new concept [46], modern-day technological advances and AI have created a unique opportunity to combine preventive medicine with personalized data. Utilizing precision health, the practice of personalized health can change the way society perceives health care so that the individual feels empowered to protect and prevent their risk of disease. Disease risk has traditionally been evaluated based on patient age, family history, and more recently, through genetic screening. Genome-wide association studies have allowed for a more in-depth exploration of the genotypic landscape. This has to advance beyond monogenic disorders to complex diseases, such as diabetes and heart disease [47]. However, many commonly identified genetic variants have only modest effects on disease. Thus, the “exposome,” the nongenetic exposures that affect human health and disease, must not be ignored. The effect and benefits of disease risk prediction and monitoring must be followed with well-designed clinical trials to determine whether there is a true benefit in outcomes. However, the data can be used regardless of the outcome to further our understanding of the disease. Precision and preventive health must be just that and must be applied precisely. Not all diseases will benefit from the same degree of monitoring, and not all patients will be monitored the same for a given disease based on their risk [48].

6.1.15 Public health and AI in medical care Public health researchers collect data from many sources and analyze the incidence and prevalence of different health conditions and related risk factors. AI presents an accurate picture [49] of population health by extracting health and non-health data at different levels to coordinate and integrate information about populations and communities regarding evidence about the epidemiology and control of chronic diseases.

210

Foundations of Artificial Intelligence in Healthcare and Bioscience

Recent AI methodological developments enable multi-level modeling to combine individuallevel data with sociomarkers. These are measurable indicators of social conditions at the grouplevel to improve disease surveillance, disease prediction, and implementation and evaluation of population health interventions. Many of these sociomarkers shape the health and well-being of individuals and communities with roots outside the conventional healthcare system. The internet and AI have expanded public health research beyond its traditional realm [50]. Other surveillance methods collecting data from clinical systems and registries are complemented by tracking Internet-based health indicators. Advances in intelligent web-based applications, online smart devices, and social media platforms are assisting public health practitioners and researchers in disease surveillance, epidemic detection, behavior monitoring, and public health communication and education. AI is currently used to process personalized data, to elicit patient preferences, and to help patients (and families) participate in the care process. This participation allows physicians to provide high quality and efficient, personalized care by personalizing “generic” therapy plans that connect patients with information beyond those available within their care setting [51]. Clinicians and public health practitioners use this AI-based technology to deliver tailored therapy or interventions based on the best evidence to maintain high-quality patient care and create healthier communities. Patient complexity is increasing with the average life expectancy in the US on the decline (Table 6 1) [52]. Baby boomers are aging (20% of the 65 1 population by 2029), Table 6–1

Life expectancy in the United States compared to other industrial countries.

Data from National Center for Health Statistics; 2016.

Chapter 6 • Current AI applications in medical therapies and services

211

and multi-morbidity affects 60% (see Comorbidity below, page 242) of this population associated with over twice as many patient-physician encounters. Social and behavioral contexts are critical in the management of these complex patients, and as such, these factors become crucial components of technology-based solutions. Despite some limitations, AI tools and techniques, still in their infancy, already provide substantial benefits in providing in-depth knowledge on individuals’ health and help in predicting population health risks. Their use of medicine and public health is likely to increase substantially in the near future.

6.1.16 Access and availability and AI in medical care Adequately integrated and applied, AI could make health care more accessible to underserved communities while lowering costs, sorely needed in the United States. The U.S. ranks poorly on many health metrics despite an average annual health care cost of $10,739 per person [53]. AI systems could relieve overworked doctors and reduce the risk of medical errors that may kill tens of thousands, if not hundreds of thousands, of U.S. patients each year [54]. Yet, critics point out that all that such promise could vanish if AI tramples patient privacy rights, overlooks biases and limitations, or fails to provide services to improves health outcomes for most people. To make the most of these predictions in health care, along with AI, humans must help make decisions that can have health and financial consequences. Because AI systems lack human intelligence, AI can make predictions that could prove harmful if blindly followed by physicians and hospitals. If you follow the model blindly, says Kenneth Jung, a research scientist at the Stanford Center for Biomedical Informatics Research, “then you’re hosed.” Because the model is saying: ‘Oh, this kid with asthma came in and they got pneumonia, but we don’t need to worry about them, and we’re sending them home with some antibiotics.’ Just who stands to benefit most from AI health care services isn’t exactly clear. There are already health care disparities: According to the World Bank and the World Health Organization, half of the globe’s population lacks access [55] to crucial health care services, and nearly 100 million people are pushed into extreme poverty by health care expenses. AI could either improve these inequalities or make them worse. Accenture, a consulting firm, predicts that top AI applications could save the U.S. economy $150 billion per year by 2026. But it’s unclear if patients and health care systems supplemented by taxpayer dollars would benefit, or more money might simply flow to the tech companies, health care providers, and insurers. “The question of who is going to drive this and who is going to pay for this is an important question,” says Isaac Kohane, a biomedical informatics researcher at Harvard Medical School. An AI assistant may not sound as exciting as an AI doctor, but it could help to free up physicians to spend more time with their patients and improve the quality of care. Family physicians frequently spend more than half of their working days on recording in electronic health records. This is the main factor behind physical and emotional burnout, which has

212

Foundations of Artificial Intelligence in Healthcare and Bioscience

dire consequences [56], including even patient deaths. Ironically, EHRs were supposed to improve medical care and cut costs. Now many experts point to them and to AI more cautiously.

6.2 Pharmaceutical and biopharmaceutical care 6.2.1 Big data analytics and AI in pharmaceutical care Healthcare databases are being used more and more for supporting drug discovery [57]. Large repositories of data that can traditionally be used for pharmacovigilance include spontaneous reporting systems (SRS) databases and networks of healthcare databases that contain medical records or medical claims. Also, the gold-standard for adverse drug reaction (ADR) detection is the spontaneous suspected ADR reporting system [58]. The Food and Drug Administration (FDA) Adverse Event Reporting System (FAERS) contains publicly available information on adverse events and medication errors submitted to the FDA. It is essential to highlight the role of health care database networks as big data applied to pharmacovigilance. Such networks reflect the attributes of big data in routine pharmacovigilance, but they are a recent development in the field. Database networks are commonly made up of EMRs and claims databases. They could also be linked to genomic data, biological specimens held in biobanks, or social media. The growing use of healthcare database networks has led to an expanding range of analytic methodologies to harness the large volumes of heterogeneous data. These networks are developing not just in terms of building distributed data networks for analysis. The availability of large amounts of healthcare data and the increasingly powerful tools to analyze it is an opportunity to study drug use and safety on ever wider scales and in greater detail. The combined use of different types of databases in a distributed database network augments the power and potential of drug utilization and safety studies. More pharmacovigilance activity does not necessarily mean more effective signal detection and strengthening [59].

6.2.2 Health information and records (EHR) and AI in pharmaceutical care Data integration is a process that consists of retrieving, cleaning, and organizing data, usually obtained from many different sources. Critically important information exists in electronic health records (EHRs) and other sources of medical data. Medical information, obtained from the patient’s EHRs and imaging data, can sometimes be a key factor in the success of projects in the pharmaceutical area. All these datasets need to be integrated and organized to become useful for a given objective. Mining EHRs is a process that can also provide useful information to be used in drug discovery, drug recycling, and drug safety research. Given the current focus on evidencedriven medicine, EHRs will become one of the most valuable resources hospitals and

Chapter 6 • Current AI applications in medical therapies and services

213

health institutions have to manage and explore, with the help of the biotechnology industry and academy [60]. Both structured and non-structured data in EHRs can be mined, the latter requiring the use of natural language processing technologies that are just now becoming of age. Future developments in pharmacogenomics will be strongly dependent on the ability of companies to integrate the information in EHRs with the genetic information of the patients, which will become progressively more common. Big data, artificial intelligence, and machine learning will become instrumental in all future biotechnology research. More researchers in biotechnology will have to become aware of the methods required to deal with large amounts of data and to include in their research team people with the ability to integrate, organize, and explore this data. Researchers specialized in bioinformatics and EHR data will become vital elements in any biotechnology research team’s efforts.

6.2.3 Research/clinical trials and AI in pharmaceutical care AI-based research models are helping clinical trial design, patient recruitment, and AI-based monitoring systems aim to increase study drug trial adherence and decrease dropout rates. “AI is not a magic bullet and is very much a work in progress, yet it holds much promise for the future of healthcare and drug development” [61]. AI can potentially boost the success rate of clinical trials by: • Efficiently measuring biomarkers that reflect the effectiveness of the drug being tested; • Identifying and characterizing patient subpopulations best suited for specific drugs. Less than a third of all phase II compounds advance to phase III, and 1 in 3 phase III trials fail-not because the drug is ineffective or dangerous, but because the trial lacks enough patients or the right kinds of patients; • Start-ups, large corporations, regulatory bodies, and governments are all exploring and driving the use of AI for improving clinical trial design. IBM’s Stefan Harrer says, “What we see at this point are predominantly early-stage, proof-of-concept, and feasibility pilot studies demonstrating the high potential of numerous AI techniques for improving the performance of clinical trials.” The authors also identify several areas showing the most real-world promise of AI for patients. For example: • • • • •

AI-enabled systems might allow patients more access to and control over their data; Coaching via AI-based apps could occur before and during trials; AI could monitor individual patients’ adherence to protocols continuously in real-time; AI techniques could help guide patients to trials of which they may not have been aware; In particular, Harrer says, the use of AI in precision-medicine is promising, such as applying technology to advance how efficiently and accurately professionals can diagnose and treat and manage neurological diseases. “AI can have a profound impact on improving patient monitoring before and during neurological trials,” he says.

214

Foundations of Artificial Intelligence in Healthcare and Bioscience

Potential implications for pharma were also evaluated, which included: • Computer vision algorithms that could potentially pinpoint relevant patient populations through a range of inputs from handwritten forms to digital medical imagery; • Applications of AI analysis to failed clinical trial data to uncover insights for future trial design; • The use of AI capabilities such as Machine Learning (ML), Deep Learning (DL), and Natural Language Processing (NLP) for correlating large and diverse data sets such as electronic health records, medical literature, and trial databases to help pharma improve trial design, patient-trial matching, and recruiting, as well as for monitoring patients during trials. The authors also identified several important takeaways for researchers: • “Health AI” is a growing field connecting medicine, pharma, data science, and engineering; • The next generation of health-related AI experts will need a broad array of knowledge in analytics, algorithm coding, and technology integration; • Ongoing work is needed to assess data privacy, security, and accessibility, as well as the ethics of applying AI techniques to sensitive medical information. AI methods have been applied to clinical trials for only the past 5 8 years. Thus, it will be several years in a typical 10- to 15-year drug-development cycle for AI’s impact to be accurately assessed. During those years, rigorous research and development will be necessary to ensure the viability of these innovations, Harrer says. “Major further work is necessary before the AI demonstrated in pilot studies can be integrated into clinical trial design,” he says. “Any breach of a research protocol or premature setting of unreasonable expectations may lead to an undermining of the trust and ultimately, the success of AI in the clinical sector” [62]. An acute need for AI intervention in drug trials is being felt with the fast-track developments of vaccines for the COVID-19 pandemic. The U.S. government has initiated a program, "Operation Warp-speed," to move vaccine trials at a much faster pace than traditionally used in such large double blind, placebo-controlled studies. There is little information being provided as to the trial protocols, but it is more than likely (or should be) that AI is having a significant influence on the process.

6.2.4 Blockchain and AI in pharmaceutical care The pharmaceutical sector helps in the introduction of new drugs into the market as well as assists in ensuring the safety and validity of drugs sold to the end consumer. It also aids in the evaluation and processing of safe medications, which ultimately assist in quicker patient recovery. Drug companies face challenges, including tracking their products and facing severe risks by allowing counterfeiters to compromise their production, or introduce fake drugs into the system. During the production and research and development (R&D) of these drugs, blockchain could become a best-fit technology for evaluating, monitoring, and ensuring the production processes of potential drugs. Regarding the effective delivery of reliable and authentic medicines

Chapter 6 • Current AI applications in medical therapies and services

215

to patients, there is a need for monitoring, evaluating, and ensuring the overall process of pharmaceutical drug development and supply through the use of AI digital technologies. A digital drug control system (DDCS) [63] could be a solution to the prevention of counterfeit drugs. Using a blockchain-based DDCS, big pharmaceutical industries (Sanofi, Pfizer, and Amgen) launched a joint pilot project for inspection and the evaluation of new drugs [64]. Using blockchain, it would be possible to track the production and location of the drugs at any given time and to improve the traceability of falsified drugs [65], the security of the drug supply system [66], and guarantee the quality of drugs supplied to consumers or end-users [63]. Blockchain can decipher and report dissemination of pharmaceuticals. This arrangement expands the speed of preparing data, guarantee straightforwardness of archive dissemination, and reduce the likelihood of loss, harm, or fabrication of archives. The innovation itself controls such aspects: The made block can’t be transformed nor erased. Blockchain will guarantee that data can’t be tainted [67].

6.2.5 Internet of Things (IoT) and AI in pharmaceutical care Digital health innovations are being produced by the convergence of AI, the Internet of Things (IoT), and cyber-physical systems (CPS) in health care and Pharma (IoT and CPS described in Chapter 3, page 55). CPS creates a dynamic digital map of virtually all things that can be analyzed in much more sophisticated ways than a bar code scanning system. CPS examples include self-driving cars, wearables for digital monitoring of heart arrhythmias (AliveCor [68]), industrial AI-powered robots in smart factories and health robots delivering home care services to disabled persons. Another exciting prospect of digital health powered by AI, IoT, and CPS is remote phenotypic (genetic) data characterization of pharmaceutical outcomes in clinical trials meaningful to patients. The IoT could bring about pharmacy and health services innovation for rural or remote communities with limited access to medical product information [69]. These IoT and CPS innovations have led to what is being referred to as Pharma 4.0. This title comes from the 4 evolutionary stages of manufacturing, the fourth of which is called Industry 4.0, with a complementary step called Health 4.0 [70]. The concept of Pharma 4.0 integrates the manufacturing control process for drugs with the entire manufacturing process as well as with quality control and business functions. “Vertical integration, manufacturing execution, and enterprise resource planning systems allow real-time, data-driven decisions. Horizontal integration of laboratory systems with the manufacturing process, equipment, and facility systems allows feedforward and backward controls.” A primary goal of Pharma 4.0 is making pharmaceutical production safer and more efficient along the whole value chain [71].

6.2.6 Telehealth and AI in pharmaceutical care Typical data sources in telehealth settings include medical sensor devices such as blood pressure meters and body weight scales. Patients can also enter data regarding their

216

Foundations of Artificial Intelligence in Healthcare and Bioscience

subjective wellbeing and their medication intake [72]. To monitor adherence to medications, a commonly used approach is smartphone apps, where app denotes software specifically designed for mobile devices like tablets or smartphones [73]. In hospitals and other inpatient settings, longer lists of monitored drugs may exist. In these scenarios answering prompts on a touchscreen can provide a more comfortable alternative to physically interacting with every single medication. The prescribed medications can be displayed on the screens together with default values for the administration of a particular drug, e.g., 1 3 times daily, 100 200 mg, single dose. Rulesets for medication intake can be applied in telehealth settings through the implementation of reminders or notification messages, which are triggered by events derived from automated analysis of biosignals [74].

6.2.7 Chatbots and AI in pharmaceutical care The benefit of ChatBots to patients is to provide advice and information for a healthy life by interacting with them for any personal query related to health care without needing to be physically available to the health care professional or hospital for a small problem. A voiceto-text (natural language processing) analysis bot connects patients in conversation about their medical issues and answers their queries [75]. PharmaBot, developed in 2015 [76] help clinicians securely file prescription errors into the hospital’s system via their smartphone. The bot should, in theory, enable greater security and save time for staff. Medxnote [77], a bot, plugs directly into a hospital’s electronic medical records system. If a clinical pharmacist spots an error, they open up the app, tell the bot that something’s up, describe the error, and take a photo of the patient’s chart. This data is then logged into the system. This is done on the spot in a couple of seconds.

6.2.8 Natural language processing (NLP) and AI in pharmaceutical care Researchers have focused on developing and improving the use of natural language processing to recognize and extract information from medical text sources (e.g., detection of medication-related information [78] or patient safety events [79] from patient records; drugdrug interactions from the biomedical literature; [80] or disease information from emergency department free-text reports) [81]. Natural language processing techniques have also been applied to extracting information on adverse drug reactions from the vast amounts of unstructured data available from the discussion and exchange of health-related information between health consumers on social media [82]. The results demonstrate that it is feasible to apply AI to automate safety case processing. Machine-learning algorithms were able to successfully train solely based on AI database content (i.e., no source document annotations), and the multiple combined accuracy metrics allowed adjudication of the different vendor algorithms [83].

Chapter 6 • Current AI applications in medical therapies and services

217

6.2.9 Expert systems and AI in pharmaceutical care The introduction of a technology-based information expert system to identify drug-related problems is introducing a new generation in pharmacy. It is based on patient data captured from the pharmacy system and other external data systems. Consistent with workflow robotics, this makes for less work for the pharmacist who can then shoulder more responsibility in identifying serious drug-related problems [84]. Overprescribing of antibiotics continues to be a problem in health care. An expert system, TREAT [85], was developed based on a causal probabilistic network to improve antibiotic therapy in hospitalized patients. The system proposed and being used at the University of South Carolina applies user-centered design techniques for the infectious disease diagnosis and antibiotic prescription in intensive care units so as to prevent the overuse of antibiotics in primary care.

6.2.10 Robotics and AI in pharmaceutical care Collaborative robots, or “cobots are assisting the efficiency in pharmaceutical research, drug production, and quality control.” These cobots offer pharma greater reliability, consistency, and precision. Once a cobot is programmed, repetitive tasks are completed with low error. They are ideal for protecting sterile environments from contamination, and many cobots can work 24-hours a day. As a result, a cobot’s return on investment occurs within a year in most industries [86]. With the advances in AI, robots are more trustworthy. As a result, doctors and a large number of institutions are now employing robots along with human supervision to carry out activities previously done by humans. The major advantage of AI is that it reduces the time that is needed for drug development, and it reduces the costs that are associated with drug development. It also enhances the returns on investment and may cause a decrease in price for the end-user [87].

6.2.11 Population health (demographics and epidemiology) and AI in pharmaceutical care Pharmacists can play a dramatic role in improving population health through active chronic disease and drug therapy management. Further, by assuring appropriate patient education and compliance in association with medication therapy, pharmacists can not only improve health outcomes but reduce readmissions, improve patient safety, and ultimately reduce healthcare costs. The literature is replete with examples of pharmacist involvement and leadership on patient care teams [88]. Programs that include medication therapy management, disease state management, wellness promotion through initiatives in areas such as smoking cessation, medication management during care transitions, population health research, and the application of pharmacoeconomics are just a few examples of where pharmacists’ opportunities lie.

218

Foundations of Artificial Intelligence in Healthcare and Bioscience

As a part of a population health initiative, pharmacists must be wholly integrated with a focus on improving medication use, adherence, and outcomes. As medication experts, pharmacists are uniquely qualified to play an important role in population health management. They must take a leadership position in developing new strategies to deliver comprehensive patient care [89].

6.2.12 Precision medicine/health (personalized health) and AI in pharmaceutical care Medication nonadherence remains a substantial public health problem. Between 25% and 50% of patients worldwide do not take their medications as recommended [90]. In the USA, suboptimal adherence has been associated with 125,000 deaths, 10% of hospitalizations, and costs up to $289 billion annually [91]. Medication adherence is a complex and multifaceted behavior, but the adverse public health effects of nonadherence are preventable. Combining the principles of patient-centered care with “big data” and predictive analytic tools employed by precision medicine, this dynamic concept on healthcare will allow researchers to identify patients at greatest risk for adherence problems, and enable clinicians to deliver adherence interventions to those who need it. While precision medicine and population health take different approaches, there is a need for both. Aspects of precision medicine, such as tailoring therapies or intervention content to specific patients’ needs, can be scaled up and applied in a population health model. Innovative approaches are needed in precision medicine and population health to advance the field of medication adherence including standardizing taxonomy to describe adherence and report adherence-related study results, integrating data sources to develop innovative measurement of adherence, and considering innovative mechanisms (i.e., behavioral economics principles) to incentivize both patients and payers to engage in improving medication adherence [92].

6.2.13 Healthcare analytics and AI in pharmaceutical care Pharmacovigilance involves post-marketing monitoring and detection of adverse drug reactions (ADRs) to ensure patient safety [93]. Pharmacovigilance studies use descriptive analytics to focus on identifying an association between adverse drug effects with medication [94]. Big Data and advanced analytics are driving a paradigm shift in the healthcare and pharmaceutical industry with multiple innovations ranging from precision medicine and digital therapeutics to the adoption of accountable and value-based care models. Drug developers are making substantial investments in Big Data and artificial intelligence-driven drug discovery platforms to shorten the process of successfully discovering promising compounds. Besides, Big Data technologies are increasingly being utilized to streamline clinical trials, enabling biopharmaceutical companies to significantly lower costs and accelerate productive trials [95]. According to McKinsey, “Big data and machine learning in pharma and medicine could generate a value of up to $100 billion annually, based on better decision-making, optimized

Chapter 6 • Current AI applications in medical therapies and services

219

innovation, improved efficiency of research/clinical trials, and new tool creation for physicians, consumers, insurers and regulators” [96]. The growing applications of ML in pharma and medicine suggest a strong potential of synchronization of data, analysis, and innovation in the future of pharmacy and health care.

6.2.14 Preventive health and AI in pharmaceutical care The practice of pharmacy and the industry is evolving. It is setting as its priority, improving the quality of healthcare, and driving its efforts towards positive value-based outcomes. Pharmacies are the most accessible and affordable healthcare providers and have the potential to become preventive and health management centers instead of only medication “counters and pourers.” AI programs help practices in the areas of diagnosis, treatment protocol development, drug development, personalized medicine, and patient monitoring and care. AI technology can help pharmacies provide more personalized healthcare and be able to offer advice, guidance, and an expanded suite of services (e.g., immunizations, screenings, MTM, disease state management). The primary aim of AI applications in pharmaceutical care is to analyze relationships between prevention or treatment techniques and patient outcomes. The next generation in pharmacy technology is the introduction of an AI technology-based information expert system to identify timely drug-related problems based on patient data captured from the pharmacy system and other external data systems [97].

6.2.15 Public health and AI in pharmaceutical care Total US pharmaceutical expenditures are reported at $1443 per capita [98]. This finding is consistent with the US Department of Health and Human Services calculations that included medications administered in hospitals, physician offices, and other facilities, and showed that 16.7% of total personal health care spending (an estimated $457 billion in 2015) was attributable to pharmaceuticals [99]. No other category of U.S. health-related spending accounts for as much of the costs as pharmaceuticals. More AI support and data management should encourage the push for regulation of drug prices, but the fundamental problem is political will. However, a concern for public health and wellness and public pressure will begin to drive the political process as rising health care costs fuel more public demand. The bill for health care products and services is mostly not paid by the people receiving the care, but collectively by society through insurance and taxes. Thus, the higher spending on health care by 1 person affects public health by taking away money from the rest of society. The United States can reduce the cost of health care. It will take the inclusion of AI systems and a medical profession, health systems, payers, and policymakers dedicated to controlling costs. In many ways, the future of the US health care system is in their hands [100].

220

Foundations of Artificial Intelligence in Healthcare and Bioscience

6.2.16 Access and availability and AI in pharmaceutical care The role of artificial intelligence in life sciences has a proven track record as a useful tool in pharmaceutical research. Companies are also rushing into the transformational impact they can have on their commercial operations, such as sales and marketing. Machine learning platforms can transform the ability of any life science company to mine big data for the right answers. It can influence formulary decisions, clinical guideline development, and separate customers into highly specific segments. This empowers sales teams to personalize their activity to health providers to a greater degree. Materials can be recommended based on the sales rep’s previous meetings with a doctor. For example, if a doctor has previously expressed an interest in diabetes drugs, the sales rep will arrive equipped with information on appropriate medications, ensuring that personalized and impactful. Machine learning will not replace marketing operations teams and reps because it cannot think on its own. However, companies and sales teams that use machine learning and AI in life sciences like pharmaceuticals will reach markets and patients that others can’t [101].

6.3 Hospital care 6.3.1 Big data analytics and AI in hospital care Innovations in AI healthcare technology are streamlining the patient experience, helping hospital staff process millions, if not billions of data points, faster and more efficiently. A health care company, CloudMedX [102] helps hospitals and clinics manage patient data, clinical history, and payment information by using predictive analytics to intervene at critical junctures in the patient care experience. The Cleveland Clinic is using AI to gather information on trillions of administrative and health record data points to streamline patient experiences [103]. Johns Hopkins Hospital recently announced a partnership with GE to improve the efficiency of patient operational flow through predictive AI techniques [104]. A risk prediction company, KenSci, uses big data with artificial intelligence to predict clinical, financial, and operational risk by taking data from existing sources to predict everything from who might get sick to what’s driving up hospital costs. It has partnered big tech and data science companies, including GE, KPMG, Allscripts, and Microsoft [105]. Other major hospitals and AI companies (Mayo, IBM, Google, Amazon) are working with hospital systems to predict ICU transfers, improve clinical workflows, and even pinpoint a patient’s risk of hospital-acquired infections. With infection, by using the company’s artificial intelligence to mine health data, hospitals can predict and detect sepsis, which ultimately reduces death rates.

6.3.2 Health information and records (EHR) and AI in hospital care Most hospitals are now integrating electronic health records (EHRs). The growth rate of the data exceeds that coming from any other media [106]. However, EHRs store data in a complex structure from different sources, making it difficult to build large datasets. Integrating EHRs requires first creating a dictionary to translate the system codes into readable text or NLP for physicians.

Chapter 6 • Current AI applications in medical therapies and services

221

The potential utility of gathering all available data from an EHR for research and AI applications is unimaginable. The availability of millions of data from a single topic could revolutionize descriptive and epidemiological studies as well as train machine and deep learning algorithms to predict clinical situations and help clinicians in clinical decisionmaking processes. The rapid data collection from EHRs and algorithm evaluations, real-time predictions, and links with clinical recommendations create an AI smart support system that can be implemented in a hospital to facilitate clinical decision-making processes [107]. The availability of large amounts of EHR data, the use of ML, and the high level of performance of new computers suggest the immense power of AI in improving medical and hospital care [108].

6.3.3 Research/clinical trials and AI in hospital care Clinical trials are a valuable tool for creating new treatments. But finding the right volunteer subjects is difficult and can undermine the effectiveness of the studies. Researchers at Cincinnati Children’s Hospital Medical Center designed and tested a new computerized solution that uses AI to identify eligible subjects from Electronic Health Records (EHRs) [109]. Compared to manually screening EHRs to identify study candidates, the study showed the system called the Automated Clinical Trial Eligibility Screener© (ACTES) reduced patient screening time by 34% and improved patient enrollment by 11.1%. The system also improved the number of patients screened by 14.7% and those approached by 11.1%. The system uses natural language processing (NLP) as the system analyzes vast amounts of linguistic data. Machine learning allows computerized systems to learn and evolve from experience without explicitly being programmed automatically. This AI ability makes it possible for computer programs to process data, extract information, and generate knowledge independently. The system extracts structured information, including patient demographics and clinical assessments from EHRs. It also identifies unstructured information from clinical notes, including the patients’ clinical conditions, symptoms, treatments, and so forth. The extracted information is matched with the study’s eligibility requirements to determine a subject’s qualifications for a specific clinical trial [110].

6.3.4 Blockchain and AI in hospital care The potential for the use of blockchain technology in hospitals has started to be tested in a pilot platform designed to help the Food and Drug Administration’s Office of Translational Sciences explore how to use the technology for healthcare data management. The pilot project is currently being implemented at 4 major hospitals. It is using Ethereum (an opensource, public, blockchain-based distributed computing platform and operating system [111]) to manage data access via virtual private networks. The project is built on the InterPlanetary File System (IPFS) to utilize encryption and reduce data duplication via offchain cloud components with cryptographic algorithms to create user sharing [112]. Blockchains enable decentralized management and makes it suitable for applications where healthcare stakeholders (e.g., hospitals, patients, payers, etc.) wish to collaborate

222

Foundations of Artificial Intelligence in Healthcare and Bioscience

without the control of a central management intermediary. Blockchains also provide immutable audit trails; it enables a record of data ownership; it ensures the robustness and availability of data; and it increases security and privacy of data [113].

6.3.5 Internet of Things (IoT) and AI in hospital care [15] The concept of “smart hospitals” is a new type of hospital that can optimize, redesign, or build new clinical processes and management systems using AI digitized networking infrastructure of integrated assets. Smart hospitals rely on optimized and automated means, utilizing the Internet of Things and big data combining connected devices with cloud computing, big data analytics, and AI. The smart hospital involves 3 essential layers — data, insight, and access. Data is collected daily and fed to analytics or machine learning software to derive a “smart” insight. This valuable insight moves a hospital a step further from being just digital. It makes it truly smart. This insight must be accessible to all potential users, including doctors, nurses, facility personnel, through an interface like a desktop or a mobile device. This enables them to make critical decisions faster. There are 3 areas that any smart hospital addresses: operations, clinical tasks, and patient centricity: • Efficiency at operations can be achieved by building automation systems, and implementing intelligent asset maintenance and management solutions, improving the internal logistics of mobile assets and control over people flow. • Efficiency in clinical tasks is concerned with ways to improve doctors’ and nurses’ work efficiency, especially the emergency, surgery, and radiology areas. Clinical efficiency also involves improving patient outcomes by ensuring patient engagement and monitoring. • Patient centricity of smart hospitals means improving the patient experience, such as building a smart patient room, which allows voice-based interactive devices such as Amazon Echo with Alexa or tablets, to call nurses or dim the lights.

6.3.6 Telehealth and AI in hospital care [114] Hospitals are using telehealth to improve access and fill gaps in care, provide services 24/7, and expand access to medical specialists. They offer several types of telehealth services to improve access to services and quality of care. Telehealth delivery platforms fall into 2 main categories: provider-to-provider; and direct-to-consumer. One of the most frequent reasons hospitals use telehealth is to extend access to specialty care. Other reasons for embracing telehealth include efficient post-operation follow-up, lower hospital-readmission rates, better medication adherence, and positive care outcomes. Nine of the most frequent uses of telehealth in hospitals include the following services: 1. Pharmacy services; 2. Chronic care management; 3. Telestroke services;

Chapter 6 • Current AI applications in medical therapies and services

4. 5. 6. 7. 8. 9.

223

Tele-ICU tool; Specialty telemedicine consults; Diagnostic screening for diabetes-related eye disease; Sleep disorders; Telepsychiatry; Opioid-use disorder (OUD).

Telehealth tools for treating patients are a more effective and efficient way to use limited staff and resources. Regional growth and development of telehealth systems can be leveraged across care sites by connecting hospitals, physician offices, diagnostic centers, and long-term care through telehealth networks. Virtual care technology can improve the timing, quality, and impact of care for more patients by eliminating travel and bringing in specialized care as needed.

6.3.7 Chatbots and AI in hospital care Among the top 5 US hospitals (Mayo, Cleveland Clinic, Mass General, Johns Hopkins, and UCLA), the most popular AI application is chatbots. Their uses range from automating physician inquiries and routing physicians to the proper specialist to an array of patient communication services. Mayo Clinic is using a startup technology, Tempus [115], which focuses on developing personalized cancer care using a machine learning platform. This startup partnership also includes such prestigious hospitals as the University of Michigan, the University of Pennsylvania, and Rush University Medical Center. Microsoft introduced its AI digital assistant, Cortana [116], to the Cleveland Clinic to help the medical center “identify potential atrisk patients under ICU care.” Cortana, a type of command center first launched in 2014, is now integrated into Cleveland Clinic’s hospital system. NVIDIA announced its affiliation with the Massachusetts General Hospital Clinical Data Science Center as a “founding technology partner” [117]. The Center aims to serve as a hub for AI applications in healthcare for the “detection, diagnosis, treatment and management of diseases.” Johns Hopkins Hospital teamed up with GE Healthcare Partners [118] to design the Judy Reitz Capacity Command Center, which receives “500 messages per minute.” Besides functioning as a chatbot for the hospital, the system integrates data from “14 different Johns Hopkins IT systems” across 22 high-resolution, touch-screen enabled computer monitors. UCLA researchers designed the Virtual Interventional Radiologist (VIR) [119]. This is a chatbot that “automatically communicates with referring clinicians and quickly provides evidence-based answers to frequently asked questions.” The ultimate goal is to expand the functionality of the application, for “general physicians interfacing with other specialists, such as cardiologists and neurosurgeons.”

6.3.8 Natural language processing (NLP) and AI in hospital care One of the biggest challenges in hospital care and administration is interpreting unstructured data (e.g., doctors' notes). One of the greatest strengths of natural language processing (NLP)

224

Foundations of Artificial Intelligence in Healthcare and Bioscience

is its ability to extract meaningful insights from unstructured data. Thus, the use of NLP in hospital care has become a critical ingredient [120]. NLP can analyze unstructured patient data based on clinical criteria, specific keywords, and text patterns. This accentuates the value healthcare NLP offers hospitals by eliminating the burdensome, time-consuming and thus costly task of clinical staff having to read through medical notes. There are numerous other valuable uses in NLP’s ability to extract useful insights from unstructured patient data. It can detect early signs of mental illness in patients by monitoring their social media posts and alerting a mental health professional. NLP accomplishes this through analysis of social media and by performing sentiment analysis and detection [121]. NLP can identify and extract text from clinical documents, such as physician’s notes and dictations. This feature allows healthcare organizations to reveal from clinical notes, such as prescriptions and medical history. It can also indicate missing information that should be included in clinical reports. Natural language understanding (NLU) [122] is a feature of NLP that is used to assist physicians by converting their spoken words in real-time, understandable dictated notes, along with discrete data elements—prescriptions, ICD-10 (International Statistical Classification of Diseases and Related Health Problems) codes, procedures, etc. This information is then populated directly into the appropriate section of the medical chart.

6.3.9 Expert systems and AI in hospital care Today’s hospitals are becoming “smart hospitals,” (see also page 222 above) digitalizing many of their administrative and clinical functions, including the use of diagnostic expert systems. OSHA (US Dept. of Labor Occupational Standards and Safety Administration) includes expert systems among its “eTools,” which are “stand-alone,” interactive, Web-based training tools utilize graphical menus including expert system modules. These modules enable the user to answer questions and receive reliable advice on how OSHA regulations apply to their worksite. eTools do not create new OSHA requirements [123]. An expert system eliminates the need to consult a human expert. In a Clinical Documentation Improvement (CDI) program, by collecting clinical information, examining its knowledge base, and applying rules, an expert system can make documentation widely available throughout a hospital. The expert system approach has already been shown to be capable of providing several enhancements to a CDI program, where its rules are applied to the clinical data that has been collected. By examining the list of a patient’s medications and laboratory results, the expert system can suggest additional diagnoses. An expert system cannot be expected to be better than the human expert, but the system provides the ability to automate expertize in the hospital process. This becomes extremely helpful in improving efficiencies and in training new staff [124].

6.3.10 Robotics and AI in hospital care Hospital robots are proving to be very useful, functional, and probably the next horizon in modern medicine. Robots in hospitals are now aiding patients in ways that were only a few

Chapter 6 • Current AI applications in medical therapies and services

225

years ago considered science fiction. PUMA (Programmable Universal Machine for Assembly) robots known for their mobility and wide range of actions as a general-purpose industrial robot was first used in 1985 to aid in a brain biopsy. The PROBOT, also a surgical robot, can remove the soft tissue from a human body during open surgery. ROBODOC [125] was initially used to aid in hip surgeries and, afterwards, used to help with bone surgeries in general. AESOP [126], the first surgical robot approved by the United States FDA has more advanced features that provide the doctors as well as the patients more options when it comes to surgery, including heart valve surgery. Other robots can now perform deep surgeries in very sensitive parts of the body, like the fallopian tube, lungs, and brain. The most recent hospital robot is the da Vinci Surgical System [127]. It is currently the leading technology in the growing line of hospital robots that are used for surgery. What sets this robot apart is its ability to perform minimally invasive yet complex surgical procedures.

6.3.11 Population health (demographics and epidemiology) and AI in hospital care The passage of MACRA (Medicare Access and CHIP Reauthorization Act of 2015 [128]) was designed to replace fee-for-service hospital reimbursement with value-based care (payment model that reimburses healthcare providers based on the quality they provide to patients rather than the number of patients seen [129]). Participating hospitals are investing in health IT solutions to help support population health management, data analytics, and care coordination capabilities [130]. Results indicate that hospitals with value-based incentives were more likely to have adopted population health management and care coordination technologies than hospitals with no incentives. Population health management and analytics technologies have been shown to help hospitals with risk stratification and predictive modeling, enabling them to deliver preventive, targeted care [130]. “New and emerging technologies, like the Internet of Things (IoT) and blockchain, will likely play an increasing role as well in supporting the capabilities needed under new payment models.” “Many health systems, including Vanderbilt University, Boston Children’s Hospital, and the Mayo Clinic, are already using these technologies to support key business and clinical decisions” [131].

6.3.12 Precision medicine/health (personalized health) and AI in hospital care As patient data in the form of genetic information and electronic health records (EHRs) dramatically increases, it has improved doctors’ assessments of the individual patient and treatments tailored to their precise needs. This type of care is referred to as precision medicine (see Chapter 4, page 101) meaning drugs or treatments designed for small groups rather than large populations.

226

Foundations of Artificial Intelligence in Healthcare and Bioscience

To achieve this level of care, doctors are using AI to develop precision treatments with an aim at increasing massive and available data sets insight into what makes patients healthy at the individual level. Those insights could lead to new drugs, uncover new uses for old ones, suggest personalized combinations, and predict disease risk. Dr. Bertalan Meskó, director of the Medical Futurist Institute, suggested that “there is no precision medicine without AI” [132]. By applying AI and machine learning to these multiple data sources, genetic data, EHRs, sensor data, environmental, and lifestyle data researchers are taking first steps toward developing personalized treatments for diseases from cancer to depression. AI-based diagnostic tools have already entered hospitals, but AI-based treatments are still at the foundational stage [133]. There are still barriers to AI in precision medicine, one being that the technology simply isn’t sophisticated enough. Another hurdle has to do with deficiencies in EHR data. The average hospital in the U.S. uses 16 different EHR platforms [134]. With different formats and permissions, an AI system may not have access to all the information, lacking the proper API it needs to suggest a personalized treatment [135].

6.3.13 Healthcare analytics and AI in hospital care The field of health care analytics covers a broad range of the healthcare industry, with insights into both the macro and micro levels. One of the principal areas addressed by health care analytics is hospital management. It helps hospital managers operate better by providing real-time information that can support decisions and deliver actionable insights. Health care analytics provides a combination of financial and administrative data alongside information that can aid patient care efforts, better services, and improve existing procedures. Research and development, clinical data, and patients and clients are feeling about services are crucial aspects of healthcare that, through analysis, can providing new innovative solutions and treatments. Beyond administrators, healthcare analytics can help providers improve upon several areas. One major area where using analytics can optimize efforts is managing hospital and foundation donations and grants. Often, for healthcare providers, donations are the basis of their yearly budgets, so organizing and tracking expenses and activity is vital for setting appropriate goals. Moreover, it can help track donor engagement, retention, and previous contributions. Finally, healthcare analytics allows hospitals to track physician records, patient histories and needs to ensure the right doctor or professional is directed to the patients in most need. These systems can help improve patient satisfaction and expedite the healing process.

6.3.14 Public health and AI in hospital care [136] When one thinks of public health in relation to hospital care, Bellevue Hospital in New York City is the most recognized and respected from its historical roots to its current activities. It is the oldest hospital in America, tracing its roots back to 1736. From its humble beginnings as a haven for the indigent, NYC Health 1 Hospitals/Bellevue has become a major academic medical institution of international renown.

Chapter 6 • Current AI applications in medical therapies and services

227

It provides the latest state-of-the-art medical technology along with compassionate care and continued dedication to its public health roots. From its origin, the hospital never turned away a needy patient, regardless of ability to pay. Over the centuries, Bellevue has served as an incubator for major innovations in public health, medical science, and medical education. Often referred to as a national treasure, NYC Health Hospitals/Bellevue defines the very best traditions of public medicine as a public service vital to the well-being of our civil society.

6.3.15 Access and availability and AI in hospital care A top priority for the health care system is access. According to most health system CEOs, that means providing patients with care where they want it, when they want it, and before they know they need it [137]. Using technology to support convenient access to care certainly isn’t a new concept. But there’s been a marked uptick in interest recently, said Brian Kalis, Accenture’s managing director of digital health. “That trend has accelerated significantly over the past 12 24 months.” Nearly half of all patients have used a walk-in or retail clinic to receive healthcare services, according to a recent survey [138] from Accenture. Younger patients seem to greatest users, with 24% of “Gen Zers” and 13% of millennials expressing dissatisfaction with the convenience of traditional healthcare services. But most health systems don’t use “true artificial intelligence” yet, said Catherine Jacobson, CEO of Milwaukee-based Froedtert Health. She does believe, however, that they are on the way there. “There are algorithms and things that use historical data, like predictive analytics, and that’s really what most of us are using right now,” she said. There’s also room for upstream interventions before the deployment of new technology. Emerging technologies require employees to learn new skills and adapt to changing components of their jobs. This may necessitate a change in the workforce, and that’s part of what drives resistance to change [137].

6.4 Nursing care 6.4.1 Big data analytics and AI in nursing care New data-driven, intelligent innovations in healthcare bring the capabilities of adding value to nursing care. AI has entered healthcare in a big way, and nurses can harness it to enhance standard patient care processes and workflows and improve quality of care, impact cost, and optimize the patient and provider experience. AI computer systems can perform tasks that would otherwise require human intelligence. They can enhance and expedite a critical component of nursing care delivery, namely decision making. New nursing technologies collect and analyze healthcare data that can foretell the risk of future events that could hinder patient care. However, even that data can be incomplete, unclean, and is found in disparate systems within organizations. Machine learning training with vast amounts of big data that is readily available from multiple sources can address such inconsistencies.

228

Foundations of Artificial Intelligence in Healthcare and Bioscience

Difficult to aggregate and analyze, nurses have yet to grasp the full use of big data to maximize its full potential and reap its many benefits. With a greater comprehension of AI, nurses can be at the forefront of embracing and encouraging its use in clinical practice [139].

6.4.2 Health information and records (EHR) and AI in nursing care A significant percentage of nurses (62%) are satisfied with their overall electronic health record (EHR) experience. This compares to physicians who report only a 16% level of satisfaction. The survey included 70,000 clinicians, including 28,000 nurses [140]. Nurses expressed greater satisfaction than providers in the EHR being helpful in terms of patient safety. Nurses indicated that EHR enables them to deliver high-quality care. Also, they reported that the EHR allowed them to deliver better patient-centered care. More than half of the nurses surveyed reported the need for improved integration with outside organizations. The same number agreed that the EHR improved efficiency and provided them with needed analytics, quality metrics, and reporting [141].

6.4.3 Research/clinical trials and AI in nursing care Nursing education and nursing research will change relative to a role and demand for professional nursing practice, with, and not for, robots in healthcare. These changes in routine nursing care will be dictated by the ability of robots to perform the currently prescribed procedures and accomplishment of nursing tasks. Notwithstanding the efficiencies derived from robotic nursing tasks, there is inherent risk in the loss of the unique, personalized care human nurses provide. From low fidelity machineries to high fidelity technologies with AI assistance, technological advancements have changed the practice of nursing. Advances in technology have been made available to aid nurses perform their jobs and care for patients more efficiently and safely. AI can determine clinically significant information underneath a massive amount of data, and this can aid in clinical decision making for nurses [142]. More research is needed to determine the effect of these new technologies in the nursing field and how to hasten their adoption in nursing practice [143]. Nursing education must incorporate technology and AI in the curriculum and increased focus in nursing research investigating the effects of technology in the nursing profession and effective ways of adopting technology in nursing practice.

6.4.4 Blockchain and AI in nursing care The full potential of blockchain in the health care sector, a standard, systematic, and enduser approach is needed to creating support tools for use in their daily practice. Blockchain solutions designed by nurses can serve as joint leaders in reforming health and social ecosystems, leading to a triple win for citizens, industry, and the service provider. Frontline nurses and a more holistic approach to value-based health and social care, placing the patient/citizen (prevention) at the center of the process, might produce an active

Chapter 6 • Current AI applications in medical therapies and services

229

community partnership. Blockchain can support citizen/patients’ empowerment in the management of their health and social data, by guaranteeing citizens in the chain know-how and where their data is being used [144]. Nurses added-value in blockchain relates to boosting the continuity of care, facilitating the communication between the different actors involved to deliver the best outcomes for patients and citizens. In particular, nurses are vital to improving access and outcomes in a peoplecentered approach, ensuring the continuity of care across the primary and secondary health and social care sectors. Blockchain can enable nurses to deliver on access to care, through AIsupported health and social care. In so doing, it needs to foster integration and continuity of care policies, support nurses in the delivery of a safe and high-quality level of care.

6.4.5 Internet of Things (IoT) and AI in nursing care The development of wearable technology and smart sensors on the Internet of Things (IoT) makes communication with a care team at any moment possible. Such communications are accomplished through a simple touch of a button or tracking essential health data in a way that doesn’t require multiple doctor’s office visits. Patients can now wear devices that measure vital signs and upload the data to their caregivers. This changes how patients interact with healthcare professionals. Specifically, such devices can assist nurses, who spend the most time interacting with patients. IoT monitoring could eliminate many pricey doctor office visits, offering a low-cost, high-tech way to access care easily. The FDA has approved more than 100 health apps for medical use [145]. Nurses and other healthcare professionals will need to continue to monitor the usage of IoT devices as they proliferate rapidly. As the IoT grows, there are a few things for nurses to consider, according to AdvanceWeb.com [146]: • How can nurses work to facilitate any necessary patient behavior modification through remote monitoring? • How can nurses provide the best possible patient and caregiver training on these devices? These considerations, along with adopting uniform policies for IoT devices, nurses can create an IoT care plan that has the potential to improve healthcare outcomes and potentially reduce costs [147].

6.4.6 Telehealth and AI in nursing care Telehealth nursing is growing rapidly as its services expand as an extension of healthcare providers’ offices, hospitals, and health plans. These services include independent nursing practices. Telehealth nursing enables these services to delivered remotely to improve efficiency and patient access to healthcare [148]. Numerous benefits of telehealth nursing have been identified as nurses assist with patient retention, decrease on-call hours for healthcare providers, and offering versatility for use during any time interval, including around-theclock, weekend, or after-hours care [149]. Telehealth nurses’ also include the ability to guide

230

Foundations of Artificial Intelligence in Healthcare and Bioscience

patients to ED visits, clarify appropriate treatment options, educate about self-care at home, and assist with appointment scheduling [150]. Telehealth nursing programs generally require nurses to have 3 5 years of clinical experience. Nurses undergo training before taking on real-world patient calls. During their training, nurses learn concepts and descriptions to conduct assessments, analyze, and plan using proper decision support tools. These tools include algorithms and protocols that provide specifics to direct the telehealth nurse [151]. The Patient Protection and Affordable Care Act enacted continuity of care through patient-centered medical homes, which has been expanded by telehealth nursing. This has offered the ability for caregivers to call at their convenience, which resulted in risk reduction and cost savings to providers and healthcare organizations, as well as meeting the accessible care standards set by the National Committee for Quality Assurance [152].

6.4.7 Chatbots and AI in nursing care Perhaps the most popular chatbot in healthcare is “Florence,” named after the Florence Nightingale, the English social reformer and founder of modern nursing. It was developed in 2016 by David Hawig, a German entrepreneur and researcher. It is a personal health assistant that reminds users to take their medication or birth control pills and helps them keep track of their body weight, mood, and other health issues. To use it, one simply starts chatting with Florence inside a messaging platform, like Skype, Kik, or Facebook messenger [153]. Another avatar-based virtual nurse assistant is called “Molly.” Molly communicates to patients through a nurse with clinical advice to assess their condition and suggest appropriate follow-up [154]. Using natural language processing (NLP see below), these chatbots give healthcare professionals the chance to communicate with patients and increase patient engagement interactively. They also help ease the workload of doctors and nurses by performing basic administrative tasks. “Not all patients will receive their care or advice by a doctor, and many will be dealt with by excellent nurses or even by an administrative staff that may hand out a leaflet,” was stated by Nils Hammerla, Ph.D., Machine Learning Specialist, Babylon in MobiHealthNews. “We see AI as a chance to improve the efficiency of the healthcare system, to support our doctors and other medical staff, and to improve accessibility all around the world to healthcare services” [155].

6.4.8 Natural language processing (NLP), and AI in nursing care Natural language processing (NLP) is an AI program that helps computers, when coupled with Automated Speech Recognition (ASR), to understand better and process human (unstructured) languages [156]. Utilizing machine learning, NLP and ASR enable scientists to develop algorithms for language translation, semantic understanding, and text summarization. This makes it easier for providers to understand and perform computations on volumes of text with less effort. Examples of NLP and ASR are virtual assistants, chatbots (mentioned above), cell phone voice texting/messaging, and nursing applications include extracting EHR text from notes in non-discreet fields, vocal charting, and speech-activated paging devices.

Chapter 6 • Current AI applications in medical therapies and services

231

Assessment in the nursing process is the organized collection and verification of data that is analyzed and communicated to members of the healthcare team and patients. NLP and ASR can aid this step by the use of hands-free input of data and mechanical extraction of patient-specific data from notes and indiscreet fields of the EHR to determine the patient’s condition more accurately. Diagnosis defines the disease; planning is where nurses set care priorities and define patient goals; and Intervention is when the actions are initiated and completed occurs [157]. All of these stages are made more precise and are expedited by predictive analytics and machine learning. NLP collects the data, and ASR to complete the process ensures the most accurate, highest quality of care for each patient throughout the care continuum [158].

6.4.9 Expert systems and AI in nursing care Clinical decision support system (CDSS) functionality offers nurses a means to promote and enhance care delivery by using rules-based tools. AI extends CDSS used in nursing care. The difference is that AI, particularly predictive analytics, adds breadth and precision to decision making for healthier care experiences for those giving and receiving care. The University of Pennsylvania School of Nursing (Penn Nursing) developed, validated, and tested a two-step clinical decision support (CDS) algorithm called Discharge Referral Expert System for Care Transitions (DIRECT). The DIRECT CDS helps clinicians identify patients most in need of PAC (post-acute care) and suggests whether skilled home care or facility level care is best. The researchers developed the DIRECT CDS using values of structured patient data drawn from the electronic health record and knowledge elicitation from clinical experts as they reviewed de-identified case studies of actual patients. “While the proportion of patients referred to PAC between the 2 phases did not change significantly, the algorithm may have identified those patients most in need. This resulted in significantly lower inpatient readmission rates for same-day, 7-, 14- and 30-day intervals,” explained Kathryn H. Bowles, Ph.D., RN, FAAN, FACMI [159]. “The DIRECT CDS indicates potential as a useful tool to optimize PAC decision-making and improve patient outcomes. It may also identify patients who need PAC but are unable to receive it because of policy or insurance barriers. Future studies examining the outcomes of these patients may have policy implications,” said Bowles.

6.4.10 Robotics and AI in nursing care The U.S. Bureau of Labor Statistics reports that there is a shortage of nurses, and the need will increase by 15% between 2016 and 2026 [160]. There is now a robot designed to help with nurse tasks and ease the pressure of understaffing. The robot’s name is Moxi, designed and built by Austin-based Diligent Robotics. Moxi will not replace nurses but rather will handle up to 30% of tasks nurses do that don’t involve patient interaction. These tasks include, but are not limited to, things like running errands around the floor or delivering specimens to the lab for analysis. “It’s hard to

232

Foundations of Artificial Intelligence in Healthcare and Bioscience

argue that we’re taking anyone’s job. Everyone is trying to make the nurses they have go further,” said the robot’s developers, Andrea Thomaz and Vivian Chu. Nurses can set up rules and tasks that give the robot a command for a particular errand when certain things change in a patient’s record (see Research in Nursing Care above). That means nurses wouldn’t have to remember specific tasks that otherwise would be part of their daily job. This reduces the cognitive load on the nurse. The preprogrammed nature of Moxi’s tasks doesn’t mean that the robot never interacts with people. It is programmed for human-robot social interaction, and carefully designed to be non-threatening and transparent in its actions. Moxi’s job is to take as many mundane tasks as possible off nurses’ plates so that they could spend more time interacting with patients, but beta trials revealed that patients enjoy interacting with the robot as well [161]. The threat of humanoid robots replacing nurses in nursing practice has become a topic of serious discussions [162]. Nonetheless, there is already a robotic revolution happening in nursing, and these robots have made tasks and procedures more efficient and safer rather than replacing nurses [163].

6.4.11 Population health (demographics and epidemiology) and AI in nursing care The passage of the Affordable Care Act (ACA) refocused health care from the individual, patientspecific, episodic care, towards population health management, that of control of groups of people with an emphasis on primary and preventive care. Population health management assists these groups in attaining and maintaining health with shared accountability for the future environment, social, and community factors that contribute to chronic disease and cost. The 3 million nurses in the United States [164], because of their role, their education, and respect for their profession, are well-positioned to help shape and improve our nation’s health status and care infrastructure. The Robert Woods Johnson Foundation Catalyst for Change report [165] has created an urgent call for harnessing the power of nurses in our communities, schools, businesses, homes, and hospitals to build capacity for population health. Informatics Nurse Specialists (INSs) are trained, prepared, and well-positioned to fulfill roles across practice, research, education, and policy to support this call. Informatics Nurse Specialists (INSs) are integral in actively supporting the nursing profession to build population health in the 21st century, aligned with the Catalyst for Change white paper [165]. INSs are critical partners to lead population health, care coordination across settings of care, and inter-professional and community stakeholder collaboration.

6.4.12 Precision medicine/health (personalized health) and AI in nursing care As described previously in Chapter 4 (page 101), precision medicine is defined as “the emerging approach for disease treatment and prevention that takes into account individual variability in genes, environment, and lifestyle for each person” [166]. The focus on

Chapter 6 • Current AI applications in medical therapies and services

233

prevention, management, and alleviation of symptoms is perhaps nursing science’s most significant contribution to precision health [167]. As such, the National Institute of Nursing Research’s (NINR) Symptom Science Model [168] NINR-funded P20/30 Centers are planning to align symptom science within precision health, including “omics” approaches (e.g., epigenetics/genomics and microbiome). Behavioral science, including self-management and sociocultural as well as the physical environment, are also included. Nursing scientists who focus on symptoms and self-management science are incorporating the concept of precision health into their ongoing research. The idea also is significant for the development, testing, and targeting of nursing interventions across the health care continuum [169]. The NINR also emphasizes the development of symptom science and the application of precision health methods to self-management of chronic conditions and to address these major health concerns [170]. To reach the goal of precision health, approaches must be applied throughout research translation from basic science to clinical research and, ultimately, at the population level to improve health and prevent disease. Nurse scientists need to become increasingly more knowledgeable and facile in integrating precision health approaches as they develop symptoms and self-management interventions [171].

6.4.13 Healthcare analytics and AI in nursing care Predictive analytics falls under the umbrella of AI healthcare analytics (see Chapter 4, page 97). This type of advanced analytics allows nurses to discover previously unknown patterns in multiple sources of clinical and operational data that can guide better decision making. Through the use of predictive data, nurses can gain actionable insights that enable greater accuracy, timely, and appropriate interventions in a prescriptive way for both patients and nurses. Predictive analytics can alleviate the untoward effects of taxing patient care that causes nurse dissatisfaction and burnout. Innovations in technology, including predictive analytics applications, can increase nurse satisfaction and improve this facet of the Quadruple Aim (patient-centered approach, improving population health, containing costs, and making a difference on a large scale) [139]. Intelligent computer systems also assist the nursing process, and critical and organized thinking to expedite decision making by synthesizing valuable nursing skills and knowledge. There are compelling reasons why nurses should care about AI innovations. Artificial Intelligence and predictive analytics can evolve nurses’ thinking about care delivery and operational tasks in functionally disruptive ways to serve the Quadruple Aim. With this shift, nurses can begin to advocate for the adoption and use of AI inpatient care delivery [139].

6.4.14 Preventive health and AI in nursing care Nurse practitioners (NPs) are registered nurses with advanced education, certification, and clinical skills to perform comprehensive physical assessments, the ability to diagnose patients’ conditions, prescribe treatments and medications, and take charge of a patient’s overall care. More so, beyond these clinical skills and personal skills synonymous with nursing, NPs have an added emphasis on prevention in their patient care [172].

234

Foundations of Artificial Intelligence in Healthcare and Bioscience

By nurses offering education and counseling, they significantly advance preventive health efforts nationwide. Preventive health refers to a collection of strategies that health care professionals encourage patients to implement to stay healthy and to reduce the risk of future disease. Preventive measures such as screenings, physical examinations, and immunizations often are performed in accordance with demographic factors like age, gender, and family history [173]. In a study published by the U.S. National Library of Medicine [174], the importance of nursing professionals working on the frontlines of patient care was highlighted to aid with preventive health care efforts. Nurses achieve this primarily through the dissemination of information that patients can harness to keep themselves as healthy as possible.

6.4.15 Public health and AI in nursing care Public health nursing (PHN) is a distinct and well-recognized area of nursing. It is defined by the American Public Health Association, Public Health Nursing Section, as the practice of promoting and protecting the health of populations using knowledge from nursing, social, and public health sciences [175]. The combination of a clinical nursing background with knowledge from the public health and social sciences provide a sound basis for public health leadership positions. Often used interchangeably with community health nursing, this nursing practice includes advocacy, policy development, and planning, which addresses issues of social justice. Public health nursing practice, utilizing AI tools, focuses on population health (discussed above) to promote health and preventing disease and disability. Public Health Nurses work with the individuals and families that compose the communities and the systems that affect the communities. They work in a variety of settings such as health departments, schools, homes, community health centers, clinics, correctional facilities, worksites, out of mobile vans, and even dog sleds [176].

6.4.16 Access and availability and AI in nursing care A low nurse to patient ratio [177] often leads to tired, stressed nurses affecting the quality of patient care. Projections from the Health Resources and Services Administration’s (HRSA) Health Workforce Simulation Model (HWSM) highlight the inequitable distribution of the nursing workforce across the United States [178]. The HWSM Model projected a national RN excess of about 8% of demand and a national LPN deficit of 13% by 2030. For RNs, the state-level projections show both projected deficits of RNs in several states, and significant variations in oversupply in other states. Similarly, national estimates of LPNs in 2030 show maldistribution among countries. These findings underscore the potential difficulties in ensuring adequate nursing workforce supply across the United States. Looking to the future, many factors will continue to affect demand for and supply of nurses, including demand for health services broadly and within specific health care settings [179]. Nonetheless, emerging care delivery models such as Accountable Care Organizations (ACOs) could change the way that RNs and LPNs delivery service. However, there is currently insufficient information to project the extent to which these new delivery models will materially affect the demand for nurses.

Chapter 6 • Current AI applications in medical therapies and services

235

The value of nursing access and availability could not have been better demonstrated by nursing critical and invaluable role during the COVID-19 pandemic. As frontline clinical and supportive caregivers, virus infected patients, inundated hospitals and the very health and well-being of nations throughout the world could not have survived. Their dedicated, compassionate service, at the risk of their own health and well-being will be appreciated and remembered forever by a grateful humanity.

6.5 Home health care, nursing homes and hospice care 6.5.1 Big data analytics and AI in home health, nursing homes, and hospice care Nursing informatics [180] is a specialized field of research that utilizes medical data to support the practice of nursing and focusing specifically on home healthcare and nursing home care. The goal is to help clinicians make better decisions about patient care and to help home care patients achieve better health outcomes. Protected data within the home health and health care industries present some barriers to adopting new big data analytics. However, the widespread use of EHRs is providing increasing, significant, and rapid change in access to data. In 2010, fewer than 50% of hospitals were using electronic records, and today, it’s nearly 90%. Legal, regulatory, and legislative issues confronting home health and hospice operators are very complicated and seriously require big data analytic support. Three problems are emerging that the National Association for Home Care & Hospice (NAHC) believes they will continue to be critical concerns for some time to come. This is because of increased bureaucratic interference with service operators [181]: 1) Patient-Driven Groupings Model (PDGM) which represents the most significant change to the payment system in the 21st century. The PDGM model devised by the Centers for Medicare & Medicaid Services (CMS) will subject home health providers to an 8.01% cut in the base rate; 2) Hospice “Carve-in” wherein Medicare Advantage benefits package would subject hospice care decisions to an additional layer of financial and utilization controls, fragmenting existing hospice benefits and diminish their value; 3) Report on Hospice Quality of Care provides for the Office of Inspector General to consider recommendations for increasing hospice responsibility related to the potential incidence of abuse.

6.5.2 Health information and records (EHR) and AI in home health, nursing homes, and hospice care Home healthcare and hospice organizations are planning on expanding interoperability initiatives by 30% in 2019 to improve patient data management, according to a Brightree survey [182]. The survey included 675 home healthcare and hospice providers as well as 440 of

236

Foundations of Artificial Intelligence in Healthcare and Bioscience

their referral sources about their opinions on interoperability and electronic referrals. The study generated 4 insights: 1) Seventy percent of home healthcare and hospice organizations said that within the past 1 2 years, there has been an increase in the number of referral sources that request data to be sent electronically; 2) More than half (60%) of referring providers said they would be willing to switch to a new post-acute provider if that organization can accept electronic referrals; 3) Just 4% of home healthcare and hospice organizations said they were able to accept electronic referrals from a referral source EHR system; 4) Thirty-one percent of post-acute care providers said they would switch EHRs if they found a vendor that could better support their interoperability needs.

6.5.3 Research/clinical trials and AI in home health, nursing homes, and hospice care The National Association for Home Care & Hospice conducts in-depth research studies on varying aspects of the home care and hospice industry. The 2 most recent include a study conducted by the Alliance for Home Health Quality & Innovation Research and Studies in conjunction with the Cleveland Clinic on “Optimizing Home Health Care: Enhanced Value and Improved Outcomes” [183]; and a second sponsored by the Alliance for Home Health Quality & Innovation in conjunction with the research firm Dobson DaVanzo & Associates on “The Clinically Appropriate and Cost-Effective Placement Project” [184]. “Optimizing Home Health Care: Enhanced Value and Improved Outcomes” demonstrated that as the baby-boom population ages, there is an increasing demand for home care as more people choose to age at home. The changing political and fiscal landscape serves as a sociomarker to policymakers and other health care stakeholders as they look to reform postacute care and make better use of home health. The “Clinically Appropriate and Cost-Effective Placement Project” examines how Medicare’s use of home health can better meet beneficiary needs and improve the quality and efficiency of care provided within the U.S. health care system. The project, which analyzed a 5% sample of 3 years of Medicare fee-for-service claims data.

6.5.4 Blockchain and AI in home health, nursing homes, and hospice care The aging population encounters daily difficulties and frustrations in navigating the health care system. They are required to repeat their history and symptoms repeatedly to providers. Poor reporting causes delays in care and undue anxiety at the complexity of the system. These problems present the scenario for blockchain technology. A company (Wellderly [185]), in conjunction with other industry partners, is building the world’s first blockchain-based platform for elderly wellness and care. The platform coordinates health and well-being services to meet the needs of aging citizens. Some of these

Chapter 6 • Current AI applications in medical therapies and services

237

services include adopting a personalized and integrated approach for the elderly, with the following capabilities and benefits: • Allow longitudinal data of the elderly to be securely stored on the blockchain which they can grant access to the service providers they had engaged; • Link all product and service providers for eldercare with the elderly, their families and caregivers together to form a seamless ecosystem where relevant information is readily accessible by them; • Motivate the elderly to actively participate in eldercare education and voluntarism with incentives to earn vouchers for senior citizen products and services.

6.5.5 Internet of Things (IoT) and AI in home health, nursing homes, and hospice care Due to their greater interconnectivity and data sharing, along with the operational advantages, the Internet of Things (IoT) is an ever-increasing force in business and health care. To be able to connect any person, particularly the elderly, with any device in virtually any location is a concept of enormous value in health care. The evolution of IoT is transforming the very nature of remote patient monitoring (RPM). RPM is designed to deliver care via a series of inter-connected, home-based devices by capturing and monitoring data on devices connected to the same. Innovative technologies like smart sensor [186], Healthcare facilities, nursing homes, and home care now have the power not only to gather and access a wide variety of data but also it can instantly leverage the data to deliver the best possible care while reducing time and wasteful costs. The benefits of IoTs are already apparent in the improved outcomes enjoyed in progressive hospitals and health facilities. Such benefits are especially recognized throughout many homes and home care settings, where unobtrusive but highly responsive devices wire patients directly to their off-site caregivers [187].

6.5.6 Telehealth and AI in home health, nursing homes, and hospice care Insurance companies are increasingly expanding coverage for telehealth and telecare programs. Center for Medicare and Medicaid Services (CMS) has also announced limited support of telemedicine [188]. This coverage is excellent news for home healthcare, nursing home, and hospice providers since telehealth can help virtually connect patients with doctors and be able to show them any patient changes or issues. Telehealth has the greatest potential to transform home healthcare and hospice agencies by improving patient care, efficiencies, and streamlining a care provider’s day. Many home care, nursing home, and hospice administrators report that telehealth and kiosks and already helping reduce costs for their agency’s patient population. The technology can also lead to better communicate internally and externally as well as fewer silos (isolations of data) within an organization [189].

238

Foundations of Artificial Intelligence in Healthcare and Bioscience

6.5.7 Chatbots and AI in home health, nursing homes, and hospice care What if, at least initially, you didn’t have to talk to another human about your end-of-life? What if your “end-of-life” conversation was with a machine? A team at Northeastern University in Boston is exploring this [190]. They’ve begun a trial in which they’re introducing terminally ill patients to chatbots able to converse with humans. According to the research team, “Patients tend to be referred to palliative care much too late. Something like a third of patients moved to a hospice die within a week.” Instead, perhaps people with a short life expectancy could use technology with AI to help prepare themselves logistically, emotionally, even spiritually for their deaths. After a voice greeting, the chatbot provides a choice of responses on the touchscreen. There is a tightly scripted interaction to keep the conversation focused and avoid communication breakdowns that can occur with even the most intelligent machines. That also protects the patient from revealing too much personal information. The chatbot also presents the option to expand the conversation beyond the person’s physical condition, too, perhaps to discuss “end of life” planning and spiritual considerations. The program doesn’t generate documents but enables family members or caregivers to see if, what and when a patient is ready to talk [191].

6.5.8 Natural language processing (NLP) and AI in home health, nursing homes, and hospice care NLP can be applied to extensive datasets through data-mining of the EHR and thus enabling widespread implementation and population-based assessment at low cost. Because of these strengths, NLP methodology can be applied to study documentation in EHR notes with the following guideline regarding recommended processes of care: goals of care conversations, clarifying code status, assessment for the nursing home, hospice, and palliative care consultation. Building a key term library is critical for implementing this NLP concept. Researchers manually reviewed medical records to identify documentation of each of these processes of care. These key term libraries were refined and validated by manual review of notes flagged by NLP, as well as manual review of records not flagged by NLP. This iterative process resulted in improvements in NLP performance over time and a “gold standard review.” Clinical chart reviews, laboratory, and imaging studies were manually performed, and assessment for hospice and palliative care consultation were conducted. NLP was then performed, and results from NLP were compared with findings from the gold standard chart review. The NLP libraries had high sensitivities and specificities that ranged from 93.8% to 100%, and the NLP search abstracted these records and provided a structured dataset in just 26 seconds. By comparison, manual review and data entry required over 20 hours to complete. This study suggests that established palliative care quality benchmarks are applicable in palliative surgery and can be rapidly and accurately implemented using NLP [192].

Chapter 6 • Current AI applications in medical therapies and services

239

6.5.9 Robotics and AI in home health, nursing homes, and hospice care Continuing advancements in AI and related tools like robots are helping bring the more traditional concept of physical care into reality. The real challenge may be building and costeffectively deploying such technologies. Advancements have already included combining proven communication capabilities, like video and chatbots, with the capabilities of a robot to move around remotely. Then the person with the robot doesn’t have to control it or know how it works. These home robots can also serve as the physical interface for numerous digital technologies and interventions, such as socialization chatbots or cognitive exercises for lonely seniors. The growing number of seniors and chronic disease patients is continuing to drive demand for new in-home, nursing home, and consumer-facing care tools. It’s anticipated that hardware-based AI will likely play an increasing role in the industry alongside their software-driven predecessors [193].

6.5.10 Population health (demographics and epidemiology) and AI in home health, nursing homes, and hospice care The U.S. health care system is moving towards value-based payment models (pay for outcomes rather than fee-for-service) and Medicare Advantage (MA) and other health plans are incentivized to provide higher quality care at a reduced cost. This is done by eliminating nonbeneficial and wasteful care [194]. Community-based palliative care programs are associated with decreased hospitalizations, nursing homes, and costs in the last months of life [195]. MA conducted a study incorporating key components of population health, including proactive identification, multidisciplinary team care management, phone, and home visits, emphasis on care coordination, collaboration with physicians and health plans, and leveraging a mobile platform to support workflows and reporting [196]. The clinical team also followed a consistent care process guided by AI standardized assessments, intervention-based care paths, and real-time clinical dashboards. A population health community-based palliative care program staffed by nurses and social workers were associated with lower costs, decreased hospitalizations and ICU days, and increased hospice utilization while improving care quality and member satisfaction. Care that is primarily driven by values, goals, and preferences of seriously ill individuals and their family members results in more compassionate, affordable, sustainable, and high-quality care [197].

6.5.11 Precision medicine/health (personalized health) and AI in home health, nursing homes, and hospice care Precision medicine is an emerging approach to disease treatment and prevention that considers differences in people’s lifestyles, environments, and biological makeup, including genes. The All of Us Research Program [198] partners with 1 million diverse people who share information about themselves over many years, intending to enable research to more precisely prevent and treat a variety of health conditions.

240

Foundations of Artificial Intelligence in Healthcare and Bioscience

Participants are asked to share different types of health and lifestyle information through online surveys and electronic health records (EHRs), which will continue to be collected throughout the program. Participants will be asked to visit a local partner site to provide blood and urine samples and to have basic physical measurements taken, such as height and weight. In the future, participants may be invited to share data through wearable devices and to join follow-up research studies, including clinical trials. The Program is partnering with trusted community partners, such as Diverse Elders Coalition member organization, the National Hispanic Council on Aging (NHCOA) [199], to ensure that their outreach efforts are culturally and linguistically competent. Factors such as the current political climate, which has created fear and unwillingness to reveal personal information to the U.S. government, have been considered by All of Us leadership and protections have been developed.

6.5.12 Healthcare analytics and AI in home health, nursing homes, and hospice care Useful data and analytics are critical in-home health care. Metrics that need frequently checking are the status of claims in the billing process, which claims came back “Return to Provider” (RTP) or rejected, and the reason codes for those claims. If there is still uncertainty about taking necessary action based on your claim data, a comparative analytics solution tools [200] offers insight into critical metrics like claim processing time, denials, and the amount paid on claims. Beyond the challenges of Medicare billing, home health agencies, and nursing homes also must deal with additional rules, deadlines, and possible penalties with Request for Anticipated Payments (RAPs). Canceled RAPs and paid RAPs are just a few numbers to track and therefore, a better handle on your deadlines and takebacks. The agency must also assess the risk of Z-RAPs [201], when RAPs pay at 0% instead of 60% of the final payment. And of course, home health agencies depend heavily on hospital referrals, and a great way to foster those referrals is to keep your hospital readmission rates low. Regularly monitoring and benchmarking your re-hospitalization rates against CMS best practice standards will put you in a better position to catch and fix any problems before re-admissions get too high.

6.5.13 Preventive health and AI in home health, nursing homes, and hospice care A digital health company, CarePredict [202], has produced an AI-supported preventive healthcare solution for senior adults. The AI platform consists of lightweight sensors and wearables (IoTs), designed for seniors, to unobtrusively collect rich data sets on senior’s activities and behavior patterns. These activities include but are not limited to eating, drinking, walking, grooming, sleeping, bathing, and toileting. Deep learning models trained on these activity sets are used to surface insights such as signs and symptoms of self-neglect indicative of depression, unusual toileting patterns

Chapter 6 • Current AI applications in medical therapies and services

241

characteristic of a urinary tract infection (UTI), or increased fall risk due to malnutrition, gait changes, lack of rest, and dehydration. This platform serves as a monitoring solution for parents and grandparents who want to age in the comfort of their own home in a safe, smart, and sustainable way. At the same time, it empowers family members with constant visibility and unparalleled insights into the evolving health of their loved ones. It allows them to make the right decisions well in advance [202].

6.5.14 Public health and AI in home health, nursing homes, and hospice care TRICARE [203] covers hospice care for military and veterans in the United States, District of Columbia and U.S. Territories administered through an AI-supported big data program under the following guidelines: • The patient, primary physician, or authorized family can initiate hospice care. • Hospice care will only start with a doctor’s order. • The patient must complete an “election statement” and file it with the regional contractor. Hospice care is not covered in any other overseas areas (outside of U.S. Territories). There are 4 levels of care within the hospice benefit: 1) 2) 3) 4)

Continuous home care; General hospice inpatient care; Inpatient respite care; Routine home care;

Care within the 4 levels may include physician services, nursing care, counseling, medical equipment, supplies, medications, medical social services, physical and occupational services. Also included are speech and language pathology, and hospice short-term acute patient care related to the terminal illness. Because hospice care emphasizes supportive services, such as pain control and home care, the benefit allows for home health aid and personal comfort items, which are limited under TRICARE’s main coverage programs. However, services for an unrelated condition or injury, such as a broken bone or unrelated diabetes, are still covered as a regular TRICARE benefit. TRICARE does not cover room and board unless the patient is receiving inpatient or respite care. Patients cannot receive other TRICARE services or benefits (curative treatments related to the terminal illness) unless the hospice care is formally revoked. No care for the illness is covered by TRICARE unless the hospice provides or arranges for the care.

6.5.15 Access and availability and AI in home health, nursing homes, and hospice care The number of home health agencies in the U.S. (as of 2016) was 12,200, 15,600 nursing homes, and 4300 hospice care agencies [204]. The National Center for Health Statistics: Vital and Health Statistics published comprehensive analytical and epidemiological studies in

242

Foundations of Artificial Intelligence in Healthcare and Bioscience

February 2019 [205]. In 2016, about 65,600 paid, regulated, long-term care services providers in 5 major sectors served more than 8.3 million people in the United States. This report provides information on the supply, organizational characteristics, staffing, and services offered by providers; and the demographic, health, and functional composition, and adverse events among users of these services. Services users include residents of nursing homes and residential care communities, patients of home health agencies and hospices, and participants of adult day services centers.

6.6 Concurrent medical conditions (“comorbidity,” aka “multimorbidity”) (See also, Chronic Illnesses, Chapter 7, page 411) Throughout this section’s discussion on comorbidities, you will begin to get a sense of the magnitude of the problem of concurrent medical conditions, especially in the aging population worldwide and now, among SARS-CoV-2 infected patients. These combined conditions encompass physical as well as mental disorders in patients. And because of their propensity towards the elderly, bodily injury also becomes an instigating complication. These confluent issues make comorbidities a demonstrable public health issue. Fig. 6 1 demonstrates the age distribution of comorbidities and the amount and their frequency graphically within the increasing age ranges.

FIGURE 6–1 Comorbidities by age. Combined conditions (“comorbidities”) encompass physical as well as mental disorders in patients with their greatest frequency for occurrence in the elderly population. This makes the issue of comorbidities a demonstrable public health issue. Source: European Respiratory Journal, Lancet 2007.

Chapter 6 • Current AI applications in medical therapies and services

243

Effective January 2018, the National Library of Medicine designated a MeSH (Medical Subject Headings, National Library of Medicine’s official vocabulary thesaurus) specific definitions for concurrent medical conditions. A different classification term for multimorbidity, distinct from comorbidity, was assigned. Nicholson et al. [206] re-emphasize that this is more than a semantic difference. While both terms focus on the occurrence of multiple chronic conditions within the same individual, the term “comorbidity” refers to the combined effects of additional conditions about a chronic index condition (such as comorbidity in diabetes mellitus, stroke, depression or cancer). In comparison, the term “multimorbidity” indicates that no single condition holds priority over any of the co-occurring conditions from the perspective of the patient and the health care professional.

6.6.1 Big data analytics and AI in concurrent medical conditions (“comorbidity”) Long-term post-acute care (LTPAC), skilled nursing facilities (SNFs), and home health agencies are major players in helping patients recover from serious events and support medically frail individuals, especially the elderly with comorbidities. Unfortunately, many of these facilities are not integrated digitally with their physician and hospital partners, which leads to incomplete information and fragmented treatment plans. Managing complex high risks patients, such as elderly individuals with multiple comorbidities, requires payers, providers, and LTPAC organizations to share more data and collaborate more closely is a high priority among many stakeholders. Learning to use more readily available data, like demographics, ICD-10 codes, and ADT alerts, are vital steps for eventually integrating much more complex and varied big data into comorbid population health management. Healthcare stakeholders that aim to create highly personalized preventive care and chronic disease management initiatives must work with their partners, AI health systems, and peers. This allows them to access and analyze big data that can meaningfully improve the delivery of quality patient care [207].

6.6.2 Health information and records (EHR) and AI in concurrent medical conditions (“comorbidity”) The EHR has proven unequivocally to be an effective tool in research studies used to discover patterns of disease susceptibility and comorbidity [208], and EHR data sets. Linking the EHR to other -omics data types within a network biology framework has also made significant contributions to better understanding various risk factors to disease etiology. These include genetics [209], environment, demography, and combinations thereof. A study was conducted to determine the relationship between disease comorbidities and commonly shared genetic architecture of disease. Records of pairs of conditions were combined from 2 different EHR systems electronic medical systems (Columbia, Stanford), and compared to an extensive database of published disease-associated genetic variants (VARIMED). Data on 35 disorders were available across all 3 sources. This included medical

244

Foundations of Artificial Intelligence in Healthcare and Bioscience

records for over 1.2 million patients and options from over 17,000 publications Disease pairs were categorized as having predominantly clinical, genetic (phenotype), or both kinds of manifestations. Bias effects of age on disease incidence were controlled for by only comparing diseases when they fall in the same cluster of similar incidence patterns. Disease pairs were categorized as having predominant clinical, genetic, or both kinds of manifestations. Confounding effects of age on disease incidence were controlled for by only comparing diseases when they fall in the same cluster of similarly shaped incidence patterns. This study using AI presented a method for integrating clinical EMR and genetics data to better understand the nature of disease comorbidity. The study identified sets of disease pairs that deviate from the assumption of their independent co-occurrence in 2 different EMR systems. By integrating the clinical observations with genetics, the study was able to categorize which disease their shared genetics might explain pairs and which might have more of an environmental component [209]. This study presented a method of integrating clinical EMR and genetics data to elucidate disease comorbidity. It identified a set of disease pairs which deviate from the independence assumption in their co-occurrence in two different EMR systems. Integrating the clinical observations with genetics, further categorizes which of the disease pairs might be explained by the shared genetics and which might have more of an enviornamental component.

6.6.3 Research/clinical trials and AI in concurrent medical conditions (“comorbidity”) In the previous section regarding EHRs relating to comorbidity, pairs of organic disease research revealed how diseases could be paired within an individual based on common clinical and genetic factors. Research studies in mental disorders have also been studied with some similar patterns. Comorbidity in mental illness can include a situation where a person receives a medical diagnosis that is followed by the diagnosis of a mental disorder (or vice versa), or it can involve the determination of a mental disorder that is followed by the diagnosis of another mental disorder. As with the EHR studies, these comorbidities with mental disorders were revealed in clinical trials through the use of big data analytics and machine learning algorithms. A large cross-sectional national epidemiological study in 2009 of comorbidity of mental disorders in primary care in Spain published in the Journal of Affective Disorders [210] showed that among a sample of 7936 adult patients, about half had more than 1 psychiatric disorder. Furthermore, in the U.S. National Comorbidity Survey, 51% of patients with a diagnosis of major depression also had at least 1 anxiety disorder, and only 26% of them had no other mental disorder [211].

6.6.4 Blockchain and AI in concurrent medical conditions (“comorbidity”) As an example of blockchain usage with comorbidity, a patient sent from their physician to a specialist to check for cardiovascular comorbidities such as hypertension is then sent on to

Chapter 6 • Current AI applications in medical therapies and services

245

another specialist to treat or prevent another disorder. Within the blockchain, such patients have a better chance of managing their disease and comorbidities because they would not need to take the time to gather all their health records from multiple doctors to send to their new specialist. All additional specialists in the patient’s care would be added to the existing blockchain from where the patient can access the same information as everyone else already participating in the chain. All the participants in the blockchain have the additional bonus of knowing that the information transmitted between different providers has undergone validation with a reduction in lost data (e.g., indecipherable handwriting). Based on the ease-ofuse, there are other benefits to be received by the blockchain, such as enhanced patientdoctor communication, real-time emergency alerts, and increased preventive care through the empowered of an informed patient [212]. With the aging demographic, there is a growing number of people living with comorbidities, non-communicable diseases, and in need of complex care interventions. A nursing co-designed blockchain technology value-based healthcare system has been developed with the potential to strengthen continuity of care and nurses’ advanced roles in care pathway co-ordination. Blockchain technology goes beyond the solutions provided by paper files or electronic health records (EHR) with a seamless and secure way to capture, track and share a citizen/ patient entire health experience. It combines personal data on patients with comorbidities, with primary care and public health data. This facilitates and enables the transition from feefor-service payments, prioritizing volume of medical actions over effective and efficient people-centered care, towards value-based reimbursement models that prioritize quality outcomes of continuity of care. The European Commission has recognized the considerable potential of blockchaininspired technologies for administrations, businesses, and society in general [213]. In this context, the EU has already allocated millions to blockchain-related projects, and potentially more to be committed from 2018 to 2020. The Commission recently launched the EU Blockchain Observatory and Forum [214], mapping existing initiatives on a blockchain, monitor related trends and developments, informing policy debates, and inspiring collective actions based on specific use-cases. Blockchain can significantly contribute to enabling nurses to deliver on access to health and social care through the digitalization of health and care. Engaging end-users with comorbidities and local frontline nurses in co-designing ‘fit for purpose’ health and social care systems is the first step forward to this path [215].

6.6.5 Telehealth and AI in concurrent medical conditions (“comorbidity”) Patients who benefit most from telehealth applications are usually (1) older, (2) affected by multimorbidity, and (3) on polypharmacy (i.e., using 5 drugs or more concurrently). This implies a challenging situation for the patient, caregiver, or healthcare professional [216]. Telehealth has a considerable potential to generate real-world evidence when the special needs of the affected patients are considered while setting up a telehealth system. Cognitive

246

Foundations of Artificial Intelligence in Healthcare and Bioscience

and physical impairments should be considered, as older and multimorbid adults may for example, have usability problems with common smartphone apps [217]. Data completeness on drug therapy include all drugs that have been prescribed, dispensed, and taken. Linking the monitoring system to a comprehensive EHR system, containing medication information can give healthcare professionals a comprehensive overview of the patient’s current medication status [218]. A combination of these data with patientreported medication intake from telehealth systems can prevent adverse effects resulting from e.g., drug-drug interactions, intolerance, or double prescriptions. Integrated telecare programs implemented for comorbid patients showed improved clinical outcomes, self-management, and improvements in quality of life. However, different patient populations benefit in different ways from these care plans. Thus, continuous evaluation, service adaptation in a real-life environment set with clear outcome metrics, is required for best results [219].

6.6.6 Chatbots and AI in concurrent medical conditions (“comorbidity”) A project called CONSULT (Collaborative mObile decisioN Support for managing mULtiple morbidiTies) was developed to explore the feasibility of employing a collaborative decisionsupport tool to help patients suffering from chronic diseases to self-manage their treatment plans. By ‘collaborative,’ it mean that the patient, caregivers, and medical professionals work as a team to decide on the best treatment plan for the patient [220]. The conversational component of the CONSULT system uses a chatbot integrated with the patient’s EHR. Interactions with the chatbot are supported by argumentation-based dialogue [221]. Additionally, the patient may have questions regarding their current treatment plan (e.g., why a particular medication has been prescribed). All the explanations are generated by the argumentation engine and displayed on the personalized dashboard. The conversational component can also alert the patient to an irregularity in 1 or more of their recent measurements and initiate a conversation. The purpose of such a discourse is to find a possible solution and suggest that the patient review specific clinical findings or to advise the patient to contact their health care provider. Patients with different characteristics in terms of risk factors, comorbidity, or demographic groups are participating in studies to evaluate the usability of the proposed system. Results have shown that argumentation is a promising method in explaining decisions to help patients choose a treatment plan together with their doctor. The use of argument and attack schemes specialized for the medical domain will be a next step to consider generating better explanations [222].

6.6.7 Natural language processing (NLP) and AI in concurrent medical conditions (“comorbidity”) To effectively identify phenotype-genotype associations, extracting comorbidity information is essential because of their confounding effects on clinical outcomes [223]. To address this

Chapter 6 • Current AI applications in medical therapies and services

247

challenge, an automated method was developed that accurately determines comorbidities from electronic medical records [224]. Using a modified version of the Charlson comorbidity index (CCI) [225], a reference standard was created of comorbidities through a manual review of 100 admission notes. They were processed using the MedLEE natural language processing system [226], and writing queries to extract comorbidities automatically from its structured output. Natural language processing (NLP) can be used to build a generalizable method for extracting comorbidities from electronic health records (EHR). Therefore, the goal was to develop an effective automated and generalizable NLP-based method that derives comorbidities from narrative records. The system was able to derive comorbidities automatically from narrative admission notes with high accuracy. This method has a higher sensitivity and accuracy than determining comorbidities from claims data and has an additional advantage of utility in prospective studies that need to identify phenotypes from medical records and correct for the confounding effects of comorbidities [227].

6.6.8 Expert systems and AI in concurrent medical conditions (“comorbidity”) The application expert systems in medicine have broadened and now covers many areas, especially in the use of a reasoning process in decision-making. Through the self-learning ability of machine learning, the reliability and accuracy of these programs have improved significantly in multiple disease categories, including mental and physical illnesses, which often are found as comorbidities [228]. An example of an expert system used to manage pulmonary diseases, commonly presenting as a comorbidity in chronically ill patients, was studied using clinical data from 189 patients. The study compared diagnostic accuracy between the practitioners and medical students using the expert system versus diagnostic gold standards. The goal was to help improve the expert system’s diagnostic accuracy while minimizing errors and costs in diagnosing and broadening practitioner knowledge [229]. It was concluded that the accuracy of the system is enhanced with the increasing total amount of data provided for each patient. Especially additional objective and reliable data provide much better accuracy for the output of the expert system, as expected. Further improvement on the performance and accuracy of the system may be obtained by designing the program with AI self-learning ability as well [230].

6.6.9 Robotics and AI in concurrent medical conditions (“comorbidity”) Socially assistive robots are now being used for patients with comorbidities to help manage their chronic disease condition(s). A study was conducted with patients with chronic obstructive pulmonary disease (COPD) and at least 1 other disease condition. Adherence to medication and availability of rehabilitation were suboptimal in the patient group and thus, increased the risk of rehospitalization. The study aimed to investigate the effectiveness of a robot delivering telehealth care to increase adherence to medication and home rehabilitation,

248

Foundations of Artificial Intelligence in Healthcare and Bioscience

improve quality of life, and reduce hospital readmission compared with a standard care control group. The study group consisted of 60 randomized patients who were provided with a robot at home for 4 months. The robot group did not show a significant reduction in rehospitalizations. However, 75% of the patients reported appreciating the robot’s capacity to offer companionship, which may provide benefits over other kinds of platforms such as computers or iPads. The intervention did improve adherence to both medication and rehabilitation exercises [231].

6.6.10 Population health (demographics and epidemiology) and AI in concurrent medical conditions (“comorbidity”) The financial and clinical success of population health management programs takes much more into account than what happens to patients in their doctor’s office. Patients must be assisted in overcoming socioeconomic barriers to improve their health truly. In a Robert Wood Johnson Foundation study [232], it was found that funding must be directed to community improvements that can reduce downstream medical costs. A 20% increase in the median social-to-health spending ratio was equivalent to 85,000 fewer obese adults and more than 950,000 adults with mental illness. The study added that this significantly reduced the associated spending for these conditions and their comorbidities. The World Health Organization defines social determinants as “the conditions in which people are born, grow, work, live, and age and the wider set of forces and systems shaping the conditions of daily life” [233]. Population health management programs that address these, manageable pieces of the great American puzzle can successfully change many lives for the better. Some of the more obvious pieces include [234]: • • • • • • •

Safe and secure housing; English language proficiency and cultural understanding; Health literacy and educational level; Transportation access; Access to healthy, nutritious food choice; Public safety and interpersonal violence; Social support and caregiver availability.

6.6.11 Precision medicine/health (personalized health) and AI in concurrent medical conditions (“comorbidity”) AI in health care is projected to grow 10-fold [235]. As described throughout this text, AI's applications are almost endless, including robot-assisted surgery, virtual nursing assistants, dosage control, automated workflow administration, etc. And while these exciting technologies have enormous immediate value, they are only scratching the surface of the more significant contribution to health care: intelligent diagnostics and precision medicine. AI’s machine learning process gets smarter as it learns from large volumes of high-quality data. As biometrics improves its data collection capabilities (with the IoT, pill-sized cameras,

Chapter 6 • Current AI applications in medical therapies and services

249

fitness bracelets, gene expression analysis, and beyond) and healthcare software and algorithms get better at organizing and analyzing data, we’re due for a revolution in how diseases are identified and treated. AI will be able to compare a patient’s health to an extensive database that will compare genetics, environment, and behavior to optimize and improve treatments in disease entities and in identifying paired entities associated with chronic illness and comorbidity [236].

6.6.12 Healthcare analytics and AI in concurrent medical conditions (“comorbidity”) Most chronic health conditions, such as diabetes, accompany other complications and symptoms like hypertension, fatigue, etc. Finding associations between these conditions help us identify comorbidity patterns. The field of data analytics can immensely contribute to health care by analyzing clinical, patient behavior, sentiment, and claims data. A study was conducted to analyze and obtain insight into comorbidity or the coexistence of several diseases in patients using clinical data. Most studies that address comorbidity from the data analytics perspective investigate the occurrence of symptoms/diseases among a large population of patients. This method can successfully capture the correlation among diseases, and it is highly prone to ignoring confounders or indirect correlations. This study strives to go beyond the conventional correlational perspective among lab test results and diseases by exploring the sufficient and necessary conditions of a disease based on the lab results. A novel angle was used by inferring the association patterns using binary Markov random fields (BMRF) that recreates the underlying comorbidity network of diseases through bootstrapping and cofactor elimination technique. Using this method, it is possible to obtain a more realistic picture of comorbidity and disease correlation and to infer an accurate comorbidity relationship among patients in different age, gender and race groups [237].

6.6.13 Preventive health and AI in concurrent medical conditions (“comorbidity”) The Centers for Disease Control and Prevention (CDC) reports [238] that nearly half of adults in the United States with arthritis also have at least 1 other chronic condition. While heart disease is the most common, diabetes, obesity, high cholesterol, and chronic respiratory conditions are high on the list as well. The CDC revealed that in the United States: • Forty-nine percent of adults with heart disease also had arthritis; • Forty-seven percent of adults with diabetes also had arthritis; • Thirty-one percent of adults who are obese have arthritis. There is no concrete answer regarding why it is common for people with arthritis to have comorbidities. AI analysis, as well as speculation, has pointed to non-modifiable risk factors as well as modifiable risk factors that are associated with arthritis and comorbidities. Researchers are increasingly concerned about the rise in comorbidity among people with arthritis [239]. As the U.S. population ages, they are looking at ways to mitigate the effects of

250

Foundations of Artificial Intelligence in Healthcare and Bioscience

treating multiple chronic conditions. Increasing physical activity, coordinating doctor appointments and tests, and properly managing medications are just a few of the suggestions [240].

6.6.14 Public health and AI in concurrent medical conditions (“comorbidity”) The risk factors for falls and fractures in people with schizophrenia-spectrum disorders are a complex/multifactorial public health problem compounded by the increase frequency of arthritis in the risk population as mentioned above. AI data analysis suggests that preexisting comorbid physical health conditions, particularly cardiovascular, metabolic, and osteoporosis, are associated with future hospital admissions due to falls. An extensive evidence base has demonstrated that people with schizophrenia-spectrum have considerably worse physical health compared to the general population [241]. Despite the poor physical health of people with schizophrenia spectrum, few studies have considered comorbid physical illnesses and falls. Many physical comorbidities recorded were associated with both falls and fractures, which is in line with the general falls/fracture literature. The key message is that people with schizophrenia-spectrum disorders should be screened for falls risk, particularly those who are older or who have comorbid medical comorbidities (particularly cardiovascular disease). Also, in line with the general population, older age, comorbid physical health disorders, and some physical health medications are predictors of falls and fractures in people with schizophrenia spectrum disorders. Future bone health promotion interventions targeting reducing falls and fractures are indicated among people with schizophrenia spectrum [242].

6.6.15 Access and availability and AI in concurrent medical conditions (“comorbidity”) Over 92% of US older adults have at least 1 chronic disease or medical condition. Of that, 77% have at least 2. Low-income and uninsured adults in particular experience a higher burden of comorbidities, and the Medicaid expansion provision of the Affordable Care Act was designed to improve access to healthcare in this population group. A study was conducted to determine the distribution of low-income and uninsured adults in expanded versus nonexpanded states and evaluate the prevalence of comorbidities in both groups. As compared with non-expanded Medicaid states, states with expanded Medicaid had a higher proportion of adults with an income of at least $50,000 per year (39.6% vs. 35.5%) and a lower proportion of individuals with no health insurance coverage (15.2% vs. 20.3%). In non-expanded states, among the uninsured, there was a higher proportion of obese (31.6% vs. 26.9%), and higher average number of comorbidities (1.62 vs. 1.52) [243] As compared to expanded states. Overall, the prevalence of comorbidities was higher among participants in states that did not expand Medicaid compared with those that did. States that did not participate in the ACA Medicaid expansion program experience a higher burden of comorbidities as compared with states that participated. Differences in

Chapter 6 • Current AI applications in medical therapies and services

251

socioeconomic status or healthcare access did not account for this finding. Negative health behaviors in non-expanded states may expand this difference in the coming years as individuals in expanded states obtain better access to preventive and medical care. Only programs in non-expanded states designed to improve the prevention and management of the disease will mitigate this adverse condition [244].

6.7 Medical/surgical robotics 6.7.1 Big data analytics and AI in medical/surgical robotics Medical robots and AI robots (robots using AI) are rapidly growing in the medical industry as well as people’s normal lives. Also, telepresence robots, such as the RP-VITA [245] are affecting hospital-based health care, being used in a broad range of medical diagnostic and treatment applications. Beyond the hospital ecosystem, wearable robotic devices such as the ReWalk [246] exoskeleton, are helping paralyzed victims to become mobile in their home setting. Although robotic use of big data in health care is still in its early stages, algorithms implementing HIPAA- compliant data collection through Cloud technologies will advance health care robotics and Robotic process automation (RPA) [247]. Through health professionals’ direction, cloud technologies will be able to allow telepresence robots to collect a patient’s health status indicators and, when necessary, provide reminders to enhance medication compliance. Cloud technologies can also enable therapeutic robots to collect statistics on pediatric patients at risk for developmental disabilities, monitor for early warning signs, and transmit suspicious information to clinicians. Surgical robots, such as daVinci [248] can connect to the Cloud to assist surgeons while in the operating room, as it mines through the large sets of open MRI data associated with patients with similar medical conditions. Robots will be able to access the personal Cloud data stream from a patient’s exercise band or smartphone GPS coordinates in case of an emergency. The continued growth of robotics and their use of Big Data and Cloud computing technologies will continue to raise security and privacy issues. The mass of data collected by robots and health professionals from multiple sources will include sensitive, medical, and personal information. Currently, there are few formal standards regarding security and privacy protection as robotics and RPA grow in the healthcare space. FDA regulations regarding access to HIPAA compliant data will be a focus of concern and federal administrative actions going forward. Solutions will include blockchain technologies (see below) and protocols for patients and doctors to participate in robot interaction or sequestration of shared information until its anonymity. Solutions will evolve as big data opportunities in robotic care expand.

6.7.2 Health information and records (EHR) and AI in medical/surgical robotics As AI in healthcare continues to grow, it will continue to improve its business operations and processes to increase operational efficiencies, expenses, and productivity. It that effort,

252

Foundations of Artificial Intelligence in Healthcare and Bioscience

robotic process automation (RPA) will be an ever-expanding approach to accomplish such business objectives. RPA is essentially mimicking human behavior for repetitive, rule-based tasks, thus allowing humans to focus on activities which require cerebral thinking. RPA executes routine tasks in a constant, repetitive fashion, and in a fraction of the time, it takes a human. It also reduces the risk of human-error synonymous with repetitive work. The robot accomplishes tasks through scripted commands and processes that have access to data sources and applications such as the EHR. Thus, the robot functions data through access to input screens, online application programming interfaces (APIs), and structured and unstructured data repositories. Certainly, RPA is not an automated solution for every EHR process, nor will it eliminate all human involvement and costs. But implemented correctly in EHR management, it is a highly effective and efficient tool producing functional excellence that delivers cost savings and streamlines processes. An example of a typical EHR function would be pre-authorizations. RPA can gather information from websites and other disparate systems and integrate the data directly into the EHR, often even capable of submitting for the pre-authorization [249].

6.7.3 Research/clinical trials and AI in medical/surgical robotics Research in robotics is exploding as new uses for medical robots grow, especially in the field of surgery. However, novel uses of robots for diagnosis (e.g., micro-bots), exoskeleton robots for paralyzed and recuperating injured patients, and on and on. Some examples of where research is taking the field of robotics illustrate the magnitude of the field: • A team of researchers at the University of California at Berkeley published research on the application of an algorithm for automated suturing performed by robots [250]; • Johns Hopkins University announced that one of its researchers was part of a team that developed a robotic surgical system called STAR or the Smart Tissue Autonomous Robot. The system integrates 3D computer imaging and sensors to help guide the robot through the suturing process [251]; • A study presented at the 2016 World Congress on Engineering and Computer Science discussed using machine learning to evaluate surgeon performance in robot-assisted minimally invasive surgery [252]; • Researchers at the University of California, San Diego (UCSD) Advanced Robotics and Controls Lab are exploring machine learning applications to improve surgical robotics [253]; • To improve how clinical reports are processed, a team of researchers developed a clinical information extraction system called IDEAL-X. The IDEAL-X adaptive learning platform uses machine learning to understand how a user generates reports. It predicts patterns to improve the speed and efficiency of the process [254].

6.7.4 Blockchain and AI in medical/surgical robotics Blockchain-based applications such as “smart contracts” are showing great potential to make distributed robotics operations more secure, autonomous, flexible, and even profitable.

Chapter 6 • Current AI applications in medical therapies and services

253

Therefore, blockchain promises a way to bridge the gap between purely scientific domains and real-world applications. To move beyond the classical view of distributed and decentralized AI, robotics improves an understanding of the possibilities of combining autonomous agents (either physical or virtual) with blockchain-based technologies. It also raises a plethora of questions that will have to be answered in the integration and interoperability of the 2 technologies [255]: • What blockchain tools are available to increase the audibility, transparency, and reliability of AI and robotics? • What kind of algorithms is suitable to combine both technologies? • Are there new models and methods to connect autonomous agents to blockchain-based innovations such as “smart contracts”? • Is blockchain technology a suitable way to achieve emergent aggregations of autonomous and self-adaptive agents? • Are distributed networks such as Bitcoin, Ethereum, EOS, Tezos, etc. a feasible way to integrate AI and robotics in our society? • Are there new business models for AI and robotics-based on cryptographic algorithms?

6.7.5 Internet of Things (IoT) and AI in medical/surgical robotics Whereas the Internet of Things (IoT) and robots have similarities, they are quite different. IoT applications are specific. That is, they are made to deal with specific, narrowly-defined problems. They also often process data in vastly different ways than robots do. Whereas many IoT applications rely on cloud computing mostly, robots more often process data locally [256]. Their integration can enhance IoT applications that have little intelligence on their own with intelligent robots that are more autonomous and capable of dealing with unique situations as they arise. Thus, in looking at the future of health care, there will be ever-increasing use-cases where IoT and robotics meet [257]. By 2019 and beyond, there will be a 50% increase over 2017 in the use of robots in conjunction with IoTs that carry out tasks such as medication delivery, food delivery, and delivery of supplies overall. In other words, they will fulfill routine tasks, freeing up (human) resources [258].

6.7.6 Telehealth and AI in medical/surgical robotics Telepresence robots (combined telehealth and robotic technology) are innovative technology in health care. They are known by various terms such as skype on wheels, virtual presence robots, or remote presence robots. Telepresence Robots Market, with the use of components such as cameras, speakers, microphones, and sensors, provides a platform to the user for remote communications. They can be controlled by using smartphones and tablets. With the help of these components, people can view, hear, and interact with the robot operator located at a remote location [259].

254

Foundations of Artificial Intelligence in Healthcare and Bioscience

Telepresence robots are experiencing wide adoption in healthcare and the medical sector worldwide. Factors include advancement in artificial intelligence technology, increasing usage of smartphones, and growing trend of bringing automation into operations by enterprises. These factors are fueling the market growth of telepresence robots. However, the high cost of manufacturing of robots, as well as their installation & maintenance, is expected to hamper the market adoption of telepresence robots in the coming years [260].

6.7.7 Chatbots and AI in medical/surgical robotics The medical field is entering a new era of diagnosis, less invasive surgery, and reduced surgical risks. Robotics is revolutionizing health care at both the macro and micro levels. New bot technologies are undertaking diagnostic and therapeutic procedures previously non-existent or done with greater risk and far less accuracy. Endoscopy (see Chapter 5, page 152) is a procedure using a small camera or surgical tool on a long wire, which is passed through an aperture or a bodily “tube” or natural opening. Its goal is to search for abnormalities, foreign objects, or traces of disease and, when necessary, perform an intra-body surgical procedure. It’s a somewhat uncomfortable and delicate procedure that is slowly being replaced by robotics. Slender, flexible robots are directed to the exact spot the doctor needs. They can be held there without the tremor of human hands while they perform anything from taking a biopsy to cauterizing a wound. An even more impressive robotic endoscopic technique is the “capsule endoscopy.” This is where the patient swallows a pill-sized robot that travels along with a particular organ system, gathering data and taking pictures that can be sent directly to a processor for diagnostics. Newly developed medical robots (nanobots) use near-microscopic mechanical particles to localize a drug or other therapy to a specific target site within the body (see Fig. 6 2). This procedure could be used to deliver radiation to a tumor, or simply to reduce the side

FIGURE 6–2 Nanorobotic technology. Medical robots (micro-bot, nanorobots, nanobots) use near-microscopic mechanical particles to localize a drug or other therapy to a specific target site within the body. Source: http:// scitechconnect.elsevier.com

Chapter 6 • Current AI applications in medical therapies and services

255

effects of the medication by confining it to the organ where it might be needed. The particles get to the target in a variety of ways. New research has generated micro-bots with tiny, helical tails that can be directed by magnetic fields to spin themselves forward through blood vessels to a specific spot in the body [261]. Bioengineers at Boston Children’s Hospital reported the first demonstration of a robot able to navigate autonomously inside the body. In an animal model of cardiac valve repair, the team programmed a robotic catheter to find its way along the walls of a beating, bloodfilled heart to a leaky valve without a surgeon’s guidance [262].

6.7.8 Natural language processing (NLP) and AI in medical/surgical robotics Robotic process automation (RPA) tools can help healthcare companies retrieve data from digital and physical clinical records. The process of searching through a database for the correct documents and then routing them to the appropriate user can be automated. But this process needs a human employee to supply it with login credentials so that it can access that network or an EHR system [263]. This type of data extraction can be improved and built upon in ways in which AI may not be suited. Natural language processing (NLP) solutions may be useful for digitizing clinical documents and identifying them based on their data. Then RPA may be able to recognize and transfer them faster and based on fewer credentials. RPA software can be trained to detect the metadata such as the filenames of scanned PDFs, or specific ID numbers for EMR documents.

6.7.9 Expert systems and AI in medical/surgical robotics An expert system is made up of 2 components, a knowledge base, and an inference engine. The knowledge base represents facts and rules, while the inference engine applies the rules to the known facts to come up with new points. Thus, the system “learns” and expands its knowledge base so that next time, the same problem is “easier” to solve (see Expert Systems, Chapter 3, page 53). Robotic process automation (RPA) is a combination of 2 different fields of AI. In health care, it is the combination of machine learning and expert systems operating to produce automated medical procedural or diagnostic results. The process can be applied to routine medical care or surgical care. Anything of a repetitive nature that can be automated is eligible for the RPA expert system application. The output from that simple expert system is piped into the automated testing tool technology that then performs a medical procedure transaction instead of a test transaction. The result is a real medical procedure transaction that requires no further user input. This is why the term “robotic” is used, even though robotics is not involved [264].

256

Foundations of Artificial Intelligence in Healthcare and Bioscience

6.7.10 Precision medicine/health (personalized health) and AI in medical/surgical robotics When we think about robots, we think about them doing physical work. But that’s not the only way for robots to help people. That’s where socially assistive robotics comes in, a form of precision health by helping people in social, not physical ways. The work of Maja Mataric´ , Ph.D., director of the University of Southern California’s Robotics and Autonomous Systems [265] The center was inspired by the need for accessible and useful care for large populations. “When people are engaging in difficult health-changing behaviors, they need companionship and encouragement, and that support is most effective when it comes from a physical agent, such as another person, pet, or a robot.” We’ve used the humanoid robot Bandit, which has arms, to show stroke patients how to do rehabilitation exercises [266], elderly users how to do chair aerobics and children with autism how to imitate movements [267]. In other cases, we needed the robot to be encouraging and fun, but smaller and inherently safer, so we used the owl-like robot, Kiwi, in the homes of children with autism [268].

6.7.11 Healthcare analytics and AI in medical/surgical robotics AI playing a significant role in health care data. It provides new and improved analytics. AI analytics are of use in the detection, diagnosis, and treatment of many diseases. In health care, it is helping to provide more targeted care for patients fighting medical conditions or issues [269]. There are different types of AI technologies that help patient outcomes when used in health care. Robotics is being used for repetitive tasks such as analyzing the results of X-rays, CT scans, etc. AI robot prototypes are performing health care tasks. They enable humanoid robots to figure out what is being said, and AI allows the humanoid robot also to respond [270]. This allows for the use of AI humanoid robots in various health care markets. Health care facilities using AI cognitive technology are using health data, which helps result in targeted, more personalized treatment [271]. Health care clinicians are using Google’s DeepMind Health, which can assess and solve real-world health care problems. It is a type of AI technology that combines learning and neuroscience to create algorithms that mimic the human brain [272].

6.7.12 Preventive health and AI in medical/surgical robotics A vast amount of cyber and physical systems (CPS) are carefully combined through the Internet of Things (IoT), intelligent sensing, and robotics to create digitized healthcare services and enterprises. Health engineering will lead to a revolutionized healthcare system that enables the participation of all people for the early prediction and prevention of diseases. But significant challenges can be foreseen in this emerging interdisciplinary field, including the acceptance of healthcare robotics applications in clinical practice. Research in the convergence of automation technology, artificial intelligence, biomedical engineering, and health informatics is essential to their successful applications in health care. Multiple scenarios of

Chapter 6 • Current AI applications in medical therapies and services

257

health engineering in primary care, preventive care, predictive technologies, wearable technologies, hospitalization, home care, and occupational health will be needed to determine the future of digital technologies like AI in health care [273].

6.7.13 Public health and AI in medical/surgical robotics Robots have the excellent ability to seeing patterns and making predictions from large data sets that would be simply overwhelming to humans. Because of this ability, epidemiology is a natural and logical target for AI robots and robot process automation (RPA). These AI robots analyze data on disease outbreaks from doctors in the field and utilize machine learning to cross-references that data with all available medical databases. From this analysis, the robot can predict when and where an outbreak is happening, as well as how to keep it from spreading. One such system is AIME [274] which has been deployed against outbreaks of dengue fever in Malaysia. It provided a nearly 85% accurate prediction rate, saving thousands of lives and potentially millions of dollars. Similar systems are probably being employed in the COVID-19 pandemic, but have not been reported to date.

6.7.14 Access and availability and AI in medical/surgical robotics The University of Pittsburgh School of Medicine and Carnegie Mellon University (CMU) each have been awarded 4-year contracts totaling more than $7.2 million from the U.S. Department of Defense. The goal is to create an autonomous trauma care system that fits in a backpack and can treat and stabilize soldiers injured in remote locations. The goal of TRAuma Care in a Rucksack: TRACIR [275] is to develop artificial AI technologies medicine. A multidisciplinary team of Pitt researchers and clinicians from emergency medicine, surgery, critical care, and pulmonary fields will provide real-world trauma data and medical algorithms that CMU roboticists and computer scientists will incorporate in the creation of a hard and soft robotic suit, into which an injured person can be placed. Monitors embedded in the suit will assess the injury, and AI algorithms will guide the appropriate critical care interventions and robotically apply to stabilize treatments, such as intravenous fluids and medications. “TRACIR could be deployed by drone to hikers or mountain climbers injured in the wilderness; people in submarines or boats could use it; it could give trauma care capabilities to rural health clinics or be used by aid workers responding to natural disasters,” the researchers reported. “And, someday, it could even be used by astronauts on Mars” [275].

6.8 Stem cells and regenerative medicine Among the categories of medical therapies and services discussed in this Chapter, perhaps the ones with the greatest potential of enhancing health and wellness in the coming years are these last 2, stem cells and regenerative medicine, and genetic (and immunogenomic) therapies. Both areas, in conjunction with AI, are likely to introduce “disruptive changes” to

258

Foundations of Artificial Intelligence in Healthcare and Bioscience

the future of medical care and, indeed, preventive health care more so than any other technologies. As will be presented here and in Chapter 7, these technologies are already changing our approach to health care from a process of disease diagnosis and treatment to methods for identification of disease risks and their correction and prevention. In this Chapter (6), we will present the basic bioscience of stem cells (genetics and genomics bioscience having been presented in Chapter 5) and how applicable AI categories are assisting in the development of both sciences. Then, in Chapter 7 (“AI applications in prevalent disease categories”), we will discuss how each of these technologies is treating human disorders; “preventing” disease (“the Holy Grail” of health care); and how AI is assisting both.

6.8.1 The basic bioscience of stem cells and regenerative medicine [276] Stem cells are cells within the body originating during embryologic development (from totipotent to pluripotent embryonic stem cells). During early life and growth these undifferentiated embryonic stem cells have the potential to develop into many different types of adult (somatic) stem cells found in organs and tissues in the body. They also differentiate into red blood cells (erythrocytes), platelets, and white blood cells (leukocytes or WBCs) including neutrophils, basophils, eosinophils, macrophages, monocytes as well as WBCs associated with the immune system including lymphocytes (T-cells, B-cells, natural killer cells) and plasma cells (Fig. 6 3). The adult stem cells serve as a repair system for the body. In some organs, such as the gut and bone marrow, they regularly divide to repair and replace worn out or damaged tissues. In other organs, however, such as the pancreas and the heart, stem cells only divide under special conditions. Given their unique regenerative abilities, the adult stem cells offer new potentials for treating conditions such as immune disorders, cancers, diabetes, and heart disease. When these cells are used in cell-based therapies to treat disease (Chapter 7) it is referred to as regenerative or reparative medicine. The more versatile human embryonic (pluripotent or PSC) stem cells can be harvested through embryos and used for reproductive purposes through in vitro fertilization. This method has met with some ethical and political resistance. However, in 2006 researchers made a (Nobel Prize winning) breakthrough by identifying conditions that would allow specialized adult cells to be “reprogrammed” genetically to assume a stem cell-like state. This new type of stem cell is called an induced pluripotent stem cells (iPSCs) and functions similarly to a natural pluripotent stem cell with the ability to become any cell type of the body [277]. The clincal value of stem cells lies in the differentiation of embryonic (pluripotent) stem cells into differentiated adult stem cells. Whereas this process is essential in repair and regeneration of normal healthy tissue in the body, it also plays a more sinister role in cells differentiating into disease-oriented progenitors. Cancers, diabetes, congenital disabilities, and so many other diseases and human disorders are generated through genetic and molecular processes producing differentiation of embryonic and adult stem cells from normal to abnormal. While understanding their role in the production of abnormal conditions, stem cells have a number of positive values in testing the effectiveness and safety of new medications, including

Chapter 6 • Current AI applications in medical therapies and services

259

FIGURE 6–3 Stages of human (and stem cells) development. Stem cells have the potential to develop into many different cell types during early (pluripotent) life and growth. Courtesy of Maharaj Institute of Immune Regenerative Medicine.

anti-tumor therapies and anti-infectives and in the analysis of a broad range of drugs on different cell types. Scientists must be able to precisely control the differentiation of stem cells into the specific cell type on which drugs can be tested. But, perhaps the most important potential application of human stem cells is the generation of cells and tissues that could be used for cell-based therapies (regenerative medicine or “stem-cell transplantation,” discussed in Chapter 7, page 302). Stem cells, directed to differentiate into specific cell types, offer the possibility of a renewable source of replacement cells and tissues to treat diseases including macular degeneration, spinal cord injury, stroke, burns, heart disease, diabetes, osteoarthritis, and rheumatoid arthritis. These will all be covered in Chapter 7 under “Immunology and Autoimmune Disorders.”

6.8.2 Big data analytics and AI in stem cells and regenerative medicine A detailed molecular understanding of normal and disease development will be facilitated by the identification of new drugs and cell therapy targets for disease treatment. Human pluripotent stem cells can provide a significant in vitro source of human cell types and, in a growing number of instances, also 3-dimensional multicellular tissues called organoids.

260

Foundations of Artificial Intelligence in Healthcare and Bioscience

Stem cell technology to discover and develop new therapies will be aided by detailed molecular characterization of cell identity, cell signaling pathways, and target gene networks [278]. Big data or ‘omics’ techniques, particularly transcriptomics and proteomics, facilitate cell and tissue characterization using thousands to tens-of-thousands of genes or proteins. These gene and protein profiles are analyzed using existing and/or emergent bioinformatics methods, including a growing number of methods that compare sample profiles against compendia of reference samples. Bioinformatic methods that generate comprehensive and integrated combinations of signaling pathways and gene regulatory networks are starting to provide specific molecular disease hypotheses that can be investigated using human PSC cell-derived cell types. Thus compendium-based big data approaches to stem cell research present significant opportunities for the development of novel cell and drug therapies [279].

6.8.3 Research/clinical trials and AI in stem cells and regenerative medicine The plasticity and pluripotent nature of bone marrow-derived cells have presented a dilemma in the scientific community for decades. Indeed, the true identity of stem cells responsible for these properties remains to be delineated [280]. VSELs (very small embryonic-like stem cells) could be the ultimate stem cell with this plasticity [281], but more in vivo evidence needs to be generated to support this conclusion. There has been discussion and concerted efforts by the scientific community, but more preclinical studies are required to demonstrate the regenerative potential of VSELs. The question remains as to whether these stem cells should be expanded in vitro and then transplanted for therapeutic purpose or AI strategies could evolve to manipulate these stem cells in vivo to bring about endogenous regeneration [282]. The second alternative seems promising, but currently, more research needs to continue in both directions. A better understanding of VSELs will help to understand normal tissue stem cell biology better and allow AI and machine learning to offer newer insights about their alteration with age and development of various diseases, including cancers [283].

6.8.4 Blockchain and AI in stem cells and regenerative medicine GIOSTAR Labs, a company dedicated to stem cell-based technologies is a subsidiary of a company called Giostar [284] that recently launched the non-profit Stem Cells for All initiative. They have joined forces with the Association of Professional Ball Players of America (APBPA) in this innovative program. The intent is to provide stem cell treatment to athletes, families, and fans at deeply discounted prices. GIOSTAR will be one of the first to apply blockchain in a new way to ensure a successful outcome. Many baseball players endure injuries that require not only extensive surgery that is demanding on the body and cause severe financial stress, disable them from playing for periods, if not career-ending. This pilot initiative will provide members of the APBPA free or

Chapter 6 • Current AI applications in medical therapies and services

261

deeply discounted therapies through GoldSTAR, who will also offer stem cell services to the underserved. Explains GIOSTAR Co-Founder Siddarth Bhavsar, “Federated permissioned blockchain based on HIPAA compatible nodes allows us to build trust via data integrity and permission collaboration from the ground up.” He continues, “Since personalized precision medicine like stem cells entails an additional level of data complexity. Thus, an AI and NLP algorithms layer will bring greater structure and extract meaning from the data to improve accuracy” [285].

6.8.5 Internet of Things (IoT) and AI in stem cells and regenerative medicine Interesting studies have been conducted that correlate data acquired from IoT activity trackers with quality of life (QOL). One such study prospectively investigated patients with advanced lung cancer who were given Fitbit activity trackers to measure daily step counts and multiple validated QOL questionnaires [286]. It was demonstrated that higher physical activity levels (as measured by daily step count) positively correlated with physical functioning, role functioning, emotional functioning, and global QOL in patients undergoing stem cell transplantation. Step counts were positively correlated with the National Cancer Institute’s Patient-Reported Outcomes Common Terminology Criteria for Adverse Events and multiple QOL items from PROMIS (PatientReport Outcomes Measurement Information System) [287].

6.8.6 3-D bioprinting and AI in stem cells and regenerative medicine Innovation produces developments in seemingly disparate areas that can sometimes (and often do) intersect. For instance, 3D printing (mentioned in multiple “Top Listings” in previous chapters) of body parts is currently being reviewed by the U.S. Food and Drug Administration (FDA), which historically has not concerned itself with customized (i.e., individualized) therapeutic interventions [288]. 3D bioprinting is, in many ways, conceptually similar to 3D printing generally. Both entail a process whereby the addition of successive layers of material, one on top of the other as a print-head moves back and forth, results in an output scenario. Both are based on a digital model of the part to be reproduced. Building 3-dimension body parts entails the usage of different materials and these may include a person’s stem cells. Already, some medical applications allow for the creation of “customized prosthetics, implants, and anatomical models, tissue and organ fabrication” [289]. On the horizon, we may see the ability to create fully functioning organs.

6.8.7 Chatbots and AI in stem cells and regenerative medicine Biobots modeled after sperm cells can now swim, which means they could one-day seed stem cells to deliver drugs, perform minimally invasive surgery, and target cancer. The

262

Foundations of Artificial Intelligence in Healthcare and Bioscience

biobots are modeled after nature, in particular sperm cells, and are propelled by muscles and nerves derived from rats [290]. Research teams led by Taher Saif and Rashid Bashir worked together “to develop the first self-propelled biohybrid swimming and walking biobots powered by beating cardiac muscle cells derived from rats” [291]. “Our first swimmer study successfully demonstrated that the bots, modeled after sperm cells, could, in fact, swim,” Saif said. A future version of these sperm-inspired biobots could one day be swimming through your body to seed stem cells, target illnesses, or administer drugs internally. “The long-term vision is simple. Could we make elementary structures and seed them with stem cells that would differentiate into smart structures to deliver drugs, perform minimally invasive surgery, or target cancer?” Saif said. “Wounds are living environments, and the conditions change quickly as cells and tissues communicate and attempt to repair. An ideal treatment would sense, process, and respond to these changes in the wound state and intervene to correct and speed recovery,” said BETR program manager Paul Sheehan, in a statement [290].

6.8.8 Natural language processing (NLP) and AI in stem cells and regenerative medicine Data on cancer stem cell surface molecular markers from 27 of the most common cancer diseases were analyzed using natural language processing and data mining techniques. The source used for the search was 8933 full-text open-access English-language scientific articles available on the Internet. Text mining was based on searching for 3 entities within 1 sentence, namely a tumor name, the phrase “cancer stem cells” or its synonym, and a name of a differentiation cluster molecule. As a result, a list of molecular surface markers was formed that included markers most frequently mentioned in the context of certain tumor diseases. This study illustrates the interoperability of AI and machine learning, through data mining and NLP, to conduct research studies previously not obtainable through standard research protocols [292].

6.8.9 Expert systems and AI in stem cells and regenerative medicine Microarray data is used extensively in cell cultures because it provides a more comprehensive understanding of genetic variants among diseases. Gene expression samples have high dimensionality, so it becomes time-consuming to analyze the samples manually. Thus, an automated system is needed to analyze these samples. The fuzzy expert system (see Chapter 3, page 53) is superior to machine learning and statistical analyzes in this scenario because it offers a precise classification. Knowledge acquisition would be a significant concern in the expert system’s fuzzy classification. Despite several existing approaches for knowledge acquisition, a great deal of effort is necessary to enhance the learning process. An innovative Hybrid Stem Cell (HSC) algorithm was studied that utilizes Ant Colony optimization and a stem cell algorithm for designing a fuzzy classification system to extract

Chapter 6 • Current AI applications in medical therapies and services

263

the informative rules to form the membership functions from the microarray dataset. The human stem cell (HSC) algorithm used a novel Adaptive Stem Cell Optimization (ASCO) to improve the points of membership function and Ant Colony Optimization. It produced a near-optimal rule set. To extract the most informative genes from the large microarray dataset, a method called Mutual Information was used. The performance results of the technique using the 5 microarray datasets proved that the proposed Hybrid Stem Cell (HSC) algorithm produces a precise fuzzy system better than existing methodologies [293].

6.8.10 Robotics and AI in stem cells and regenerative medicine A team of researchers developed a microrobot that can deliver therapeutic stem cells precisely to very specific parts of the brain. Their work demonstrates that neural stem cells can be cultured and differentiated on their robot and that the device can travel from the carotid artery into various parts of the brain. This development may 1 day provide an approach for treating several brain-related disorders [294]. The device is fabricated using 3D laser lithography and is designed to have a helical shape to travel through the body more easily. It can be manipulated by a magnet, allowing the researchers to move it through the body non-invasively using an external magnetic field. The device is also porous, which helps the attachment and proliferation of the stem cells. Professor Hongsoo Choi, who was involved in the research, said, “Through this research, we hope to increase the treatment efficiency and success rate for Alzheimer’s and central neural diseases, which couldn’t be approached through the existing method of stem cell treatment. Through continuous follow-up research with hospitals and related companies, we will do our best to develop a microrobot-based precise treatment system that can be used in actual hospital and clinical sites” [295].

6.8.11 Precision medicine/health (personalized health) and AI in stem cells and regenerative medicine Recent technical advances now allow the direct manipulation of DNA sequences in cells (DNA editing - see Chapter 7, CRISPR, page 303) through machine learning. Mutations can be corrected, and genetic variations of interest can be introduced. Using stem cells with this molecular biology technique enables a discovery process wherein genetic analysis can identify a disease. Stem cells can then be created from the patient to correct their mutation. This is the essence of precision medicine. Cells can be differentiated into cell types relevant to a disease for assessing the inference genetic analysis could provide. Then, subsequent study of these cells will identify genes that can modify mutations and reduce or eliminate the disease caused by the mutant gene. This could also lead to the identification of new drug targets. To realize this level of precision medicine, research scientists are collaborating to conduct relevant genetic analyzes and downstream stem cell-based vetting of these genes for disease

264

Foundations of Artificial Intelligence in Healthcare and Bioscience

mechanism and molecular interventions. Such a collaborative effort has been established as “The Stem Cell Core Lab” of Columbia University [296].

6.8.12 Healthcare analytics and AI in stem cells and regenerative medicine The promises of stem cell and regenerative medicine to generate precise disease models and designer cell samples for personalized therapeutics will require harnessing phenogenotypic cell-level data quantitatively and predictively in the lab and clinic. Those requirements could be met by developing a Big-Data-driven stem cell bioscience strategy and community [297]. Stem cell research has been increasingly making use of biological Big Data analytics and integration strategies to define better cell types or states and the differences among cell types/states. Recently more sophisticated mathematical and computational methods, including Machine Learning methods, have begun being used to address questions about cellular state and identity in a more specific, quantitative, data-driven, and predictive manner. A better definition is needed to quantitative Big-Data-driven stem cell bioscience and the joint development of a strong computational stem cell biology oriented toward clinical/industrial requirements. This will be pivotal in helping make the stem cell field quantitative, predictive, and therapeutically suitable and help make the promise of stem cell therapeutics a reality [298].

6.8.13 Preventive health and AI in stem cells and regenerative medicine In November 1998, the world was introduced to human embryonic stem cells, the blank slate cells that arise at the earliest stages of development and that go on to become any of the scores of cell types that make up a human. With a capacity to replicate endlessly under the right laboratory conditions, the prospect of an inexhaustible supply of replacement cells for ailments such as Parkinson’s, diabetes, heart disease, spinal cord injury, and a host of other dire conditions catapulted the cells into the biomedical spotlight, public imagination, and the hope of preventive healthcare. Today, proven therapies based on AI analysis identifying and trading out diseased cells for healthy lab-grown cells remains a clinical aspiration. But a growing number of clinical trials, the widespread use of the cells in industry, and a swelling list of primary findings attributable to the “Swiss Army knife” of cells are contributing to a measured, steady realization of the promise that came with the first lab-grown cells 2 decades ago. Less publicized is the use of stem cells to develop high-throughput drug screens. Nearly every major pharmaceutical company now has an AI-based stem cell program where the cells are used to assess promising new drug candidates for safety and efficacy. Also in development is the growing scientific contribution of stem cells, both embryonic and induced pluripotent cells. The induced pluripotent cells are adult cells such as skin that have been genetically reprogrammed to mimic the qualities of embryonic stem cells [299].

Chapter 6 • Current AI applications in medical therapies and services

265

6.8.14 Public health and AI in stem cells and regenerative medicine As with any medical therapy, precautions against misuse or untoward events must always be weighed, especially with new innovative therapies like stem cells and regenerative medicine. Transplanted human stem cells are dynamic biological entities that interact intimately with, and are influenced by the physiology of the recipient. The capabilities to self-renew and differentiate that are inherent to human stem cells also point to their perceived therapeutic potential and to the challenge of assessing their safety. Assessing human stem cell safety requires the implementation of a comprehensive strategy. If you are considering stem cell treatments, you must check to make sure the product you are considering is on the FDA’s approved list of stem cell treatments [300]. Whether human stem cells are of embryonic, fetal, or adult origin, donor sources must be carefully screened. Routine AI testing is done on large numbers of infectious agents (bacteria, viruses, etc.) to guard against the inadvertent transmission of infectious diseases [301]. To ensure the integrity, uniformity, and reliability of human stem cell preparations intended for clinical use, it is essential to demonstrate that rigorously controlled, standardized practices and procedures are being followed in establishing and maintaining human stem cell lines in culture. Healthcare providers should discuss testing with their patients on a case-by-case basis [302].

6.8.15 Access and availability and AI in stem cells and regenerative medicine When considering the small number of “stem cell clinics,” there are considerable variations among the offering of these clinics. About 25% focus exclusively on stem cells and regenerative medicine, while many others are rather orthopedic and sports medicine clinics that have added stem cells and regenerative medicine to their services [303]. Also found are differences in the degree of expertize of providers at stem cell clinics. For example, whereas specialists in orthopedics and sports medicine are more likely to restrict stem cell treatments to conditions related to their medical specialties, other providers listing specialties in cosmetic or alternative medicine are more likely to treat a much more extensive range of conditions with stem cells. Just because someone is board certified doesn’t necessarily mean they are qualified to provide stem cell treatments. You need to ask “what they are board-certified in and whether their medical expertise is well-matched to the condition you are seeking treatment for.” More recently, the FDA has begun to tighten up its guidelines and restrict the practices of these clinics [303].

6.9 Genetics and genomics therapies Along with stem cells and regenerative medicine, genetics, and genomics are perhaps the most profound and “disruptive” therapeutic health care technologies influenced by AI.

266

Foundations of Artificial Intelligence in Healthcare and Bioscience

Although they are intimately related (including immunology), the distinction between the these biosciences is essential to highlight. All 3 biosciences are enormous in 2 regards. First, the current and future research and applications of therapies in these 3 areas have the most significant potential of dramatically affecting virtually every category of health and wellness. Second, relative to the immense sciences of stem cells, genetics, genomics, molecular biology, and immunology, the volume of clinical material and information to be addressed is overwhelming. As regards the first consideration of the potential for genetic therapies, since the mapping of the human genome was completed in 2003 (The Human Genome Study [304]), the research, results, and applications of the genetic information produced has had a profound, albeit nascent, influence on medical diagnosis, therapeutics, and prevention. These applications of genetic therapies will be discussed regarding AI categories in the balance of this Chapter (6). Then, in Chapter 7, we will cover specific genetic related treatments and preventive applications in prevalent disease categories. The second considerations, “. . .the volume of clinical material to be addressed. . .in the sciences of stem cells, genetics, genomics, molecular biology, and immunology” may prove to be the most compelling reason why AI has had such a profound influence and thus, has changed (i.e., “disrupted”) the future of health care. Having reviewed (in Chapter 5) “The Bioscience” and the diagnostic applications genetics and genomics have in health care and biosciences, let’s now discuss their enormous role in medical therapies and services. As we discussed back in Chapter 3, “machine learning allows the use of immeasurable amounts of new data whose volume or complexity would previously have made analyzing them unimaginable” [305]. That statement certainly describes the challenge of “bioinformatics,” the science of collecting and analyzing complex biological data such as genetic codes [306]. Now, as we get into AI categories influencing genetics and genomics (here and in Chapter 7), you will begin to see the profound effects of bioinformatics, the science derived from “data science” and “big data analytics.”

6.9.1 Big data analytics and AI in genetics and genomics Throughout Chapters 3 5, we have presented the many applications of data analytics and AI in health care. But its use in genetics and genomics may be considered its most profound contribution to current and future health care. Advancements in AI technologies are enabling the scientific community to create, store, and analyze enormous volumes of data in hours that would have taken years to accomplish only a short time ago [307]. Using machine learning (ML), data-analytical techniques are applied to multi-dimensional datasets allowing for the predictive model constructing and insights gained from the data [308]. ML helps study and understand complex cellular systems, such as genome or gene editing, allowing for the creation of models that then learns from big data sets to generate predictable outcomes.

Chapter 6 • Current AI applications in medical therapies and services

267

In our discussion in Chapter 4 about Big Data Analytics (page 83), there was an explanation and a table (Table 3 7) of the “Vs. of Big Data” [309]. Also referred to as the “V Framework” [310], it has been used to evaluate the current state of data in genomics concerning other applications in data sciences [311]. Example of the evaluation includes: • Volume: One of the key aspects of genomics as data science is the sheer amount of data being generated by sequencers. The data growth trend in genomics, however, is greater than in other disciplines. Some researchers have suggested that if the genomics data generation growth trend remains constant, genomics will soon generate more data than applications such as social media, earth sciences, and astronomy [312]. • Velocity: There are 2 widely accepted interpretations of data velocity: (i) the speed of data generation; and (ii) the speed at which data are processed and made available. The sequencing of a human genome now takes less than 24 h, down from 2 to 8 weeks at the close of the Human Genome Study in 2003. Regarding the speed of data processing, for example in sequencing technologies, such as rapid diagnosis, epidemiology, and microbiome research, nucleic acid sequences for fast, dynamic tracking of diseases and pathogens is now the preferred approach in as little as minutes to hours [313]. • Variety: Genomics data have a two-sided aspect. On 1 side is the sequencing data, ordered lists of nucleotides. In human genomics, these are mapped to the genome and are used to generate coverage or variation data. The other side of genomics data is the complex phenotypic data with which the nucleotides are being correlated. Phenotypic data can consist of such diverse entities as unstructured and straightforward text descriptions from electronic health records, quantitative metrics from laboratories, sensors, and electronic trackers, and imaging data. In the past decade, there has been a fantastic change in the efficiency of DNA sequencing. Using traditional Sanger sequencing [314], the human genome project took 20 years and cost $3 billion. Current next-generation sequencing (NGS) methods now allow a human genome to be sequenced for $1000, in 24 hours [315]. Science and medicine are changing in ways never imagined as DNA sequencing generates volumes of data faster and at a lower cost (remember Moore's Law from Section 1?), all thanks to AI. Next-Generation Sequencing (NGS), also called massively parallel sequencing, is defined as a high-throughput method used to determine a portion of the nucleotide sequence of an individual’s genome. This technique utilizes DNA sequencing technologies that are capable of processing multiple DNA sequences in parallel [316]. They exponentially increase the rate of biological data being generated. Whereas the first human genome was a $3 billion-dollar project requiring over a decade to complete, as mentioned above, we are now close to being able to sequence and analyze an entire genome in a few hours for less than a thousand dollars. However, with this increase in computer and internet speed, what is needed is an infrastructure to generate, maintain securely, transfer, and analyze large-scale information in life sciences. Also, such support will integrate omics data with other data sets, such as clinical data from patients (mainly from EHRs).

268

Foundations of Artificial Intelligence in Healthcare and Bioscience

6.9.2 Health information and records (EHR) and AI in genetics and genomics therapies Electronic health records contain patient-level data collected during and for clinical care. Data within the electronic health record include diagnostic billing codes, procedure codes, vital signs, laboratory test results, clinical imaging, and physician notes. With repeated clinic visits, these data are longitudinal, providing valuable information on disease development, progression, and response to treatment or intervention strategies. The nearly universal adoption of EHRs nationally has the potential to provide population-scale real-world clinical data accessible for biomedical research, including genetic association studies [316]. For this research potential to be realized, high-quality research-grade variables must be extracted from these clinical data warehouses. Electronic or computable phenotyping methods and approaches, along with associated evaluation metrics, are now emerging. More sophisticated strategies are being developed, ensuring in part that the full potential of the EHR for precision medicine research, including genomic discovery, will be achieved as many envision [317]. A study applying machine learning tools to EHR data was conducted by researchers at the University of Wisconsin-Madison (UW-Madison) and the Marshfield Clinic. Through their EHR methodology, they identified genetic markers related to conditions such as depression, anxiety, mood disorders, sleep apnea, and a host of other conditions. In follow-up studies, EHRs were used to set up a double-blind methodology, where both clinicians and patients were blind to the genotype. This enabled researchers to assess whether premutation carriers differed in their patterns of clinical diagnoses from those who don’t have the premutation [318]. These and numerous other studies reported in the literature demonstrate the valuable relationship and resource the EHR is providing in genetic research. The continued identification of genotype and phenotype correlations for inpatient populations will lead to significant advances in population health and precision medicine.

6.9.3 Research/clinical trials and AI in genetics and genomics The most fertile areas of genetics and genomics lie in health care research. Thanks to the advances in ML, coupled with big data analytics, there has been a virtual explosion of activity in cross-disciplinary applications of genomics, medicine, and AI. The integration of machine learning into the clinical workflow would help to remove gaps in the data available to healthcare professionals and allow integration of other datasets (such as genetic information), a vital step in enhancing medical data for a better understanding of patient treatments and cares [319]. A strong example of AI and application in research has been demonstrated at Rady Children’s Institute for Genomic Medicine (RCIGM). They have utilized a machine-learning process and clinical natural language processing (CNLP) to diagnose rare genetic diseases in record time. This new method is speeding answers to physicians caring for infants in intensive care and opening the door to increased use of genome sequencing as a first-line diagnostic test for babies with cryptic conditions [320].

Chapter 6 • Current AI applications in medical therapies and services

269

With the increasing emphasis and intensity of genomic research in the areas of genome sequencing, genetic editing (e.g., CRISPR - see Chapter 7, page 303), deep genomics interpreting genetic variations, pharmacogenomics, Newborn Genetic Screening Tools, the future of genetic research and health applications is endless.

6.9.4 Blockchain and AI in genetics and genomics The cost of genome sequencing has already fallen below $1000 per person [321]. Soon it will be down to $100. This reducing cost and resultant increasing utilization will provide researchers and health professionals with ever-increasing data for genomic sequencing and EHR records analysis. This will lead to better disease assessments and precision outcome analysis. But there are also obstacles to overcome. First, bioscience needs vast datasets of genomic and other healthcare datasets to gather meaningful and potentially transformational information. Second, for the promise of precision medicine to be achieved, data must be sharable and interoperable across technological, geographic, jurisdictional, and professional boundaries. Also, there must be an integration of data to allow patients, health professionals, governments, researchers, and providers of health technology access, cooperation, collaboration, networking, and the ability to form partnerships. The world needs a centralized health data hub, an open marketplace where health and genomic data can be shared, borrowed, even sold. Of course, such a market would have to be secure. By utilizing blockchain technology and next-generation cryptography, a trust could quickly be built around such an ecosystem, alleviating consumer hesitations about leaving personal data online or in the hands of corporations. By implementing an open, collaborative blockchain platform and marketplace, critical mass can be achieved faster for precision medicine to be realized. Of course, this data hub has to be international, although many ethnic and geographical populations are still woefully underrepresented in public databases [322].

6.9.5 Internet of Things (IoT) and AI in genetics and genomics Like all other dramatically expanding fields in AI and its related technologies, so too is the Internet of Things (IoT) moving into areas not even thought about as little as 5 years ago. Indeed, the IoT applications (small connected devices, smartphones, wearables, etc.) are limited only by our imagination. The market for IoT devices is projected to grow from $5.8 billion in 2018 to $3.04 trillion in 2020 [323]. A novel and exciting IoT application in related genetic research is the use of wearables for reactive biospecimen acquisition and phenotyping. It is now possible to remotely monitor a participant’s activity and sleep patterns. The participant is asked to perform a task (e.g., complete a test of cognition) or collect a biospecimen (e.g., a saliva or finger-prick blood sample) based on their activity or sleep. IoT device and wearable could automate aspects of rapid analysis of sleep data upon waking and coupling that with a push notification to the study participant’s smartphone requesting them to collect a finger-prick blood sample [324].

270

Foundations of Artificial Intelligence in Healthcare and Bioscience

Uses of IoTs and wearables in research, including genetics and genomics, are farreaching. The goal should be to incorporate reliable, accurate, valid, and low-cost portable monitors easily into the study design. As the potential IoT and wearables use continue to grow, the potential positive impact of these technologies on how we conduct human research should not be underestimated [325].

6.9.6 Telehealth and AI in genetics and genomics A new concept, Genome Medical, is a service that helps make genomics part of the standard of care, through a combination of telehealth technology and services. The service helps healthcare providers and their patients navigate and utilize genetic test results to understand the risk for disease, accelerate disease diagnosis, make informed treatment decisions, and lower the cost of care. This kind of care has the potential to provide a new level of genomic medicine via a telehealth platform to top clinical genomics specialists for patients and clinicians. The service is delivered via telehealth and work with health systems, providers, employers, and health plans to meet the ever-growing needs of our services. Services are available to hospitals, health systems, employers, and consumers in all 50 states. It provides on-demand access to genetic experts for virtual visits and provider-to-provider consults, in addition to educational and training services such as patient risk assessment tools and consumer e-learning resources. Health care will see a future in which genomics is a seamless part of everyday care delivery. The current challenge is the gap between the clinical application of genomics and access to genetic expertize. It is felt that technologies like telehealth will eliminate traditional barriers and will allow for immediate access to genetic experts for making informed professional judgments and decisions about genetics and genomics [326].

6.9.7 Chatbots and AI in genetics and genomics Increasing amounts of patients are getting genetic tests to assess cancer risk to their hereditary questions on passing a disease on to a child. But most doctors aren’t adequately trained to interpret genetic results or counsel patients appropriately. Skilled genetic counselors are in short supply. As genetic testing becomes more available to an ever-increasing population, so too is their access to mobile broadband technologies. One such technology, Clear Genetics [327], a company based in San Francisco, has collaborated with genetic counselors in a variety of patient-care settings to develop a chatbot named “GIA” (Genetic Information Assistant). This is a clinical-grade chatbot that can assist patients in need of genetic counseling, risk assessment, and testing. It guides a user through their results, collect information and review options for genetic testing, and answer questions about things like whether the test will be covered by insurance. If additional support is required, it can schedule a consultation with a human expert via video or in-person. Another similar chatbot technology is provided by Geisinger Clinic in Pennsylvania [328] serving over 3 million patients. Geisinger and Clear Genetics have collaborated to develop

Chapter 6 • Current AI applications in medical therapies and services

271

chatbots for communication with patients enrolled in the MyCode Community Health Initiative [329], an extensive research biobank returning genetic test results for genes known to be associated with an increased risk for treatable and preventable heritable heart diseases and cancers. The tool is compliant with the Health Insurance Portability and Accountability Act 1996 and can be used on any mobile device. It is designed to record the decision (consent, considering, decline) in the patient’s electronic health record. At-risk patients can request a genetic counseling visit via the MyCode chatbot and ask questions. Geisinger patients who have interacted with the family sharing tool and chatbot report positive experiences and secured the privacy of their information. They have welcomed a device that has detailed information and answers that wouldn’t be on the tip of the tongue when talking with family members about genetic test results [330].

6.9.8 Natural language processing (NLP) and AI in genetics and genomics An ambitious study was conducted using Natural language processing (NLP) and machine learning approaches to build classifiers to identify genomic-related treatment changes in the free-text visit progress notes of cancer patients. Data from a total of 755 cancer patients who had undergone clinical next-generation sequencing (NGS) testing was analyzes. An NLP system was implemented to process the free-text data and extract NGS-related information [331]. The project explored how word embedding, and deep learning techniques can help to extract information from free-text EHR documents efficiently (e.g., progress notes) and evaluate the effectiveness and actionability of genetic testing in assisting cancer patient treatment adjustment. The primary goals of the project were to (1) identify the section of the progress report that discussed genomic testing results and treatment information; (2) predict if there is a treatment change (or not) based on the extracted data using deep learning and word embedding, and (3) compare the performance of 4 recurrent neural networks (RNN)-based approaches and 5 conventional machine learning algorithms for text classification task using clinical progress reports. The results were successfully applied to NLP and machine learning methods to extract information from clinical progress reports and classify them into treatment-change. Notreatment-change groups reported using NLP and deep learning. The goal was to implement an automated annotation system for clinical progress reports that can improve the annotation accuracy, as well as reduce the time required. An automated genetic alteration interpretation system based on NLP and RNN methods could be developed to incorporate relevant information from text-based sources such as pathology reports and progress notes.

6.9.9 Expert systems and AI in genetics and genomics “Decision support system” (DSS) is a system that supports users in unstructured decisionmaking situations. Most of the DSS activities to date have focused on innovation tools such as expert systems. With the advent of Electronic Health Records (EHR) systems, clinicians are getting accustomed to a special kind of DSS called Clinical Decision Support (CDS) [332].

272

Foundations of Artificial Intelligence in Healthcare and Bioscience

Over the years the system has evolved from performing simple medical data processing tasks to complex analytical tasks to support genetically guided cancer management, risk assessment with family history, and operation on a massive amount of clinical data [333]. The recent advancements in genomics have resulted in a high interest in developing genomic decision support systems for improving clinical care [334]. Many types of genomic data are currently available, like sequencing data, gene expression data, genotype data, epigenetic data, etc. Several efforts are underway to utilize these large data sets for clinical support. Expert systems [335] have also been developed to utilize rules. Most fields of clinical genomics research have not yet reached a level of maturity to produce such regulations. Nonetheless, pharmacogenomics is one field that has proven that genotype-phenotype rules could be used in Genome CDS [336].

6.9.10 Robotics and AI in genetics and genomics Nanotechnology is the science of engineering molecularly precise structures and, ultimately, molecular machines or “nanobots.” This technology is conducted at a nanoscale, which is about 1 100 nanometers. A nanometer is 1-billionth of a meter, the width of about 5 carbon atoms nestled side by side [337]. Medical nanorobotics is expected to have a revolutionary effect on regenerative medicine and with diligent effort, and their distinct influences will begin to appear in clinical treatment as early as the 2020s [338]. Already, a nanorobot called a “chromallocyte” is being researched, as a mechanism to extract all existing chromosomes from a diseased or dying cell and insert fresh new ones in their place. This process is called chromosome replacement therapy. The replacement chromosomes are manufactured earlier, outside of the patient’s body, using the patient’s genome as a blueprint. Each chromallocyte is equipped with a single copy of a digitally corrected chromosome set and then injected. Each device travels to its target tissue cell and replaces old or defective genes with new chromosome copies, then exits the cell and is removed from the body. Therefore, chromosome replacement therapy could be an attractive perspective to correct the accumulating genetic damage and mutations that lead to aging in every one of our cells [338].

6.9.11 Population health (demographics and epidemiology) and AI in genetics and genomics The new millennium and the completion of the Human Genome Project in 2003 [304] introduced rapid developments in the core functions and essential services of public health in genomics and an understanding of the distinction between the terms “genetics” and “genomics” (see “Genetic and genomic therapies” below). These developments have enhanced our knowledge of how human genes interact with each other and the environment to influence health. These advancements have yielded the increasing recognition of the potential applications of genomic knowledge and related technologies to improve population health. Specifically, it can enable the stratification and subsequent screening of individuals and subgroups of populations based on their level of genetic risk for developing a disease. This can

Chapter 6 • Current AI applications in medical therapies and services

273

then lead to the development of more targeted prevention approaches to reduce the burden of untoward psychological or social effects [339]. Whereas there are expectations that genomic knowledge, tools, and technologies benefit population health, they must be applied only when the benefits outweigh the potential harms. New tools and technologies that are prematurely introduced without the evidence demonstrating that they are valid and useful run the risk of posing harm to individuals, families, and the broader health system. It is, therefore, essential that the existing and emerging knowledge, tools, and technologies be considered to determine which are beneficial to population health and how they could be appropriately implemented. Public health genomics bridges this gap between new scientific discoveries and technologies, and the application of genomic knowledge to benefit population health [340]. Public health genomics (see below) has been successfully integrated into existing paradigms for the provision of traditional public health services. The continued alignment of genomics with public health promises to deliver more precise, personalized health care to benefit the population. A national, coordinated approach to provide centralized governance of decisionmaking is required to ensure responsible delivery, universality, and equity of access [341].

6.9.12 Precision medicine/health (personalized health) and AI in genetics and genomics The National Institutes of Health started the Big Data to Knowledge (BD2K) Initiative and the ‘Precision Medicine Initiative,’ intending to develop a genetically guided treatment with personalized, precision medicine for improved preventive, early detection, and treatment of common complex diseases [342]. The program aims to do this by using AI machine learning to gathering and linking the electronic health records and data of a group of 1 million Americans. The group will categorize and capture entire genome sequences, cell populations, proteins, metabolites, RNA, DNA as well as behavioral data [343]. In this era of bioinformatics, the wealth of data that diagnostic tests generate has become a new option value, like oil-exploration leases, to power the value and strategy of businesses. Take 23andMe as an example: using genotyping chips, the company offers tests for the genetic blueprint of its customers’ ancestry and health or trait markers. So far, it has gathered data from more than 2 million customers [344], who have the choice of allowing their data to be used for biomedical research. The global market for personalized medicine has proliferated since the inception of the Precision Medicine Initiative, announced by Barack Obama during his 2015 State of the Union address. Market research estimated the 2016 global market at $44 billion in revenues. And these revenues are forecast to more than triple, to $140 billion, by 2026 [345].

6.9.13 Healthcare analytics (and bioinformatics) and AI in genetics and genomics Whereas bioinformatics has been mentioned numerous times up to this point in this book, its arguably most significant application is in genetics and genomics. By definition,

274

Foundations of Artificial Intelligence in Healthcare and Bioscience

bioinformatics refers to the use of computer science, particularly AI and machine learning for statistical modeling and algorithmic processing. To understand biological data. It applies AI and modern computing and big data (healthcare) analytical techniques to biological information [346]. Machine learning (ML) is particularly useful in bioinformatics at prediction and pattern detection based on large datasets. Within the areas of genetics and genomics, the most significant applications of AI and ML include DNA sequencing (NGS), protein classification, and the analysis of gene expressions on DNA microarrays. The process of gene finding (locating regions of the DNA that encode genes [347]) consists of a combination of extrinsic and intrinsic searches. As part of the outward search, “the target genome is searched for DNA sequences that are similar to extrinsic evidence” in the form of known gene encoding sequences previously discovered and labeled. Gene prediction algorithms attempt to identify segments of the DNA that could potentially host gene encoding sequences [348]. Gene expression is the process by which information from a gene is used in the synthesis of a functional gene protein [349]. Machine learning is utilized for analysis, pattern identification, and classification of gene expressions. In the field of cancer research, the advent of microarrays and RNA sequencing coupled with state-of-the-art machine learning techniques, have demonstrated the potential for the detection and classification of tumors at a molecular level [350].

6.9.14 Preventive health and AI in genetics and genomics The concept of “Preventive Genomics” includes the clinical aspects of extensive use of AI in DNA sequencing (NGS), interpretation, and reporting to help accelerate precision medicine research. A clinic (“Preventive Genomic Clinic”) at Brigham and Women’s Hospital in Boston is the first in the U.S [351]. The goal of the clinic is to interpret disease-associated genes for healthy adults and children seeking to understand and mitigate their risk of future disease. For over 2 decades, NIH studies in translational genomics have proven to provide more potential medical benefits and fewer risks than previously considered. Thus, it is felt that genomic technology is offered in a clinical context, under the care of genetics experts, to individuals who wish to be proactive about their health. After a patient has been examined to rule out genetic risks, the patient can then choose from a menu of gene panels that will enable patients to receive quality genome sequencing through machine learning techniques capable of interpreting and reporting of about 3700 disease-associated genes [351]. Patients will then have the opportunity to participate in an NIH-funded follow-up study in which researchers will track outcomes for years. A goal of the clinic and its follow-up studies is to go beyond the typical diagnostic application of genetic testing and to help patients make individualized decisions based on their specific needs. This will accelerate the integration of DNA sequencing into day-to-day medical care and allow patients the ability to communicate with health professionals regarding their “genetic health” and to achieve precision and preventive health.

Chapter 6 • Current AI applications in medical therapies and services

275

6.9.15 Public health and AI in genetics and genomics The CDC Office of Public Health Genomics tracks epidemiologic study results in its weekly horizon scan [352] and Advanced Molecular Detection Clips [352] and deposits relevant publications in the Public Health Genomics Knowledge Base [352]. Many computer tools, AI, and machine learning technologies have contributed significantly to the emerging base of genomic knowledge that has been developed over the past 10 years. However, only a limited amount of this knowledge so far has been fully translated into healthcare and public health practice [353]. The literature references 2 reasons why this may be so. Firstly, in genomic studies, most genetic variants that have been identified seem to contribute to common diseases to only small increases in relative risk. At the same time, they explain only a little about the relationship between disease and genetic inheritance [354]. Secondly, among the tools and techniques based on genomic knowledge that have been developed, there is limited evidence regarding their validity and utility [355]. Public health genomics bridges this gap between new scientific discoveries and computer, AI technologies, and the application of genomic knowledge to benefit population health [356]. The integration of genomic knowledge and technologies into healthcare is revolutionizing the way we approach clinical and public health practice. Building upon the work of public health genomics from the last 20 years, precision public health now enables the integration of genomics into public health strategies within a broader context of other determinants of health, such as socioeconomic, behavioral, and environmental factors. This will lead to more precise individual and population-based interventions [357], and ultimately, improve population health outcomes [358]. The continued alignment of genomics with public health promises to deliver more precise, personalized health care to benefit the population.

6.9.16 Access and availability and AI in genetics and genomics As deep genomics expands, relevant ethical and governance questions begin to arise. How will predictive health data be used, and who will have access to it? Do pharmaceutical companies have the right to profit from your health information without giving you any benefits back in return. Would it threaten your self-worth when those handling your health data know a lot of biological details about your body? Is it ethical for a prospective employer to ask how your health will look like in the next decade? Answers to these questions are not easy to capture, but their impact on society is profound. A recent report [359] on ethics design, and AI argues that “as computational power increases with time and AI improves, information that was thought private will be linked to individuals at a later stage in time. Furthermore, as data is stored in terms of summaries rather than as raw observations, and may help training algorithms, keeping track of data usage and potential risks to privacy may be increasingly complex.” Technological advances such as blockchain being developed for digital currency systems allow individuals to hold and secure digital assets without a central authority. Such

276

Foundations of Artificial Intelligence in Healthcare and Bioscience

technologies are also being used to create new digital property systems that include personal medical data property [360]. This raises the broader analysis of how personal data ecosystems would fit in an AI-powered health economy. There is also an urgent need to concurrently discuss how the convergence of AI and personal health will challenge the purpose and content of relevant legislation such as the Health Insurance Portability and Accountability Act (HIPAA) and the Genetic Information Non-Discrimination Act (GINA). Models of genomic data protection must be considered a critical biosecurity issue. The insights underlying the functioning of our genomes and their impact on our health and research will be of strategic importance in biotechnology and biodefense [361]. It is of importance, if not urgent, that the genomics and AI research community start discussions with diverse fields of expertize in biosafety, biosecurity, and biotechnological governance [362]. The volume of information presented in this Chapter has been enormous yet, a mere fraction of the published literature on the topics and categories addressed herein. As the author, I had to fight the temptation to add more and more on each subject as I conducted the research. I may have erred in some areas with more than needed in a “guidebook” of this nature or went to excess in other areas. But in any area where you, as the reader, may have felt a bit overwhelmed by more than you thought you needed on any specific item, I can assure you that I only scratched the surface. On the other hand, if you were left feeling a lack of sufficient information on a topic that you would have wanted greater coverage, again, I can assure you that there is much more available. To wit, I tried to include substantial enough literature citations (footnotes) for your further research and reading on any topic discussed.

References [1] Health. The difference between primary, secondary, and tertiary health care. eInsure; January 27, 2017. [2] Jee K, Kim GH. The potentiality of big data in the medical sector: focus on how to reshape the healthcare system. Healthc Inform Res 2013;19(2):79 85. [3] Tan SS, Gao G, Koch S. Big data and analytics in healthcare. Methods Inf Med 2015;54(6):546 7. [4] Shortliffe EH. Artificial intelligence in medicine: weighing the accomplishments, hype, and promise. IMIA and Georg Thieme Verlag KG. Yearb Med Inf 2019; , https://doi.org/10.1055/s-0039-1677891 . . [5] Wu PY, Cheng CW, Kaddi CD, Venugopalan J, Hoffman R, Wang MD. Omic and electronic health record big data analytics for precision medicine. IEEE Trans Biomed Eng 2017;64(2):263 73. [6] Garets D, Davis M. Electronic medical records vs. Electronic health records: yes, there is a difference. Zhongguo Yiyuan 2007;11(5):38 9. [7] Poulymenopoulou M, Malamateniou F, Prentza A, Vassilacopous G. Challenges of evolving PINCLOUD PHR into a PHR-based health analytics system. Paper presented at the Proceedings of the European, Mediterranean & Middle Eastern Conference on Information Systems EMCIS 2015. [8] Morgan L. Artificial intelligence in healthcare: how AI shapes medicine. Datamation; March 8, 2019. [9] FDA. Novel drug approvals for 2018. ,https://www.fda.gov/drugs/developmentapprovalprocess/druginnovation/ucm592464.htm.; 2018. [10] Krizhevsky A, Sutskever I, Hinton GE. ImageNet classification with deep convolutional neural networks. Commun ACM 2017;60:84 90.

Chapter 6 • Current AI applications in medical therapies and services

277

[11] Topol EJ. High-performance medicine: the convergence of human and artificial intelligence. Nat Med 2019;25:44 56. [12] Esteva A, et al. A guide to deep learning in healthcare. Nat Med 2019;25:24 9. [13] Shah P, Kendall F, Khozin S, et al. Artificial intelligence and machine learning in clinical development: a translational perspective, vol. 2. npj Digital Medicine; July 26, 2019. Article number: 69. [14] Quarré F. The advent of AI and blockchain in health care. Forbes Technology Council. Forbes; January 30, 2019. [15] Sciforce. IoT in healthcare: are we witnessing a new revolution? Medium; March 7 2019. [16] Kuziemsky C, Maeder AJ, John O, et al. Role of artificial intelligence within the telehealth domain. Georg Thieme Verlag KG Stuttgart. Official 2019 Yearbook Contribution by the members of IMIA Telehealth Working Group. Yearb Med Inf 2019;28(01):035 40. [17] Yaghoubzadeh R, Kramer M, Pitsch K, et al. Virtual agents as daily assistants for elderly or cognitively impaired people. In: International workshop on intelligent virtual agents 2013 August 29. Berlin, Heidelberg: Springer; 2013. p. 79 91. [18] Bickmore TW, Utami D, Matsuyama R, et al. Improving access to online health information with conversational agents: a randomized controlled experiment. J Med Internet Res 2016;18(1). [19] Shaked NA. Avatars and virtual agents relationship interfaces for the elderly. Health Technol Lett 2017; (03):83. [20] Riccardi G. Towards healthcare personal agents. In: Proceedings of the 2014 workshop on roadmapping the future of multimodal interaction research, including business opportunities and challenges 2014 November 16. ACM; 2014. p. 53 6. [21] Chapman W. Healthcare NLP: the secret to unstructured data’s full potential. Health Catalyst; April 2, 2019. [22] Glen Coppersmith G, Leary R, Crutchley P, et al. Natural language processing of social media as screening for suicide risk. Biomed Inform Insights 2018;. Available from: https://doi.org/10.1177/ 1178222618792860. [23] , https://aws.amazon.com/comprehend/medical/.; 2019. [24] Kannan A. The science of assisting medical diagnosis: from expert systems to machine-learned models follow. Medium; April 15, 2019. [25] Tutorialspoint.com. Artificial intelligence expert systems. ,https://www.tutorialspoint.com/artificialintelligence/artificial-intelligence-expert-systems.htm.; 2018. [26] Kermany DS, Goldbaum M, Cai W, et al. Identifying medical diagnoses and treatable diseases by imagebased deep learning. Cell 2018;172(5):1122 31. [27] Ravuri M, Kannan A, Tso GJ, et al. From expert systems to machine-learned diagnosis models. Proc Mach Learn Res 2018;85:1 16. [28] MedTech. AI-powered medical robots are becoming reality! ,https://www.medicaltechoutlook.com/.; July 5, 2019. [29] Da Vinci Surgical System | Robotic Technology - Da Vinci Surgery. ,https://www.davincisurgery.com/ da-vinci-systems/about-da-vinci-systems.; 2019. [30] Prodromakis T. Five ways nanotechnology is securing your future. Elsevier SciTech Connect; March 29, 2016. [31] Research Briefs. Paging Dr. Robot: how robotics is changing the face of medicine. CB Insights; July 24, 2019. [32] Rigby MJ. Ethical dimensions of using artificial intelligence in health care. AMA J Ethics 2019. [33] Lough S. Predictive health platform, optimizes health and disease risk management, improves population health outcomes. ,www.bannerhealth.com.; March 7, 2019. [34] Insights Team. How machine learning is crafting precision medicine. Forbes Insights; February 11, 2019.

278

Foundations of Artificial Intelligence in Healthcare and Bioscience

[35] Mesko B. The role of artificial intelligence in precision medicine. Expert Rev Precis Med Drug Dev 2017;2(5):239-241. Available from: https://doi.org/10.1080/23808993.2017.1380516. [36] FDA permits marketing of artificial intelligence-based device to detect certain diabetes-related eye problems. U.S. Food and Drug Administration; April 11, 2018. [37] Keown S. Using algorithms to personalize patient care. Dr. Elizabeth Krakow, a blood-cancer specialist, and epidemiologist. Fred Hutch News Service; October 1, 2018. [38] Kaisler S, Armour F, Espinosa JA, et al. Big data: issues and challenges moving forward. In: IEEE Computer Society 46th Hawaii International conference on system sciences. IEEE; 2013. p. 995 1004. Available from: https://doi.org/10.1109/HICSS.2013.645. [39] Perrey J, Spillecke D, Umblijs A. Smart analytics: how marketing drives short-term and long-term growth. McKinsey Quarterly; 2013. [40] Wang L, Alexander CA. Big data in medical applications and health care. Am Med J 2015;6:1 8. Available from: https://doi.org/10.3844/amjsp.2015.1.8. [41] Shah R, Echhpal R, Nair S. Big data in healthcare analytics. Int J Recent Innov Trends Comput Commun 2015;10:134 8. [42] Hansen MM, Miron-Shatz T, Lau YS, et al. Big data in science and healthcare: a review of recent literature and perspectives. Contribution of the IMIA Social Media Working Group, Yearb. Med Inf 2014;9:21 6. Available from: https://doi.org/10.15265/IY-2014-0004. [43] Schultz T. Turning healthcare challenges into big data opportunities: a use-case review across the pharmaceutical development lifecycle. Bull Association Inform Sci Technol 2013;39:34 40. Available from: https://doi.org/10.1002/bult.2013.1720390508. [44] Yang J, Aloe AM, Feeley TH. Risk information seeking and processing model: a meta-analysis. J Commun 2014;. Available from: https://doi.org/10.1111/jcom.12071. [45] Bellazzi R. Big data and biomedical informatics: a challenging opportunity. Yearb Med Inf 2014;9:8 13. [46] Terris M. Evolution of public health and preventive medicine in the United States. Am J Public Health 1975;65:161 9. [47] Manolio TA. Bringing genome-wide association findings into clinical use. Nat Rev Genet 2013;14:549 58. [48] Gambhir SS, Ge TJ, Vermesh O, et al. Toward achieving precision health. Sci Transl Med 28 2018;10 (430). Available from: https://doi.org/10.1126/scitranslmed.aao3612. [49] Shaban-Nejad A, Lavigne M, Okhmatovskaia A, et al. A knowledge-based platform to support integration, analysis, and visualization of population health data. Ann N Y Acad Sci 2017; 1387:44 53. [50] Shaban-Nejad A, Brownstein JS, Buckeridge DL. Public health intelligence and the internet. Cham: Springer International Publishing AG; 2017. [51] Wilk S, et al. Comprehensive mitigation framework for the concurrent application of multiple clinical practice guidelines. J Biomed Inf 2017;66:52 71. [52] Xu JQ, Murphy SL, Kochanek KD, Arias E. Mortality in the United States, 2015. NCHS data brief, no 267. Hyattsville, MD: National Center for Health Statistics; 2016. [53] Hsu J. Will artificial intelligence improve health care for everyone? Undark Magazine Smithsonian.com; July 31, 2019. [54] Makary M. Medical errors now third leading cause of death in United States. BMJ 2016;353:i2139. [55] To PRESS RELEASE NO: 2018/092/HD. World Bank and WHO: half the world requires access to essential health services, 100 million still pushed into extreme poverty because of health expenses. The World Bank; December 13, 2017. [56] Physician Burnout Report. ,https://cdn1.sph.harvard.edu/wp-content/uploads/sites/21/2019/01/.; 2018.

Chapter 6 • Current AI applications in medical therapies and services

279

[57] Bate A, Juniper J, Lawton AM, Thwaites RM. Designing and incorporating a real-world data approach to international drug development and use: what the UK offers. Drug Discov Today 2016;21 (3):400 5. [58] Bate A, Pariente A, Hauben M, Begaud B. Quantitative signal detection and analysis in pharmacovigilance. Mann’s Pharmacovigil 2014;331 54. [59] Trifiro G, Sultana J, Bate A, et al. From big data to smart data for pharmacovigilance: the role of healthcare databases and other emerging sources. Drug Saf 2017;41(3):1 7. Available from: https://doi.org/ 10.1007/s40264-017-0592-4. [60] Oliveira AL. Biotechnology, big data, and artificial intelligence. Biotechnol J 2019;. Available from: https://doi.org/10.1002/biot.201800613. [61] Harrer S, Shah P, Antony B, et al. Artificial intelligence for clinical trial design. Trends Pharmacol Sci 2019;. Available from: https://doi.org/10.1016/j.tips.2019.05.005. [62] Press C. Review evaluates how AI could boost the success of clinical trials. ScienceDaily 2019;. [63] Plotnikov V, Kuznetsova V. The prospects for the use of digital technology “Blockchain” in the pharmaceutical market. In: MATEC web of conferences. London: EDP Sciences 2018;vol. 193. [64] Markov A. Use of blockchain in pharmaceutics and medicine. ,https://miningbitcoinguide.com/technology/blokchejn-v-meditsine/.; October 16, 2018. [65] Sylim P, Liu F, Marcelo A, et al. Blockchain technology for detecting falsified and substandard drugs in distribution: pharmaceutical supply chain intervention. JMIR Res Protoc 2018;7:e10163. [66] Trujllo G, Guillermo C. The role of blockchain in the pharmaceutical industry supply chain as a tool for reducing the flow of counterfeit drugs [Ph.D. thesis]. Dublin: Dublin Business School; 2018. [67] Siyal AA, Junejo AZ, Zawish M, et al. Applications of blockchain technology in medicine and healthcare: challenges and future perspectives. Cryptography; January 2, 2019. [68] Hickey KT, Riga TC, Mitha SA, et al. Detection and management of atrial fibrillation using a remote control. Nurse Pract 2018;43(3):24 30. [69] Özdemir V. The big picture on the “AI Turn” for digital health: the internet of things and cyber-physical systems OMICS: A J Integr Biol 2019;236 Review Articles. June 5. Available from: https://doi.org/ 10.1089/omi.2019.0069. [70] Thuemmler C, Bai C. Health 4.0: how virtualization and big data are revolutionizing healthcare. Cham, Basel: Springer; 2017. [71] Markarian A. Pharma 4.0. Pharm Technol 2018;42(4):24. [72] Kastner P, Morak J, Modre R, et al. Innovative telemonitoring system for cardiology: from science to routine operation. Appl Clin Inf 2010;1(2):165 76. [73] Schreier G, Schwarz M, Modre-Osprian R, Kastner P, Scherr D, Fruhwald F. Design and evaluation of a multimodal mHealth based medication management system for patient self-administration. Conf Proc IEEE Eng Med Biol Soc. 2013;2013:7270 3. [74] Kropf M, Modre-Osprian R, Hayn D, et al. Telemonitoring in heart failure patients with clinical decision support to optimize medication doses based on guidelines. Conf Proc IEEE Eng Med Biol Soc 2014;2014:3168 71. [75] Dharwadkar R, Deshpande NA, Pune A. Pharmabot: a recommendation on general medicines Int J Innovative Res Computer Commun Eng 2018;6(6).

survey.

[76] Comendador BEV. Pharmabot: a pediatric generic medicine consultant chatbot. J Autom Control Eng 2015;3:137 40. [77] Keane J. Medxnote is building “the perfect use case” for chatbots in healthcare. TechEU; September 11, 2017. [78] Ben Abacha A, Zweigenbaum P. A hybrid approach for the extraction of semantic relations from MEDLINE abstracts. In: 12th international conference on computational linguistics and intelligent text processing. Tokyo, Japan. ,https://rd.springer.com/book/10.1007.; 2011.

280

Foundations of Artificial Intelligence in Healthcare and Bioscience

[79] Fong A, Hettinger AZ, Ratwani RM. Exploring methods for identifying related patient safety events using structured and unstructured data. J Biomed Inf 2015;58:89 95. [80] Kim S, Liu H, Yeganova L, et al. Extracting drug-drug interactions from literature using a rich featurebased linear kernel approach. J Biomed Inf 2015;58:23 30. [81] Lopez Pineda A, Ye Y, Visweswaran S, et al. Comparison of machine learning classifiers for influenza detection from emergency department free-text reports. J Biomed Inf 2015;58:60 9. [82] Segura-Bedmar I, Martinez P. Pharmacovigilance through the development of text mining and natural language processing techniques. J Biomed Inf 2015;58:288 91. [83] Schmider J, Kumar K, LaForest C, et al. Use of artificial intelligence in adverse event case processing. Innov Pharmacovigilance: ClPharmacol Ther 2019;105(4). [84] Fong DJ. Artificial intelligence in pharmacy: are you ready? Wolters Kluwer; January 22, 2018. [85] Paul M, Andreassen S, Tacconelli E, et al. Improving empirical antibiotic treatment using TREAT, a computerized decision support system: cluster randomized trial. J Antimicrob Chemother 2006;58:1238 45. [86] Robotics Online Team. How collaborative robots are being used in pharma. Robotics Industry Assoc; May 30, 2019. [87] Vyas M, Thakur S, Riyaz B, et al. Artificial intelligence: the beginning of a new era in pharmacy profession. Asian J Pharmaceutics 2018;12(2):72. [88] Chisolm-Burns MA, Kim Lee J, Spivey CA. US pharmacists’ effect as team members on patient care: systematic review and meta-analysis. Med Care 2010;48(923):33. [89] Sanborn MD. Population health management and the pharmacist’s role. Am J Health-System Pharm 2017;74(18):1400 1. Available from: https://doi.org/10.2146/ajhp170157. [90] van Dulmen S, Sluijs E, van Dijk L, et al. Patient adherence to medical treatment: a review of reviews. BMC Health Serv Res 2007;7:55. [91] Viswanathan M, Golin CE, Jones CD, et al. Interventions to improve adherence to self-administered medications for chronic diseases in the United States: a systematic review. Ann Intern Med 2012;157 (11):785 95. [92] Zullig LL, Blalock DV, Dougherty S, et al. The new landscape of medication adherence improvement: where population health science meets precision medicine. Patient Prefer Adherence 2018;12:1225 30. Available from: https://doi.org/10.2147/PPA.S165404. [93] Harpaz R, Vilar S, Dumouchel W, et al. Combing signals from spontaneous reports and electronic health records for detection of adverse drug reactions. J Am Med Inf Assoc 2013;20 (3):413 19. [94] Islam MS, Hasan M, Wang X, et al. A systematic review on healthcare analytics: application and theoretical perspective of data mining. Healthc Jun 2018;6(2):54. Available from: https://doi.org/10.3390/ healthcare6020054. [95] Big Data in the Healthcare & Pharmaceutical: A 1 $7 Billion Opportunity by 2021. Research and Markets; July 31, 2018. [96] Faggella D. 7 applications of machine learning in pharma and medicine. Business Intelligence and Analytics; January 30, 2019. [97] Fong DJ. Artificial intelligence in pharmacy: are you ready? Wolters Kluwer; January 22, 2018. [98] Hartman M, Martin AB, Espinosa N, et al. National health expenditures account team. National health care spending in 2016. [99] , https://aspe.hhs.gov/pdf-report/observations-trends-prescription-drug-spending.; March 8, 2016. [100] Emanuel EJ. The real cost of the US health care system. JAMA 2018;319(10):982 5.

Chapter 6 • Current AI applications in medical therapies and services

281

[101] Bennett D. Artificial intelligence in life sciences: marketing, sales, and more. Human intelligence amplified; June 11, 2019. [102] , http://www.cloudmedxhealth.com/.; 2019. [103] Sheil E. Cleveland clinic, IBM collaborate to establish model for cognitive population health management and data-driven personalized healthcare. Cleveland Clinic, Newsroom; December 22, 2016. [104] HopkinsMedicine.org; 2019. [105] AI for Care Variation, Utilization & Hospital Ops. KenSci; 2019. [106] Obermeyer Z, Emanuel EJ. Predicting the future—big data, machine learning, and clinical medicine. N Engl J Med 2016;375:1216 19. [107] Garcia Vidal C, Puerta P, Sanjuan G, et al. Predicting multidrug-resistant Gram-negative infections in hematological patients with high-risk febrile neutropenia using neural networks. Eur Congr Clin Microbiol Infect Dis 2019;13 16. [108] Garcia-Vidal C, Sanjuan G, Puerta-Alcalde P. Artificial intelligence to support clinical decision-making processes. EBioMedicine 2019;46:27 9 EBioMedicine. 2019 Aug; 46: 27 29. [109] Ni Y, Bermudez AA, Kennebeck S, et al. A real-time automated patient screening system for clinical trials eligibility in an emergency department: design and evaluation. JMIR 2019;7(3). [110] News Release. Artificial intelligence solution improves clinical trial recruitment. EurekAlert AAAS; July 24, 2019. [111] Wikipedia. 2019. [112] Cyran MA. Blockchain as a foundation for sharing healthcare data. Blockchain Health Today 2018;1:13. [113] Kuo TT, Kim HE, Ohno-Machado L. Blockchain distributed ledger technologies for biomedical and health care applications. J Am Med Inf Assoc 2017;24(6):1211 20. [114] American Hospital Association. Telehealth: a path to virtual integrated care. ,www.AHA.org/center.; 2019. [115] , https://www.tempus.com/about-us.html.; 2019. [116] Gauher S, Boyle Uz F. Cleveland clinic, to identify at-risk patients in ICU using cortana intelligence. Cleveland Clinic; September 26, 2016. [117] , http://nvidianews.nvidia.com/news/nvidia-massachusetts-general-hospital-use-artificial-intelligence-to-advance-radiology-pathology-genomics.; 2019. [118] Insights. Command center to improve patient flow infographic. Johns Hopkins Medicine; March 1, 2016. [119] Lee E, Seals K. Virtual interventional radiologist. UCLA Health; March 3, 2017. [120] Burns E. Natural language processing (NLP). TechTarget; May 2019. [121] Coppersmith G, Leary R, Crutchley P, et al. Natural language processing of social media as screening for suicide risk. Biomed Inform Insights 2018;10. Available from: https://doi.org/10.1177/ 1178222618792860. [122] Haughn M. Natural language understanding (NLU). TechTarget; October 2017. [123] OSHA. Hospital eTools. U.S. Department of Labor. OSHA QuickTakes; 2019. [124] Elion J. Emerging technologies for clinical documentation improvement. Becker’s Healthcare; August 23, 2010. [125] ROBODOC. ThinkSurgical.com; 2019. [126] Kraft BM, Jäger C, Kraft K, et al. The AESOP robot system in laparoscopic surgery: increased risk or advantage for surgeon and patient? Surg Endosc 2004;18(8):1216 23. [127] About DaVinci systems. Intuitive; 2019. [128] CMS.gov. MACRA. Centers for Medicare & Medicaid Services; June 14, 2019.

282

Foundations of Artificial Intelligence in Healthcare and Bioscience

[129] CMS.gov. What are the value-based programs? Centers for Medicare & Medicaid Services; July 16, 2019. [130] Burill S, Boozer Cruse C. Beyond the EHR: shifting payment models call for hospital investment in new technology areas. DeLoitte Insights; January 11, 2019. [131] Kent J. Value-based hospitals more likely to adopt population health tools. Health IT Analytics; January 17, 2019. [132] Bertalan M. The role of artificial intelligence in precision medicine. Expert Rev Precis Med Drug Dev 2017;2(5):239 41. Available from: https://doi.org/10.1080/23808993.2017.1380516. [133] Torkamani A, Topol E. Your genome, on-demand how your detailed genetic profile can predict your risk of diseases and improve your health. MIT Technology Review; October 23, 2018. [134] Sullivan T. Why EHR data interoperability is such a mess in 3 charts. Healthcare TI News; May 16, 2018. [135] Insights Team. How machine learning is crafting precision medicine. Forbes Insights; February 11, 2019. [136] Oshinsky D. Bellevue: three centuries of medicine and mayhem at America’s most storied hospital. Doubleday; 2017. [137] Cohen JK. CEO power panel: patient access is the next frontier for health systems. Modern Healthcare 2019;May 18. [138] Meyer H. Younger patients more dissatisfied with traditional healthcare. Modern Healthcare; February 12, 2019. [139] Carroll W. Artificial intelligence, nurses, and the quadruple aim. Online J Nurs Inform (OJNI) 2018;22(2). [140] Davis T, Bice C. Nurses have significantly higher levels of EHR satisfaction than providers. KLAS; March 28, 2019. [141] Clarke M. Nurses have significantly higher levels of EHR satisfaction than providers. Health Leaders; April 15, 2019. [142] Dilsizian SE, Siegel EL. Artificial intelligence in medicine and cardiac imaging: harnessing big data and advanced computing to provide personalized medical diagnosis and treatment. Curr Cardiol Rep 2014;16(1):441. Available from: https://doi.org/10.1007/s11886-013-0441-8. [143] de Veer AJE, Fleuren MAH, Bekkema N, et al. Successful implementation of new technologies in nursing care: a questionnaire survey of nurse-users. BMC Med Inf Decis Mak 2011;11:67. [144] De Raeve P. Blockchain supports nurses in the continuity of health and social care. Open Access Government; August 8, 2018. [145] FDA is rapidly approving medical IoT devices. Sure; February 17, 2019. [146] , http://community.advanceweb.com/blogs/nurses-18/archive/2015/10/14/nurses-will-benefit-fromthe-internet-of-things.aspx.. [147] The Internet of Things and Nursing. Jacksonville University; 2019. [148] American Telehealth Association. Telehealth nursing fact sheet: ATA telehealth nursing special interest group; 2018. [149] Smith W. Three benefits of a nurse telephone triage line. Team Health Medical Call Center; 2014. [150] AmBetter. 24/7 Nurse advice line. ,http://www.ambetterhealth.com/benefits-services/nurse-line. html.; 2019. [151] American Academy of Ambulatory Care Nursing. Telehealth nursing practice scope and standards of practice. ,http://www.aaacn.org/tnp-scope-standards-practice.; 2019. [152] The Patient Protection and Affordable Care Act. ,http://www.gpo.gov/fdsys/pkg/PLAW-111publ148/ pdf/PLAW-111publ148.pdf.; 2010.

Chapter 6 • Current AI applications in medical therapies and services

283

[153] Hawig D. The rise of AI-powered chatbots in the healthcare industry. IdeaConnection; May 5, 2019. [154] Hand C. How chatbots are changing communication in healthcare. Inc 5000 Healthcare; January 21, 2019. [155] Lovett L. AI triage chatbots trekking toward a standard of care despite criticism. Mobile Health News; November 2, 2018 [156] Seif G. An easy introduction to Natural Language Processing: using computers to understand human language. Retrieved January 20, 2019. [157] Johansen M, O’Brien J. Decision making in nursing practice: a concept analysis. Nurs Forum 2016;51:40 8. [158] Carroll WM. Artificial intelligence, critical thinking, and the nursing process. J Nurs Informatics: OJNI; Chic 2019;23(1). [159] Bowles KH, Ratcliffe SJ, Holmes JH, et al. Using a decision support algorithm for referrals to post-acute care. JAMDA 2019;20(4):408 13. [160] U.S. Bureau of Labor Statistics, Occupational Outlook Handbook. Registered Nurses; September 4, 2019. [161] Schwab K. A hospital introduced a robot to help nurses. They didn’t expect it to be so popular. Fast Company; July 8, 2019. [162] Eriksson H, Salzmann-Erikson M. The digital generation and nursing robotics: an ethnographic study about nursing care robots posted on social media. Nurs Inq 2017;24(2). Available from: https://doi.org/ 10.1111/nin.12165. [163] Andrew J, Pepito AB, Locsin R. Can nurses remain relevant in a technologically advanced future? Int J Nurs Sci 2019;6(1):106 10. [164] U.S. Department of Labor, Bureau of Labor Statistics. Occupational Outlook: Registered Nurses. ,https://www.bls.gov/ooh/healthcare/registered-nurses.htmtab-1.; 2017 [accessed 28.12.17]. [165] Robert Wood Johnson Foundation Catalyst for change Harnessing the Power of Nurses to Build Population Health for the 21st Century 2017. [166] CDC. What is precision medicine? National Institute of Health (NIH). Genetics Home Reference (GHR); May 28, 2019. [167] Cashion AK, Gill J, Hawes R, et al. National Institutes of Health Symptom Science Model sheds light on patient symptoms. Nurs Outlook 2016;64:499 506. [168] Cashion AK, Grady PA. The National Institutes of Health/National Institute of Nursing Research intramural research program and the development of the National Institutes of Health Symptom Science Model. Nurs Outlook 2015;63:484 7. [169] Cashion AK, Grady PA. Response to the commentary: precision health: using omics to optimize selfmanagement of chronic pain in aging: from the perspective of the NINR Intramural Research Program. Res Gerontology Nurs 2018;11:14 15. [170] NINR. The NINR strategic plan: advancing science, improving lives. ,https://www.ninr.nih.gov/sites/ www.ninr.nih.gov/files/NINR-StratPlan2016-reduced.pdf.; 2016. [171] Hickey KT, Bakken S, Byrne MW, et al. Precision health: advancing symptom and self-management science. Nurs Outlook 2019;67(4):462 75. [172] Howley EK. Can nurse practitioners help ease the growing physician shortage? US News; November 15, 2018. [173] Preventive Care Services. Guideline Number: CDG.016.26. United Healthcare; July 1, 2019. [174] Chiverton PA, Votava KM, Tortoretti DM. The future role of nursing in health promotion. Am J Health Promot 2003;18(2):192 4.

284

Foundations of Artificial Intelligence in Healthcare and Bioscience

[175] The Definition and Practice of Public Health Nursing. APHA; November 11, 2013. [176] Public Health Nursing. American Public Health Association; 2019. [177] U.S. Department of Health and Human Services Health Resources and Services Administration Bureau of Health Workforce National Center for Health Workforce Analysis. Supply and Demand Projections of the Nursing Workforce: 2014 2030; July 21, 2017. [178] U.S. Department of Health and Human Services, Health Resources and Services Administration, National Center for Health Workforce Analysis. The Future of the Nursing Workforce: National- and State-Level Projections, 2012 2025. Rockville, Maryland; 2014. [179] Institute of Medicine (US). Committee on the future health care workforce for older Americans. Retooling for an aging America: building the health care workforce. National Academies Press; 2008. [180] Orsini M. Utilizing big data to improve patient outcomes in home healthcare share. Axxess; July 2, 2019. [181] Threlkeld T. NAHC Policy Staff. 3 Significant issues facing homecare & hospice. Home Care; August 29, 2019. [182] Brighttree. 60% of referral sources would switch to a home health and hospice provider that accepts electronic referrals, survey reveals. BusinessWire; July 24, 2019. [183] Optimizing Home Health Care. Enhanced value and improved outcomes. Clevel Clin J Med 2019;. [184] Dobson A, El-Gamil A, Heath S, et al. Use of home health care and other care services among medicare beneficiaries clinically appropriate and cost-effective placement (CACEP) project working paper series. Alliance for Home Health Quality & Innovation; 2012. [185] Seng Tan J. Global blockchain platform for elderly wellness and care: white paper. Wellderly; April 29, 2018. [186] Access and leverage smart sensor analytics. Care Innovations, LLC; 2019. [187] 5 Ways healthcare IoT & remote patient monitoring are transforming care delivery. Care Innovations; 2019. [188] Medicare.gov. Telehealth. Official US Government site for Medicare; 2019. [189] Erstling A. 6 Healthcare innovation trends in 2019| what to pursue and what to watch. Formula; January 23, 2019. [190] , http://news.northeastern.edu/2017/09/professor-designs-chatbot-to-help-comfort-patients-in-lastyears-of-their-lives/.. [191] Rieland R. Can a chatbot help you prepare for death? Smithsonianmag.com; September 29, 2017. [192] Lilley EJ, Lindvall C, Lillemoe KD, et al. Measuring processes of care in palliative surgery a novel approach using natural language processing. Ann Surg 2018;267(5):823 5. [193] Muoio D. How chatbots and robots can fill healthcare’s unmet needs. MobiHealthNews; September 25, 2018. [194] Emanuel E. The status of end-of-life care in the United States: the glass is half full. JAMA 2018;320:329. [195] Lustbader D, Mudra M, Romano C, et al. The impact of a home-based palliative care program in an accountable care organization. J Palliat Med 2016;20:23 8. [196] Coalition to Transform Advanced Care (CTAC): Clinical and Payment Models. Washington, DC. ,www.ctac.org/models.; March 2019. [197] Yosick L, Crook RE, Gatto M, et al. Effects of a population health community-based palliative care program on cost and utilization. J Palliat Med 2019;22(9). [198] The future of health, begins with you. All of Us Research Program. U.S. Department of Health & Human Services; 2019. [199] NHCOA is a member of the Combined Federal Campaign (CFC) #44522 © 2016. [200] eSolutions.com. Titan; 2019.

Chapter 6 • Current AI applications in medical therapies and services

285

[201] eSolutions. Home health & hospice billing; 2109. [202] CarePredict @Home. CarePredict; January 8, 2019. [203] www.tricare.mil is an official website of the Defense Health Agency (DHA), a component of the Military Health System. Last updated June 28, 2019. [204] Center for Disease Control and Prevention. National Center for Health Statistics; March 11, 2016. [205] National Center for Health Statistics, Vital and Health Statistics. Long Term Care Providers and Services Users in the United States, 2015 2016. Series 3, Number 43. February 2019. [206] Nicholson K, Makovskic TT, Griffith LE, et al. Multimorbidity and comorbidity revisited: refining the concepts for international health research. J Clin Epidem 2019;105:142 6. [207] Bresnick J. Identifying big data sources for population health management. Health IT Analytics; January 2, 2018. [208] Glicksberg BS, et al. An integrative pipeline for multi-modal discovery of disease relationships. Pac Symp Biocomput 2015;407 18. [209] Bagley SC, Sirota M, Chen R, et al. Constraints on biological mechanism from disease comorbidity using electronic medical records and database of genetic variants. PLoS Comput Biol 2016;12(4):e1004885. [210] Roca M, Gili M, Garcia-Garcia M, et al. Prevalence and comorbidity of common mental disorders in primary care. J Affect Disord 2009;119(1 3):52 8. Available from: https://doi.org/10.1016/j. jad.2009.03.014. [211] Cuncic A. Facts about comorbidity. VeryWellMind; September 13, 2019. [212] Schumacher A. Towards a global, blockchain-based precision medicine ecosystem. Blockchain & Healthcare - 2017 Strategy Guide Method; June 2017. [213] , https://ec.europa.eu/digital-single-market/en/content/mid-term-review-digital-single-market-dsmgood-moment-take-stock.. [214] , https://ec.europa.eu/digital-single-market/en/news/european-commission-launches-eu-blockchainobservatory-and-forum.. [215] De Raeve P. Blockchain technology: supporting continuity of care. Health Europa; April 27, 2018. [216] Stegemann S. Developing drug products in an aging society: from concept to prescribing. Cham: Springer; 2016. [217] Schreier G, Schwarz M, Modre-Osprian R, Kastner P, Scherr D, Fruhwald F. Design and evaluation of a multimodal mHealth based medication management system for patient self-administration. Conf Proc IEEE Eng Med Biol Soc 2013;2013:7270 3. [218] Ebner H, Modre-Osprian R, Kastner P, et al. Integrated medication management in mHealth applications. Stud Health Technol Inf 2014;198:238 44. [219] Porath A, Irony A, Borobick AS, Nasser S, et al. Maccabi proactive Telecare Center for chronic conditions - the care of frail elderly patients. Isr J Health Policy Res 2017;6(1):68. [220] Kökciyana N, Chapmanb M, Balatsoukas P, et al. A collaborative decision support tool for managing chronic conditions. UK Engineering & Physical Sciences Research Council (EPSRC); 2019. [221] Sklar E, Azhar MQ. Argumentation-based dialogue games for shared control in human-robot systems. J Human-Robot Interact 2015;4:120 48. [222] Young AP, Kökciyan N, Sassoon L, et al. Instantiating metalevel argumentation frameworks. In: Proceedings of the 7th international conference on computational models of argument. 2018, pp. 97 108. [223] Smoller JW, Lunetta KL, Robins J. Implications of comorbidity and ascertainment bias for identifying disease genes. Am J Med Genet 2000;96:817 22. [224] Salmasian H, Freedberg DE, Friedman C. J Am Med Inf Assoc 2013;20:e239 42.

286

Foundations of Artificial Intelligence in Healthcare and Bioscience

[225] De Groot V, Beckerman H, Lankhorst GJ, et al. How to measure comorbidity. A critical review of available methods. J Clin Epidemiol 2003;56:221 9. [226] Friedman C, Shagina L, Lussier Y, et al. Automated encoding of clinical documents based on natural language processing. J Am Med Inf Assoc 2004;11:392 402. [227] Kansagara D, Englander H, Salanitro A, et al. Risk prediction models for hospital readmission: a systematic review. JAMA 2011;306:1688 98. [228] Singla J, Grover D, Bhandari A. Medical expert systems for the diagnosis of various diseases. Int J Computer Appl 2014;93(7):36 43. [229] Nohria R. Medical expert system-A comprehensive review. Int J Computer Appl 2015;130(7):44 50. [230] Bursuk E, Demirci S, Korpinar MA, et al. Expert system in medicine and its application at pulmonary diseases. Med Sci Discovery 2016;3(11):342 9. [231] Broadbent E, Garrett J, Jepsen N, et al. Using robots at home to support patients with chronic obstructive pulmonary disease: pilot randomized controlled trial. JMIR 2018;20(2). [232] Bradley Eh, Canavan M, Rogan E, et al. Variation in health outcomes: the role of spending on social services, public health, and health. Health Aff 2016;35(5). [233] Social determinants of health. Healthy housing for a sustainable and equitable future Housing and health guidelines. World Health Organization; November 27, 2018.

the WHO

[234] Bresnick J. What are the social determinants of population health? Health IT Analytics; August 19, 2017. [235] Belcher K. From $600 M to $6 Billion, artificial intelligence systems poised for dramatic market expansion in healthcare. Frost & Sullivan; January 5, 2016. [236] Delaney K. Do the math: what happens when you add the next big thing to the next big thing? Accenture; January 19, 2018. [237] Olathe. Project: analysis of medical care data base to identify comorbidity patterns. K State University; March 27, 2019. [238] The National Center for Chronic Disease Prevention and Health. What can people who have arthritis and comorbidities do? Centers for Disease Control and Prevention; May 16, 2018. [239] Theis KA, Brady TJ, Helmick CG. No one dies of old age anymore: a coordinated approach to comorbidities and the rheumatic diseases. Arthritis Care Res 2016;. Available from: https://doi.org/10.1002/acr.23114. [240] Eustice C. Overview of comorbidity and arthritis. VeryWellHealth; May 22, 2019 [241] Correll CU, Detraux J, De Lepeleire J, et al. Effects of antipsychotics, antidepressants, and mood stabilizers on risk for physical diseases in people with schizophrenia, depression, and bipolar disorder. World Psychiatry: J World Psychiatr Assoc (WPA) 2015;14(2):119 36. [242] Stubbs B, Mueller C, Gaughran F, et al. Predictors of falls and fractures, leading to hospitalization in people with schizophrenia spectrum disorder: a large representative cohort study. Schizophrenia Res 2018;201:70 8. [243] Lankarani MM, Assari S. Association between number of comorbid medical conditions and depression among individuals with diabetes; race and ethnic variations. J Diabetes Metab Disord 2015;14:56. [244] Akinyemiju T, Jha M, Xavier J, et al. Disparities in the prevalence of comorbidities among US adults by state Medicaid expansion status. Prev Med 2016;88:196 202. [245] FDA clears the first autonomous telemedicine robot for hospitals. Kurzweil Accelerating Intelligence; January 28, 2013. [246] About ReWalk Robotics. ReWalk.com; 2019. [247] 4 Ways RPA in healthcare can streamline critical processes. HealthSystem Blog; February 22, 2019. [248] Davincisurgery.com. Intuitive; 2019.

Chapter 6 • Current AI applications in medical therapies and services

287

[249] Cullity K. Robotics for the business of healthcare - are you ready for RPA? Culbert Healthcare Solutions; August 21, 2018. [250] Schulman J, Gupta A, Sibi Venkatesan A. A case study of trajectory transfer through non-rigid registration for a simplified suturing scenario. Department of Electrical Engineering and Computer Sciences, University of California at Berkeley. IROS; 2013. [251] Hirsch A. Johns Hopkins scientist programs robot to perform ‘soft tissue’ surgery. HUB; May 6, 2016. [252] Fard MJ, Ameri S, Chinnam RB, et al. Machine learning approach for skill evaluation in roboticassisted surgery. In: Proceedings of the world congress on engineering and computer science 2016, vol. I. WCECS 2016; October 19 21, 2016. [253] Stuart Campbell. The impact of robotics on neurosurgery. Renishaw; October 11, 2016. [254] Zheng S, Lu JJ, Ghasemzadeh N, et al. Effective information extraction framework for heterogeneous clinical reports using online machine learning and controlled vocabularies. JMIR Med Inform; June 5, 2017. [255] Castello Ferrer E. Decentralized AI and robotics using blockchain technology. MIT Research Topic; 2019. [256] Eastward G. How robots are extending the scope of IoT applications. Innovation Enterprise; June 13, 2019. [257] Internet of Things (IoT) in healthcare: benefits, use cases, and evolutions. I-Scoop; 2018. [258] Allocate A, Burghard C, Clap M. Worldwide healthcare IT 2017 predictions. IDC FutureScape; November 2016. [259] Telepresence Robots Market Research Report - Global Forecast to 2023. Market Research Future; May 2019. [260] Telepresence Robots Market 2019 Global Industry Size, Trends, Gross Margin Analysis, Development Status, Sales Revenue by Leading Players Forecast to 2023. Reuters Plus; February 11, 2019. [261] Tomlinson Z. 15 Medical robots that are changing the world. Interesting Engineering; October 11, 2018. [262] Boston Children’s Hospital. A first in medical robotics: autonomous navigation inside the Robotic body catheter, using a novel sensor informed by AI and image processing, makes its way to a leaky heart valve. ScienceDaily; April 24, 2019. [263] Mejia N. Robotic process automation (RPA) in healthcare [264] ITSC Blog. Robotic process automation (RPA) Supply Chain; January 30, 2019.

current use-cases. Emerj; May 30, 2019.

it’s not about robotics! Information Technology and

[265] Matari´c MJ. Robotics and autonomous systems center. USC; 2015. [266] Tapus A, Tapus C, Mataric MJ. User-robot personality matching and assistive robot behavior adaptation for post-stroke rehabilitation therapy. Intell Serv Robot: Multidiscip Collaboration Socially Assistive Robotics 2008;1:169 83. [267] Greczek J, Kaszubski E, Atrash A, et al. Graded cueing feedback in robot-mediated imitation practice for children with autism spectrum disorders. IEEE; October 20, 2014. doi:10.1109/ ROMAN.2014.6926312. [268] Armiitage H. Countdown to big data in precision health: robots that are here to help. Stanford Medicine; April 8, 2019. [269] Mills T. AI in health care: the top ways AI is affecting the health care industry forbes technology council; Jun 11, 2019. [270] Singh Dang S. Artificial intelligence in humanoid robots. Cognitive World; February 25, 2019. [271] Szwartz G. The AI will see you now: how cognitive computing is transforming medicine. Wired; 2018.

288

Foundations of Artificial Intelligence in Healthcare and Bioscience

[272] Wiggers KL. Google’s DeepMind is using neural nets to explore dopamine’s role in learning. AI; May 14, 2018. [273] Special Issue. Enabling technologies in health engineering and informatics for the new revolution of healthcare 4.0. IEEE J Biomed Health Inform 2019;. [274] Jeter Hansen A. Artificial intelligence in medicine—predicting patient outcomes and beyond. Stanford Medicine Scope; May 8, 2018. [275] Poropatich R. Pitt, CMU receive Department of Defense Contracts to create autonomous robotic trauma care system. Carnegie Mellon University. News 2019;. [276] NIH Stem Cell Information Home Page. In stem cell information [World Wide Web site]. Bethesda, MD: National Institutes of Health, U.S. Department of Health and Human Services; September 22, 2019. [277] Scudellari M. How iPS cells changed the world. Nature 2016;. [278] Berg J. Gene-environment interplay. Science 2016;354(6308):15. [279] Kabir H, O’Connor MD. Stems cells, big data, and compendium-based analyses for identifying cell types, signaling pathways, and gene regulatory networks. Biophys Rev 2019;(1):41 50. [280] Bhartiya D. The need to revisit the definition of mesenchymal and adult stem cells based on their functional attributes. Stem Cell Res Ther 2018;9:78. Available from: https://doi.org/10.1186/s13287-0180833-1. [281] Ratajczak MZ, Ratajczak J, Kucia M. Very small embryonic-like stem cells (VSELs). Circ Res 2019;124:208 10. Available from: https://doi.org/10.1161/CIRCRESAHA.118.3. [282] Bhartiya D, Shaikh A, Anand S, Patel H, Kapoor S, Sriraman K, et al. Endogenous, very small embryonic-like stem cells: a critical review, therapeutic potential and a look ahead. Hum Reprod Update 2016;23:41 76. Available from: https://doi.org/10.1093/humupd/dmw030. [283] Bhartiya D. Clinical translation of stem cells for regenerative medicine a comprehensive analysis. CIRCRESAHA; March 14, 2019. Available from: https://doi.org/10.1161/118.313823. [284] Global Institute of Stem Cell Therapy and Research. Goldstar.com; 2019. [285] Coleman L. Here’s how blockchain, sports and science are disrupting the healthcare game. Forbes Magazine; August 25, 2019. [286] Garg S, Williams NL, Ip A, et al. Clinical integration of digital solutions in health care: an overview of the current landscape of digital technologies in cancer care. JCO Clin Cancer Infomatics 2018;2:1 9. [287] Bennett AV, Reeve BB, Basch EM, et al. Evaluation of pedometry as a patient-centered outcome in patients undergoing hematopoietic cell transplant (HCT): a comparison of pedometry and patient reports of symptoms, health, and quality of life. Qual Life Res 2016;25:535 46. [288] Lahm Jr RJ, Lockwood FS. Innovations to come could help address the small business health care access and affordability problem. In: Institute for global business research conference proceedings, vol. 2. Number 1. April 6, 2018. [289] Dodziuk H. Applications of 3D printing in healthcare. Pol J Thorac Cardiovasc Surg 2016;13(3):283 93. Available from: https://doi.org/10.5114/kitp.2016.62625. [290] Aydin O, Zhang X, Nuethong S, et al. Neuromuscular actuation of biohybrid motile bots. PNAS 2019;. Available from: https://doi.org/10.1073/pnas.1907051116. [291] Saif T, Bashir R. Researchers build microscopic biohybrid robots propelled by muscles, nerves. National Science Foundation; September 17, 2019. [292] SuvorovYa RE, Kim SA, Gisina M, et al. Surface molecular markers of cancer stem cells: computation analysis of full-text scientific articles. Bull Exp Biol Med 2018;166(1):135 40. [293] Vijay SAA, Ganesh Kumar P. Fuzzy expert system based on a novel hybrid stem cell (HSC) algorithm for classification of microarray data. J Med Syst 2018;42(4):61. Available from: https://doi.org/10.1007/ s10916-018-0910-0.

Chapter 6 • Current AI applications in medical therapies and services

289

[294] Sungwoong J, Kim S, Ha S, et al. Magnetically actuated microrobots as a platform for stem cell transplantation. Sci Robot 2019;4(30):eaav4317. Available from: https://doi.org/10.1126/scirobotics.aav4317. [295] Choi H. Introduced a new paradigm of cell transplantation with scaffold microrobots. DGist Research News; June 6, 2019. [296] Precision Medicine at Columbia University. Stem Cells. Columbia University; 2019. [297] Hockemeyer D, Jaenisch R. Induced pluripotent stem cells meet genome editing. Cell Stem Cell 2016;18:573 86. [298] Del Sol A, Thiesen HJ, Imitola J, et al. Big-data-driven stem cell science, and tissue engineering: vision and unique opportunities. Cell Stem Cell 2017;. Available from: https://doi.org/10.1016/j.stem. [299] Tenneille EL, Kujak A, Rauti A, et al. 20 Years of human pluripotent stem cell research: it all started with five lines. Cell Stem Cell 2018;23(5):644. Available from: https://doi.org/10.1016/j. stem.2018.10.009. [300] Biologics. Approved cellular and gene therapy products. U.S. Food and Drug Administration; March 29, 2019. [301] NIH Stem Cell Information Home Page. In stem cell information [World Wide Web site]. Bethesda, MD: National Institutes of Health, U.S. Department of Health and Human Services; September 24, 2019. [302] Bacterial infections after the use of stem cell products. Center for Disease Control and Prevention; May 9, 2019. [303] Frow EK, Brafman DA, Muldoon A, et al. Characterizing direct-to-consumer stem cell businesses in the Southwest United States. Stem Cell Rep 2019;. Available from: https://doi.org/10.1016/j. stemcr.2019.07.001. [304] The Human Genome Project. National Human Genome Research Institute; Update on January 9, 2019. [305] Mullainathan S, Spiess J. Machine learning: an applied econometric approach. J Economic Perspect 2017;31(2):87 106. Available from: https://doi.org/10.1257/jep.31.2.87. [306] Westry T. New bioinformatics program is the first of its kind in the state. UAB News; July 18, 2018. [307] Hulsen T, Jamuar SS, Moody AR, et al. From big data to precision medicine. Front Med 2019. Available from: https://doi.org/10.3389/fmed.2019.00034. [308] Camacho DM, Collins KM, Powers RK, et al. Next-generation machine learning for biological networks. Cell 2018;173(7):1581 92. Available from: https://doi.org/10.1016/j.cell.2018.05.015. [309] Bresnick J. Understanding the many V’s of healthcare big data analytics. HealthITAnalytics; June 5, 2017. [310] Wamba SF, Akter S, Edwards A, et al. How “big data” can make a significant impact: findings from a systematic review and a longitudinal case study. Int J Prod Econ 2015;165:234 46. [311] Navarro FCP, Mohsen H, Yan C, et al. Genomics and data science: an application within an umbrella. Genome Biol 2019;20. [312] Stephens ZD, Lee SY, Faghri F, et al. Big Data: astronomical or genomic? PLoS Biol 2015;13:e1002195. [313] Quick J, Loman NJ, Duraffour S, et al. Real-time, portable genome sequencing for Ebola surveillance. Nature 2016;530:228 32. [314] DePalma A. Sanger sequencing: still the gold standard? LabManager; November 5, 2018. [315] Chow E. Next-generation sequencing. iBiology; November 2018. [316] Kohane IS. Using electronic health records to drive discovery in disease genomics. Nat Rev Genet 2011;12(6):417 28. Available from: https://doi.org/10.1038/nrg2999. [317] Pendergrass SA, Crawford DC. Using electronic health records to generate phenotypes for research. December 5, 2018. Available from: https://doi.org/10.1002/cphg.80.

290

Foundations of Artificial Intelligence in Healthcare and Bioscience

[318] Movaghar A, Page D, Brilliant M, et al. Data-driven phenotype discovery of FMR1 premutation carriers in a population-based sample. Sci Adv. 21 2019;5(8):eaaw7195. [319] Taylor B. Machine learning in genomics research and healthcare. Medium; January 28, 2019. [320] Clark M, Hildreth A, Batalov S, et al., Diagnosis of genetic diseases in seriously ill children by rapid whole-genome sequencing and automated phenotyping and interpretation. Science Translational Medicine; 2019. [321] Tirrell M. Unlocking my genome: was it worth it? Personalized Medicine. CNBC; December 10, 2015. [322] Schumacher A. Blockchain-powered DNA data marketplace will revolutionize precision medicine. Dataconomy; January 16, 2019. [323] Haghi M, Thurow K, Stoll R. Wearable devices in medical internet of things, scientific research, and commercially available devices. Healthc Inf Res 2017;23:4 15. [324] Mantua J, Gravel N, Spencer RM. Reliability of sleep measures from four personal health monitoring devices compared to research-based actigraphy and polysomnography. Sensors (Basel), 16. 2016. p. 646. [325] Kiana T, Michael A. Wearable technology and wearable devices, everything you need to know. Bloomberg; March 26, 2014. [326] Alderson L. Telehealth bringing Personalized Medicine closer. HealthManagement.org 2019;19:3. [327] , https://www.cleargenetics.com/.. Clear Genetics; 2019. [328] Schmidlen T. Collaborative development of chatbots as an innovative tool in the delivery of scalable genomic counseling. Geisinger Clinic. PAGC Annual Spring Meeting Philadelphia, PA; May 4, 2018. [329] MyCode Community Health Initiative. Geisinger Health; 2019. [330] Dimic K. Chatbots in precision medicine. Medium; June 23, 2019. [331] Guan M, Cho S, Petro R, et al. Natural language processing and recurrent network models for identifying genomic mutation-associated cancer treatment change from patient progress notes. JAMIA Open 2019;2(1):139 49. [332] Cowie MR. Electronic health records to facilitate clinical research. ClRes Cardiology 2017;106:1 9. [333] Sen A, Banerjee A, et al. Clinical decision support: converging toward an integrated architecture. J Biomed Inform 2012;45(5):1009 17. [334] Khoury MJ. Knowledge integration at the center of genomic medicine. Genet Med 2012;14(7):643 7. [335] Tsoukas LI, Liaros A. The use of the expert system of composite risk factors in breast cancer screening. Stud Health Technol Inform 1997;43:859 63. [336] Sena A, Kawamb A, Dattab A. Emergence of DSS efforts in genomics: past contributions and challenges. Decis Support Syst 2019;116:77 90. [337] Poutintsev F. Nanorobots might hold the key to a radically extended life span. MediumPredict; August 4, 2018. [338] Nanotechnology and radically extended life span. ,http://www.lifeextension.com/magazine/2009/1/ Nanotechnology-Radically-Extended-Life-Span/Page-01.; 2017 [accessed 31.10.17]. [339] Cleeran E, Van der Heyden J, Brand A, Van Oyen H. Public health in the genomic era: will Public Health Genomics contribute to significant changes in the prevention of common diseases? Arch Public Health 2011;69:8. Available from: https://doi.org/10.1186/0778-7367-69-8. [340] Khoury M, Bowen M, Burke W, et al. Current priorities for public health practice in addressing the role of human genomics in improving population health. Am J Prev Med 2011;40:486 93. Available from: https://doi.org/10.1016/j.amepre.2010.12.009. [341] Molster CM, Bowman FL, Bilkey GA. The evolution of public health genomics: exploring its past, present, and future. Front Public Health 2018;6:247. Available from: https://doi.org/10.3389/ fpubh.2018.00247.

Chapter 6 • Current AI applications in medical therapies and services

291

[342] Big Data to Knowledge. National Institutes of Health. U.S. Department of Health and Human Services; August 2, 2019. [343] Mouratidis Y. AI unlocks the mysteries of clinical data. Forbes; April 2019. [344] Herper M. 23andMe rides again: FDA clears genetic tests to predict disease risk. Forbes; April 6, 2017. [345] Global precision medicine market to reach $141.70 billion by 2026, reports BIS Research. BIS Research, Cision PR Newswire; December 15, 2017. ,prnewswire.com.. [346] Bioinformatics. Techopedia; 2019. [347] Sleator RD. An overview of the current status of eukaryote gene prediction strategies. Gene. 2010;461 (1 2):1 4. Available from: https://doi.org/10.1016/j.gene.2010.04.008. [348] Martins PVL. Gene prediction using Deep Learning. U.Porto; July 22, 2018. [349] Steiling K, Christenson S. Tools for genetics and genomics: gene expression profiling. UpToDate. Wolters Kluwer; February 21, 2019. [350] Blair E, Tibshirani R. Machine learning methods applied to DNA microarray data can improve the diagnosis of cancer. SIGKDD Explorations 2019;5(2):48. [351] Press Release. New preventive genomics clinic launches at the Brigham Brigham Health; August 19, 2019. [352] Public Health Genomics Knowledge Base. Public health genomics and precision health knowledge base (v6.0). CDC.gov; October 3, 2019. [353] Boccia S, McKee M, Adany R, et al. Beyond public health genomics: proposals from an international working group. Eur J Public Health 2014;24:877 9. [354] Burke W, Burton H, Hall A, et al. Extending the reach of public health genomics: what should be the plan for public health in an era of genome-based and “personalized” medicine? Genet Med 2010;12:785 91. [355] Bowen M, Kolor K, Dotson W, et al. Public health action in genomics is now needed beyond newborn screening. Public Health Genomics 2012;15:327 34. [356] Khoury M, Bowen M, Burke W, et al. Current priorities for public health practice in addressing the role of human genomics in improving population health. Am J Prev Med 2011;40:486 93. [357] Khoury MJ, Bowen MS, Clyne M, et al. From public health genomics to precision public health: a 20year journey. Genet Med 2017;20:574 82. [358] Weeramanthri TS, Dawkins HJS, Baynam G, et al. Editorial: precision public health. Front Public Health 2018;6:121. [359] Ethically Aligned Design. IEEE Standards. N.p., 2017. Web; January 27, 2017. [360] Kish LJ, Topol EJ. Unpatients—why patients should own their medical data. Nature.com. N.p.; 2016. [361] Peril and Promise, Emerging Technologies and WMD, Emergence and Convergence Workshop Report, Center for the Study of Weapons of Mass Destruction, National Defense University; October 13 14, 2016. [362] Pauwels E, Vidyarthi A. Who will own the secrets in our genes? A U.S. Intelligence and Genomics. The Wilson Center; November 2017.

China Race in Artificial

7 AI applications in prevalent diseases and disorders Notwithstanding the relevance and importance of the information in Chapters 1 6 of this book, I would not be at all surprised if you consider the information in this Chapter 7 as the most pertinent to your needs and interests. The reason is that “now things get personal.” It’s virtually certain that “something” in this Chapter, probably multiple things, will relate directly to your career activities, your personal health, and the health and wellness of your loved one(s). Indeed, even if you escape every possible relationship to any of the diseases or disorders (hopefully) we’ll be discussing, by your very existence, you will have an intimate relationship with some of the major categories such as immunology, genetics, nutrition, aging, just to name a few. In the spirit of the “guidebook” approach that we have attempted to follow throughout this book, we will continue with that format of identifying major categories (in the case of this Chapter 7, “prevalent diseases and disorders”) and then addressing subdivisions within each category. The major categories that we will discuss in this Chapter include all of the diseases (pathologies and pathophysiologies) and disorders (abnormalities without a direct pathological basis) that affect the human organism. These categories will include: 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19.

Immunological and autoimmune disease; Genetics and genomics disorders; Cancers; Vascular (cardiovascular and cerebrovascular); Diabetes (Types 1 and 2); Neurological and sensory disorders and diseases; Musculoskeletal system and arthritis; Integumentary system (skin, hair, nails) and exocrine glands (glands with ducts); Endocrine (hormones) glands; Digestive and excretory system: Renal system and urinary system; Respiratory (pulmonary) system: Reproductive system; Physical injuries, wounds and disabilities; Infectious diseases (bacterial, viral, etc.); Human development, aging, degeneration, and death; Chronic disease; Mental and behavioral disorders; Nutrition and exercise (preventive care).

Foundations of Artificial Intelligence in Healthcare and Bioscience. DOI: https://doi.org/10.1016/B978-0-12-824477-7.00007-9 © 2021 Elsevier Inc. All rights reserved.

293

294

Foundations of Artificial Intelligence in Healthcare and Bioscience

The order of clinical categories in the list above is not random. I have attempted to prioritize it with the first 2 categories (immunology and genetics) representing universal considerations in current and future health care as well as the 2 areas in health care most influenced by AI. After that, entries number 3 through 7, while also influenced heavily by AI, represent the most prevalent diseases of which we are at risk. By virtue of their importance to their AI applications and their morbidity and mortality, we will address them in some depth. Categories 8 through 13 include all of the remaining systems of the human body, which have significant implications in health and wellness as well as numerous AI applications. And finally, categories 14 through 19 represent relevant and universal components in health and wellness and again, influenced substantially by AI applications. In that this book is a guidebook focused more on AI applications and influences in health care (and de facto, disease care), exhaustive descriptions of each of the subdivisions for each category mentioned above should be considered beyond the scope of the book. I will, however, provide limited, but adequate background information (anatomy, physiology, pathology, clinical descriptions) on each subject to allow for maximal understanding by readers at almost any level of medical knowledge and experience. The AI portions of the discussions should be understandable by relying on the information provided in the Chapters 1 3 and in fact, throughout the entire book. Finally, as I mentioned above, “now things get personal.” In the previous Chapters, wherein the information was strictly technical (hopefully, relatively understandable) and objective, in this Chapter 7, I will try to “soften” the discussion with a bit more personal, casual style. By its very nature, the discussion will continue to include a fair amount of medical and scientific terminology, but I will “soften” it with corresponding, non-technical descriptions where possible. Please don’t misinterpret any such explanations as condescending or patronizing. They are meant entirely in the spirit of providing a comfortable level of discussion for all.

7.1 Immunology and autoimmune disease In deciding the order in which prevalent disease categories should be approached, it becomes clear, very quickly, that immunology and genetics are the principal categories among the collective aggregate of topics to be discussed. Both permeate every aspect of human health and disease; both are on the forefront of current and future health care; both are unquestionably the most “disruptive” forces in disease diagnosis, treatment, and current and future research efforts; and finally, both are indelibly tied to current and future AI technologies. I have arbitrarily chosen immunology to start with, I guess, because of its already established, ubiquitous relationship to virtually all (and yes, I mean all) of the conditions of human disease, health, and wellness. Not to say that genetics and genomics are not also intimately associated with all as well, but immunology had a bit of a chronological head start as a human health science. Genetics and genomics can point to their rise in health care with the completion of the Human Genome Project in 2003 [1]. Modern immunology, on the other hand, became deeply involved in human disease and health in 1975 (earlier in some aspects, but let's not quibble) with an expanded understanding of T-lymphocytes and

Chapter 7 • AI applications in prevalent diseases and disorders

295

monoclonal antibodies [2]. That knowledge base, as well as that of genetics and AI, has grown to a level of health and wellness today, a fraction of which we could not have accomplish just a short time ago, especially when considering the COVID-19 pandemic.

7.1.1 Pathogenesis and etiologies of immunology and autoimmune disease I always like to start a discussion on immunology with a maxim and a paradox, that together capture the essence of immunology. The maxim is: “Immunology is a battle of ‘self’ versus ‘non-self.’” And the paradox is: “Immunology is our best friend and our worst enemy.” Some simple explanations will make these otherwise ambiguous statements, understandable. In this wonderful world of ours, there is “you”. . .and everything else. Now, think of “you” as “self” and everything else as “non-self.” Whether that “non-self” is a substance, chemical, infectious agent (pathogen, e.g., virus), toxin (airborne, ingested, contact), even a nonsubstance like physiological, mental, emotional or physical (injury) stress, virtually anything external to “you,” your body (“self”) interprets these “non-self” entities as “foreign” or technically, an “antigen.” Let’s assume you’re in good health, well-nourished, in good physical condition with a sound mind in a sound body. Notwithstanding a world of antigens that you have to deal with 24/7, the power of a healthy (innate) immune system is awesome. Kind of like a “yin and yang” metaphor where “self” is good and “non-self” (antigen) is evil. Your body does not like non-self-antigens. So, it uses its natural (“innate”) defenses, like anatomical and chemical features (skin, tears, mucus membranes, anti-infectious barriers, enzymes, cytokines) and a series of cellular elements (antibodies and white blood cells [WBCs]). It also uses humoral components (blood-related serum and fluids that carry WBCs, B and T lymphocytes) and chemical components like cytokines. Together, in an exquisitely complex process where your body recognizes an antigen and forms an “antigen-presenting complex” (the antigen bound to a specialized WBC [macrophage] and a TH [t-helper] lymphocyte); creates and binds it to antibodies (generated by B cells); processes it through a series of cellular (T and B cell generated antibodies) and humoral (cytokine) functions, and eliminates it. This protection from the constant 24/7 barrage of antigens is called our “adaptive immune response” [3]. Indeed, it is “our best friend.” Fundamentally, that is what your innate (or natural) immune system and immunology are all about, i.e., a battle of self-versus-non-self-antigens. Wouldn’t it be great if our body won all the battles? Need I say, life doesn’t work that way. Sometimes, the innate immune system can’t quite handle the load. Maybe, for some reason, it’s compromised (“immunocompromised”) or weakened or suppressed (“immunosuppression”). Perhaps the antigen is not being removed effectively (persistent cause), or it keeps reoccurring (continuing re-exposure) as the innate system tries to eliminate it. Or maybe the antigen is too abundant or too pernicious (virulent) for the innate immune response to overcome it. In such conditions, after a few days to a week of feeling “not so great,” the strength of the adaptive immune system begins to demonstrate its next, more aggressive "go to" system.

296

Foundations of Artificial Intelligence in Healthcare and Bioscience

Working with the innate response (using all of its tools plus additional cellular and humoral weapons), adaptive immunity processes the antigen in a more specific approach. It utilizes highly specialized TH (“helper”), Ts ("supressor"), and Tc ("cytotoxic") lymphocytes to produce a progressive attack of an array of neutralizing B lymphocytes, antibodies, and cytokines. It also sets up an identification process (T and B-Memory cells), which, along with the antibodies, provides lifetime protection (we think?) against the specific antigen. All in all, this acquired immune system is a great defender and protector (a “friend”). . .to a point. We can look at acquired immunity as a race to eliminate the “bad guys” (antigens), a competition which, in most cases (given an otherwise healthy person), it wins. If, however, the underlying health of the patient (note how I move from “person” to “patient” here), is not adequate to sustain the activity of the acquired immune response, things could deteriorate or “dysregulate.” As we described above, the immune response begins using a lot of tools, i.e., cellular components and chemicals in amounts not normal (“pathophysiological”) to your body. Their regular (“physiological”) activity is working diligently to resolve the “pathological” (disease) process in your body, but those efforts are also producing abnormal byproducts called “pro-inflammatory cytokines.” Accumulation of those byproducts is a basis for the pathophysiological process called “acute inflammation”. (Note: All inflammation is characterized in medical terminology by the suffix “. . .i t i s.” Any condition mentioned in this text, heretofore or hereafter under any disease category with the suffix “itis” should be considered an inflammation, acute or chronic.) Another type of acute inflammation is caused by a specific form of an antigen called an “allergen” which can activate a specialized kind of immune protein (immunoglobulin E or IgE antibody) response producing the infamous “Type 4 immediate, allergic or hypersensitivity response.” Examples of this response include hay fever, eczema, hives, asthma, food allergy, and reactions to dust, pollen, insect bites and stings, sometimes reaching severe levels called anaphylaxis and anaphylactic shock. Like its antigen cousin, the allergen can be inhaled, ingested, or enter through the skin. After a susceptible person is exposed to an allergen, the body starts producing a large quantity of IgE antibodies. This results in the reoccurrence of the allergic response (sometimes with increasing intensity) with each re-exposure to the allergen. Included among these Type 4 reaction’s cytokines, is one called histamine which, with the other inflammatory symptoms, produce itching. Ultimately, an active immune process, in conjunction with the identification of the cause (diagnosis) and removal of that cause, is the “cure.” Done promptly, health and wellness will result. But, if not resolved within a reasonable period of time (weeks to months at most), the immune system advances to a condition called “chronic inflammation,” different than acute inflammation in its clinical symptoms (often none at all versus those of acute inflammation) and its cellular pathology. It should also be noted that, though poorly understood, chronic inflammation can develop spontaneously, that is without an antigen and acute inflammatory precursor (cause) episode. According to most medical experts, chronic inflammatory disease is the progenitor or originating cause of all (emphasis on all) the major human disease categories [4] (Fig. 7 1). The clinical basis for this belief lies in the destructive nature of the persistent inflammatory

Chapter 7 • AI applications in prevalent diseases and disorders

297

mediators and cellular components damaging tissue throughout the body, especially the blood vessel walls (perivasculitis) supporting virtually every organ system. Over time, leakage of WBCs from the weakened blood vessel walls (infiltration or diapedesis) occurs disrupting normal tissue function (protein synthesis) and even break down of the DNA of the tissue, producing loss of bodily integrity and function and manifesting as recognizable chronic diseases (discussed later in this Chapter). Thus, per our stated paradox, immunology (specifically, chronic inflammation) could be “our worst enemy.” Sadly, COVID-19 is a painful example of this phenomenon. Antigens, by definition, are “foreign,” but as we learn more about the immune system, it has become apparent that “foreign” may not be entirely synonymous with “non-self.” When, for some unknown reason, the body incorrectly identifies itself (self) as foreign (called “autoantigenicity”), it initiates an acquired immune response directed at “itself.” Stated another way, the immune system has the potential to produce an “autoimmune response” [5]. Autoimmune disease develops after further immune system dysregulation, in both the innate and acquired immune system [6]. There are several theories (Table 7 1) as to the cause of this idiosyncratic response. But whatever the cause, the response has created an entirely separate and extensive disease

FIGURE 7–1 Chronic inflammation. Through its inflammatory mediators and cellular components that damage tissue throughout the body, especially the blood vessel (perivasculitis) walls (with infiltration and diapedesis) supporting virtually every organ system, chronic inflammatory disease is considered the progenitor or originating cause of all (emphasis on all) the major human disease categories.

298

Foundations of Artificial Intelligence in Healthcare and Bioscience

Table 7–1 Theories on etiologies and pathogenesis of chronic inflammation and autoimmune disease. 1. 2. 3. 4. 5. a. a. a.

Table 7–2 • • • • • • • • • • • • •

A prolonged inflammatory process from failure to eliminate an antigen; Part of the patient’s genome; Environmental factors to which patient is exposed over time; Increasing release of pro-inflammatory cytokines; An abnormal immune response to “self”: Disruption of homeostasis (Yin Yang); Innate autoantigens from inflammatory process; “Rogue” antigen presenting cells (APCs);

Listing of prevalent autoimmune diseases.

Ankylosing spondylitis Lupus Rheumatoid arthritis Juvenile arthritis Scleroderma Dermatomyositis Behcet’s disease Celiac disease Crohn’s Disease Ulcerative colitis Sjogren’s syndrome Reactive arthritis Mixed connective tissue disease

• • • • • • • • • • • • •

Raynaud’s phenomenon Giant cell arteritis Temporal arteritis Polymyalgia rheumatica Polyarteritis nodosa Polymyositis Takayasu arteritis Granulomatosis with polyangiitis Vasculitis Alopecia areata Antiphospholipid Antibody syndrome, Autoimmune hepatitis,

• • • • • • • • • • • •

Type 1 diabetes, Graves’ disease, Guillain-Barre syndrome Hashimoto’s disease, hemolytic anemia, Idiopathic thrombocytopenic purpura, Inflammatory bowel disease, Multiple sclerosis, Myasthenia gravis, Primary biliary cirrhosis, Psoriasis, Vitiligo

Source: Autoimmune disease: Johns Hopkins Health; 2019.

category referred to as “autoimmune disease.” Table 7 2 reveals an impressive (and ominous) list of prevalent autoimmune diseases.

7.1.2 Clinical presentations in immunology and autoimmune disease The signs and symptoms of immune diseases vary widely based on their underlying etiologies (acute, allergy, chronic inflammation) as well as the variety of disorders associated with the autoimmune disease categories. In the case of acute inflammation, indications of the cause are frequently apparent in the patient’s history. Determination and removal of the antigenic (“nonself”) source of the inflammation is effectively 90% of the cure. Such removal can range from simple hygiene to an antibiotic for an infectious antigen (external or internal) to antihistamines and decongestants (for allergy) to injury repair and stress reduction therapies. Somewhat ironically, in large part, the healing process associated with the immune response (i.e., inflammation) is what produces the acute or, in the case of chronic inflammation, extended symptoms.

Chapter 7 • AI applications in prevalent diseases and disorders

299

The clinical presentation of acute inflammation is usually of rapid onset (hours to days for acute inflammation and minutes to hours for allergic IgE responses). The classic inflammatory response, as mentioned above, was described originally in 4 simple words in the first century AD by the Roman scholar Celsus [7]: • • • •

Rubor (redness or hyperemia from vasodilation); Calor (warmth or fever, cytokine induced); Dolor (pain from nociceptor nerve fibers); and Tumor (swelling or edema/induration from fluid and cellular infiltration). (Sometimes also mentioned: functio laesa or loss of function in chronic inflammation).

With the allergic and hypersensitivity response, symptoms can also include itching, sneezing, and congestion (from histamine release). And, as we mentioned above, in their most severe form, allergy or hypersensitivity can produce a life-threatening condition call anaphylaxis and anaphylactic shock [8]. In the case of chronic inflammation, signs and symptoms can vary considerably. As opposed to acute inflammation, chronic inflammation can be difficult to identify without a history of precipitating acute inflammation. A chronically ill patient (chronic illnesses discussed separately, later in this Chapter) can be assumed to have chronic inflammation. This was mentioned previously, that “. . .chronic inflammatory disease is the progenitor or originating cause of all (emphasis on all) the major human disease categories [4,9], (Fig. 7 1).” Non-specific symptoms associated with chronic inflammation include: [10] • • • • • • •

Body pain; Fever (often diagnostically referred to as “fever of unknown origin” or FUO); Constant fatigue and insomnia; Depression, anxiety and mood disorders; Gastrointestinal complications like constipation, diarrhea, and acid reflux; Weight gain; Frequent infections.

In the case of immunosuppression or immunocompromised disease conditions, the symptoms are usually indirect as in increased illnesses, risk of infection, blood disorders, digestive problems, and delayed growth and development [11]. Regarding the array of autoimmune diseases (Table 7 2), there are more than 80 identified affecting more than 50 million Americans (according to the American Autoimmune Disease Related Association, AARDA), of whom 75% are women [12]. Table 7 2 (above) listed the more prevalent autoimmune diseases, and Table 7 3 identifies the top 10, which are undoubtedly quite familiar to most. Relative to the specific conditions (to be discussed separately throughout this Chapter), symptoms range from no symptoms at all to general malaise to severe illness and

300

Foundations of Artificial Intelligence in Healthcare and Bioscience

Table 7–3 Ten (10) most common autoimmune diseases. • • • • • • • • • •

Rheumatoid arthritis Systemic lupus erythematosus (SLE) Inflammatory bowel disease (IBD) Crohn’s disease Multiple sclerosis (MS) Type 1 diabetes mellitus Guillain-Barre syndrome Psoriasis Graves’ disease Myasthenia gravis

Source: Autoimmune diseases: types, symptoms, causes, and more. HealthLine; 2019.

risk of death (as in the case of many COVID-19 patients). A listing of the non-specific symptoms most associated with autoimmune disease include: [13] • • • • • • • •

fatigue; achy muscles; swelling and redness; low-grade fever; trouble concentrating; numbness and tingling in the hands and feet; hair loss; skin rashes.

7.1.3 Current treatment approaches and AI applications in immunology and autoimmune disease In so much as acute inflammation can occur anywhere, internally or externally, the variety of treatment considerations is extensive, but the first treatment is always the removal of the cause (the antigen). Short of that, treatment is palliative and directed to the involved site (joint, muscle, internal organ, skin, etc.) to reduce the inflammatory process (assuming the antigen has been removed) and mitigate the pain. Towards these ends, cold (wet or ice) compresses are enormously valuable to produce vasoconstriction (part of the inflammatory process being vasodilation of associated blood vessels [i.e., rubor] to supply inflammatory cells to the site). Other palliative measures include analgesics, anti-inflammatories, nutritional and vitamin supplements (more on nutrition later in this Chapter), omega-3 sources, compression, stress reduction, exercise. The most popular forms of acute anti-inflammatory therapy include the popular analgesic pain relievers such as nonsteroidal anti-inflammatory drugs (NSAIDs, aspirin, ibuprofen, or naproxen, and corticosteroids (such as prednisone or prednisolone for topical application), and acetaminophen [Tylenol]). It should be noted, however, that steroids and non-steroidal medications only suppress symptoms ('masking effect") but are not curative.

Chapter 7 • AI applications in prevalent diseases and disorders

301

Chronic inflammation and autoimmune diseases often require treatment directed at the tissue(s) and organ system(s) being adversely affected. Some of those organ-specific treatments are used in conditions related directly and indirectly to chronic inflammation and autoimmunities, cancers, type-1 diabetes, and many other autoimmune diseases. More generalized treatment falls under the category of immunosuppressive and immunomodulating (suppressing or stimulating) therapies, sometimes referred to as “non-specific therapies.” These therapies include types of drugs that are used to suppress immune system autoantigenicity in autoimmune diseases or to boost the immune system response in cancers and anti-tumor therapies. Two main categories of these immune-modulating drug classes include “disease-modifying anti-rheumatic drugs (DMARDs)” [14], including methotrexate, sulfasalazine, and leflunomide, and the “biologic drugs” [15], such as etanercept, adalimumab, and abatacept, and many, many others. Some of the more popular non-specific immunotherapeutic agents include cytokines like interferons; interleukins; anti-TNF (tumor necrosis factor, a pro-inflammatory cytokine); monoclonal antibodies (drugs with the name suffix, “. . .mab”); immunoglobin antibodies, gene-based delivery systems; checkpoint inhibitors; and other immune system modulators (e.g., hydroxychloroquine, receiving considerable attention relative to its “potential” inhibitory effects on coronavirus somewhat disproven as of May 2020). While the biologics include a large number of drug options, all are attempting to regulate (increase or decrease) the immune response. The reason for the large variety of drugs is due to the extensive amount of pro-inflammatory mediators in the chronic inflammatory and autoimmune process [16]. Each of the biologics having a distinct biochemical effect on different mediators. This gives treating physicians the option of “experimenting” with a variety of biologics to get a maximal drug effect. It also confuses the hell out of the public when they watch a TV commercial promoting a biologic drug for a specific autoimmune condition (e.g., rheumatoid arthritis) on 1 station. Then they change channels and see the same drug being promoted for an entirely different condition (e.g., psoriasis). The drugs are specific for individual mediators that occur in all of the autoimmune diseases, and thus, the drugs are non-specific for any 1 disease. As described above (and in Fig. 7 1), autoimmune disease may be organ-specific in its cellular damage (e.g., Crohn’s disease, Graves’ disease, etc.), or its harm may be disseminated in multiple organ systems throughout the body (e.g., systemic lupus erythematosus, giant-cell arteritis, rheumatoid arthritis). Thus, treatments for autoimmune disease beyond the drug classes mentioned above must be targeted, organ-specific therapies, or treatments disseminated throughout the body via cellular and genetic pathways. This is also the case in cancer therapies (to be discussed next under “Cancers”). Thus, multiple treatment options and approaches are common to both autoimmune diseases and cancers (effectively an autoimmune disease itself). Among the treatment options common to autoimmune diseases and cancers, besides the drug categories mentioned above, stem cell transplantation and immunogenomic (CRISPR-Cas9 and CAR-T cell) therapies have been well received. They are rapidly approaching the standard of care (“precision medicine”) for targeted, organ-specific treatments as well as disseminated and genomic therapies. The bioscience of both stem cells and genomics have been discussed in some depth in Chapters 5 and 6. A quick review by the

302

Foundations of Artificial Intelligence in Healthcare and Bioscience

reader of the “Basic Bioscience” discussions for each of these topics might be valuable as we now begin to apply those technologies and AI’s influences in their clinical applications, treatments, and current and future research. Effectively, cellular therapies (stem cell transplantation and CAR-T cell replacement therapy) and genetic therapies (gene replacement and CRISPR-Cas9, gene editing) have similar applications, albeit with different therapeutic goals in autoimmunologic diseases, genetic disorders, cancers and numerous other congenital, acquired and chronic conditions to be discussed in this Chapter. In the spirit of efficiency (and hopefully best understanding), we will describe each of the 3 principal therapies (stem cell transplantation, CRISPR-Cas9 and CAR-T) step-by-step in this immunologic section. Then we will revisit them downstream in the “Research and Future Considerations” subdivision of subsequent disease categories in the Chapter wherein they, along with any other relevant therapies, are considered necessary relative to the specific disease. But it is important to note that these innovative and “disruptive” medical and cellular therapies for autoimmune diseases enjoy the benefits that piggyback on the successes of genetic and cancer treatments and vis a versa [17].

7.1.3.1 Stem cell transplantation Hematopoietic stem cell therapy (previously described in Chapter 6, Figure 6 3 as a “blood stem cell” that can develop into all types of blood cells found in the peripheral blood and the bone marrow [18]) is now being used effectively to grow new cellular and immunological based strategies for patients with malignancy and hematological disorders produced or provoked by immunologic or autoimmunologic causes (see below). The objective of stem cell transplantation therapy is to destroy the mature, long-lived, and auto-reactive immune cells and generate a new, properly functioning immune system. The patient’s stem cells are used in a procedure known as autologous (from “one’s self”) hematopoietic stem cell transplantation (Fig. 7 2). First, patients receive injections of a growth factor, which coaxes large numbers of hematopoietic stem cells to be released from the bone marrow into the bloodstream. These cells are harvested from the blood, purified away from mature immune cells, and stored. After sufficient quantities of these cells are obtained, the patient undergoes a regimen of cytotoxic (cell-killing) drug and/or radiation therapy, which eliminates the mature immune cells. Then, the hematopoietic stem cells are returned to the patient via a blood transfusion into the circulation where they migrate to the bone marrow and begin to differentiate to become mature immune cells. The body’s immune system is then restored [19]. Stem cells can be readily harvested from bone marrow and adipose tissue (and other bodily tissues) and converted into undifferentiated induced pluripotent cells (iPSC - see page 303) suitable for transplantation into diseased and degenerated organs and body structures (e.g., diabetes, osteoarthritis, etc.). These cells then regenerate and begin to replace the abnormal cells with new, normal cells and even potentially with functioning organs (organ morphogenesis) [20]. Currently, muscle and bone tissue are particularly responsive to stem cell regeneration. All medical treatments have benefits and risks, but unproven stem cell therapies can be particularly unsafe. The FDA will continue to help with the development and licensing of new stem cell therapies where the scientific evidence supports the product’s safety and effectiveness [21].

Chapter 7 • AI applications in prevalent diseases and disorders

303

FIGURE 7–2 Genetic modification & stem cell therapy. The patient’s own stem cells are used in a procedure known as autologous (from “one’s self”) hematopoietic stem cell transplantation. Source: National Institute of Medicine, NIH.

The use of stem cells, in combination with other gene therapies, expands the potential value of stem cell therapies to new heights. Biomedical research can generate cell lines that can then act as disease models. Induced pluripotent cells (iPSCs) can be modified using CRISPR-Cas9 technology (discussed next) for disease modeling, gene correction therapy, antiviral therapy, and antitumor therapies [22]. One strategy is to knock out disease-relevant genes in human PSCs using CRISPR-Cas9 to explore the pathogenic mechanism in the derived cells. These cells then could be used as disease models for drug therapies [23]. (Notice the increasing overlap in these 2 biosciences, immunology and genetics.)

7.1.3.2 CRISPR-Cas9 (gene editing) One of the effective ways of treating autoimmune disease is to identify the “signature” of offending genes (their “gene expression” or the number of RNA molecules they are producing), which is abnormal in autoimmune (and cancer) genes. This identification is accomplished using a

304

Foundations of Artificial Intelligence in Healthcare and Bioscience

FIGURE 7–3 CRISPR-Cas9. CRISPR guide RNAs target specific spots in the genome for the Cas9 enzyme (“genetic scissors”) to cut (scissor), forming a double-strand break. A machine learning algorithm predicts which types of repairs will be made at a site targeted by a specific guide RNA. Possibilities include an insertion of a single base pair, a small deletion, or a larger change known as a microhomology deletion [25].

technique called ‘single-cell RNA sequencing’ (scRNA-seq), or, more specifically, TIDE (for Tumor Immune Dysfunction and Exclusion) for autoimmune genes [24]. With this information, a procedure called CRISPR-Cas9 (Clustered regularly interspaced short palindromic repeats-associated protein 9), an RNA-guided genome editing technology is being used to re-engineer T cells. It is worth noting that the 2020 Nobel Prize in Chemistry was awarded to 2 molecular biologists, Emmanuelle Charpentier of the Max Planck Unit for the Science of Pathogens Institute for Infection Biology and Jennifer Doudna of the University of California, Berkeley, for the development of this revolutionary genome editing technique often referred to as “genetic scissors.” The CRISPR-Cas9 system (Fig. 7 3) creates a small piece of RNA with a short “guide” sequence that attaches (binds) to a specific target sequence of DNA identified by AI in a genome. The RNA also binds to the Cas9 enzyme and is used to recognize the DNA sequence. The Cas9 enzyme acting as a “scissor” cuts the DNA at the targeted location. Once the DNA is cut, the cell’s DNA uses its repair machinery to add or delete pieces of genetic material, or to make changes to the DNA by replacing an existing segment with a customized DNA sequence [26]. It was first thought that the stitching back together of the genetic material after the CRISPR-Cas9 procedure was random [27]. But subsequent studies using a trained machine learning algorithm called inDelphi to predict repairs made to DNA snipped with Cas9, used guide RNAs to induce a single, predictable repair genotype in the human genome in more than 50% of editing products. This proved that the edits aren’t random. Based on a library of 41,630 guide RNAs and the sequences of the targeted loci before and after repair—a dataset that totaled more than 1 billion repairs in various cell types [28]. The algorithm was then 

Press Release: The Nobel Prize in Chemistry 2020. The Royal Swedish Academy of Sciences. 7 October 2020.

Chapter 7 • AI applications in prevalent diseases and disorders

305

able to use the sequences that determine each repair to predict Cas9 editing outcomes, the researchers reported in Nature Biotechnology [25]. Further explanation of this CRISPR procedure, RNA sequencing and the AI applications that make it possible can be found in the Chapter 8, “Immunogenetic considerations regarding SARS-COV-2 (COVID-19)”, page 459.

7.1.3.3 CAR-T cell (gene replacement) Cancer immunotherapy is a rapidly growing field that has recently demonstrated clinical efficacy in the treatment of solid tumors and hematological malignancies [29]. Numerous clinical approaches have been developed to redirect and/or augment immune function against tumor cells. The application of adoptive cell transfer therapy (ACT therapy) for the treatment of malignant cancers has been expanded by the use of T lymphocytes engineered to express chimeric antigen receptors (CARs) [30]. Chimeric antigen receptor T cells (CAR-T cells) are T cells that have been genetically engineered to give them the new ability to target a specific protein. The receptors are “chimeric” because they combine both antigen-binding and T-cell activating functions into a single receptor. The premise of CAR-T immunotherapy is to modify T cells to recognize cancer cells to more effectively target and destroy them [31]. CAR-T-cell therapy (Fig. 7 4) begins by removing a patient’s lymphocytes and transducing them with a DNA plasmid vector (a DNA molecule distinct from the cell’s DNA used as a tool to clone, transfer, and manipulate genes [32]) that encodes specific tumor antigens. These modified and targeted lymphocytes are then reintroduced to the patient’s body through a single infusion to attack tumor cells. Known as autologous CAR-T-cell therapy, this treatment has been in development for more than 25 years, resulting in 4 generations of improving therapy that has generated responses for up to 4 years in some studies [33]. There are currently 2 FDA approved CAR-T products used in cell malignancies [34]. Based upon the high rates of initial cancer remission and durable responses in many patients receiving CAR-T cell therapy, the ACT field has expanded with CAR-T cell therapy now being applied against numerous other B cell-associated antigens with encouraging clinical response data being reported [35]. Again, as previously described above, about the combination of stem cells with CRISPR-Cas9, so too can CAR-T cell therapies be expanded in combination. To increase the efficiency of the CAR-T cells, CRISPR-Cas9 has been used to increase their antitumor efficiency by disrupting a programmed death protein [36]. Worth noting here is the costs of CAR-T therapy as well as the other immunotherapies discussed. Notwithstanding the significant benefits these therapies provide, the costs are exorbitant. FDA approved CAR-T cell therapy and the CRISPR-Cas9 procedure range from $373,000 to $875,000 for a single treatment [37]. Depending on the type of stem cell procedure, prices can range from $5,000 to $25,000 per procedure [38]. Gene therapies are subject not only to the regulatory structure of the FDA, but also to the Office of Biotechnology Activities, and the Recombinant DNA Advisory Committee. Excessive regulatory oversight creates an elongated and expensive route to approval. Gene therapies provide those with rare, serious, and possibly terminal conditions with the ability to improve

306

Foundations of Artificial Intelligence in Healthcare and Bioscience

FIGURE 7–4 Immunogenics (immunotherapy) Chimeric Antigen Receptor T cells (CAR-T). CAR-T-cell therapy begins by removing a patient’s lymphocytes and transducing them with a DNA plasmid vector (a DNA molecule distinct from the cell’s DNA used as a tool to clone, transfer, and manipulate genes) that encodes specific tumor antigens. These modified and targeted lymphocytes are then reintroduced to the patient’s body through a single infusion to attack tumor cells. Source: National Institute of Medicine, NIH.

their quality of life significantly. By one estimate, an approved gene therapy drug costs nearly $5 billion (five times as high as the average cost of FDA approval) to develop [39]. Some insurers are beginning partial coverage of FDA approved gene therapies, but experimental treatments receive no third-party coverage other than limited humanitarian exemptions.

7.1.4 Research and future AI considerations in immunology and autoimmune disease The diagnosis of autoimmune disease can be challenging for 2 reasons. First, many of the associated diseases share similar symptoms. Second, as described previously, the disease process may be organ-specific or disseminated among multiple body systems. Thus, their

Chapter 7 • AI applications in prevalent diseases and disorders

307

diagnostic evaluation includes a thorough history, physical examination, laboratory testing, and imaging based on suspected organ-system involvement(s). The cause(s) of autoimmune diseases remain unknown, but research has strongly suggested a “pathogenesis” (“natural history”) of the disease progression over time and the damage it produces. The path of the disease includes 5 likely causes (see Table 7 1). They are: (1) a prolonged, untreated inflammatory process; (2) part of the patient’s genome; (3) environmental factors to which the patient was exposed; (4) Increasing release of pro-inflammatory cytokines; and (5) inherent abnormalities in their immune system [40]. A person’s genes are what “predispose” them or provide genetic susceptibility to dysregulate the immune system, which in turn yields chronic inflammation and, in effect, creates the pathological damage to cells, tissues, and organ systems synonymous with all diseases. Finally, the environmental factors (e.g., smoking, pollution) along with the inherited alleles that an individual possesses for a specific gene (the genotype), combine to produce the “phenotype trigger” and the clinical manifestations of the disease state. This recipe for a disease is responsible for immune and autoimmune diseases, for congenital and acquired genetic diseases, for cancers, and in fact, for just about all the conditions to be discussed in the balance of this Chapter. Thus, throughout our discussions of specific disease entities, you will recognize common denominators in their diagnosis and treatments (as mentioned previously in the discussion on “non-specific” drug therapies in autoimmune disease). What will change for each disorder are the clinical manifestations (phenotypes) of the individual disease categories based on cellular, tissue, and organ system(s) involved [41]. Unraveling the genetic and environmental underpinnings of autoimmune disease has become a major focus at the National Institute of Environmental Health Sciences of the National Institute of Health [42,43]. The process of identifying the offending genetic sites (likely multiple mutations) in the person’s genome is overwhelming. As you will recall from Chapter 6, the potential for those mutations in the sequencing of the 4 base compounds within the 20,000 to 25,000 genes in the human genome exceeds 2.5 3 1020 possibilities spread among the 37.2 trillion somatic cells. Thanks to AI, more specifically, big data analytics and deep learning (convolutional neural networking CNN), genetic loci for immune disorders (immunodeficiencies, autoimmune diseases) are now being identified in a timely diagnostic manner (days to weeks versus months to years) [44]. This application of AI to better identify genetic mutations and their associated disease states, combined with the now FDA approved cellular and gene therapies presented above, has created new horizons in the treatments, management, cures and prevention of disease. Finally, for the first time, scientists at the Human Vaccines Project are combining systems biology with artificial intelligence to understand one of the most significant remaining frontiers of human health, the human immune system [45]. Perhaps the most exciting application of AI in immunology is found in the Human Vaccines Project. Researchers are comprehensively sequencing the human immune system, a system billions of times larger than the human genome. The goal is to encode the genes responsible for circulating B cell receptors. This can provide potential new antibody targets for vaccines and therapeutics that work across populations. The Project seeks to define the genetic underpinnings of people’s ability to respond and

308

Foundations of Artificial Intelligence in Healthcare and Bioscience

adapt to an immense range of diseases [46]. The SARS-CoV-2 COVID-19 pandemic will certainly expedite further progress on this critical area of clinical research. The study specifically looks at 1 part of the adaptive immune system, the circulating B cell receptors that are responsible for the production of antibodies, considered the primary determinant of immunity in people. The receptors form unique sequences of nucleotides known as receptor “clonotypes.” This creates a small number of genes that can lead to an incredible diversity of receptors (B idiotype cells and the antibody idiotype-specific regulatory circuit), allowing the immune system to recognize almost any new pathogen. This Project marks a crucial step toward understanding how the human immune system works, setting the stage for developing next-generation health products, drugs, and vaccines through the convergence of genomics and immune monitoring technologies with machine learning and artificial intelligence [47].

7.2 Genetic and genomic disorders As was stated previously, immunology, genetics, and genomics permeate every aspect of human health and disease. They are at the forefront of current and future health care; they are unquestionably the most “disruptive” force in disease diagnosis (discussed in Chapters 5 and 6); treatment; current and future research efforts; and finally, they are intimately tied to the present and future technologies of AI. The introduction to this section on therapies for genetic and genomic disorders is an ideal place to revisit the concept of “precision” or “personalized” medicine (or health). The business and administrative aspects of Precision Health were covered in depth in Chapter 4; its diagnostic considerations were addressed in Chapter 5, and its applications in medical therapies were discussed in Chapter 6. Now it’s time to make Precision Medicine (Personalized Health) . . . personal. What does it mean to you, your health, wellness, and your life? Precision health (or precision medicine) and personalized health (or personalized medicine) are somewhat interchangeable terms, but also somewhat confusing. “Personalized” could be misinterpreted to imply that treatments and preventions are being developed uniquely for each individual. Rather, precision medicine identifies which approaches will be effective for which patients based on genetic, environmental, and lifestyle factors. Thus, the National Research Council preferred the term “precision medicine” to “personalized medicine” [48]. The Precision Medicine Initiative is a long-term research project involving the National Institutes of Health (NIH) and multiple other research centers. They aim to understand how a person’s genetics, environment, and lifestyle can help determine the best approach to prevent or treat disease. The long-term goals of the Precision Medicine Initiative focus on bringing precision medicine to all areas of healthcare on a large scale. To this end, the NIH has launched a study, known as the “All of Us” Research Program, which involves a group (cohort) of at least 1 million volunteers from around the United States. Participants are providing genetic data, biological samples, and other information about their health [49].

Chapter 7 • AI applications in prevalent diseases and disorders

309

7.2.1 Description and etiology of genetic and genomic disorders Genetic disorders are diseases caused in whole or in part by disturbances in the DNA gene sequences (genetic code) of base compound pairs (adenine paired with thymine and guanine paired with cytosine). These disturbances (“mutations”), can occur in 1 gene (monogenic disorder); in multiple genes (multifactorial inheritance disorder); by a combination of gene mutations and environmental factors; or by damage to chromosomes (changes in the number or structure of entire chromosomes, the structures that carry genes see Chapter 5, Figure 5 3). The human genome holds the key to nearly all diseases. Congenital disorders (also known as birth defects) may be hereditary or may occur during fetal development and are present at birth (or identified during infancy). They can be caused by inherited mutations from the parents and are present at birth (e.g., cystic fibrosis, sickle cell anemia, Marfan’s syndrome). Inherited mutations are also called germline mutations because they originate in the parent’s egg or sperm cells, which are called germ cells. When an egg and a sperm cell unite, the resulting fertilized egg cell receives DNA from both parents. If this DNA has a mutation, the child that grows from that fertilized egg will have the mutation in each of their cells [50]. Most inherited genetic diseases are recessive, which means that a person must inherit 2 copies of the mutated gene to inherit a disorder. This is 1 reason that marriage between close relatives is discouraged; 2 genetically similar adults are more likely to give a child 2 copies of a defective gene. Other genetic disorders are caused by “acquired mutations” in a gene or group of genes and occur during a person’s lifetime. Such mutations are not inherited from a parent but instead occur either randomly or due to some environmental exposure (e.g., smoking, lead poisoning, etc.). Mutations range in size affecting anywhere from a single DNA building block (base pair) to a large segment of a chromosome that includes multiple genes. Acquired mutation rates in individuals vary by their genome; by heredity; by type; by generation; and by the environment. The human genome germline mutation rate is low at approximately 0.5 3 1029 per base pair per year [51]. But given constant lifelong cell division and DNA replication, the rate of “acquired mutations” in the human genome (with about 37 trillion somatic [body] cells) is in the trillions. Fortunately, only an infinitesimal amount of them (less than 60) override “apoptosis” (normal cell’s ability to self-destruct when something goes wrong) to produce genetic disorders and disease. Nonetheless, the risk exists as borne out by the rate of cancers, which are directly associated with accumulated cell mutations [52]. Cancer usually results from a series of mutations within a single cell. Often, a faulty, damaged, or missing