Industry 4.0 and Healthcare: Impact of Artificial Intelligence (Advanced Technologies and Societal Change) 9819919487, 9789819919482

This book presents different stages of Industrial Revolution in artificial intelligence and its impact on industry 4.0 a

170 91 8MB

English Pages 257 [253] Year 2023

Report DMCA / Copyright

DOWNLOAD PDF FILE

Table of contents :
Contents
1 An Artificial Intelligence-Based Model for the Detection of Heart Disease Using Machine Learning
Introduction
Machine Learning
Literature Review
Related Work
General Structure for Heart Disease Classification
Research Methodology
Evaluation of Performance and Results
Analysis of Performance Metrics
Conclusion
Future Scope
References
2 Recent Advancements in AI-Assisted Drug Design and Discovery Systems
Introduction
Construction of AI Model for Drug Delivery
Input Data Preparation
Data Format in Drug Domain
Data Quality and Its Measurement
Drug Discovery Using Computation and Machine Learning
Synthesis Analysis
AI Applications in the Production of Pharmaceuticals
AI Applications for Drug Design
Conclusions
Appendix
References
3 Designing Dense-Healthcare IOT Networks for Industry 4.0 Using AI-Based Energy Efficient Reinforcement Learning Protocol
Introduction
IoT-WSN Architecture
IoT-WSN Architecture
Literature Review
Proposed RL-Based Routing Protocol Design
Proposed RL-Based Routing Protocol Design
Validation of EER-RL [5] Protocol
Impact of the Transmissions Probability
Effect of Transmission Distance of Node
Evaluation of Learning Rates
Impact of Higher Node Densities
Conclusions and Future Scopes
References
4 The Impact of Artificial Intelligence on Healthcare
Introduction
Significance of Artificial Intelligence in Healthcare
Related Work
Traditional System Versus AI-Based Healthcare System
Artificial Intelligence
Potential Applications of AI in Healthcare Systems
Conceptual Framework
Methodology
Data Analysis
AI-Based Approach for COVID-19 Treatment
Challenges in AI-Based System
AI, Ethics and Laws and Regulations in Healthcare
Release Burden
Decision Support
Acceptance of AI
The Issue of Data Quality
Conclusions
References
5 Brain Tumor Segmentation of MR Images Using SVM and Fuzzy Classifier in Machine Learning
Introduction
Purpose of Picture Handling
Methods of Picture Handling
Image Classification Techniques
Literature Survey
The Proposed Methodology
Brain Tumor Analysis Methodology
Classification
The Assessment of a Mind Growth is Predominantly Founded on a Fluffy Philosophy.
The Design Changes from a Reasonable Progress to an Etymological
Reference Vector Framework (SVM)
Result and Discussion
Classification Approval Results
Validation 1
Approval 2
Approval 3
Approval 4
Approval 5
Segmentation Outcomes Examination
Examination Assessment for Unmistakable Division Procedures
Assembly Diagram Investigation
Conclusion
References
6 Artificial Intelligent Model for Riot and Violence Detection that Largely Affect Societal Health and Local Healthcare System
Introduction
Artificial Intelligence for Riot and Violence Detection
Basic Concepts of AI for Riot and Violence
Classification of Riot and Violence Detection Techniques
Impact of Artificial Intelligence on Healthcare System
Impact of Riot and Violence on Health and Healthcare System
Impact of Riot and Violence on Health
Impact of Riot and Violence on Local Healthcare System
Conclusion
References
7 How Artificial Intelligence is Transforming Medicine: The Future of Pharmaceutical Research
Introduction
General and Specific Pharmaceutical Screenings Using Artificial Intelligence
How Can We Create Trustworthy and Efficient AI-Assisted Healthcare Systems?
Create and Expand
Participant Involvement and Co-Creation
Human-Centred AI
Fresh Experiments
Assess and Confirm
Size and Ubiquity
Watch Over and Keep Up
Applications of Artificial Intelligence (AI) in Several Pharmaceutical Business Sectors
Pinpoint Diagnostics
Individualized Medicine
Artificial Intelligence Used in Conventional Computational Drug Design
New Therapeutic Methods
AI for Market Research and Forecasting
AI for Prediction of Product Cost
AI Role in Development of Nanomedicine
AI's Role in Emergency Notifications
Limitations and Hindrances
Charges for Maintenance and Service
The Use of Energy
Security and Confidentiality of Data
The Management of Data
Discovery of Novel Illnesses
Future Scope
Conclusion and Significant Recommendations
References
8 Impression of Big Data Analytics and Artificial Intelligence for Healthcare—A Study
Introduction
Big Data and Healthcare
Big Data for Healthcare
Artificial Intelligence
AI for Healthcare
Big Data in Healthcare
Role of Big Data Analytics in Healthcare
Big Data Architecture in Healthcare
Big Data Challenges and Opportunities in Healthcare
AI and Healthcare
AI Role in Renovating Healthcare
A Case Study: Public Health System and AI
Converging AI with Big Data Analytics for Healthcare System
AI Will Change the Future
Conclusion
References
9 Artificial Intelligence Based Querying of Healthcare Data Processing
Introduction to Healthcare Case Study
Process Flow of Case Study
Hadoop and Spark Architecture for Huge Data Storage and Effective Analytics
Hadoop and Spark Integration
Spark Architecture
Conclusion
References
10 A Disaster Management System Powered by AI and Built for Industry 4.0
Introduction
What's Disaster?
Vulnerability
Risk
Hazards
What is DM? [3]
Program for DM Risk in Brief
DM in India
The Institutional and Legal Environment
Cyclone
Flood
Disaster Mitigation and Prevention
Preparedness
History of Disaster and DM in India
DM Plan for Surat [4]
Metropolitan City, India Disaster Risk Management Profile
Management of Earthquake
Policy for DM in Gujarat (GSDMP) [5]
The Bhopal Gas Tragedy
DM Plan for GIR
India Tsunami [6]
Dispute and Problems
Challenges for Current DMS
Ideologies
Working
Solution
Advantages of Integrated Disaster Management
Essential Ingredients for Effective Disaster Management
Conclusion
References
11 A Vision for Industry 4.0 Utilising AI Techniques and Methods
Introduction
Innovative Manufacturing
Intelligent Manufacturing (IM) Models Driven by Data
Intelligent Manufacturing Systems (IMS)
Collaboration Between Humans and Machines
Primary Industrial 4.0 Challenges with Artificial Intelligence
Embedded Systems with ICT Methodologies
Key Elements of Industry 4.0 with Ai: Abcde
Conclusion
References
12 Artificial Intelligence and Optimization Strategies in Industrial IoT Applications
Introduction
Review of Related Works
Operational Optimization Strategy
Linking Return with Investment to AIoT Solutions in Industry
Discussion
Case Study
Case Study: Complex Process KPIV Exploring by AIoT
Case Study: Computer Vision
Conclusion
References
Recommend Papers

Industry 4.0 and Healthcare: Impact of Artificial Intelligence (Advanced Technologies and Societal Change)
 9819919487, 9789819919482

  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
File loading please wait...
Citation preview

Advanced Technologies and Societal Change

Ashish Mishra Jerry Chun-Wei Lin   Editors

Industry 4.0 and Healthcare Impact of Artificial Intelligence

Advanced Technologies and Societal Change Series Editors Amit Kumar, School of Electrical and Electronic Engineering, Bioaxis DNA Research Centre (P) Ltd., Hyderabad, Telangana, India Ponnuthurai Nagaratnam Suganthan, School of EEE, Nanyang Technological University, Singapore, Singapore Jan Haase, NORDAKADEMIE Hochschule der Wirtschaft, Elmshorn, Germany Editorial Board Sabrina Senatore, Department of Computer and Electrical Engineering and Applied Mathematics, University of Salerno, Fisciano, Italy Xiao-Zhi Gao , School of Computing, University of Eastern Finland, Kuopio, Finland Stefan Mozar, Glenwood, NSW, Australia Pradeep Kumar Srivastava, Central Drug Research Institute, Lucknow, India

This series covers monographs, both authored and edited, conference proceedings and novel engineering literature related to technology enabled solutions in the area of Humanitarian and Philanthropic empowerment. The series includes sustainable humanitarian research outcomes, engineering innovations, material related to sustainable and lasting impact on health related challenges, technology enabled solutions to fight disasters, improve quality of life and underserved community solutions broadly. Impactful solutions fit to be scaled, research socially fit to be adopted and focused communities with rehabilitation related technological outcomes get a place in this series. The series also publishes proceedings from reputed engineering and technology conferences related to solar, water, electricity, green energy, social technological implications and agricultural solutions apart from humanitarian technology and human centric community based solutions. Major areas of submission/contribution into this series include, but not limited to: Humanitarian solutions enabled by green technologies, medical technology, photonics technology, artificial intelligence and machine learning approaches, IOT based solutions, smart manufacturing solutions, smart industrial electronics, smart hospitals, robotics enabled engineering solutions, spectroscopy based solutions and sensor technology, smart villages, smart agriculture, any other technology fulfilling Humanitarian cause and low cost solutions to improve quality of life.

Ashish Mishra · Jerry Chun-Wei Lin Editors

Industry 4.0 and Healthcare Impact of Artificial Intelligence

Editors Ashish Mishra Department of Computer Science Gyan Ganga Institute of Technology and Sciences Jabalpur, India

Jerry Chun-Wei Lin Silesian University of Technology Gliwice, Poland

ISSN 2191-6853 ISSN 2191-6861 (electronic) Advanced Technologies and Societal Change ISBN 978-981-99-1948-2 ISBN 978-981-99-1949-9 (eBook) https://doi.org/10.1007/978-981-99-1949-9 © The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2023 This work is subject to copyright. All rights are solely and exclusively licensed by the Publisher, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed. The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. The publisher, the authors, and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors give a warranty, expressed or implied, with respect to the material contained herein or for any errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in published maps and institutional affiliations. This Springer imprint is published by the registered company Springer Nature Singapore Pte Ltd. The registered company address is: 152 Beach Road, #21-01/04 Gateway East, Singapore 189721, Singapore

Contents

1

2

3

An Artificial Intelligence-Based Model for the Detection of Heart Disease Using Machine Learning . . . . . . . . . . . . . . . . . . . . . . . Vishal Paranjape, Neelu Nihalani, and Nishchol Mishra Recent Advancements in AI-Assisted Drug Design and Discovery Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Kamal Nayan, Karan Kumar Paswan, Vinamra Bhushan Sharma, Yogendra Kumar, and Saurabh Tewari Designing Dense-Healthcare IOT Networks for Industry 4.0 Using AI-Based Energy Efficient Reinforcement Learning Protocol . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Susheel Kumar Gupta, Sugan Patel, Praveen Kumar Mannepalli, and Sapna Gangrade

4

The Impact of Artificial Intelligence on Healthcare . . . . . . . . . . . . . . . Shivshankar Rajput, Praveen Bhanodia, Kamal K. Sethi, and Narendra Pal Singh Rathore

5

Brain Tumor Segmentation of MR Images Using SVM and Fuzzy Classifier in Machine Learning . . . . . . . . . . . . . . . . . . . . . . . Ashish Mishra, Meena Tiwari, Jyoti Mishra, and Bui Thanh Hung

1

19

37

59

81

6

Artificial Intelligent Model for Riot and Violence Detection that Largely Affect Societal Health and Local Healthcare System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113 Mahaveer Jain, Praveen Bhanodia, and Kamal K. Sethi

7

How Artificial Intelligence is Transforming Medicine: The Future of Pharmaceutical Research . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133 Pankaj Sharma, Vinay Jain, and Mukul Tailang

v

vi

Contents

8

Impression of Big Data Analytics and Artificial Intelligence for Healthcare—A Study . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151 Sonali Vyas, Dinesh Bhatia, and Sunil Gupta

9

Artificial Intelligence Based Querying of Healthcare Data Processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173 UmaPavan Kumar Kethavarapu, Praveen Kumar Mannepalli, Bhavanam Lakshma Reddy, Pusapati Siva Prasad, Ashish Mishra, and Sateesh Nagavarapu

10 A Disaster Management System Powered by AI and Built for Industry 4.0 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 185 Raj Kumar Singh, Ishan Srivastava, and Vandana Dubey 11 A Vision for Industry 4.0 Utilising AI Techniques and Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 207 L. Bhagyalakshmi, Rajeev Srivastava, Himanshu Shekhar, and Sanjay Kumar Suman 12 Artificial Intelligence and Optimization Strategies in Industrial IoT Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 223 Yu-Chung Wang and Jerry Chun-Wei Lin

Chapter 1

An Artificial Intelligence-Based Model for the Detection of Heart Disease Using Machine Learning Vishal Paranjape , Neelu Nihalani , and Nishchol Mishra

Introduction Today the reason for a vast number of deaths is cardiovascular disease CVDs. Due to strokes and cardiac attacks, there are four out of five deaths, and people below the age of 70 die prematurely because of it, which is one-third of these deaths [1]. There are several reasons for CVD: unbalanced diet, smoking, stress, alcohol, fast foods, and inactive lifestyles. Our present work is based on the concept of artificial intelligence where the outer layer is artificial intelligence the subset if artificial intelligence is machine learning and the subset of machine learning is deep learning. The major focus of artificial intelligence lies in the fact that the task which humans do better can be done with full perfection by machines. A study surveyed in 2016 that over 17 million individuals’ deaths are because of cardiovascular disease by the world health organization, which accounts for over 30% of deaths worldwide. The same survey found that the number of mortalities in underprivileged and medium-income countries is more than 70%. The good news is that heart diseases can be prevented by avoiding some critical factors, such as poor diet habits and insufficient physical exercise. Machine learning is a sub-field of artificial intelligence which helps in making accurate predictions that earlier only humans could take with their experience and expertise. The present work is based on a prediction of heart disease using certain parameters. To control their general state of health and avoid sudden cardiac failure, prompt detection and predictive mechanisms are needed for people who are at risk of high cardiovascular disease. Speaking about predicting heart disease, one of the well-known predictions is machine learning. AI showed promising outcomes in healthcare. In the Journal of Clinical Analysis [2] in a 2012 study, ML plays a vital role in automatically detecting intricate patterns in radiology applications, and it helped radiologists make smart decisions. Moreover, in 2015 [3], the researchers V. Paranjape (B) · N. Nihalani · N. Mishra RGPV, Bhopal, India e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2023 A. Mishra and J. C.-W. Lin (eds.), Industry 4.0 and Healthcare, Advanced Technologies and Societal Change, https://doi.org/10.1007/978-981-99-1949-9_1

1

2

V. Paranjape et al.

showed that machine learning is essential to improve our understanding of cancer progression. Clinical diagnosis is a diagnostic task whereby evaluating the qualities of a variety of features; a doctor tries to classify disease. Traditional approaches to treat heart disease, including ECG, blood pressure, level of cholesterol, etc., are costly and require a lot of time. So, to drop the death rate. It is necessary to design a heart disease diagnosis system that is computerized. The number of physicians is low compared to patients. That is why it is essential to a system that efficiently diagnoses heart disease well in advance. It gives more precise results than traditional ways and reduces cost [4]. Researchers are continually doing their hardest to identify more reliable smart models for the successful treatment of this disorder, and a variety of smart models based on professional ML methods have been established with the passing of time. To make the medical diagnosis easy and cheap, we choose to build a fusion model to classify disease effectively. Decision-level fusion has been implemented using the summation of scores of separate models [5]. The decision score of the separate models has been taken from the trained models. As the fusion occurs at the decision, so it is referred to as decision-level fusion. The supervised algorithms-decision tree and logistic regression are fused with a deep neural network separately to develop a better fusion model that can predict heart disease by analyzing the data. This work contributes by making a fusion model besides different machine learning algorithms to achieve better accuracy than them. And in our work, there was a notable improvement in the accuracy after fusion. This paper is presented as follows. The concern for the issue of heart disease is discussed in the first segment. The second segment contains the previous work. Materials and methodology are presented in the third segment. Experimental outcomes are revealed in part four of our suggested architecture for identifying heart disease and also a comparative analysis with previous work. After that, we have ended with a conclusion and prospects. The majority of people today live an unhealthy and fast-paced lifestyle that, according to studies, causes a heart jolt. The heart is the organ that pumps blood with the required amount of oxygen and other vital nutrients into different parts of the body through vessels. Every organism’s survival is largely dependent on the heart’s proper functioning, and if the heart’s pumping activity is problematic, the body’s key organs, such as the brain and kidneys, will suffer negative consequences. When the heart stops working, the individual dies within minutes. Heart disease, myocarditis, cardiac attack, cardiac cancer, and other diseases are all linked to our unhealthy lifestyles. CVD system does not have enough blood to supply blood to the heart in coronary heart disease due to cholesterol and fat within the arterial wall. When the path of a artery is choked due to coagulation of blood on the heart’s wall, it is called a heart attack. Angina is a condition in which the heart’s blood flow is disrupted, resulting in chest pain. Coronary artery disease, heart disease, stroke, high B.P., and other conditions may also cause cardiac disease. Lifestyle of people in todays life have become very hectic and most of the people residing in metro cities and urban lifestyle rarely gets time for exercise, gym, etc. to keep them fit and healthy as a result of which prediction of heart disease for a person after 40 years of age would help them a lot in changing their lifestyle and bringing them back to their fitness in order to provide a healthy society (Fig. 1.1).

1 An Artificial Intelligence-Based Model for the Detection of Heart …

3

Fig. 1.1 Death rate of highly effected countries (Source WHO)

The use of data generated by machines or decision support systems may be one way to cut costs in health-care facilities. Disease diagnosis technique is used by medical practitioner for taking decisions using scientific data and his or her clinical expertise for diagnosing new and uncertain case. This decision-making process can be automated to improve cost, facility, speed, precision, and reliability (Fig. 1.2). The paper further discusses the research work done by several researchers. Next we proposed our model for diagnosis of heart disease using ML. The last section deals with conclusion and future work.

Machine Learning Machine learning is the science where machines are taught to think and work in those domains where humans work better. “A computer program is said to learn from experience E with respect to particular classes of tasks T and measurement of performance P, if its performance in tasks T, as measured by P, improves with experience E,” according to [6]. A new and innovative field of ML came in the 1990s, although it first appeared in the 1950s [7]. There are a number of industries that deploy ML algorithms. Experience teaches humans as and when they are exposed to new situations. Depending on the utilization of learning they may or may not be graded. The various learnings in ML are: supervised, unsupervised, semi-supervised, and reinforcement learning [8]. The present work is focused on supervised machine learning where we have a result and we need to train our model on the basis of

4

V. Paranjape et al.

Fig. 1.2 Major risk factors

training data using different machine learning algorithms. Further, we will apply our model on testing data to get prediction. Finally we see the accuracy of our model for different algorithms. It is known as supervised learning when algorithms are provided with training data and reliable results [9]. The training data are subjected to the machine learning technique, and predictions on testing data are then made by the resulting model. The projected value and the actual value are now contrasted.

Literature Review Related Work This chapter will cover a variety of machine learning classifiers as well as past work on heart disease. We can utilize different algorithms, also known as classifiers, for the purpose of prediction in our project. The present study in our work majorly deals with the prediction of patients with or without heart disease by applying algorithms to our dataset. We shall use six machine learning algorithms since it will help us to make more accurate and dependable predictions [10]. A true and valid prediction can only be made if we have more than one algorithm for comparison, while it may have a high level of accuracy, it may not be the best or most suited method for our situation. When we employ more than one algorithm or classifier, in our case four, we can compare them to one another, and if one classifier provides accuracy that is not even

1 An Artificial Intelligence-Based Model for the Detection of Heart …

5

close to the accuracy supplied by the other method, we can deduce that something is amiss. It’s possible that the method isn’t up to the task or that we committed a coding error [11]. As a result, any prediction-based system must employ multiple algorithms. We’ve decided to employ the following algorithms in our project: 1. Decision tree [12], 2. Naive Bayes [12], 3. SVM (support vector machine), and 4. Random Forest 5. XG Boost and 6. KNN. Below, we’ll go over each of those algorithms in detail. We will focus on past researches and on the basis of limitations in the same we would predict newer improvements in the present work. Now we’ll go over some of the previous research on heart disease prediction. Harshit Jindal et al. [13] highlighted that there is a tremendous increase in the number of heart diseases in todays world so predicting it earlier would help a lot in saving lives. This work focuses on the prediction of detecting people who can be more prone to heart disease based on certain parameters. Prediction of heart disease can more easily be done by taking into consideration the patient medical history and other details. We use various ML techniques like logistic regression and KNN to predict and classify patients with heart disease. A useful technique was applied to govern the utilization of the model for the prediction of heart disease. A high level of accuracy was shown with the model implemented with KNN and logistic regression with respect to other previously existing algorithms. Therefore, a great deal of stress has been reduced by using the given model to calculate the likelihood that the classifier will correctly and accurately diagnose cardiac sickness. Pronab Ghosh et al. [14] told that cardiovascular disease (CVD) is one of the most common major conditions that damage people’s health. Early detection of CVDs may help to avoid or ameliorate the disease, lowering mortality rates. Using machine learning algorithms to identify risk variables is a promising method. For the purpose of obtaining accurate cardiac disease prediction and efficient model is presented. To find suitable features Relief and LASSO approaches were used. Manoj Diwakar et al. [15] told that the diagnosis of disease is the most important aspect of healthcare. As we know prevention is better than cure so if people know earlier about the danger, then they would be an alert mode by taking proper care of themselves. An accurate and timely diagnosis of the disease can help a lot in the medical field. As the diagnosis of heart disease is difficult therefore it is one of the most disastrous diseases prevailing today so an early prediction of the same would save the lives of many people. A fusion classification approach has been used by researchers in this paper to detect heart disease and help health-care practitioners in the detection of heart disease. A lot of work has been done in the area of medical science but the efficient prediction of diseases like cardiac arrest will help a lot to overcome the problems in our society and help humans to live a stress-free life. Prasanta Kumar Sahoo et al. [16] suggested that stress is the major cause of illness in todays world which leads to cardiac failure. A major cause of cardiac arrest today is Covid which has affected humans a lot. Inflammation of the heart muscle is caused by Corona virus. There is an abrupt increase in heart disease because of mental stress caused by the corona virus particularly in urban areas. It has emerged as the leading cause of death in both urban and rural areas. One of the most common types of

6

V. Paranjape et al.

cardiac disease is CAD (coronary artery disease). In the medical sector, forecasting cardiac disease has become a difficult and time-consuming task that necessitates patient health records and, in some situations, genetic information. SVM, Naive Bayes, Logistic Regression, Decision Tree, and KNN are among the algorithms used in this project. The best result was determined to be SVM, which had an accuracy of up to 85.2%. Rishabh Magar et al. [17] told that most of the death in todays world is caused by heart disease. Doctors and experts are also not able to predict it easily but machines on the basis of certain features are able to predict accurately about the disease. In order to enhance and improve medical efficiency we need to deploy an automated system in medical diagnosis. For efficiently finding the patients suffering from heart disorders we make use of machine learning algorithms. In this work, different hidden patterns are revealed by making use of various classification algorithms to efficiently predict the causes of heart disease. Galla Siva Sai Bindhika et al. [18] suggested that heart disease is a widespread disease in the world today faced by several people. It is a major challenge to predict accurately the patients suffering from heart disease so that proper care and medication can be given to them well in advance. On the basis of a large amount of data, we see that Hybrid Machine Learning (ML) is an effective technique in making decisions in the health-care industry and hospitals. Recent breakthroughs in numerous areas of the Internet of Things have also used machine learning techniques (IoT). Various studies reveal that machine learning can only accomplish a fragment of what can be done to predict heart illness. We propose a narrative strategy for detecting significant traits using machine learning algorithms in this study, which enhances cardiovascular disease prediction accuracy. A prediction model with a range of attributes and classification techniques is proposed. Ramatenki Sateesh Kumar et al. [19] told that the major part of our human body is the heart. As per reports of WHO 31% of deaths are caused by the irregular functioning of the heart. Unhealthy habits of eating n working etc. are the major cause of Heart disease. For the purpose of efficient prediction of heart disease we use an ensembling classifier K-Nearest Neighbor [17], Support Vector Machine [18], MK-NN, and CART [19]. Sangya Ware et al. [20] discussed that heart disease is a very dangerous illness that is becoming more widespread these days. For the detection of cardiac disease we use a simple ML technique by designing a simple, light-weight methodology. In order to anticipate cardiac disease, machine learning can be used. Different ML algorithms are utilized in this article, and the results are compared using various performance criteria. A thorough comparison of heart disease is done by the present technique making use of data from the UCI data repository. The present work uses several datasets like Switzerland dataset, the Hungarian dataset, and the Cleveland dataset. In the present work, we have used the Cleveland dataset for training and testing which has 303 patient records with 14 attributes. Efficient preprocessing has been done to remove noise and missing data.

1 An Artificial Intelligence-Based Model for the Detection of Heart …

7

Ibomoiye Domor Mienye et al. [21] revealed that an early diagnosis of it is really fruitful. In this study, for predicting the risk of heart disease a new machine learning strategy is proposed. Using a mean-based splitting methodology, the methodology randomly partitions the dataset into smaller sections. After that, classification and regression trees are used to model the various partitions (CART). A. Ann Romalt et al. [22] According to recent studies, heart disease is a major disease-causing deaths in the U.S. In a year, almost a billion individuals will die from heart failure. We can diagnose this ailment using machine learning prediction. For prediction, the machine learning system makes use of a few key features. Because all attributes in a dataset cannot be used, feature selection is critical in prediction. Only the features that are relevant are used. In this research, our focus is an exploration of feature selection strategies for detecting heart disease. Mangesh Limbitote et al. [23] told that the major task of the heart is the purification and circulation of blood. A major cause of death worldwide is due to cardiac arrest. Some symptoms are documented, such as chest pain, a quicker heartbeat, and difficulty breathing. This information is examined on a regular basis. Current treatment and overview of heart disease are given in this article. The most relevant ML algorithms for predictions of heart disease are presented in this article. We’re working on the most accurate algorithm possible. This will make it easier for doctors to deal with the cardiac problem. Akhand Pratap Singh et al. [24] told that heart disease is one of the most dreadful diseases for humans on the planet, and it has a tremendous impact on people’s lives. When a person gets cardiac arrest there is a struggle in the heart to pump blood in all parts of the body. An early diagnosis can prevent it. Machine learning approaches have been demonstrated to be useful in determining and predicting a significant amount of data supplied by the medical business. The diagnosis of cardiac disease using a machine learning system has been published in several research publications. A review of some recent efforts connected to the use of machine learning in the prediction of heart disease is predicted in this study. This review serves as a foundation for comprehending the domain’s complexity, the researchers’ tools and methodologies, and the current efficiency obtained by diverse methodologies. Senthilkumar Mohan et al. [25] told that majority of death today is caused by heart disease. Major problem is the diagnosing of heart disease so ML plays a vital role in this and efficiently determines and predicts the heart disease and helps a lot in the health-care industry. Several domains of IoT have also made prominent use of ML approaches. Several researchers made use of ML techniques for the prediction of heart disease. The researcher in their work suggested that one of the most common diseases nowadays is heart-related disorders so if there is a mechanism for the early detection of the same it would help a lot. The dataset based on risk factors deployed the use of the Naive Bayes algorithm. Decision tree also gives an accurate prediction for large dataset and in this respect, we have made use of a decision tree for the correct prediction of heart disease (Table 1.1).

8

V. Paranjape et al.

Table 1.1 Comparative Study of Different Research Papers Author

Year

Journal

Method used

Harshit Jindal et al. [13]

2021

IOP Conf. Series: Materials Science and Engineering

Logistic Regression & KNN

Pronab Ghosh et al. [14]

2021

IEEE Access

DTBM, RFBM, ABBM, GBBM

Manoj Diwakar et al. [15]

2020

Elsevier

Classification Techniques

Prasanta Kumar Sahoo et al. [16]

2020

SSRN

SVM, Naïve Bayes, Logistic Regression

Rishabh Magar et al. [17]

2020

JETIR

Data mining techniques applied

Galla Siva Sai Bindhika et al. 2020 [18]

IRJET

HRF Technique with Linear Model

Ramatenki Sateesh Kumar et al. [19]

2020

IJRTE

KNN, SVM, MK-NN and CART

Sangya Ware et al. [20]

2020

IJRTE

SVM Algorithm

Ibomoiye Domor Mienye et al. [21]

2020

Elsevier

CART & WAE

A. Ann Romalt et al. [22]

2020

Journal of Critical Reviews

Feature Selection

Mangesh Limbitote et al. [23] 2020

IJERT

SVM, ANN, Naive Bayes

Akhand Pratap Singh et al. [24]

2020

Journal of Xi’an University of ML Methods Architecture & Technology discussed

Senthilkumar Mohan et al. [25]

2019

IEEE Access

HRF Technique with Linear Model

Rajesh N. et al. [26]

2018

IJET

Combination of algorithms

General Structure for Heart Disease Classification The dataset in our proposed work will be taken from well-known Kaggle dataset repository. The dataset comprises several features like age, sex, type of chest pain, etc. Data cleaning, preprocessing and EDA are done on the given dataset [27]. After this process training of data is done to build the model. Several techniques and ML algorithms like logistic regression, Naïve Bayes, svm, etc. are deployed to train our model, and accuracy is evaluated. After the performance evaluation Random Forest or Bootstrap aggregator is used to give a high-performance model for efficient prediction [28].

1 An Artificial Intelligence-Based Model for the Detection of Heart …

9

Table 1.2 Attributes of the Dataset Attribute

Explanation

age

Age

sex

1:male, 0:female

cp

Chest pain type, 1:typical angina, 2:atypical angina, 3:non-angina pain, 4: asymptomatic

trestbps

Resting blood pressure

chol

Serum cholesterol in mg/dl

fbs

Fasting blood sugar >120 mg/dl

restecg

Resting electrocardiograph results values (0,1,2)

thalach

Maxheart rate achieved

exang

Exercise induced angina

oldpeak

Oldpeak = ST depression induced by exercise relative to rest

slope

The slope of the peak exercise ST segment

ca

Number of major vessels (0–3) colored by fluoroscopy

thal

Thal:3 = normal; 6 = fixed defect; 7 = reversible defect

Dataset A well-known data repository named Kaggle helped us in the selection of dataset for the purpose of heart disease prediction which comprised of the following 13 attributes (Table 1.2). In the table given above, we see that there are 13 attributes that are quite important to predict if a person is suffering from heart disease or not. Some attributes in the dataset play a vital role in the prediction and training of our model. The table above gives an explanation of the attributes which we took for our research work (Table 1.3). Tools Used We use the various open-source tools for our study: 1. 2. 3. 4. 5.

Python 3.5 NumPy Pandas Seaborn SciPy and Scikit-learn

1. Python: Today one of the most user-friendly languages used widely in machine learning is Python and is used in several domains. Documentation and community support for Python is easy to use [29]. 2. NumPy: A powerful package in Python is numpy which stands for numerical Python used for scientific computing. Numpy is used for Fourier transform, algebra, N-dimensional array, etc.30 .

10

V. Paranjape et al.

Table 1.3 Details of Heart Disease Dataset Attribute S. no.

Attributes

Data types

Description

Value range

1

age

int 64

Age in years

29–77

2

sex

int 64

Gender instance

0 and 1

3

cp

int 64

Chest pain type

0,1,2,3

4

trestbps

int 64

Resting blood pressure in mm Hg

94 to 200

5

chol

int 64

Serum cholesterol in mg/dl

126 to 564

6

fbs

int 64

Fasting blood sugar >120 mg/dl

0.1

7

restecg

int 64

Resting ECG results

0,1,2

8

thalach

int 64

Maximum heart rate achieved

71 to 202

9

exang

int 64

Exercise induced angina

0.1

10

oldpeak

float 64

ST depression induced

1 to 6.2

11

slope

int 64

Slope of peak

1.2

12

ca

int 64

Number of major vessels

0 to 4

13

thal

int 64

Defect types

0 to 3

14

target

int 64

Diagnosis of heart disease

0.1

3. Pandas: These are special package used in machine learning which works on data frames. A complete set of data analysis tool is provided by it and competes with R programming. Various operations like reading csv and excel files etc. can be performed with Pandas.31 4. Seaborn: Data visualization is the most important things for providing fruitful insights to raw data and for that the package used in Python is the seaborn class. Seaborn and matplotlib are used to derive conclusions from datasets.32 Seaborn is much more informative with respect to matplotlib and is tightly coupled with NumPy and Pandas. 5. SciPy: It lies above NumPy and Scikit-learn is widely used in ML and comprises fundamental mathematical functions. It is also very easy to use.33

Research Methodology The present chapter deals with various approaches used in our present work and is depicted by the flow chart below (Fig. 1.3). The various steps of our research methodology are as follows: 1. The initial procedure for this project is the collection of data. We have collected the dataset from Kaggle, which is available; it is an open-source. 2. The next step after data collection is data preprocessing. In this step, the data is cleansed by removing unnecessary values. It also removes the missing/ null/ corrupted values.

1 An Artificial Intelligence-Based Model for the Detection of Heart …

Fig. 1.3 Steps of our proposed research work

11

12

V. Paranjape et al.

3. After data is cleansed, the next step is dividing the data, we divide the data into two sets, training data and testing data. The values have to deal with before we start to construct the training model. By using training data we build a model for prediction. 4. We chose SVM Algorithm as it achieved more accuracy and it is efficient. Now, we have to calculate the accuracy of the model. 5. The final step is predicting the disease. The final output will be in numerical 1 = Yes; 0 = No. In the very beginning, we used the available dataset from UCI repository and studied it to compare different ML algorithms for understanding the performance on the basis of metrics. Next we found the scope for its study and purpose and available resources. We have gone through several research papers to see the work done by several researchers and found that most of the researchers gave the machine learning technique but in our work we have used ensemble-based learning technique which improves the prediction accuracy of our model. We have conducted our experiments on google colab platform where we have done the data preprocessing including the exploratory data analytics and other tasks. Further research related basic knowledge was done in order to see the activities to be followed to accomplish our tasks. Exploratory Data Analysis In this section, we will go through the various techniques for data exploration. Checking for the datatype using info() method

Snapshot 1: Dataset Information

From the above figure, we can see that there are thirteen attributes out of which only one attribute has float data type rest all attributes have integer values. A total number of data in our dataset is 303 and is non-null means there are no missing values. The total memory usage of our dataset is 33.3 KB.

1 An Artificial Intelligence-Based Model for the Detection of Heart …

13

Checking for the Null values in our dataset

Snapshot 2: Heatmap Showing Null Values

From the above figure we see that there are no missing values as a complete filled solid depicts that there are no missing values. Checking for the Correlation between columns in our dataset

Snapshot 3: Correlation Matrix

The correlation graph above depicts the dependence of one attribute with another where we can see that cp, i.e., chest pain has a high correlation with causing heart disease. Similarly, we see that thalach feature has a high correlation with the target feature. In the same way, we see that slope feature has much correlation with the target. Negative values of correlation symbolize that the two features have no correlation with each other.

14

V. Paranjape et al.

The first step for preprocessing and EDA comprises finding missing values via Heat Maps. In heat maps, if we get a solid fill, it reveals that there are no missing values in our dataset. We conclude from the below graph that majorly the person in the age group of 55–65 years are more prone to a heart attack (Figs. 1.4 and 1.5). The above graph shows that chest pain of ‘0’ with typical angina is the rarest chance to get affected by cardiac arrest. Now, for applying we divide our dataset into training and testing with an 80:20 ratio, where Training data made up 80% of the data and Testing data 20%. Result of splitting is as follows: Training Dataset

Testing Dataset

(242,13)

(61,13)

Fig. 1.4 Age distribution via Bar Graph

Fig. 1.5 Bar graph chest pain (cp) wrt target

1 An Artificial Intelligence-Based Model for the Detection of Heart …

15

Evaluation of Performance and Results Analysis of Performance Metrics Accuracy: In order to find correct values we use the parameter called accuracy which is the sum of all true values divided by total values Accuracy =

(TP + TN) (TP + TN + FP + FN)

(1.1)

Precision: In order to determine the correctness of prediction by the model we use a parameter called precision. It is given by Precision =

(TP) (TP + FP)

(1.2)

Recall: Models ability to predict positive values are depicted by the recall. It is true positives divided by the total number of actual positive values. Recall =

(TP) (TP + FN)

(1.3)

F1 Score: For taking both precision and recall used the F1 measure (Table 1.4, Figs. 1.6 and 1.7).

F1 =

2 ∗ Precision ∗ Recall (Precision + Recall)

(1.4)

The above figure demonstrates that the best accuracy for heart disease prediction in our model is represented by the ensemble-based strategy, which includes Random Forest with an accuracy of 88.52 percent. Table 1.4 Analysis of algorithms on different parameters

Algorithm

Precision

Recall

F1 Score

Naïve Bayes

84

90

87

SVM

88

88

88

KNN

75

71

73

Decision Tree

78

93

85

XG Boost

84

87

86

Random Forest

91

88

89

16

V. Paranjape et al.

90 85 80

Random Forest

XG Boost

Recall Decision Tree

70 KNN

Precision SVM

75 Naïve Bayes

Percentage (%)

95

F1 Score

Algorithms

Fig. 1.6 Comparison of accuracy metrics

Accuracy of an Algorithm 100 86.89

90

85.25

85.25

88.52

Decision Tree

XG Boost

Random Forest

86.89

80 70.49

Accuracy %

70 60 50 40 30 20 10 0 Naïve Bayes

SVM

KNN

Algorithm Fig. 1.7 Comparison percentage of all algorithms

Conclusion The present work was based on the prediction if a person would be suffering from heart disease or not. Comparison of accuracy and analysis of the reason for variation for different algorithms was the main task. We took the dataset from wellknown UCI data repository comprising 303 instances with 14 attributes and deployed

1 An Artificial Intelligence-Based Model for the Detection of Heart …

17

machine learning by dividing the dataset into training and testing [5]. Six algorithms of machine learning have been used for enhancing the prediction accuracy of heart disease. On applying all the techniques finally we found that ensemble-based approach gives the maximum accuracy of 88.52%. Further if we use much more data or increase the number of attributes, then it may be possible that we can further increase the accuracy.

Future Scope In our proposed work we have worked on the classification algorithms of supervised machine learning and our work is done on a small dataset but in future if we have a larger dataset, we can apply deep learning techniques to enhance the efficiency of our model. In the present work ensemble-based technique comprising random forest has been used.

References 1. Ayatollahi, H., Gholamhosseini, L., Salehi, M.: Predicting coronary artery disease: a comparison between two data mining algorithms. BMC Public Health 19, 1–9 (2019) 2. Wang, S., Summers, R.M.: Machine learning and radiology. Med. Image Anal. 16, 933–951 (2012) 3. Kourou, K., Exarchos, T.P., Exarchos, K.P., Karamouzis, M.V., Fotiadis, D.I.: Machine learning applications in cancer prognosis and prediction. Comput. Struct. Biotechnol. J. 13, 8–17 (2015) 4. Karayılan, T., Kılıç, Ö.: Prediction of heart disease using neural network. In: 2017 International Conference on Computer Science and Engineering (UBMK), pp. 719–723. IEEE (2017) 5. Matin, A., Mahmud, F., Ahmed, T., Ejaz, M.S.: Weighted score level fusion of iris and face to identify an individual. In: 2017 International Conference on Electrical, Computer and Communication Engineering (ECCE), pp. 1–4. IEEE (2017) 6. Liaw, A., Wiener, M.: Classification and regression by randomForest. R. News 2(3), 18–22 (2002) 7. Minh, L.Q., Duong, P.L.T., Lee, M.: Global sensitivity analysis and uncertainty quantification of crude distillation unit using surrogate model based on Gaussian process regression. Ind. Eng. Chem. Res. 57(14), 5035–5044 (2018) 8. Kabir, H.D., Khosravi, A., Hosen, M.A., Nahavandi, S.: Neural network-based uncertainty quantification: a survey of methodologies and applications. IEEE access 6, 36218–36234 (2018) 9. Qiu, X., Zhang, L., Ren, Y., Suganthan, P.N., Amaratunga, G.: Ensemble deep learning for regression and time series forecasting. In: 2014 IEEE Symposium on Computational Intelligence in Ensemble Learning (CIEL), pp. 1–6. IEEE (2014) 10. Goetz, T. (2010). The decision tree. Rodale 11. Gold, J.C., Cutler, D.J.: Heart disease. Enslow Publishers (2000) 12. Healey, J.: Heart disease. Spinney Press (2005) 13. Jindal, H., Agrawal, S., Khera, R., Jain, R., Nagrath, P.: Heart disease prediction using machine learning algorithms. In: IOP Conf. Series: Materials Science and Engineering. https://doi.org/ 10.1088/1757-899X/1022/1/012072.

18

V. Paranjape et al.

14. Ghosh, P., Azam, S., Jonkman, M., (Member, Ieee), Karim, A., Javed Mehedi Shamrat, F.M., Ignatious, E., Shultana, S., Reddy Beeravolu, A., De Boer, F.: Efficient prediction of Cardiovascular Disease Using Machine Learning Algorithms With Relief and LASSO Feature Selection Techniques. Digital Object Identifier https://doi.org/10.1109/ACCESS.2021.3053759 (Jan 2021) 15. Diwakar, M., Tripathi, A., Joshi, K., Memoria, M., Singh, P., Kumar, N.: Latest trends on heart disease prediction using machine learning and image fusion. https://doi.org/10.1016/j.matpr. 2020.09.078, 2020. 16. Sahoo, P.K., Jeripothula, P.: Heart Failure Prediction Using Machine Learning Techniques. ORCID 0000-0002-5164-1010 17. Magar, R., Memane, R., Suraj Raut, R.: Heart disease prediction using machine learning. JETIR 7(6) (2020). www.jetir.org (ISSN-2349-5162) 18. Siva Sai Bindhika, G., Meghana, M., Sathvika Reddy, M., Rajalakshmi. Heart Disease Prediction Using Machine Learning Techniques (IRJET) 7(4) (2020) 19. Sateesh Kumar, R., Sameen Fatima, S., Thomas, A.: Heart disease prediction using ensemble learning method. 9(1) (2020). ISSN: 2277–3878 20. Ware, S., Rakesh, S.K., Choudhary, B.: Heart Attack Prediction by using Machine Learning Techniques. Int. J. Recent. Technol. Eng. (IJRTE) 8(5) (2020). ISSN: 2277–3878. 21. Domor Mienyea, I., Suna, Y., Wangb, Z.: An Improved Ensemble Learning Approach for the Prediction of Heart Disease Risk. https://doi.org/10.1016/j.imu.2020.100402, Informatics in Medicine Unlocked, 2020. 22. A. Ann Romalt, R. Mathusoothana S. Kumar, “An Analysis On Feature Selection Methods, Clustering And Classification Used In Heart Disease Prediction –A Machine Learning Approach”, Journal of critical reviews, ISSN- 2394–5125 Vol 7, Issue 6, 2020. 23. Limbitote, M., Mahajan, D., Damkondwar, K., Patil, P.: A survey on prediction techniques of heart disease using machine learning. http://www.ijert.org 9(6), ISSN: 2278–0181, June-2020 24. Pratap Singh, A., Dr. Singh, B.: A review on heart disease prediction using machine learning. J. Xi’an Univ. Arch. Technol., Issn No : 1006–7930, Volume XII, Issue III (2020) 25. Mohan, S., Thirumalai, C., Srivastava, G.: Effective Heart Disease Prediction Using Hybrid Machine Learning Techniques. https://doi.org/10.1109/ACCESS.2019.2923707. IEEE Access (2019) 26. Rajesh, N., Maneesha, T., Hafeez, S., Krishna, H.: Prediction of heart disease using Machine learning Algorithms. Int. J. Eng. Technol., 7(2.32), 363–366 (2018) 27. Sheen, B.: Heart disease. Thomson/Gale (2004) 28. Silverstein, A., Silverstein, V.B., Nunn, L.S.: Heart disease. Lerner (2006) 29. Python Programming Documentation [online]. https://www.python.org/about/ Accessed 26 February 2017. 30. Numpy Documentation [online]. http://www.numpy.org/ Accessed 26 February 2017. 31. Pandas Documentation [online]. http://pandas.pydata.org/ Accessed 26 February 2017. 32. Waksom, M.: An introduction to Seaborn [online]. http://seaborn.pydata.org/introduction.html Accessed 26 February 2017 33. Pedregosa, F.: Scikit-learn: Machine Learning in Python [online]. http://jmlr.csail.mit.edu/pap ers/v12/pedregosa11a.html Accessed 26 February 2017.

Chapter 2

Recent Advancements in AI-Assisted Drug Design and Discovery Systems Kamal Nayan, Karan Kumar Paswan, Vinamra Bhushan Sharma, Yogendra Kumar, and Saurabh Tewari

Introduction Several approaches have been created to streamline and improve biomedical procedures like medication repurposing and drug discovery. Platforms like artificial intelligence (AI) are beneficial for finding and developing medications [1]. Clinical-level AI research has been done on nanomedicines and non-nanomedicines over the past ten years. On the other hand, a barrier and difficulties that affect medication design and development are imposed by low effectiveness, off-target delivery, time requirements, and high cost. Additionally, the pipeline for developing new drugs is hampered by complicated and substantial data sets from genomes, proteomics, microarray data, and clinical trials [2]. Machine learning (ML) techniques have become essential in researching and developing new drugs. The subjects have become more advanced thanks to Deep learning algorithms and Artificial neural networks (ANNs). In addition, various cutting-edge modeling algorithms have benefited from revolutionary data mining, curation, and management strategies. Deep learning developments offer an exceptional chance for real drug discovery and design, eventually influencing humanity. Time and production costs are the main issues with drug development and design, such as inefficiency in target delivery, and inadequate dosage can hinder drug delivery and growth [3]. AI algorithms used in computer-aided drug design may

K. Nayan · K. K. Paswan · V. B. Sharma Rajiv Gandhi Institute of Petroleum Technology, Uttar Pradesh, Jais, Amethi 229304, India Y. Kumar Indian Institute of Technology, Chennai 60036, India S. Tewari (B) Gyan Ganga Institute of Technology and Sciences, Madhya Pradesh, Bargi Hills, Jabalpur 482001, India e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2023 A. Mishra and J. C.-W. Lin (eds.), Industry 4.0 and Healthcare, Advanced Technologies and Societal Change, https://doi.org/10.1007/978-981-99-1949-9_2

19

20

K. Nayan et al.

Fig. 2.1 Schematic diagrams of single-AI classifier methods. a Linear discriminant analysis model, b K-nearest neighbors, c decision tree and d support vector machine (SVM)

eventually be able to eliminate the challenges and obstacles associated with traditional medication development. Figure 2.1 shows schematic diagrams for popular AI classifier methods. ML, which encompasses supervised, unsupervised, and reinforcement learning, is a subset of AI. Small-molecule drug research and development (R&D) faces numerous challenges, including increased cycle time, failed clinical trials, and some economic issues. Even with ample funding, drug (small molecule) R&D has yet to produce the required productivity and efficacy [4]. Lengthy approval procedures for incorporating novel chemicals and techniques, market saturation, and reluctance to invest in developing and developed markets are all issues that must be addressed. This review will examine the obstacles that must be overcome, from drug discovery to initial clinical trials. In recent years, the amount of information available to scientists has been incomparable, far exceeding the capabilities of many traditional practices. AI is not a new thought in finding new drugs, particularly in generalized structure– activity relationship modeling. Analog to Hammett’s formula for relating equilibrium

2 Recent Advancements in AI-Assisted Drug Design and Discovery Systems

21

constants and reaction rates for benzene and its derivatives, the concept of considering experimental findings and moving forward using regression is used here. Deep learning, a subsection of AI, has potential for future drug developments [5]. Recently, several ML methods have reemerged; some of these methods might be viewed as domain-specific AI that has been effectively used in the design and discovery of drug medicines. This study comprehensively analyzes AI techniques and their uses in the pharmaceutical industry for drug development and delivery. The existing literature on AI-aided drug delivery is reviewed after introducing the fundamental ideas behind the various ML algorithms and a few application comments [6]. This covers de novo drug design applications, drug repurposing, physicochemical characteristics estimation, pharmacokinetic behavior forecast, and associated fields. It also encompasses virtual screening based on structure and ligand.

Construction of AI Model for Drug Delivery A precise characterization of the issue is required to use an AI algorithm in drug development systematically. This approach typically includes issue identification, AI architecture design, data preparation, model training and assessment, and interpretation and explanation of the outcomes, as shown in Fig. 2.3. More particularly, as shown in step 1 of Fig. 2.3, before making any technical or structural considerations, one must be informed of the issue at hand, as the ML solution chosen must be appropriate for the issue at hand. First, determine whether this problem falls under generative or discriminative tasks. Designing a suitable model architecture considering AI modeling comes next, as shown in step 2 in Fig. 2.3. Choosing a proper technique and determining good initial values for hyperparameters are required at this step. The most popular algorithms for discriminative tasks include Random forest (RF), ANNs, Decision trees, K-Nearest Neighbours (K-NN), Support vector machines (SVM), etc. Al techniques like Deep Boltzmann machine, Adversarial autoencoder, Deep belief network, Generative adversarial network, and variational autoencoder are available for generative problems. ANNs are typically the part of the field chosen for these tasks due to their generalizability and flexibility to imitate any input–output connection function. Different AI algorithms require various hyperparameters. This is relevant, for instance, to the kernel selection, kernel parameters (like in Radial Basis Function), and the error penalty C in SVM algorithms. In addition to the quantity and variety of nerve cells and levels, regularization parameters, the decay and rate of learning, and the presence of connections between adjacent layers or neurons all make up the architecture of a neural network [7].

22

K. Nayan et al.

After selecting a rough design, the collected data is processed, as indicated in step three in Fig. 2.3. The original information’s quantity, characteristics, and representativeness significantly influence how well an AI model works. Following the creation of the initial architecture and data sets, models may be trained and evaluated, as shown in steps four and five of Fig. 2.3. The training process searches for parameters to minimize or eliminate prediction errors. The final AI model should be capable of explaining the primary connection between molecular representations and the goals of practitioners. If not, then looking at specific situations may aid the practitioner in developing their own “environment” to get the desired result.

Input Data Preparation By focusing on the most recent AI approach, those new to using AI methods in drug development projects want to increase overall model performance. However, as training data provides the foundation for all subsequent progress, it is typically more effective to concentrate on it initially [8]. Regardless of the model used, having more high-quality data frequently improve generalization performance. The process of preparing data is time-consuming and difficult. Understanding the source and significance of the training data is critical [9]. This includes understanding the classifications and levels of difficulty of the units characterized, the volume of data, and, more precisely, how they are distributed in chemical space [10]. The question is how thoroughly the universe of likely inputs we would wish to utilize as inputs for predictions has been filled in. Preprocessing techniques must be selected based on various criteria, including whether additional data is required, if labeled data is sufficient, what type of representation would most effectively encode the represented elements, and others. It is noteworthy that the same laws bind not everyone. We intend to offer some direction on these topics in the next section (Table 2.1). Table 2.1 Popular open-source clinical data utilized for drug delivery study S. no.

Dataset name

References

1

CDC’s Antibiotic Resistance Isolate Bank

[11]

2

EPIC Care Everywhere Network

[12]

3

EuResist Integrated database

[13]

4

NCBI Reference Sequence Database

[14]

5

Office of Cyber Infrastructure and Computational Biology (OCICB)

[15]

6

Pathosystems Resource Integration Center(PATRIC)

[16]

7

Quotient Bioresearch

[17]

8

RENAGENO

[18]

9

ReSeqTB knowledgebase

[19]

10

Stanford University’s HIV Drug Resistance

[20]

2 Recent Advancements in AI-Assisted Drug Design and Discovery Systems

23

Data Format in Drug Domain The input datasets that AI systems employ as inputs data X and outputs data Y in drug development models. Diverse data structures utilized for modeling drugrelated applications can be organized as (a) Input data type: sequence data (e.g., time series data, SMILES data, biomedical texts for drug interactions, data about biomolecular structures, etc.), fixed-size vector (e.g., bitstring, real values vector), and structural graph of the molecule. (b) Output data types: binary values, fixed vectors (e.g., biometric vector), sequential data (e.g., SMILE string, amino acid sequences), integer values, and real value integers (estimation or regression tasks). The most common input data type used in creating pharmaceuticals is an input vector of fixed length (e.g., molecular descriptors, fingerprints). Although, there are significant drawbacks associated with such types of interpretation. For encoding every potential substructure without overlap, such vectors frequently have a large size, leading to models with numerous learnable factors that try to understand from relatively sparse records [21]. For instance, Unterthiner et al. employed a 43,000pixel fingerprint vector. It is challenging to determine similarity in a chemical space with many dimensions. Numerous varieties of graph fingerprinting have been developed to address this problem. Chemical structure graphs are used as inputs for a differentiable neural network to create these fingerprints. The other drawback is the quick inability to easily establish a bijection between input vectors and molecule structures [22]. Although input vectors from a molecular structure may be produced with relative ease, reconstructing a back system after vectors is challenging, mainly because a single evidence interpretation is likely to correlate to a wide variety of different chemical forms combining SMILES strings as molecular representations with AI deep generative models is one options combinations have been thoroughly studied recently in anti-BT agents are new molecules with desirable features [23]. In this instance, there are two possible outputs: a SMILES string and an identification vector [24]. Most other AI-assisted drug development initiatives produce a numeric value, frequently with experimental biochemical data, where dual values correlate to categorization in binary, integer values to Real-valued numbers, and multiclass classification (or clustering) are applied to regression tasks.

Data Quality and Its Measurement The original data’s representativeness and cardinality should be given special consideration. If the size of the data set does not accurately reflect the fundamental learning objective, the training process may overfit and experience poorly conditioned standard error [25]. Both issues raise the possibility that a given model may not be

24

K. Nayan et al.

Fig. 2.2 Schematic diagrams of ensemble-learning-classifiers methods

genuinely able to generalize to the latest information. But how does someone decide where to get more information? For interested readers, here are some suggestions: (Fig. 2.2). First, evaluate if the training set’s outcome is satisfactory. The selected design is likely to benefit from some modifying or reconstructing whether the training objective was achieved unsatisfactory, even model regularization when (such as dropout) is disabled. Spending time here is recommended over seeking alternative data right away because, in general, even a poorly optimized design should be capable of understanding the training data. If no progress is apparent after making the necessary adjustments or trying a different pattern (if required), it could be essential to check the training data’s quality for noise and faults, as well as to see whether training distributions are equal and the input–output connection is reasonable [26]. Another suggested sanity check is retraining the model using disorganized Y-data. As a sign

2 Recent Advancements in AI-Assisted Drug Design and Discovery Systems

25

that the model finds essential correlations in the unscrambled condition, it is better if the trained model performs poorly on the messy data. The test set performance should be analyzed if the exercise set execution is satisfactory. Extending the data set is one of the most efficient strategies when there is a significant performance difference between the training and testing sets. Essential factors to consider include the projected volume of data needed to significantly enhance test set productivity and the relative expense of collecting extra data needed to improve test set efficiency in different traditions. To further understand the generalization ability and robustness of the model, retrain while feature-stratifying the data input it with various extents of each stratum chosen equally or randomly (or with a preset bias). More superior raw data at a lower price is always preferable [27]. As is frequently the case with drug development based on biological data, a more frequent scenario is that such procedures are extremely expensive or ineffective. The training set can be extended with fabricated data points or by applying transfer learning techniques to improve the model’s understanding of related data. Depending on the job, various methods can be used to improve the model, including hyper-parameter tweaking, normalization (particularly if the model is getting out of hand), or the usage of more particular ANNs architectures. If presentation on the testing set is motionless and noticeably poorer than on the training sample despite changing the regularization hyperparameters, enhancing the learning process may be the simplest way to lower the generalization error. Collecting more training data may not be viable in this situation [28] (Fig. 2.3). For instance, a successful artificial molecular structure can be created using the GAN approach from a set of unlabeled and few labeled data [29]. The first training data must be of sufficient quantity and quality, which needs to be emphasized again. On the other hand, the current condition in the medical research sector is poor data availability and quality. For example, there may be 2000 “reliable” values for binding affinities available in the literature, but databases provide 10,000 values, and so on). Let’s assume that 10,000 values can be used; this quantity of information is negligible when matched with the enormous volumes of data used to create visual prototypes or models that correlate with our possible future purchases. The distribution of active and inactive compounds in databases obtained from input–output screening data is usually out of balance, even though specific data sampling procedures can balance how modeling activities are distributed [30]. It’s important to remember that data sets utilized for using ML to find new drugs may not always enable the learning system to identify a solution, unlike data in computer vision, where the statistics are trusted and identify the fundamental problem. We must acknowledge that our knowledge of the subject and related data sets are insufficient and rife with mistakes. Additionally, nearly all drugs have several natural targets and effects, and the patient’s unique genomic profile influences the relative importance of each. This indicates that due to unidentified contributing elements and many-to-many nonlinear interactions, we face fundamentally ill-posed difficulties in certain areas of drug development. The chemical source impacts the level of assurance that can be used to declare it inactive against a specific target [31]. The directory of usable decoys (DUD), for instance, categorizes substances as “inactive” without needing experimental verification. The

26

K. Nayan et al.

Fig. 2.3 The generalized workflow for the AL model development for drug-related applications

use of bootstrapping, noise-adapted neural networks, or semi-supervised methodologies that add details about active and unidentified materials during exercise as well as suited neuronal to noise networks that assess the probability that a present label is inappropriate and practice this information to weight the data appropriately can all help to solve this problem partially. Figure 2.4 shows the workflow to decide on data collection and AI architecture improvement approaches.

Drug Discovery Using Computation and Machine Learning For the current drug discovery process, computational approaches are crucial in synthesizing therapeutically significant compounds. The two main types of these techniques are ligand-based and structure-based. It is required to comprehend the structures of the target and ligand for structure-based courses to work. Molecular dynamics and protein–ligand docking techniques for determining the binding free energy are some of them [32]. To anticipate the natural reaction based on past data on famous active and inactive ligands, ligand-based approaches solely rely on information about the ligand. These techniques usually use similarity searches, activity cliffs

2 Recent Advancements in AI-Assisted Drug Design and Discovery Systems

27

Fig. 2.4 The workflow to decide on data collection and AI architecture improvement approaches

analysis, and quantitative structure–activity connections. Both methods use empirical scoring functions or force fields to determine how much energy is present in molecular systems. It might be challenging to parameterize precise and transferrable force fields when one wants to filter millions of potential ligands. This parametrization is necessary to include extreme quantum physics conduct into a condensed set of factors and calls for using a straightforward analytic functional form. Consequently, over 25 years, a “zoo” of force fields has been created for several disciplines, such as applications involving proteins, different materials, nucleic acids, carbohydrates, and other minor drug compounds [33]. The places where specific force fields operate systematically and do not exist are difficult to pinpoint. This reality may be summed succinctly by these two straightforward claims: Contrary to quantum mechanical methods, which may be incredibly exact and have prohibitive processing costs. These fields are quick and may operate weakly beyond their fitting set. For academic and commercial computational chemists, the long-standing discrepancy between the precision of quantum mechanical methodologies and the quickness of classical force fields has long been disputed [34]. Figure 2.5 shows the application stages of ML models for drug delivery.

28

K. Nayan et al.

Fig. 2.5 Application stages of ML models for drug delivery

Synthesis Analysis Recently developed ML-based approaches have been used for synthesis planning. The computer program Chematica was used to prepare total syntheses of essential compounds for medicine without the aid of humans. Grzybowski et al. employed Chematica to create decision trees using over 50,000 rules of synthesis. Reaction rules are included in graphs that connect millions of potential molecules with various synthetic pathways to identify the most efficient artificial paths [35]. Numerous studies have shown that retrosynthetic courses can be found using Monte Carlo tree search and symbolic AI without professional human guidance [36]. Most organic chemical reactions that have been recorded were used to train this neural network. Synthetic chemists found that computer-created tracks were comparable to those described in the literature after conducting a double-blind test.

2 Recent Advancements in AI-Assisted Drug Design and Discovery Systems

29

AI Applications in the Production of Pharmaceuticals Finding appropriate biologically effective drug molecules in the ample chemical space comprising ten molecules is the most challenging and discouraging stage in drug innovation and progress. Drug development is also considered a costly and time-consuming procedure [37]. The most unfortunate aspect is that nine out of ten therapeutic compounds frequently do not complete regulatory approvals and phase II clinical trials [38]. AI-based technologies and methods can be used to overcome the drawback of drug discovery and development. Every step of the drug development process uses AI, including designing small molecules, calculating dosages and efficacy, forecasting toxicity and bioactive properties, predicting bioactive components, predicting protein interactions, their folding and misfolding, and employing virtual screening based on structure and ligand using Quantitative structure–activity relationship modeling, and repurposing drugs.

AI Applications for Drug Design De novo drug designing is an iterative technique that creates unique molecules by designing the 3D structures of receptors. Its goal is to create new dynamics. De novo drug designing hasn’t, however, found endless application in medicine disclosure. Furthermore, because of recent developments in the field of AI, the Field has experienced some revival [39]. As virtual screening performs profitable in silico analyses in a vast array of blends, increasing the effectiveness of prospective drug leads, it has manifested as a critical tool in the drug improvement process. ML, a branch of AI, is a technique for organizing a virtual screening for drug leads that frequently entails accumulating an extensive collection of substances that includes both active and inert compounds to train the model. After the model is constructed, it is tested to determine whether it is accurate enough before being used to detect novel drugs in a database that has never been used before. This part examines how AI has benefitted the de novo approach to drug design. Table 2.2 in the appendix shows the performance comparison of various ML algorithms utilized for drug delivery purposes, particularly for anti-microbial cases. Table 2.2 provides a list of useful tools and software that can be utilized for ML and Deep learning applications for drug delivery [51].

30

K. Nayan et al.

Conclusions AI techniques have been incorporated in the initial phases of drug innovation and delivery. Numerous instances are available in the research works for the practical applications of Deep learning for molecular characteristic estimation, target identification for drug delivery, toxicity forecast, lead optimization, structuring of clinical trials, etc. Various international Kaggle competitions have also been organized where drug-related data have been provided to test the potential of different AI algorithms. This helps to develop new tools and techniques for the pharmaceutical sciences. AI algorithms are data-dependent models. However, collecting sufficient, good quality, uniform format, and specifically required data for drug delivery are still challenging. Sharing data on open-source platforms is essential for pharmaceutical research to develop and test existing and newer intelligent algorithms. Only a few companies are practicing data-sharing policies in the present scenarios. Lead optimization is also an issue in the initial stage of drug discovery which is conventionally performed based on the experiences of drug chemists. The main challenge is to promote good quality of drugs, their recognition, simultaneous optimization of these properties, and in some instances, identify mutual incompatibility of drug components. With AI, we may be able to select the safest compounds rapidly during the synthesis of drugs in the future. Recruitment of patients for any clinical trial is also an important task that AI can perform. AI will collect patient data from wearable equipment and analyze data to detect health issues and send alarms to the patients. Various AI models have been applied and reported for drug delivery and related research publications are mushrooming rapidly. These models claim supremacy with their own data sets having diverse, intelligent algorithms. This creates confusion in selecting the best AI algorithms for any drug-related application. Therefore, programmers should develop standard guidelines for evaluating AI models for drug delivery and pharmaceutical domains. AI has shown great potential in drug delivery and provided more profound insights into their synthesis, leading to advanced drug research works in the future.

Appendix See Tables 2.2 and 2.3.

[41]

[42]

[43]

[40]

Reference

Performance score

Precision: 0.93 Recall: 0.83

Accuracy: 0.86–0.9

AUC:0.67–0.79

Performance

Yes

Yes

Excellent

ANN

Boost

Yes

Yes

[44]

F-measure: 0.569–0.678

Excellent

Yes

Excellent

Yes

Yes

Random forest

Yes Yes

Yes

Excellent

Yes

Yes

Excellent

SVM

Yes

Yes

K-Nearest Neighbor

Yes

Decision trees

Yes

Yes

Naive Bayes

AMR prediction

Logistic regression

Task

[45]

Accuracy: 0.597–0.954

Yes

Yes

Yes

Yes

Excellent

Yes

Yes

Yes

Table 2.2 The performance of ML algorithms utilized for drug delivery.

Yes

[46]

Accuracy: 0.6619–0.7364

Yes

Yes

Yes

Yes

Excellent

[15]

AUC:0.74–0.85

Yes

Excellent

Yes

Yes

Yes

Yes

[47]

AUC: 0.838–0.95

Excellent

Yes

Yes

Yes

[48]

AUC: 0.8546–0.9808

Excellent

Yes

Yes

Excellent

[49]

Accuracy: 0.721–0.8583

Excellent

Yes

Yes

Yes

Yes

Drug combination

[50]

AUC:0.56–0.79

Yes

Yes

Yes

Yes

[51]

AUC:0.59–0.94

Yes

Yes

Yes

Excellent

Treatment outcome prediction

2 Recent Advancements in AI-Assisted Drug Design and Discovery Systems 31

32

K. Nayan et al.

Table 2.3 The list of useful tools and software that can be utilized for ML and Deep learning applications for drug delivery [51] S. no.

Name of software and tools

Useful links of tools and software utilized for ML and deep learning applications for drug delivery

1

QSAR-Co-X

https://github.com/ncordeirfcup/QSAR-Co-X

2

Cloud 3D-QSAR

http://agroda.gzu.edu.cn:9999/ccb/server/cloud3dQSAR/

3

ChemDes

http://www.scbdd.com/chemdes

4

OntoQSAR

https://github.com/rafaelgsilveira/OntoQSAR

5

ChemGrapher

https://doi.org/10.1021/acs.jcim.0c00459

6

ANFIS

https://github.com/topics/anfis?l=matlab

7

DrugNet

http://genome2.ugr.es/drugnet/

8

RepCOOL

https://translational-edicine.biomedcentral.com/articles, https://doi.org/10.1186/s12967-020-02541-3

9

GIPAE

https://www.hindawi.com/journals/bmri/2019/2426958/

10

DrPOCS

https://www.computer.org/csdl/journal/tb/2019/01/083 50090/17D45XeKgra

11

HeteroDualNet

https://www.frontiersin.org/articles, https://doi.org/10. 3389/fphar.2019.01301/full

12

RCDR

https://ieeexplore.ieee.org/document/8734933

13

GRTR

https://link.springer.com/chapter, https://doi.org/10.1007/ 978-3-319-94968-0_2

14

SAEROF

https://github.com/HanJingJiang/SAEROF

15

WGMFDDA

https://link.springer.com/chapter, https://doi.org/10.1007/ 978-3-030-60796-8_47

16

HNet-DNN

https://pubs.acs.org, https://doi.org/10.1021/acs.jcim.9b0 1008

17

DeepConv-DTI

https://github.com/GIST-CSBL/DeepConv-DTI

18

DeepH-DTA

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC854 5313/

19

Neg Stacking

https://github.com/Hawash-AI/deepH-DTA

20

SPIDR

https://bmcbioinformatics.biomedcentral.com/articles, https://doi.org/10.1186/s12859-018-2153-y

21

DeepPurpose

https://github.com/kexinhuang12345/DeepPurpose

22

DTI-CDF

https://github.com/a96123155/DTI-CDF

23

Pred-binding

https://www.tandfonline.com/doi/full, https://doi.org/10. 3109/14756366.2016.1144594

24

Chembench

https://chembench.mml.unc.edu

25

mCSM-lig

http://structure.bioc.cam.ac.uk/mcsm_lig

26

CSM-lig

http://structure.bioc.cam.ac.uk/mcsm_lig

27

mCSM-AB

http://structure.bioc.cam.ac.uk/mcsm_ab

28

dendPoint

http://biosig.unimelb.edu.au/dendpoin (continued)

2 Recent Advancements in AI-Assisted Drug Design and Discovery Systems

33

Table 2.3 (continued) S. no.

Name of software and tools

Useful links of tools and software utilized for ML and deep learning applications for drug delivery

29

MDCKpred

http://www.mdckpred.in/

30

Vienna LiverTox

https://livertox.univie.ac.at/

31

Ambit-SMIRKS

http://ambit.sourceforge.net/smirks

32

COSMOfrag

https://pubs.acs.org, https://doi.org/10.1021/ci0501948

33

RosENet

https://pubmed.ncbi.nlm.nih.gov/23808195/

34

MDeePred

https://github.com/cansyl/MDeePred

35

ProTox-II

http://tox.charite.de/protox_II

36

ADMETlab

http://admet.scbdd.com/

37

lazar

https://www.frontiersin.org/articles, https://doi.org/10. 3389/fphar.2013.00038/full

38

TargetNet

http://targetnet.scbdd.com

39

PSBP-SVM

http://server.malab.cn/PSBP-SVM/index.jsp

40

IDDkin

https://github.com/CS-BIO/IDDkin

41

SMPDB 4.2

http://www.smpdb.ca/

42

DruGeVar

http://drugevar.genomicmedicinealliance.org

43

DrugPathSeeker

https://research.ibm.com/publications/drugpathseeker-int eractive-ui-for-exploring-drug-adr-relation-via-pathways

44

SNF-NN

https://bmcbioinformatics.biomedcentral.com/articles, https://doi.org/10.1186/s12859-020-03950-3

45

DeepDrug

https://github.com/wanwenzeng/deepdrug

References 1. Eppe M., Nguyen, P.D., Wermter, S.: From semantics to execution: Integrating action planning with reinforcement learning for robotic causal problem-solving. Front. Robot. AI 6(123), (2019) 2. Hamet, P., Tremblay, J.: Artificial intelligence in medicine. Metabolism 69, S36–S40 (2017) 3. Duch, W., Swaminathan, K., Meller, J.: Artificial intelligence approaches for rational drug design and discovery. Curr. Pharm. Des. 13(14), 1497–1508 (2007) 4. Fukushima, K.: A self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position. Biol. Cybern. 36(4), 193–202 (1980) 5. Kim, S., Chen, J., Cheng, T., Gindulyte., A., He, J., He, S., Bolton, E.E.: PubChem in 2021: new data content and improved web interfaces. Nucl. Acids Res. 49, 1388–1395 (2021) 6. Baldi, A.: Computational approaches for drug design and discovery: an overview. Syst. Rev. Pharm. 1(1), 99 (2010) 7. Kalaiarasi, C., Manjula, S., Kumaradhas, P.: Combined quantum mechanics/molecular mechanics (QM/MM) methods to understand the charge density distribution of estrogens in the active site of estrogen receptors. RSC Adv. 9(69), 40758–40771 (2019) 8. Carpenter, K.A., Huang, X.: Machine learning-based virtual screening and its applications to Alzheimer’s drug discovery: a review. Curr. Pharm. Des. 24(28), 3347–3358 (2018) 9. Provenzano, C., Cappella, M., Valaperta, R., Cardani, R., Meola, G., Martelli, F., Falcone, G.: CRISPR/Cas9-mediated deletion of CTG expansions recovers normal phenotype in myogenic

34

10. 11.

12. 13.

14.

15.

16.

17.

18.

19.

20. 21.

22.

23.

24. 25.

26. 27.

K. Nayan et al. cells derived from myotonic dystrophy 1 patients. Molecular Therapy-Nucleic Acids 9, 337– 348 (2017) Mustapha, I.B., Saeed, F.: Bioactive molecule prediction using extreme gradient boosting. Molecules 21(8), 983 (2016) Drouin, A., Letarte, G., Raymond, F., Marchand, M., Corbeil, J., Laviolette, F.: Interpretable genotype-to-phenotype classifiers with performance guarantees. Sci. Rep. 9(1), 4071 (2019). https://doi.org/10.1038/s41598-019-40561-2 Ramon, E., Belanche-Muñoz, L., Pérez-Enciso, M.: HIV drug resistance prediction with weighted categorical kernel functions. BMC Bioinform. 20(1), 1–13 (2019) Chen, M.L., et al.: Beyond multidrug resistance: leveraging rare variants with machine and statistical learning models in Mycobacterium tuberculosis resistance prediction. EBioMedicine 43, 356–369 (2019). https://doi.org/10.1016/j.ebiom.2019.04.016 Chen, T., Guestrin, C.: Xgboost: a scalable tree boosting system. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016). Rishishwar, L., Petit, R.A., Kraft, C.S., Jordan, I.K.: Genome sequence-based discriminator for vancomycin-intermediate Staphylococcus aureus. J. Bacteriol. 196(5), 940–948 (2014). https://doi.org/10.1128/JB.01410-13 Coelho, J.R., et al.: The use of machine learning methodologies to analyse antibiotic and biocide susceptibility in Staphylococcus aureus. PLoS ONE 8(2), e55582 (2013). https://doi.org/10. 1371/journal.pone.0055582 Goodman, K.E., Lessler, J., Harris, A.D., Milstone, A.M., Tamma, P.D.: A methodological comparison of risk scores versus decision trees for predicting drug-resistant infections: a case study using extended-spectrum beta-lactamase (ESBL) bacteremia. Infect. Control Hosp. Epidemiol. 40(4), 400–407 (2019). https://doi.org/10.1017/ice.2019.17 Raposo, L.M., Arruda, M.B., de Brindeiro, R.M., Nobre, F.F.: Lopinavir resistance classification with imbalanced data using probabilistic neural networks. J. Med. Syst. 40(3), 69 (2016). https://doi.org/10.1007/s10916-015-0428-7 Bhattacharyya, R.P., et al.: Simultaneous detection of genotype and phenotype enables rapid and accurate antibiotic susceptibility determination. Nat. Med. 25(12), 1858–1864 (2019). https://doi.org/10.1038/s41591-019-0650-9 Sauer, C.M., et al.: Feature selection and prediction of treatment failure in tuberculosis. PLoS ONE 13(11), e0207491 (2018). https://doi.org/10.1371/journal.pone.0207491 Wicht, K.J., Combrinck, J.M., Smith, P.J., Egan, T.J.: Bayesian models trained with HTS data for predicting β-haematin inhibition and in vitro antimalarial activity. Bioorg. Med. Chem. 23(16), 5210–5217 (2015) Rogers, D., Brown, R.D., Hahn, M.: Using extended-connectivity fingerprints with Laplacianmodified Bayesian analysis in high-throughput screening follow-up. J. Biomol. Screen. 10(7), 682–686 (2005) Speck-Planche, A.V., Kleandrova, V., Luan, F., Cordeiro, N.D.: Chemoinformatics in multitarget drug discovery for anti-cancer therapy: in silico design of potent and versatile antibrain tumor agents. Anti-Cancer Agents Med. Chem. (Former. Curr. Med. Chem.-Anti-Cancer Agents) 12(6), 678 (2012). Xia, X., Maliski, E.G., Gallant, P., Rogers, D.: Classification of kinase inhibitors using a Bayesian model. J. Med. Chem. 47(18), 4463–4470 (2014) Ouyang, X., Handoko, S.D., Kwoh, C.K.: Cscore: a simple yet effective scoring function for protein–ligand binding affinity prediction using modified cmac learning architecture. J. Bioinform. Comput. Biol. 9(1), 1–14 (2011) Srivastava, R.K., Greff, K., Schmidhuber, J.: Training very deep networks. Adv. Neural Inf. Process. Syst., 28 (2015). Liu, B., Ramsundar, B., Kawthekar, P., Shi, J., Gomes, J., LuuNguyen, Q., Pande, V.: Retrosynthetic reaction prediction using neural sequence-to-sequence models. ACS Cent. Sci. 3(10), 1103–1113 (2017)

2 Recent Advancements in AI-Assisted Drug Design and Discovery Systems

35

28. Schneider, G., Clark, D.E.: Automated de novo drug design: are we nearly there yet. Angew. Chem. Int. Ed. 58(32), 10792–10803 (2019) 29. Asanuma, D., Sakabe, M., Kamiya, M., Yamamoto, K., Hiratake, J., Ogawa, M., Urano, Y.: Sensitive β-galactosidase-targeting fluorescence probe for visualizing small peritoneal metastatic tumours in vivo. Nat. Commun. 6(1), 1–7 (2015) 30. Jain, A., Zamir, A.R., Savarese, S., Saxena, A.: Structural-rnn: Deep learning on Spatiotemporal graphs. In Proceedings of the ieee conference on computer vision and pattern recognition, 5308–5317(2016) 31. Sanchez-Lengeling, B., Aspuru-Guzik, A.: Inverse molecular design using machine learning: Generative models for matter engineering. Science 361(6400), 360–365 (2018) 32. Sellers, B.D., James, N.C., Gobbi, A.: A comparison of quantum and molecular mechanical methods to estimate strain energy in druglike fragments. J. Chem. Inf. Model. 57(6), 1265–1275 (2017) 33. Popova, M., Isayev, O., Tropsha, A.: Deep reinforcement learning for de novo drug design. Sci. Adv. 4(7), 7885 (2018) 34. Segler, M.H., Preuss, M., Waller, M.P.: Planning chemical syntheses with deep neural networks and symbolic AI. Nature 555(7698), 604–610 (2018) 35. Li, L., Snyder, J.C., Pelaschier, I.M., Huang, J., Niranjan, U.N., Duncan. P., Burke, K. Understanding machine-learned density functionals. Int. J. Quantum Chem. 116(11), 819–833 (2016). 36. Pilania, A., Mannodi-Kanakkithodi, B.P., Uberuaga, R., Ramprasad, J.E., Gubernatis, Lookman, T.: Machine learning bandgaps of double perovskites. Sci. Rep. 6, 19375 (2016). 37. Pilania, G., Mannodi-Kanakkithodi, A., Uberuaga, B., et al.: Machine learning bandgaps of double perovskites. Sci. Rep. 6, 19375 (2016) 38. Margolis, R., Derr, L., Dunn, M., Huerta, M., Larkin, J., Sheehan. J„ Green, E.D.: The National Institutes of Health’s Big Data to Knowledge (BD2K) initiative: capitalizing on biomedical big data. J. Am. Med. Inform. Assoc. 21(6), 957–958 (2014). 39. Parmar, C., Barry, J.D., Hosny, A., Quackenbush, J., Aerts, H.J.: Data analysis strategies in medical imagingData science designs in medical imaging. Clin. Cancer Res. 24(15), 3492–3499 (2018) 40. Cohen, J.D., Li, L., Wang, Y., Thoburn, C., Afsari, B., Danilova, L., Papadopoulos, N.: Detection and localization of surgically resectable cancers with a multi-analyte blood test. Science 359(6378), 926–930 (2018) 41. Wang, H.-Y., et al.: Rapid Detection of Heterogeneous Vancomycin-Intermediate Staphylococcus aureus Based on Matrix-Assisted Laser Desorption Ionization Time-of-Flight: Using a Machine Learning Approach and Unbiased Validation. Front. Microbiol. 9, 2393 (2018). https://doi.org/10.3389/fmicb.2018.02393 42. Huang, T.-S., Lee, S.S.-J., Lee, C.-C., Chang, F.-C.: Detection of carbapenem-resistant Klebsiella pneumoniae on the basis of matrix-assisted laser desorption ionization time-offlight mass spectrometry by using supervised machine learning approach. PLoS ONE 15(2), e0228459 (2020). https://doi.org/10.1371/journal.pone.0228459 43. Zhang, C., et al.: Systematic analysis of supervised machine learning as an effective approach to predicate β-lactam resistance phenotype in Streptococcus pneumoniae. Brief. Bioinform. 21(4), 1347–1355 (2020). https://doi.org/10.1093/bib/bbz056 44. Moradigaravand, D., Palm, M., Farewell, A., Mustonen, V., Warringer, J., Parts, L.: Prediction of antibiotic resistance in Escherichia coli from large-scale pan-genome data. PLoS Comput. Biol. 14(12), e1006258 (2018). https://doi.org/10.1371/journal.pcbi.1006258 45. Feretzakis, G., et al.: Using machine learning techniques to aid empirical antibiotic therapy decisions in the intensive care unit of a general hospital in Greece. Antibiot. Basel Switz. 9(2) (2020). https://doi.org/10.3390/antibiotics9020050. 46. Haga, H., et al.: A machine learning-based treatment prediction model using whole genome variants of hepatitis C virus. PLoS ONE 15(11), e0242028 (2020). https://doi.org/10.1371/jou rnal.pone.0242028

36

K. Nayan et al.

47. Oonsivilai, M., et al.: Using machine learning to guide targeted and locally-tailored empiric antibiotic prescribing in a children’s hospital in Cambodia. Wellcome Open Res. 3, 131 (2018). https://doi.org/10.12688/wellcomeopenres.14847.1 48. Macesic, N.N., Bear Don’t Walk, O.J., Pe’er, I., Tatonetti, N.P., Peleg, A.Y., Uhlemann, A.C.: Predicting phenotypic Polymyxin Resistance in Klebsiella pneumoniae through Machine Learning Analysis of Genomic Data. mSystems 5(3) (2020). https://doi.org/10.1128/mSy stems.00656-19 49. Kouchaki, S., et al.: Application of machine learning techniques to tuberculosis drug resistance analysis. Bioinforma. Oxf. Engl. 35(13), 2276–2282 (2019). https://doi.org/10.1093/bioinform atics/bty949 50. Mason, D.J., et al.: Prediction of Antibiotic Interactions Using Descriptors Derived from Molecular Structure. J. Med. Chem. 60(9), 3902–3912 (2017). https://doi.org/10.1021/acs.jmedchem. 7b00204 51. Gupta, R., Srivastava, D., Sahu, M., et al.: Artificial intelligence to deep learning: machine intelligence approach for drug discovery. Mol Divers 25, 1315–1360 (2021). https://doi.org/ 10.1007/s11030-021-10217-3

Chapter 3

Designing Dense-Healthcare IOT Networks for Industry 4.0 Using AI-Based Energy Efficient Reinforcement Learning Protocol Susheel Kumar Gupta, Sugan Patel, Praveen Kumar Mannepalli, and Sapna Gangrade

Introduction The artificial intelligence (AI) is the future of designing the next generation industry 4.0 based on the IoT networks. The machine learning (ML) is an essential part of the intelligent sensor networks. In the recent times, sensors-based monitoring and management system have been extensively used in healthcare-based IoT networks [1]. The introduction of adaptive routing within SDN has been suggested by Sandra Sendra1 et al. [1] as a solution to the routing issue. The Minnie emulator is used for this, and an SDN-based intelligence routing algorithm is suggested. In electronic hospital devices, sensors are utilized to transform diverse stimuli signals for artificial processing. Sensors could monster bedside for distant patient’s vital signs as well as other health parameters. They can also boost artificial intelligence for networks of medical equipment. Here are many application areas where the IoT-based wireless sensor networks (WSN) in the medical and healthcare fields especially in intensive care units (ICU) and hospital wards, early works have discussed the uniqueness of difficulty that arises with the widespread use of wireless wide area sensor network. S. K. Gupta (B) LNCT Bhopal (MP), Bhopal, India e-mail: [email protected] S. Patel Chandigarh University Mohali Punjab, Ludhiana, India e-mail: [email protected] P. K. Mannepalli G. H. Raisoni Institute of Engineering & Technology Nagpur, Nagpur, India e-mail: [email protected] S. Gangrade Osmania University Hyderabad, Hyderabad, India e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2023 A. Mishra and J. C.-W. Lin (eds.), Industry 4.0 and Healthcare, Advanced Technologies and Societal Change, https://doi.org/10.1007/978-981-99-1949-9_3

37

38

S. K. Gupta et al.

The technology uses for numerous sectors and purposes, such as medical care observation, earth detection, ambient air pollution checking, military activity checking, or monitoring framework verifying. The frequently used sensors in healthcare are shown in Fig. 3.1 which includes oxygen sensor for breathing rate monitoring, temperature sensors for fever monitoring, pressure sensors for uses like oxygen cylinder, the ECG sensors for the heart reinterring, system required pressure sensors for monitoring pressures of oxygen cylinders, the hearing related issues and noise attenuations is handled using the digital hearing aids containing sound and audio sensors. The diabetic patients may measure the sugar range at home using glucose sensors. The brain activities and behavioral monitoring are possible using the Gesture sensors. Huge number of sensor nodes has been modified recently for patient health assessment in healthcare. Wireless sensors being used by Healthcare-based Internet of Things (HC-IoT) could save patient and healthcare administration information over Web server. The use of IoT offers to store the data directly to cloud servers and makes it assessable globally from across the world. As can be seen in Fig. 3.2, the one main use of WSN in the healthcare industry is for remote patient health monitoring. The data from the sensors is also utilized to forecast and identify the condition. The data used by elders also enables the early detection of health problems in them. There are numerous medical imaging techniques that use visual sensors to assess the condition of interior body components, and hospitals

Fig. 3.1 Most frequently being used sensors for health care monitoring

3 Designing Dense-Healthcare IOT Networks for Industry 4.0 Using …

39

Fig. 3.2 WSN installed for patient health monitoring in healthcare

frequently use WSN for remote patient health monitoring. Figure 3.2 depicts an instance of how the WSN is used in healthcare. The figure illustrates three distinct applications of WSN in operation theaters OT’s, in patient rooms or ICUS, and as WSN for doctors to report. Contribution of Work Energy efficiency is essential to maintaining a fully functional network for as long as possible. The dense Reinforced Learning RL-based EE routing [5] a case study (DEER-RL) for highly dense LPWAN—IoT network implementations for healthcare usage is covered in this chapter. For extremely dense IoT networks, accomplishing the desired performance based on learning and optimum parameter selection is used. • Sensors can react to changes in network properties as a range of transmission is made three times, the EE in terms of probability, density, and re-engage of the hop counts thanks to reinforcement learning (RL). • The cluster head (CH) selection and efficient cluster formation are the foundations upon which the protocol is built. The mobile communication system is the foundation for learning. • Performance under network scalability is examined when the network node density is expanded up to four times. Performance is evaluated after the network is randomly modeled employing normal and uniform probability distributions. By adjusting optimum design characteristics, the performance of the proposed protocol is assessed in order to improve its lifetime and, consequently, EE and scalability. Modern EE routing technologies are used for providing the further improved performance.

40

S. K. Gupta et al.

Fig. 3.3 The RL basic model used in this study

Why use RL Devices can adapt to dynamic changes in the network, including mobility and energy density, and make better routing information, thanks to reinforcement learning (RL) as shown in Fig. 3.3. In this chapter it is proposed to design and evaluation of the energy efficient (EE) routing using the reinforcement learning (RL). Reward signals or real numbers are the only things RL learning has, and it excludes having a supervisor. Using the learning algorithm helps in making appropriate decisions in a timely manner. The RL is a subset of the ML, and it is designed to improve performance under unknown environment based on trail-error methodology based on rewards allocation as shown in Fig. 3.3 [1]. RL can be used in IoT networks to address issues such as dynamic changes in the network topology caused by device mobility, energy consumption, and other transmission characteristics like signal strength, distance, and bandwidth that might change on a temporal basis and affect network performance.

IoT-WSN Architecture For sending data on cloud servers throughout the past two decades, IoT-based sensor networks are highly popular. WSN is a system of tiny, autonomous devices called sensors that collect distinct types of physical or ecological conditions, such as temperature, sound, vibration, weight, and development at various locations, cycle data, and deliver the identified information to clients [2]. These sensors are employed to gather data from the planet and transmit it to the base station. A base station establishes a connection to the actual space in which the accumulated data is handled, organized,

3 Designing Dense-Healthcare IOT Networks for Industry 4.0 Using …

41

Fig. 3.4 Basic architecture of the IoT-WSN

and shown to assistive apps [2]. These sensor center points are many in WSNs, and they have the ability to communicate data directly to an externally placed base station (BS) or among themselves. Different applications might send a huge number of sensors to determine various events like weight and the development of sensing unit. The basic IoT-WSN architecture is shown in Fig. 3.4. There could be millions of nodes in a Wide area IoT-WSN. The healthcare WSN are highly heterogenic in terms of sensors deployed and the node count determines the node density. WSN are frequently used to monitor certain patient health with remote monitoring for a few seconds to a few months or even years. Since replacing batteries due to significant network costs is not possible, nodes must optimize their energy utilization.

IoT-WSN Architecture The next-generation IoT network has been used extensively in health care therefore this section deals with the certain open challenges of such HC-IoT networks. Numerous approaches have been developed in the past to improve the effectiveness of the healthcare systems, which can be used effectively in a variety of patient and hospital contexts. Problem that, if not handled properly, could have a negative impact on freshly started activities and desires perceptions. However, there are still some difficulties that must be overcome. Among these difficulties are: • Next-generation healthcare Industry 4.0 is expected to deploy a highly dense IoT network. The routing operation is consuming the maximum power since it allows sensors to communicate through IoT network. With increasing node density, it becomes a challenging task to enhance the EE of the routing protocol on IoT networks. • Due to IoT devices’ low memory and power capacities, networks’ computing capabilities are limited. • The calibration of the sensor for minimal power consumption and accuracy is a challenging undertaking.

42

S. K. Gupta et al.

• Which components of the healthcare system should be improved or eliminated in order to raise the overall system’s efficiency? • The communication range between the nodes is a critical factor for the analysis of the • Additionally early studies have looked at several security loopholes and realworld applications of WSN, such as interruption, hub catch attack, dark opening attack, and targeted sending attack. Potential defenses are put forth in the form of agreements or frameworks for the secure sharing of information between friendly hubs, settling on security initiatives with the goal of achieving secure and strong association. • Numerous approaches have been developed in the past to improve the effectiveness of the healthcare system, which may be used effectively in a variety of remote health monitoring contexts. However, there are still several challenges that need to be resolved. The shortcomings of the current routing protocols are also a challenge for future research. • The basic LEACH protocol now in use has a low energy efficiency, which needs to be increased in new protocols. • The primary drawback of a stable election protocol (SEP) is the impossibility of dynamitic selection of the cluster heads CH for the two distinct network nodes. The network died earlier as a result of the nodes farthest from the greater energy nodes dying more quickly. • The fundamental restriction of distributed DEEC protocols seems to be that advanced nodes are permanently penalized in DEEC. It occurs often when the advance nodes’ leftover energy decreases to within the range of the regular nodes. This reduces the dispersed network’s total lifespan. • Fuzzy-based routing does not suites for global network context. As most of the fizzy protocols were specific. • The EE and the adoptability to network dynamics are lagging in most of the existing protocols. With the increase of IoT networks and liking to their scalability in the future, it is high time to design learning-based intelligent routing protocols.

Literature Review There are a number of researches have been carried out in the last two decades for improving the EE and the lifetime performance of the IoT-WSN networks. These sections of the chapter initially classify the different AI-based routing methodologies for designing WSN system for wide area coverage. The broad classification of the various artificial WSN routing protocols for establishing EE communication is presented in Fig. 3.5. Broadly AI-based WSN routing methods are classified as the

3 Designing Dense-Healthcare IOT Networks for Industry 4.0 Using …

43

Clarification of AI based Hierarchical Protocols

AI based Routing

Fuzzy Protocol

Machine Learning Protocol

Reinforce Learning Protocol

Clustering Protocol

Energy efficient RL Routing

Fig. 3.5 Classification of an artificial WSN routing protocols

fuzzy logic-based routing protocol [3], ML-based routing and clustering [4], and RLbased EE algorithm routing [5] capable of adopting dynamic network changes This chapter and section sequentially reviewed the research in each of these categories. Sandra Sendra et al. [1] give a fresh plan for putting a smart routing protocol into an SDN topology. Depending on the RL process, the intelligent routing protocol enables selecting the optimal data transmission routes in accordance with the best criteria and in light of the network status. A succinct overview of the various energy efficient routing methods utilized for WSN applications has been provided by Amee Vishwakarma, et al. [3]. For the review, they addressed every clustering-based routing strategy. V. Kazadi et al. [5] introduce an energy efficient routing protocol. This routing protocol uses reinforcement learning. Using this, different devices are able to adapt to different changes occurring in terms of energy level, routing decisions, and mobility. To enhance the performance, simulation using MATLAB has been used and several devices are distributed randomly across the field. The results show that the performance of EER-RL has been compared with other devices whereas the protocol runtime is reduced much. It also minimizes the energy consumption. Maja L. et al. [10] propose a theoretical approach for Routing Protocol which are helpful in providing energy efficiency. This requires a minimum energy path when the mobility of nodes is taken into consideration. These nodes can be static or mobile. This protocol uses a Routing Protocol for Low Power and Lossy Networks. The simulation results show better performance in terms of energy consumption and duty cycle. The future work shows the betterment of small degradation occurring in the packet to delivery ratio. A. Habib et al. [11] presented different routing protocols based on RL algorithms. These protocols are viewed and compared with each other to test the performance. The designing also faces some issues, and the challenges are seen in implementation and thus an overview to this drawback is made for enhancing the quality features and network performance of WSNs. Hence, it can be concluded from this research that resolving security issues and multiple routing metrics, faster convergence should be taken as a main consideration.

44

S. K. Gupta et al.

A. Habib et al. [12] propose research in which a survey is made to compare the performance of routing protocols on the basis of their key features. Optimization of network lifetime in RL routing protocols has been acquired by managing routing efforts. The protocols are used along with RL algorithms. Also, the RL-based routing algorithms for wireless sensor networks with respect to the issues faced in their design are also taken as a main concern. The data redundancy leads to such issues, thus, the freshness of data through routing must be taken. The future plan leads to the discussion of the merits and demerits of routing protocols in WSNs. S.E. Bouzid et al. [13] presented an approach in which energy consumption is the main concern and wireless sensor network lifetime is taken into consideration. For this, an RL algorithm based on routing protocol is used. This protocol is applied for wireless networks using a dynamic selection of path. The routing protocol consists of two processes mainly: discovery process and continuous learning process. This can be demonstrated through different simulations and comparisons. The results show high energy efficiency and different long lifetime. The future planning’s discusses the implementation of RL protocol using wireless network services. Anju A. et al. [14] propose routing protocol after doing a survey based on RL reinforcement in wireless sensor network. These protocols have been developed as per meeting the demands required for the development of sensor network. The problem of prediction and controlling is overcome in this research by a Q-learning approach. This Q-learning concept upgrades the service requirements for acquiring adaptivity and finding a maximum path in resource constrained environment. The future scope of this research gives importance to QoS for enhanced network lifetime. Wenjing Guo et al. [14] propose a RL-based routing protocol that takes the advantage of algorithms applied for maximum routing path. This routing path searches for data transmission. A performance evaluation framework for routing protocol is constructed. This balances the total energy consumption and also improves the packet delivery ratio. This results in better energy efficiency and network lifetime and reveals the work efficiency. The simulation results show that Q-Routing, MRL-SCSO reinforcement protocol maximizes the network lifetime and improves the energy efficiency. N. Kaur et al. [16] presented a routing protocol in which neighboring nodes are selected along with cluster heads. An appropriate cluster head has been selected after analyzing environmental conditions and network coverage. The network coverage is computed at a distance from the cluster head to base station. These cluster head nodes minimize the energy consumption and network lifetime. Also, to meet the power needs of processors, transmitters, and receiver cluster heads, a battery is essentially required. The simulation results show that the energy saving of 7.41%, 3.27%, etc. is achieved for 100, 200, etc. nodes, respectively. K. Anitha [17] proposes a survey for the maximum use of routing protocol with RL technique. The major part of the research focuses on a detailed study of protocols. These protocols offer researchers on sensor networks in an elaborated way. This supports a number of application specific routing protocols. An advanced learning strategy is introduced to enhance and improve communications. This strategy implants routing optimization. Thus, final results show that the lifetime of WSNs has been improved without compromising QoS.

3 Designing Dense-Healthcare IOT Networks for Industry 4.0 Using …

45

K. Tuyls et al. [18] approach the study of sensor nodes to plan the active time period based on interactions with neighboring nodes. The nodes do not need to synchronize outwardly. This is achieved by proper data and message transmission. This results in the successful exchange of data in data collection processes. The desynchronization leads to the reduction of packet collisions and radio interferences. Thus, this results in short-range communication by removing wake-up cycles and also degrades the data retrieval latency. Hence, the duty cycle of the system is reduced, and the lifetime is improved efficiently. Anna F. et al. [19] presented an approach for RL-based routing protocol for wireless networks. It evaluates the result and benefits obtained by implementing this protocol in a scatter web node. This successful implementation demonstrates the accessibility of a machine learning-based protocol that depends on real hardware networks and also increases the performance. This exact measurement of the packet delivery ratio is evaluated and influences the network performance. The simulation results reveal common implementation challenges using the transmission process and display the improved performance and also affect the routing cost. Khurram K. et al. [20] propose a routing scheme known as reinforcement learning-based protocol for wireless sensor network using fuzzy logic. This protocol is called for controlling the use of node’s Q-value and determines the proper transmission of MRS signal from source to destination. This paper also proposes a novel routing scheme for Oppnets in which the notes meet the proper computational time. The simulation results show that the proposed protocol using a fuzzy logic-based Q-learning improves the overhead ratio, the delivery ratio, and the average latency. S. Wang et al. [21] presented an efficient RL-based routing protocol that AIIMS to investigate the management and resources of WSN networks. Routing algorithm maximizes the study of the relationship between energy priority and distance priority. To improve the utilization rate a related protocol becomes an effective method. Especially, your learning is applied and reduces power consumption. The simulation results show that the proposed routing scheme increases the network lifetime and reduces the transmission delay. B. Patel et al. [22] proposes research based on sensor notes in wireless sensor network system. This uses a hierarchical routing protocol and Q-routing algorithm. This AIIMS is to be in energy efficient protocol which implements the proper routing strategies and also compares the energy efficiency. This implants a hybrid topology (a mixture of heterogeneous and homogeneous topologies). The applied algorithm accepts many challenging factors and effects. This overcomes problems and returns a maximum computational power, efficient energy system, scalability, and data aggregation. Final result shows that the shortest distance developed using this algorithm results in the increase of network lifetime and more energy efficiency. W. Guo et al. [23] propose routing algorithm for wireless sensor network which seems to be efficient and effective in overcoming the drawbacks related to previous researchers. This protocol uses reinforcement learning technique. Such protocol algorithm techniques divide an equal amount of energy consumption and also improve the packet delivery ratio, thus reducing the extra cost required. Along with this

46

S. K. Gupta et al.

and energy aware routing (EAR) becomes the most acceptable and easily accessible routing algorithm which gives an idea about the use of minimum energy and maximum network use. This algorithm uses a singular path rather than a number of paths and reduces energy consumption as well as improves the communication in the forwarding node. A. Zhang et al. [24] have studied the deep technology based on routing algorithm for wireless sensor network. This technology proposes an effective method for routing along with network topologies. The notes are used to make routing decisions properly with respect to the network and use a strategy based on proper traffic flow. This multi-hop network topology scheme proposes a strategy for changing the network state and also improves the feasibility of the network. The whole mechanism is based on enhanced notes and collecting efficient energy and trafficking for several nodes. The simulation of such of this technique is completed by a deep learning model and includes the experimental parameters for wireless sensor networks. The summary is shown in Table 3.1.

Proposed RL-Based Routing Protocol Design Evaluation of highly dense network performance under RL-based protocols and an optimum parametric variable is proposed to create an effective protocol design. The process of the proposed RL-based optimum parameter-based routing protocol evaluation is shown in Fig. 3.6. In this chapter it is proposed to evaluate the routing performance of the RL under the network node distributions using uniform and normal probability distributions. The RL is responsible for the selection of the Cluster head (CH) in the network. The example of the node distributions using normal and uniform distributions are shown in Fig. 3.7. It can be clearly observed from Fig. 3.7 that the normal distribution has Gaussian random distribution thus unequally place nodes in the network, but the uniform distribution has rather equal node density across the network.

Proposed RL-Based Routing Protocol Design In this chapter initially, the basic existing RL-based routing protocols simulation results are validated. Parametric Optimization Sing MATLAB simulation, the system is proposed to deploy 100 devices at random above a sensing field of 100100 m, having followed the normal distribution, to assess the results of the designed EER-RL. The parameters used for the study to optimize the protocol performance are shown in Table 3.2.

3 Designing Dense-Healthcare IOT Networks for Industry 4.0 Using …

47

Table 3.1 Summary table of the literature review Author

Methodology

Approach

V. Kazadi et al. [5]

Energy Efficient Routing Protocol based on Reinforcement Learning

Use of Lightweight RL to minimize energy and protocol runtime

Maja L. et al. [10]

Routing Protocol for Low Power To reduce total energy and Lossy Networks consumption and total traffic control

A. Habib, et al. [11]

Carefully designed Routing Protocol for improving network performance of WSNs

Resolving security issues, multiple routing metrics and faster convergence algorithm

A. Habib et al. [12]

RL algorithms for designing routing protocols in WSN

Optimization of network lifetime

S.E. Bouzid et al. [13]

Q- Routing and RLBR Routing Protocol

Routing Protocol with distributed Reinforcement learning

Anju A. et al. [14]

Development of routing protocol Q-learning Approach for WSNs

Wenjing Guo et al. [15]

BEER, Q-Routing, MRL-SCSO

N. Kaur et al. [16]

A cluster head neighboring node RL approach based on reward for observing environmental point conditions

K. Anitha [17]

Reinforcement learning technique

Detailed study of networking protocols for sensor networks

K. Tuyls et al. [18]

Synchronization in exchange of messages

RL approach in OMNET++ sensor network simulator

Anna F. et al. [19]

Multicast routing protocol (FROMS) in a testbed of Scatter Web

Observations on properties and pitfalls of WSN implementations along with potential solutions

Khurram K. et al. [20]

Q-Learning Routing (FQLRP) and FCSG routing

A novel routing scheme for OppNets called RLFGRP

S. Wang et al. [21]

3D underwater wireless sensor network (UWSN) and Q-learning approach

To investigate the resource management in the hierarchical networks

A. B. Patel et al. [22]

Implementation of routing strategies using Q-routing algorithms

To develop an energy efficient shortest path

W. Guo et al. [23]

Use of Energy Aware Routing and Improved Energy Aware Routing

To obtain an improved network lifetime and packet delivery ratio

A. Zhang et al. [24]

An efficient routing strategy to cope with network topology changes

To improve transmission delay, average routing length and energy efficiency

To cut down total energy consumption and balances energy consumption

48

S. K. Gupta et al.

Fig. 3.6 Proposed block diagram of the RL-based routing evaluation methodology

Validation of EER-RL [5] Protocol The basic validation results for the number of alive operational nodes and the energy consumptions are plotted in Fig. 3.8. The validation process uses 100 nodes and 199 m × 100 m area with sink at the center of the network.

Impact of the Transmissions Probability This experimentation is simulated to represent the comparison of the different performance test parameters for the 100 nodes and 100 m × 100 m area using the EE-RLbased protocol while adjusting the probability of transmission and selecting in the range of p = [0.5, 0.6, and 0.7] and the results comparison for alive and dead nodes along with energy consumptions are shown in Fig. 3.9. It can be observed from Fig. 3.9 that by increasing the probably of the transmissions of data the energy consumptions of the network are minimized as clear from Fig. 3.9c. This in turn maximizes the overall network lifetime and the stability period of the network. As reflected in Fig. 3.9a. Thus overall, in this chapter it is proposed to settle with the compromise in higher probability and better network performance and as settlement the p is proposed to set to 0.7 as maximum value. The improvement in the stability performance of the proposed EE-RL-based optimum parametric routing protocol is presented in Fig. 3.9d. This can be observed that increasing the probability of the transmission coextend the stability period of the network routing protocol significantly by around 25%. The comparison of the latency for different state of art routing protocols with the proposed RL-based protocol is given in Fig. 3.9e. The improvement by the proposed RL-based routing is clearly observed from figure.

Effect of Transmission Distance of Node To identify the effect of transmission range over the designed protocol using the EERL while adjusting the RC the transmissions range of nodes and selecting in range

3 Designing Dense-Healthcare IOT Networks for Industry 4.0 Using …

49

EER-RL 100 90 80 70

(m)

60 50 40 30 20 10 0

0

10

20

30

40

50 (m)

60

70

80

90

100

(a) Node distribution using uniform probability distribution

(b) Node distribution using normal probability distribution Fig. 3.7 Example of RL-based EE routing protocol clustering for basic 100 nodes in 100 m2 But the normal distribution takes larger initial network settling and cluster formation Timings and is therefore uniform distribution is preferred in this chapter for the rest of the result evaluation work

50 Table 3.2 The range of Simulation Parameters used for study

S. K. Gupta et al. Variable

Description

Range

Xm, Ym

Sensor field dimensions

100 m × 100 m

n

Number of nodes network

199–400

Sx, Sy

Sink node locations

50, 50

ETx

Transmitter Energy

50 × 10−9 J

ETr

Receiver Energy

50 × 10−9 J

Eamp

Amplifier energy

100 × 10−12 J/b/m^2

EDA

Data aggregation Energy

5 × 10−9 J

Rc

Range of transmission [20 40 60]

α

Learning rate

[0.5 1 1.5 2]

γ

Discount factor

0.95

p

Energy probability parameter

0.5–0.7

Q1

Hop counts probability

1-p = [0.5, 0.4, 0.3]

Fig. 3.8 Validation of the basic RSED EE routing protocol a number of alive operational nodes b energy consumptions of the per round

of Rc = 10, 15, and 20. The results of Rc evaluations for alive and dead nodes are shown in Fig. 3.10. The optimum result is found for Rc = 20.

Evaluation of Learning Rates For identifying the impact of the learning rate over the designed routing protocol using the EE-RL the rate of learning is varied for simulation. The value of learning

3 Designing Dense-Healthcare IOT Networks for Industry 4.0 Using …

51

(a) Comparison of operating nodes

P=0.5 P=0.6 P=0.7

(b) Comparison of dead nodes counts Fig. 3.9 Results comparison of the various performance evaluation parameters for the 100 nodes and 100 m × 100 m area but by varying the probability of transmissions as p = [0.5, 0.6, and 0.7] using the EE-RL-based protocol

52

S. K. Gupta et al.

P=0.5 P=0.6 P=0.7

(c) Comparison of Energy consumption per rounds

(d) the improvement in stability period by increasing the probability of the data transmission

(e) Comparison of latency for different routing protocols.

Fig. 3.9 (continued)

3 Designing Dense-Healthcare IOT Networks for Industry 4.0 Using …

53

(a) Comparison of operating nodes EER-RL Total dead nodes per Round 100 Rc=20 Rc=15 Rc=10

90 80

Dead Nodes

70 60 50 40 30 20 10 0

0

500

1000

1500

2000

2500

3000

3500

4000

4500

5000

Rounds

(b) Comparison of dead nodes counts Fig. 3.10 Results evaluation for parameters for 100 nodes and 100 m × 100 m area but by varying range of transmissions as p = [0.5, 0.6, and 0.7] using the EE-RL-based protocol

54

S. K. Gupta et al.

rate α is selected in the range of α =0.5 0.7 and 1. The results of rate evaluations for alive and dead nodes are shown in Fig. 3.11. The optimum result is found for α = 1.

(a) Comparison of operating nodes

(b) Comparison of energy consumption Fig. 3.11 Results comparison for parameters for dense nodes by varying value of the Learning Rates as a = [0.5, 0.7, and 1] using the EE-RL-based protocol

3 Designing Dense-Healthcare IOT Networks for Industry 4.0 Using …

55

It can be observed from Fig. 3.11 that lowering the learning rate may significantly improve the overall network stability period latency. It is because in early 2000 round the energy consumption is less with the less learning rates. But simultaneously it can also be observed that on the higher rounds, it is good enough to adapt the higher learning rates in terms of minimizing the energy consumption. Also it is proposed to use a lower learning rate for higher node densities for performance improvements. It was identified that in early slightly reducing the transmission distance can help simulation under the higher node densities. Also in the future, the adaptive learning rates based on the artificial learning analysis can improve the performance of dense network performance.

Impact of Higher Node Densities The size of the IoT networks is increasing continuously due to the deployment of huge numbers of nodes. It may be expected to increase the stability performance of the network as the node density increases. An experiment is performed to represent the impact of higher node densities the RL-EE protocol is simulated using the 50 and 100 nodes within the same 100 × 100 m area. The results of the operational nodes are plotted in Fig. 3.12. It can be observed from Fig. 3.12 that increasing the node density may require larger energy and may also offer a better lifetime in comparison to low densities. Thus, it is good to increase the energy of the network for considering the network scalability.

Conclusions and Future Scopes In this study, we presented D-EER-RL, an EE routing protocol for the IoT based on dense clusters and reinforcement learning. The goal of this effort was to decrease energy consumption and extend the network lifetime by determining the optimal path for data transfer. The first experiment is performed to compare the uniform node distribution and normal probability distribution. It is concluded that for EE-RL-based routing at least the normal distribution takes a significantly larger time and is difficult to cluster, thus it is concluded to use uniform node distribution in this work. To develop a successful protocol design, an evaluation of highly dense network performance under RL-based protocols and an ideal parametric variable are provided. In this chapter, it is suggested to use uniform and normal probability distributions to assess the routing effectiveness of the RL under network node distributions. The decision on the network’s Cluster Head (CH) rests with the RL. At first, the chapter defines the many design difficulties for RL-based networks.

56

S. K. Gupta et al.

(a) Number of alive nodes for dense WSN example

(b) Energy comparison for dense WSN example Fig. 3.12 Number of alive nodes for dense WSN example. Energy comparison for dense WSN example. Result simulations for showing the dense deployment with 50 nodes in Red color, and 100 nodes in Blue deployed in the small network area

3 Designing Dense-Healthcare IOT Networks for Industry 4.0 Using …

57

It is found that when node density rises, network stability performance decreases. This chapter conducted a number of experiments to assess the IoT-performance WSN’s utilizing an RL-based protocol design. A larger probability may have better stability performance when the energy probability, p, is changed between 0.5 and 0.7. The transmission distance is changed in an experiment to determine the best network life performance. It has been determined that extending the range from 10 to 20 can increase lifespan stability by about 15%. Simulated learning rate evaluation results show that reducing the learning rate may greatly reduce overall network stability period latency. As node density rises, the network’s performance in terms of stability may improve. The adaptive learning ratebased routing protocol may be used in the future to increase efficiency in terms of both energy and life. Additional speed improvement can be achieved using optimization approaches.

References 1. Sendra, S., Rego, A., Lloret, J., Jimenez, J.M., Romero, O.: Including artificial intelligence in a routing protocol using software defined networks. In: ICC2017: WT04–5th IEEE International Workshop on Smart Communication Protocols and Algorithms (SCPA 2017) 2. Ahmed Hamza, M., Mesfer Alshahrani, H., Al-Wesabi, F.N., Al Duhayyim, M., Mustafa Hilal, A., Mahgoub, H.: Artificial intelligence based clustering with routing protocol for internet of vehicles. Comput., Mater. Contin., CMC 70(3) (2022) 3. Vishwakarma, A., Prof. Dutta, P.: A review on various energy efficient clustering protocols of WSN. J. Emerg. Technol. Innov. Res. (JETIR) 4(8) (2017) 4. Al-Kiyumi, R., et al.: Fuzzy Logic-based Routing Algorithm for Lifetime Enhancement in Heterogeneous Wireless Sensor Networks. IEEE Trans. Green Commun. Netw. (2018) 5. Kazadi Mutombo, V., Lee, S., Lee, J., Hong, J.: EER-RL: energy-efficient routing based on reinforcement learning. Hindawi Mob. Inf. Syst. 2021, 1–12 https://doi.org/10.1155/2021/558 9145 6. Ding, Q., Zhu, R., Liu, H., Ma, M.: An overview of machine learning-based energy-efficient routing algorithms in wireless sensor networks. Electronics 10, 1539 (2021). https://doi.org/ 10.3390/electronics10131539 7. Yadav, A.K., Sharma, P., Yadav, R.K.: A novel algorithm for wireless sensor network routing protocols based on reinforcement learning. Int. J. Syst. Assur. Eng. Manag. 13, 1198–1204 (2022) 8. Nurmi, P.: Reinforcement learning for routing in ad-hoc networks. In: Proceedings of the fifth international symposium on modeling and optimization in mobile, Ad-Hoc, and Wireless Networks (WiOpt) (2007) 9. Chettibi, S., Chikhi, S.: A survey of reinforcement learning based routing proto- cols for mobile Ad-Hoc networks, recent trends in wireless and mobile networks, 162. Communications in Computer and information Science, Springer, pp. 1–13 (2011) 10. Lazarevska, M., Farahbakhsh, R., Shakya, N.M., Crespi, N.: Mobility Supported Energy Efficient Routing Protocol for IoT based Healthcare Applications”, 2018, IEEE Conference on Standards for Communications and Networking (CSCN), DOI:978-1-5386-8146-6/18. 11. Habib, A., Yeasir Arafat, M., Moh, S.: Routing Protocol based on Reinforcement Learning for wireless sensor networks: a comparative study. J. Adv. Res. Dyn. Control. Syst. (2019). https:// www.researchgate.net/publication/331588735.

58

S. K. Gupta et al.

12. Habib, A., Arafat, Y., Moh, S.: A survey on reinforcement-learning-based routing protocols in wireless sensor networks. The 8th International Conference on Convergence Technology (2018), https://www.researchgate.net/publication/325617283. 13. Bouzid, S.E., Serrestou, Y., Raoof, K., Omri, M.N.: Efficient routing protocol for wireless sensor network based on reinforcement learning. In: 5th International Conference on Advanced Technologies, DOI: 978-1-7281-7513-3/20 (2020) 14. Arya, A., Malik, A., Garg, R.: Reinforcement learning based routing protocols in WSNs: a survey. Int. J. Comput. Sci. Eng. Technol. (IJCSET) 4(11) (2013) 15. Guo, W., Yan, C., Lu, T.: Optimizing the lifetime of wireless sessor networks via reinforcement learning-based routing. Int. J. Distrib. Sens. Netw. 15 (2019) 16. Kaur, N., Kaur Aulakh, I.: An energy efficient reinforcement learning based clustering approach for wireless sensor network. In: EAI Endorsed Transactions on Scalable Information Systems (2021). https://doi.org/10.4108/eai.25-2-2021.168808 17. Anitha, K.: A survey of optimization of routing protocol in wireless sessor network using reinforcement learning technique. IJARIIE-ISSN(O)-2395–4396, Vol-8 Issue-3 (2022) 18. Mihaylov, M., Le Borgne, Y.-A., Tuyls, K., Nowe, A.: Decentralised reinforcement learning for energy-efficient scheduling in wireless sensor networks. Int. J. Commun. Netw. Distrib. Syst. 9 (2012) 19. Forster, A., Murphy, A.L., Schillerand Kirsten Terfloth, J.: An efficient implementation of reinforcement learning based routing on real WSN hardware.In: IEEE international Conference (2008). https://doi.org/10.1109/WiMob.2008.99. 20. Khalida, K., Wounganga, I., Dhurandher, S.K., Singh, J.: Reinforcement learning-based fuzzy geocast routing protocol for opportunistic networks. Elsevier Internet of Things (2021). https:// doi.org/10.1016/j.iot.2021.100384. 21. Wang, S., Shin, Y.: Learning for Magnetic Induction Underwater Sensor Networks. IEEE Access, vol. 7 (2019) 22. Patel, A.B., Shah, H.B.: Reinforcement Learning Framework for Energy Efficient Wireless Sensor Networks. Int. Res. J. Eng. Technol. (IRJET) 2(2) (2015) 23. Guo, W., Yan, C., Gan, Y., Lu, T.: An Intelligent Routing Algorithm in Wireless Sensor Networks based on Reinforcement Learning. Trans Tech Publications (2014). https://doi.org/ 10.4028/www.scientific.net/AMM.678.487. 24. Zhang, A., M. Sun, Wang, J., Li, Z., Cheng, Y., Wang, C.: Deep reinforcement learning-based multi-hop state-aware routing strategy for wireless sensor networks (2021), MDPI, https:// www.mdpi.com/journal/applsci

Chapter 4

The Impact of Artificial Intelligence on Healthcare Shivshankar Rajput, Praveen Bhanodia, Kamal K. Sethi, and Narendra Pal Singh Rathore

Introduction Nowadays, a big responsibility of healthcare systems is to take care of people’s health. It is very important to have healthy country, societies and individuals. This system is very helpful to detect right disease and prevent from those diseases timely. Modern time diseases are very critical and many peoples are being infected by any disease so a modern healthcare system is required [1, 2]. Therefore, artificial intelligence has a major role in healthcare system. Artificial intelligence fulfills the various expectations by people like smart instruments and smart devices in medical field. AI-based healthcare system is very helpful in COVID-19 scenario. Artificial intelligence [3] is transfiguring systems by improving the way in which healthcare system is delivered. Numerous hospitals and health-related institutions have amassed significant datasets in the form of age, gender, education, income and ethics available for record-keeping. Various doctors do not have proper knowledge of disease and he/she tried various experiment on patient and this causes various patients to have some side effects, but artificial intelligence removes this type of problem and it is helpful in various sectors of healthcare systems like drug research, diagnosis, patient monitoring care systems, etc. In the healthcare industries/companies, artificial intelligence helps to gather previous records like doctor prescription, test report, previous disease and related issue by digital health records for disease prevention S. Rajput (B) · P. Bhanodia · K. K. Sethi · N. P. S. Rathore Acropolis Institute of Technology and Research, Indore, India e-mail: [email protected] P. Bhanodia e-mail: [email protected] K. K. Sethi e-mail: [email protected] N. P. S. Rathore e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2023 A. Mishra and J. C.-W. Lin (eds.), Industry 4.0 and Healthcare, Advanced Technologies and Societal Change, https://doi.org/10.1007/978-981-99-1949-9_4

59

60

S. Rajput et al.

and diagnosis. In medicine, Artificial Neural Network and Deep Learning algorithms are used to predict and assess the precise treatment guidelines that are most likely to be effective for a patient based on a variety of patient behaviours, traits and treatment options (Fig. 4.1). AI is much needed to meet today’s healthcare needs and for better treatment. We can transform the traditional system with the help of AI to get better results. Today’s research hospitals and institutions are appreciative of the always improving technology developments that have transformed their procedures, goods and services. The digital transformation [4] involves merging current systems with new technology and applying it to existing systems.

Fig. 4.1 Specific keywords

4 The Impact of Artificial Intelligence on Healthcare

61

Significance of Artificial Intelligence in Healthcare (a) It provides high satisfaction compared to traditional system with accurate result using various technologies. (b) This system also improves the efficiency of operations and it is also improving and smoothly conducts the clinical process and diagnosis process. (c) This system stores the previous health records in one place which helps in diagnosis. After study of this chapter we will be capable of finding the answer of following questions: (i) How can artificial intelligence influence the accuracy of disease decisions? (ii) The AI data is increasing day by day, so how to follow the ethics and rules to use the data properly? (iii) How AI-based healthcare system is best?

Related Work Moving on, a general definition of artificial intelligence is given, and it is how to improve the diagnosis result with different rules. Then, aspects of AI’s strengths and advantages are discussed, along with some of its drawbacks and issues with regard to healthcare. The drawbacks and issues include the ethical and legal issues that AI has to deal with.

Traditional System Versus AI-Based Healthcare System The globe is undergoing a revolution, and numerous transformation is occurring which is embraced in both the modern technologies and human lives [5]. Even though life prospects have been evolving over the past few decades, a significant change is inevitable. An estimated 1.5 billion people—more than twice as many as in 2019— will be over the age of 65 by 2050. According to projections from the UN and WHO, the majority of illnesses will by 2025 be caused by chronic, coexisting conditions. Healthcare professionals frequently need to pay constant attention to patients with these conditions. The ageing of the population will also have a significant impact on healthcare costs. Following task can be performed by AI in better way: • Magnetic resonance imaging (MRI) and computed tomography (CT) scans. • AI-based laboratory tests. • Digital era test is possible due to AI.

62

S. Rajput et al.

Artificial Intelligence Because we lack a definition for “intelligence”, there isn’t a standard definition of “artificial intelligence” anywhere. The term is frequently used to describe new systems that possess human-like abilities such as reasoning, learning significance, simplifying or learning from prior knowledge. Huge amounts of data are used for this. For AI is also included in Fig. 4.2.

Concept of Machine Learning Artificial intelligence includes ML (machine learning). It is employed to comprehend the information and analysis of various concepts/patterns. These ideas/patterns may be utilised to explain and diagnose a wide range of medical issues. Machine learning is a technique where a programme trains and learns data without using conventional programming based on statistics and probability concepts. Learning algorithms are used, and ML illustrates the capacity to understand by identifying patterns in huge database [6]. Each data point in a database is a key that may be examined to extrapolate data that won’t be available for some time. These algorithms’ learning processes can be divided into several categories. One type involves connecting input variables to a predetermined output. Due to the manual labelling of the data, this method takes time. In the field of medicine, supervised learning is useful for establishing connections between a patient’s characteristics (used as input) and the desired result (as output). The learning algorithm or programme can then be fed as unlabelled data to make predictions after the labelled data has been used as a foundation. There are various approaches, but ultimately it comes down to giving the programme a tonne of data and teaching it to learn the outcome that is statistically most accurate. ANNs, NLP and DL are the most popular classification or learning models in machine learning (ML). These all are emerging technologies and broadly used in healthcare system and also used in various fields like motor vehicles, home appliance, fraud detection, etc.

Fig. 4.2 Understanding artificial intelligence

4 The Impact of Artificial Intelligence on Healthcare

63

Fig. 4.3 Classification of artificial intelligence

NLP is the field of computer science that helps computers understand and process human language. With the use of NLP, written material may be controlled and interpreted even if it follows a range of construction pathways or does not follow logical and consistent grammatical standards. NLP is utilised in the healthcare sector to transform unstructured laboratory text data into structured data, as seen in Fig. 4.3. As an illustration, consider data extraction from different digital data sources such as medical history and clinical prescriptions. NLP can be helpful in extracting the pertinent information to match the story because they are typically unstructured and difficult for computer programmes to understand. Artificial neural networks (ANN) and deep learning are two different types of machine learning (ML) that employ more layers before the output is shown. High complexity and diversity structured data may be handled using DL. Each image must be labelled with the intended response, such as the existence of a nodule or tumour, if the system is given millions of X-ray images. This is referred to as having sufficient training examples. Once the system has received sufficient training, the algorithm can successfully recognise a nodule in a picture. DL is suitable, especially for data like photos, music and video. A deep learning (DL)-based model for predicting breast cancer was proposed in a research that connected data from EHRs and mammograms.

Potential Applications of AI in Healthcare Systems A sophisticated approach to care practises and processes is something that artificial intelligence has the potential to provide. Heart disease is the biggest cause of mortality worldwide. It was found that nearly one-third of the cases had been initially misdiagnosed in a research that lasted 9 years, 243 hospitals and 600,000 heart attack patients [7]. The learning process also included a number of issues that affected the

64

S. Rajput et al.

survival rate. A number of these issues were personnel, a lack of specialised medical facilities and delays in getting care. Associate Medical Director at the British Heart Foundation Dr. Mike Knapption said: “It’s critical to make the appropriate diagnosis as soon as possible; if the diagnosis is delayed, the treatment will also be delayed.” Reliable cardiac anomaly detection algorithms must be created in order for clinicians to diagnose heart problem/issues using huge database, which are known to be more correct. To categorise heartbeats, several researches have published novel methods using deep learning models. Radiologists and cardiologists deal with a large amount of imaging data due to the growth of digital imaging with MRI, CT and ultrasound [8]. A radiologist in an emergency room may review up to 300 studies of images, each containing ten hundreds of images. Thus, this causes a variety of effects including cognitive fatigue. Visual analysis of a patient’s images is performed, but it relies on scant clinical data. Additionally, because there are so many images to examine, less time is spent on the case’s clinical details. This affects the patient’s diagnosis and causes high rates of diagnostic error. For radiologists and cardiologists specifically, IBM has created an automated cognitive assistant with clinical decision-making capabilities. Its name is Medical Sieve, and it uses artificial intelligence to gather clinical, textual and imaging data on patients from EHR systems. The thoroughly acquired clinical data is then filtered, and abnormalities are discovered. Condensed versions of the important data from the patient’s records are also included. After comparing the anomalies with the patient’s records, the results are then put into a second stage, which compares the results with the help of vast patient-driven related patient data and patient-independent clinical knowledge. This method allows for accurate and efficient clinical judgement and diagnosis.

Conceptual Framework A number of challenges prevent AI from being fully integrated into healthcare, especially in emergency rooms. The literature review focuses on these topics, though, as the study’s scope is limited to those parts of technology and legal requirements. The fundamental objective of the framework described below is to organise the findings. It is founded on the research question, the definition of the problem and the literature review. According to the literature, artificial intelligence (AI) has a variety of potential benefits in the field of healthcare, including diagnostics, clinical decision-making and bio-surveillance; for instance, AI can serve as a doctor’s helper [9]. Figurative framework is shown in Fig. 4.4. It can also save time that would otherwise be spent on administrative activities by medical personnel. According to studies, AI is highly good at forecasting health-related occurrences. The literature does, however, also highlight several drawbacks, such as the potential for broken machinery.

4 The Impact of Artificial Intelligence on Healthcare

65

Fig. 4.4 Conceptual framework

Methodology To answer the research question, this paper uses both a qualitative and quantitative study’s methodology [10]. A qualitative research is good for providing “how” answers as opposed to “how many” answers since it offers an informative portrayal of words, discourse and texts. Additionally, because a qualitative method enables a better knowledge of the research, it is advantageous for contextualising complicated events. The use of a variety of data collection methods is another typical feature of qualitative research. This study includes a quantitative component. A quantitative approach collects numerical data to look into the relationships between variables. The objective of this study, which is to evaluate how diagnostic accuracy impacts healthcare and how artificial intelligence in healthcare connects to ethical and legal concerns, is in keeping with the qualitative and quantitative methods that were employed in this work. Despite the fact that some would claim that integrating many methodologies requires more time, it has been demonstrated that the benefits typically exceed the disadvantages in terms of obtaining a more thorough understanding of the phenomenon being examined [11]. The problem at hand is approached deductively in this work. An inductive method is one in which significant patterns, themes and linkages are found by using the information and characteristics of the material. When employing an inductive approach, literature is first read to help the researcher grasp the topic better. The research was conducted in a non-linear, iterative manner and was primarily broken down into five segments [12]. The figure below depicts the research procedure. The research topic in Fig. 4.5 resulted occur from a method. This led to a problem statement and ultimately an iteratively developed research question. The literature

66

S. Rajput et al.

Fig. 4.5 Conceptual framework

review, which involved critically compiling information from the body of existing knowledge in the research field, was conducted after the research topic. A conceptual framework was inspired by the literature review in relation to the formulation of the research question and problem. Following a review of the literature and the creation of a framework, interviews and surveys were used to collect empirical data. The literature review, interviews and survey were among the primary and secondary sources used to collect the data for this paper. The literature improved general understanding of the topic and made it easier to comprehend the conclusions drawn from the interviews. Saunders, Lewis and Thornhill [13] contend that it is best to gather data from unaffiliated sources. The fact that the data in this report was gathered using various techniques broadens our understanding of the phenomenon under study.

4 The Impact of Artificial Intelligence on Healthcare

67

Data Analysis We are collecting data by conducting interview with expert, doctors and patient. The data gathering in this paper was obtained from primary and secondary sources, including literature review, interviews and a survey. The literature helped to increase the overall knowledge of the subject and also facilitated the understanding of the findings made in the interviews. According to Saunders, Lewis and Thornhill [13] independent sources are preferable when collecting data. Since the data in this report were collected through different methods, it adds further perspective on the researched phenomena.

AI-Based Approach for COVID-19 Treatment Without complete knowledge, artificial intelligence (AI) is capable of actively learning and analysing various types of data [14]. How thorough data mining might uncover uncommon anomalies, and how categorisation could be supported by facts or suspicions. AI technology, which has effectively proved its tremendous potential in the medical and healthcare industries, is transforming how health services are delivered. Large datasets and a wide range of computing techniques helped AI evolve. A global COVID-19 epidemic is predicted to occur by Wu et al. in accordance with their susceptible-exposed-infectious-recovered metapopulation model, which was published on 31 January 2020. The next step was to alter the dynamic mitigation and suppression treatments using a multivariate prediction model. The application of AI-based techniques can be broadly divided into four categories: prognostic prediction, characterisation and severity assessment, screening and diagnosis, and patient tracking and monitoring.

Tracing and Monitoring Breaking the chain of transmission and promptly isolating affected individuals are necessary for pandemic control. AI-based smartphone apps have made it simpler to install real-time tracing solutions. A smartphone was used to gather real-time spatiotemporal trajectory data [15], correctly tracking and monitoring each person using the Global Positioning System (GPS), social media and geospatial AI. Given COVID-19’s strong infectivity, this type of endeavour is crucial for keeping it under control. Despite privacy and confidentiality problems, these applications were well liked and supported, and a number of comparable smartphone apps were utilised in other countries. Because automated contact tracing is a mobile phone app-based intervention, there are a few things to remember. It is crucial to obtain virtually complete compliance without anybody feeling afraid of isolation or quarantine once individuals with

68

S. Rajput et al.

confirmed illnesses have been notified. Additionally, numerous models or applications were developed and utilised in different countries. For the purpose of globally limiting a pandemic, a universal network or platform for exchanging contact tracing data could be more beneficial. Despite ample evidence of transmission from asymptomatic and presymptomatic carriers, contact tracking utilising existing applications is still challenging.

Screening and Diagnosis Early discovery and diagnosis are crucial since COVID-19 is so contagious and infectious. Real-time reverse transcriptase polymerase chain reaction (RT-PCR) testing is the gold standard for confirming a COVID-19 diagnosis. Testing, however, is timeand resource-intensive [16]. The poor sensitivity of the RT-PCR test, which has been shown to range from 37 to 71% in early investigations, is another important barrier to the early identification of COVID-19. For the diagnosis of COVID-19, thoracic imaging is crucial in addition to RT-PCR testing. However, the sensitivity and specificity of COVID-19 detection in chest imaging were perceived differently by radiologists. Many AI-based radiological diagnostic systems based on radiographs or CT scans offer high sensitivity, specificity, or accuracy, which is equivalent to the diagnosis provided by skilled radiologists, according to studies. Various AI-based instruments are required in initial stage of COVID-19 identification and prevention of spreading to other persons. We have used oximeter for measuring the oxygen level and digital meter for measuring blood pressure and pulse rate. Given flowchart describes a detailed process for identification of disease and how to confirm by positive or negative result with the help of characteristics. Various types of methods are used but AI-based method is effective and produces accurate result with correct prediction with future perspective. Other typical flaws in early research include (1) short sample sizes, which raise the possibility of overfitting and may impact the resilience of models; (2) merely using a single or a small number of datasets for training and testing and (3) studies that were mostly retrospectively constructed, with just a few models’ clinical efficacy being prospectively evaluated. Numerous studies have emphasised the ability of quantitative CT parameters to predict deterioration in health and unfavourable outcomes. On a chest CT [17], earlystage quantitative disease load suggested grave consequences. They discovered that older age and a greater consolidation burden in the upper lobes at admission were related with increased probabilities in their evaluation of poor outcomes using a 3D U-Net-based model. In a different multivariable regression model, the volumes of consolidation and ground glass opacities were discovered to be separate predictors of condition worsening or mortality. When these variables were incorporated into a model that was only based on clinical data, they likewise demonstrated incremental predictive effectiveness. Listed below is a basic COVID-19 detection method [18].

4 The Impact of Artificial Intelligence on Healthcare

69

These findings were in line with those of previous studies. The Light Gradient Boosting Machine and Cox proportional-hazard regression-based prediction system had an AUC of 0.8479 for predicting serious progression based on quantitative pulmonary lesion features of 556 patients. Performance dramatically increased when clinical factors were added, attaining an AUC of 0.9039 with an 86.71 percent sensitivity and an 80.00 percent specificity. In addition, some writers evaluated the relationship between the advancement of the pattern of pulmonary lesions and CT characteristics observed on admission (day 0) and on day 4 in their studies. The predictive value of CT characteristics on day 4 was improved by an AI-driven approach, although it was still inferior to that of changes in CT features. This is very useful to detect the disease and prevent from spreading. Given flowchart describes a detailed process for identification of disease and how to confirm by positive or negative result with the help of characteristics. Various types of methods are used but AI-based method is effective and produces accurate result with correct prediction with future perspective. Different precautions are required to protect from spreading and reduce the death cases or serious case (Fig. 4.6). The COVID-19 virus has spread widely and had a significant impact. The World Health Organization (WHO) designated it as a pandemic in March 2020. As the COVID-19 pandemic spreads, it is essential to concentrate on creating prediction models to help policymakers and health management allocate healthcare resources and avoid or restrict outbreaks. There were nine researches that attempted to predict the COVID-19 pandemic trend. Six of this research made predictions about the incidence, verified cases, fatalities and recoveries, development trend and likely stopping time of COVID-19 using long short-term memory (LSTM) models alone or in combination with other models [19]. used the Susceptible-Exposed-Infectious-Recovered (SEIR) model with machine learning to predict the development of the epidemic or determine the number of illnesses that were not reported. In order to break the COVID-19 link and stop the sickness from spreading, many individuals are detained in forced lockdowns. Additionally, the weakening of the global economy depicts a resistance to the monotony of daily life, according to COVID-19 [20]. Digital health societies, symposiums and contributors who employ technical interventions to stop the COVID-19 wave are at the forefront of the global scene [21]. Digital devices have very important role in pandemic situation to control the spreading growth in limited time. Managing case detection, examining the socioeconomic effects of the virus’s propagation and adopting public health mandates to limit the epidemic in the afflicted areas are a few examples of these interventions. Researchers are currently evaluating the interrelated combinations made available by digital technology in response to COVID-19. Table 4.1 lists some typical artificial intelligence jargon. Geographic information systems (GIS), sometimes referred to as mobile computer-based platforms, collect, preserve and categorise geographical data relevant to the earth’s surface. The embedded data cohorts may contain street signs, geographic locations, population census data and distinguishing vegetation within the system region coverage. Geographic information systems can lead to technical interventions for COVID-19 diagnoses through data reports that include the location of detected cases, hospital

70

Fig. 4.6 Flowchart for COVID identification

S. Rajput et al.

4 The Impact of Artificial Intelligence on Healthcare Table 4.1 Summary of the data collection

Interviewer

Profession

71 Topic

Expert 1

Professor

AI

Expert 2

Professor

ML

Expert 3

Professor

Law & regulations

Expert 4

Professor

AI

Doctor 1

Doctor

803.7

Doctor 2

Doctor

642.96

Doctor 3

Doctor

750.12

Patient 1

Business

Smart tools

Patient 1

Teacher

AI machine and smart gadgets

occupancy and patient compliance with public health standards. The system then conducts a complex geographical analysis of COVID-19-related data, assisting the authorities in closely monitoring the spread of the virus throughout the affected country and the rest of the world [22]. Below diagram shows the relationship between machine learning (ML), deep learning (DL), artificial intelligence (AI) and neural network (NN).All the technologies are interdependent and most popular nowadays. Without complete knowledge, neural network and artificial intelligence (AI) are capable of actively learning and analysing various types of data [23]. How thorough data mining might uncover uncommon anomalies, and how categorization could be supported by facts or suspicions. AI technology, which has effectively proved its tremendous potential in the medical and healthcare industries, is transforming how health services are delivered. Large datasets and a wide range of computing techniques helped AI evolve (Fig. 4.7). In COVID-19 pandemic various smart devices are used which are based on artificial intelligence technology. Various smart devices used from last 7 years and he data from 2015 to 2020 were analysed. In Table 4.2, the graph shows the data of various

Fig. 4.7 Relation between AI, ML, DL and NN

72

S. Rajput et al.

Table 4.2 Glossary of common artificial intelligence terminology Terminology

Definition

Deep learning

The classification of random data in order to compute human independent critical thinking, interpretation and resolution

Machine learning

This is to identify the data into different formats

Convolutional neural network (CNN) A deep learning approach for artificial intelligence systems intended for separating multimedia content (i.e. video, audio and images) Supervised learning

Utilising catalogued data to understand data

Unsupervised learning

Utilising uncatalogued data to understand data

Natural language processing (NLP)

The recording of accumulated information received through conversation and natural linguistics

Nature-inspired computing (NIC)

The interdisciplinary integration of mathematics, computer science, theoretical physics, etc. to create new hardware and algorithms

devices purchased quarterly [24]. Digital devices [25] have very important role in pandemic situation to control the spreading growth in limited time. The amount is shown in dollars as per quarter. The table given below shows the cost of digital gadgets in dollars which is used in pandemic and consumed by different patients as per need. Another graph contains the growth rate of digital devices in pandemic situation (Fig. 4.8 and Table 4.3).

Fig. 4.8 Graph shows the growth rate of digital gadgets

4 The Impact of Artificial Intelligence on Healthcare Table 4.3 Static data table of digital gadgets

73

Year

Quarter

2015

Q1

553.66

Q2

446.5

Q3

446.5

Q4

714.4

Q1

803.7

Q2

642.96

Q3

750.12

Q4

875.14

Q1

910.86

2016

2017

2018

2019

2020

Amount in $M

Q2

875.14

Q3

1535.96

Q4

1035.88

Q1

1285.92

Q2

1750.28

Q3

1268.06

Q4

1160.9

Q1

1607.4

Q2

1607.4

Q3

2143.2

Q4

1482.38

Q1

1482.38

Q2

1500.24

Q3

2178.92

Challenges in AI-Based System AI (artificial intelligence) is transfiguring systems by improving the way in which healthcare system is delivered. Digital devices have very important role in pandemic situation to control the spreading growth in limited time [26]. Numerous hospitals and health-related institutions have amassed significant datasets in the form of age, gender, education, income and ethics available for record-keeping. Various doctors do not have proper knowledge of disease and he/she tried various experiments on patient and this causes various patients to have some side effects, but artificial intelligence removes this type of problem and it is helpful in various sectors of healthcare systems like drug research, diagnosis, patient monitoring care systems, etc. In the healthcare industries/companies, artificial intelligence helps to gather previous records like doctor prescription, test report, previous disease and related issue by digital health records for disease prevention and diagnosis. The most frequent causes for individuals to visit the emergency room, according to the doctors who were

74

S. Rajput et al.

interviewed, are heart problems, blood pressure problems, problems with fractures, stomach discomfort, chest pain, breathing problems and dizziness. The academics, professors, physicians and politicians surveyed for the thesis had a favourable opinion of AI in healthcare. The following are some of the respondents’ opinions on the use of AI in healthcare: I think it’s absolutely positive, it’s something that is going to happen. And it’s going to happen relatively fast (Professor Azizpour). To be able to get to the correct diagnosis faster, I think that AI can have a very large positive impact.

AI, Ethics and Laws and Regulations in Healthcare Release Burden Many AI-based diagnostic instruments based on radiographs or CT scans provide very high sensitivity, result, specificity or accuracy, which is equivalent to the diagnosis, provided by skilled radiologists, as per the studies and mentioned in book chapters by researchers [27]. Different AI-based digital devices are required in initial stage of COVID-19 identification and prevention of spreading to other persons [28]. We have used oximeter for measuring the oxygen level and digital meter for measuring blood pressure and pulse rate. Given flowchart describes a detailed process for identification of disease and how to confirm by positive or negative result with the help of characteristics. Various types of methods are used but AI-based method is effective and produces accurate result with correct prediction with future prospective. A research study where an AI focussed on important diagnosis factors with an accuracy of 94.2% in diagnosing, surpassing a range of professionals in the field of eye illnesses, is one example that comes to light in the literature review. After receiving training on 14,894 scans, this outcome was obtained. The AI described in the demonstration above, which was trained on 14,894 scans and then reached a 94.2% accuracy, was focussed on 53 important qualities. Practitioners in the emergency room may not have the time to concentrate on several important characteristics. Additionally, it’s not assured that they have the finest expertise in each field given the range of circumstances that arise in emergencies.

Decision Support The outputs of the expertise show that artificial intelligence won’t be able to make decisions on its own [29]. Instead, it will serve as a decision support tool for the doctor. Doctors won’t practise independently, claims the person after them, because

4 The Impact of Artificial Intelligence on Healthcare

75

healthcare depends on both hard data and soft data. Medical professionals cannot yet be replaced by AI since it can only currently handle hard data. According to Professor Funk, keeping all of the new medical data is beyond human capacity and that medical knowledge suddenly rises in less than a year. Professor Funk expands on next person-3’s discussion of uncommon disorders, which doctors often only face once or twice in their careers, and how AI may be helpful by managing thousands of medical data. He describes how doctors might look for comparable instances in a case library made up of prior cases [6]. Although AI technology can be considered to be sufficiently developed, there are still ethical and legal concerns. A supervisory medical professional will be required in addition if you’re capable of inspiring uncertainty. However, situations when the caregiver’s and the AI’s suggested diagnoses differ might be troublesome. This may lead caregivers to choose diagnoses that are biased and consistent with AI’s hypothesis. Legal restrictions like the Patient Data Act and the General Data Protection Regulation prevent Sweden from connecting separate patient journals to one another. As a result of these rules, AI today cannot perform as it ought to, such as a case library for caretakers.

Acceptance of AI A general acceptability is necessary for AI to be deployed in healthcare [14], even though not everyone needs to grasp the technology. There are various levels of acceptance, for instance, among patients, physicians, management, etc. Transparency, according to Professor Azizpour, is essential for winning over people’s trust and approval [30]. He adds that the model can provide justifications for the decision, as opposed to just displaying the outcome. According to researcher Karlsson, in order for people to transmit knowledge, the technical aspects must first be understood by them through media like newspapers and movies. When this technology reaches the point at which it is accepted, like in the researcher Karlsson’s example with the traffic lights, then everyone no longer has to be aware of the underlying details because a widespread acceptance has been developed. However, Researcher Karlsson highlighted a potential situation in which the technology is advanced, tested and allowed to enter the market by law, because medical expert refuse to utilise it in the contingency room but it is novel and weird. This suggests that even when there is evidence that the technology is functional, acceptance from the various partners is still necessary. Because of the limited capabilities of AI that Doctor 2 says, acceptance has not yet fully developed [31]. AI is not capable of seeing, hearing or feeling, according to Doctor 2. Some patients prefer conventional face-to-face interactions, yet a lack of these three factors may make it difficult for patients and machines to communicate with one another. Doctor 3 explains how interacting with people makes it possible to see subtleties connected to human instinct that an AI system would not. Despite these difficulties, there are indicators that patients embrace remote care. The geriatrician who employs

76

S. Rajput et al.

technology, which patients and staff have found to be beneficial and appreciated, is mentioned by next person three. Additionally, the fact that Kry, a digital healthcare provider, is currently used by 200000 people (a figure that is constantly increasing) demonstrates that many patients are accepting digital treatment, which can be taken as a sign that the transformation of healthcare is moving in the direction of AI acceptance.

The Issue of Data Quality The literature review revealed amazing potential applications for AI, including earlier and more precise cancer, eye, radiology and viral pandemic forecasts, among other diagnoses [32]. This demonstrates that an AI system [33] is operational, and it is further backed by the opinions of several interviewees who believe that the technology is advanced enough but that the issue is with the data needed to make the diagnosis. There are problems with data quality that are connected to improving accuracy. Models may not be typical of all patients and may provide a false image of reality since there may not be enough data. The requirements are plainly there given that information in the healthcare sector may be obtained from a variety of sources, including social media, public data, wearable technology, electronic health records, etc. Numerous interview respondents as well as authors have cited the EHRs as a significant data source. The issue with medical records is that they frequently include inaccuracies and don’t have a uniform format. Professor Funk emphasised the need of following standardised protocols to guarantee that all patient situations have a consistent structure. Additionally, patient follow-ups are required to wrap up these situations. One way to use data to get useful information is to integrate data from diverse systems spread across different areas. When they employ various data collecting techniques, connecting these places becomes challenging. A lot of data is also limited or safeguarded for security and legal reasons. People are willing to share patient data to assist other patients, according to the poll done for this study. Additionally, technological options like blockchain technology and federated data are viewed as adequate ways to secure patient data. Staff members in the emergency room frequently have to wait for patients, which makes for a stressful work atmosphere. Doctors must evaluate their diagnosis, the course of action and the outcomes in order to construct the full cases Professor Funk discuss. A review of the outcomes is also necessary. This might be a difficult assignment for a doctor operating in the emergency room. There is also a lack of motivation to do so, which contributes to the shortage of time, because medical personnel has little incentive to input data in the systems other than to comply with legal duties. Since the accuracy of AI for a diagnostic purpose is impacted by data quality, there is some argument that potential advancements in the emergency department are related to it.

4 The Impact of Artificial Intelligence on Healthcare

77

If the data is more complete in a few healthcare categories but does not cover all elements to meet the varied emergency scenarios, it must be considered whether an AI is acceptable in the emergency room.

Conclusions By responding to the research questions, this chapter presents the main research findings. The consequences for humans are then briefly discussed in regard to the study. Finally, recommendations for potential future applications and restrictions are presented. Professors, researchers, physicians and politicians who were questioned concurred that AI should function as aid in healthcare for medical professionals. AI’s involvement can help caretakers make better educated judgments and diagnoses because medical knowledge is growing quickly and humans are unable to store it all. It is thought that AI technology is advanced enough to provide more precise diagnoses. Due to its scarcity, lack of universal organisation and incompleteness, today’s data quality is a concern. Data from patient follow-ups must be added to the data to complete it. The emergency room has several different parts, and the patients there have a range of diseases and degrees of severity. The volume of patients in emergency rooms can be extremely high because illness knows no bounds. As a result, working as a doctor involves a lot of stress and is very demanding. By reducing the administrative workload for workers, AI has the power to lighten the load. This indicates that the patient is given the most attention feasible. The significance of patient data for diagnostic reasons has become obvious from this study. The management of patient data is currently governed by numerous rules and regulations, which limits its application. The strict requirements are mostly justified by security concerns related to the handling and sharing of medical data. The study concludes that there are adequate methods for safeguarding the patient data needed to train AI models. Additionally, the poll’s findings show that individuals are open to disclosing their patient data if it would help them. The laws as they currently stand do not support the promise of AI as a technology. As a result, AI in healthcare is now experiencing problems. Because healthcare development is a high-risk industry, it must be carried out with the utmost care. The EU has released a law that takes into account AI’s use in high-risk sectors and that it should be safeguarded and developed.

References 1. Donev, D., Kovacic, L., Laaser, U.: The Role and Organization of Health Care Systems (2013) 2. Durach, C.F., Kembro, J., Wieland, A.: A new paradigm for systematic literature reviews in supply chain management. J. Supply Chain Manag. 53(4), 67–85 (2017)

78

S. Rajput et al.

3. Barh, D.: Artificial Intelligence in Precision Health: From Concept to Applications, 1st edn. Academic Press (2020) 4. Belliger, A., Krieger, D.J.: The digital transformation of healthcare. In: Progress in IS, pp. 311– 326 (2018). https://doi.org/10.1007/978-3-319-73546-7_19 5. Blomkvist, P., Hallin, A.: Metod för teknologer: Examensarbete enligt 4-fasmodellen. 1st edn (2015). Lund: Student litteratur AB Bohr, A., Memarzadeh, K.: Artificial Intelligence in Healthcare. San Diego: Elsevier Science & Technology (2020) 6. Young, D.W., Ballarin, E.: Strategic decision-making in healthcare organizations: it is time to get serious. Int. J. Health Plann. Manage. 21(3), 173–191 (2006). pmid:17044545 7. Bhf. (n.d). Third of Heart Attack Patients are Misdiagnosed, Research Claims. Retrieved 11 March 2021, from https://www.bhf.org.uk/informationsupport/heart-matters-magazine/news/ behind-the-headlines/misdiagnosis 8. Davenport, T., Kalakota, R.: The potential for artificial intelligence in healthcare. Futur. Healthc. J. 6(2), 94–98 (2019). https://doi.org/10.7861/futurehosp.6-2-94 9. McCradden, M., Joshi, S., Anderson, J., Mazwi, M., Goldenberg, A., Zlotnik Shaul, R. (2020) 10. Creswell, J.W.: Research Design—Qualitative, Quantitative, and Mixed Methods Approaches, 3rd edn. SAGE Publications, Thousand Oaks, California (2009) 11. Danske Bank: KRY - en vårdande revolution (2021). https://danskebank.se/foretag/senastenytt/artiklar/kry-en-vardande-revolution 12. Lalmi, F., Adala, L.: The fourth industrial revolution: implementation of artificial intelligence for growing business success. Stud. Comput. Intell. (2021). https://doi.org/10.1007/978-3-03062796-663 13. Saunders, M., Lewis, P., Thornhill, A.: Research Methods For Business Students, 7th edn. Pearson Education Limited, Harlow (2015) 14. Copeland, B.J.: Artificial Intelligence (2020). Encyclopedia Britannica. https://www.britan nica.com/technology/artificial-intelligence 15. Grote, T., Berens, P.: On the ethics of algorithmic decision-making in healthcare. J. Med. Ethics 46(3), 205–211 (2019). https://doi.org/10.1136/medethics-2019-105586. 16. Faris, H., Habib, M., Faris, M., Alomari, A., Castillo, P. and Alomari, M.: Classification of Arabic healthcare questions based on word embeddings learned from massive consultations: a deep learning approach. J. Ambient. Intell. Hum. Comput. (2021) 17. Densen, P.: Challenges and opportunities facing medical education. Trans. Am. Clin. Climatol. Assoc. 122, 48 (2011) 18. Li, H., Boulanger, P.: A survey of heart anomaly detection using ambulatory electrocardiogram (ECG). Sensors 20(5), 1461 (2020). https://doi.org/10.3390/s20051461 19. Hazarika, I.: Artificial intelligence: opportunities and implications for the health workforce. Int. Health 12(4), 241–245 (2020). https://doi.org/10.1093/inthealth/ihaa007 20. Haenssle, H., Fink, C., Schneiderbauer, R., Toberer, F., Buhl, T., Blum, A., et al.: Man against machine: diagnostic performance of a deep learning convolutional neural network for dermoscopic melanoma recognition in comparison to 58 dermatologists. Ann. Oncol. 29(8), 1836–1842 (2018). https://doi.org/10.1093/annonc/mdy166 21. Kuhlman, T., Farrington, J.: What is sustainability? Sustainability 2(11), 3436–3448 (2010). https://doi.org/10.3390/su2113436 22. Kustrin, S., Beresford, R.: Basic concepts of artificial neural network (ANN) modeling and its application in pharmaceutical research. J. Pharm. Biomed. Anal. 22(5), 717–727 (2000) 23. Hamid, S.: The opportunities and risks of artificial intelligence in medicine and healthcare. CUSPE Commun. (2016). https://doi.org/10.17863/CAM.25624 24. Cheng, I.S.: Emergency Department Crowding and Hospital Patient Flow: Influential Factors and Evidence-Informed Solutions. Karolinska Institutet (2016) 25. Gibbert, M., Ruigrok, W., Wicki, B.: What passes as a rigorous case study? Strateg. Manag. J. 29(13), 1465–1474 (2008) 26. Arnold, M.: Teasing out artificial intelligence in medicine: an ethical critique of artificial intelligence and machine learning in medicine. J. Bioethical Inq. (2021)

4 The Impact of Artificial Intelligence on Healthcare

79

27. Ericsson, A. (2015). Patientlagen och Patientdatalagen. Några lagar som styrvårdadministratörens arbete.https://docplayer.se/14651577-Patientlagen-och-patientdatalagen.html 28. Gerke, S., Minssen, T. and Cohen, G.: Ethical and legal challenges of artificial intelligencedriven healthcare. Artif. Intell. Healthc., 295–336 (2020) 29. Shahid, N., Rappon, T., Berta, W.: Applications of artificial neural networks in health care organizational decision-making: a scoping review. PLOS ONE 14(2) (2019) 30. Holm, S., Stanton, C., Bartlett, B.: A new argument for no-fault compensation in health care: the introduction of artificial intelligence systems. Health Care Anal. (2021). https://doi.org/10. 1007/s10728-021-00430-4 31. Honavar, V.G.: Artificial intelligence: an overview (2014). https://faculty.ist.psu.edu/vhonavar/ Courses/ai/handout1.pdf 32. Frick, N., Mirbabaie, M., Stieglitz, S., & Salomon, J.: Maneuvering through the stormy seas of digital transformation: the impact of empowering leadership on the AI readiness of enterprises. J. Decis. Syst., 1–24 (2021) 33. Funk, P. (2015). Why hybrid Case-Based Reasoning will Change the Future of Health Science and Healthcare.

Chapter 5

Brain Tumor Segmentation of MR Images Using SVM and Fuzzy Classifier in Machine Learning Ashish Mishra, Meena Tiwari, Jyoti Mishra, and Bui Thanh Hung

Introduction Clinical photograph handling—a cutting-edge mechanical development is presently the span of the review. You can settle on harmless arrangements utilizing logical representation. Clinical imaging innovation utilizes harmless strategies, including PET, CT [1, 2], ultrasound, SPECT, X-beam, and attractive reverberation test. In the field of clinical conclusion, atomic attractive reverberation (NMR) gives improved results than figured tomography representation which further develops examination between individual delicate tissues of the human body [3]. The most straightforward obligation of advanced pictures in medication is the source code. Photograph extension. Pictures from various logical instruments (Xbeams, X-ray, X-beam registered tomography) are frequently hazy and uproarious; this clamor separating is utilizing low-level picture handling calculations (denoising, smoothing, aspect location) with boundary values set to novel disturbance, expanding how much helpful details are accessible to expert specialist who deciphers it (Antoine A. Mishra (B) Department of Computer Science and Engineering, Gyan Ganga Institute of Technology and Sciences, Jabalpur, M.P., India e-mail: [email protected] M. Tiwari Department of Computer Science and Engineering, Shri Ram Institute of Science and Technology, Jabalpur, M.P., India J. Mishra Department of Mathematics, Gyan Ganga Institute of Technology and Sciences, Jabalpur, M.P., India e-mail: [email protected] B. T. Hung Data Science Laboratory, Faculty of Information Technology, Industrial University of Ho Chi Minh City, Ho Chi Minh City, Vietnam © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2023 A. Mishra and J. C.-W. Lin (eds.), Industry 4.0 and Healthcare, Advanced Technologies and Societal Change, https://doi.org/10.1007/978-981-99-1949-9_5

81

82

A. Mishra et al.

Mainz and Max Viergewer 1998). The X-ray test areas of strength utilize discipline to coincidentally find RF heartbeats to acquire exhaustive pictures of inward organs, smooth tissues, bones, and different designs [4, 5]. Picture catch, pre-handling, picture handling (counting type and division), and the show comprise the picture handling gadg [20]. Picture handling is a technique for changing a picture over completely to a computerized configuration and performing sound decrease procedure on it to make a better picture or to extricate significant data records. This is a way to deal with signal dissemination where the info is photo, including a video body or a picture, and the outcome is a photo or capability related with the picture. In most extreme picture handling frameworks, pictures are handled as layered markers and sign handling calculations applied to them [6, 7]. The three stages that involve picture handling are as per the following: I. Optical scan or virtual images for photo import. II. Analysis and manipulation of images, including data. Photo compression and enlargement as well as noise reduction algorithms to identify styles that are not visible to the naked eye. III. The conclusion is the last step when the photo or report is based mainly on an image analysis may be relevant.

Purpose of Picture Handling Picture handling can be separated into five classifications. Foster most elevated pix, imaging approach originally used to concentrate on objects that are not apparent, detectable through the picture and fix process. The motivation behind the photograph search is to track down the picture of interest. Layout estimation is utilized to decide the size of stunning articles in photograph. The objective of photograph acknowledgment is to recognize gadgets in a photograph.

Methods of Picture Handling There are two strategies accessible in picture handling.

Analog Picture Handling Simple photograph handling is a way to deal with changing a photograph utilizing electric designing. The TV picture is the most widely recognized model. A TV signal is a level of voltage that vacillates in plentifulness up to accept obvious brilliance. The presence of the showed picture changes electrical sign transformation. Splendor and differentiation changes the television changes the plentifulness and connection of the video signal, change the brilliance of the showed photograph to ease up, obscure, or change.

5 Brain Tumor Segmentation of MR Images Using SVM and Fuzzy …

83

Advanced Picture Handling Simple photograph handling is a way to deal with changing a photograph utilizing electric designing. The TV picture is the most well-known model. A TV signal is a level of voltage that vacillates in plentifulness up to take the clear splendor. The presence of the showed picture changes electrical sign transformation. Splendor and differentiation change the television changes the plentifulness and association of the video signal, change the brilliance of the showed photograph to make it lighter, more obscure or change.

Brain Cancers Clinical innovation is the object of a mechanical skill that concentrates on the inside work of the human body concentrate on contamination in human body, specialists utilize various imaging strategies. Each image methodology has its own arrangement of abilities and boundaries to human life structures [26]. Cerebrum cancers are among the most perilous issues between different complex infections [11]. The growth is the term utilized in medication to make sense of the unprecedented expansion of cells inside the human body. Growths are named harmless or dangerous principally based on its development rate and other clinical attributes [14]. Vidyarthi and Mittal’s dangerous growths are threatening in light of the fact that they are the most it can prompt passing. This type of growth spreads quickly. all through the human body, influencing the fundamental organs. The growth is conventional undirected improvement of cells. A harmless growth is equivalent to disease. It doesn’t enter into adjoining tissues or on the other hand flip to various pieces of the edge, similar to disease does. Estimate for harmless growths is typically good. Growth of the psyche—blast unfamiliar cells in cerebrum tissue. Cerebrum growths can be harmless (not disease cells) or threatening (fast development of most malignant growth cells). Some essential cerebrum growths that implies they start in the mind. Different kinds of malignant growth are optional or metastatic, meaning they start somewhere else inside the structure and spread in the cerebrum. Albeit the reasons for most diseases are complicated to figure out what are the generally realized materials related with most sorts of malignant growth forestall arrangement. Tiredness, seizures, disarray, and conduct irregularities—all potential side effects of the psyche of most malignant growth cells.

Primary Cerebrum Growths Essential mind cancers emerge in the cerebrum or close by tissues alongside meninges (meninges), cranial nerves, pituitary organ or pineal organ. Essential mind disease creates while ordinary cells DNA blunders (transformations) collect.

84

A. Mishra et al.

Secondary Mind Cancers Optional (metastatic) mind growths are cancers that develop when disease spreads (metastasizes) from one more piece of the body to mind. Harmful cancers are the contrary prerequisite for this.

MRI Scans X-ray is the most dependable imaging methodology for diagnosing greatest kinds of disease of the psyche. These sweeps give total pictures to know about the utilization of attractive fields and radio waves rather than X-beams and PCs. Josephine Perlo, Christoph Mulder, Ernesto Danieli, Christian Show by Hopmann, Bernhard Blumich, and Federico Casanova (X-ray) noticeable “cuts” of the cerebrum that can be sorted out to shape a three-layered representation of the growth [13]. picture, you can utilize a near color. In a X-ray filter, one of solid attractive field added substances is utilized to clean the radio recurrence; driving forces; and give total photos of organs, smooth tissues, bones and different designs of the human body. X-ray is a valuable strategy that offers broadly acknowledged realities about the engineering of the engaged psyche cancer, considering the right forecast, therapy, and follow-up illness. Its strategies have been refined to offer exchange estimations. Inner and encompassing essential and metastatic mind cancers, like edema, volumetric deformation, and physical attributes inside the growth, among others [20]. The programmed portrayal of tissue highlights can be acquired in any tissue plane utilizing an X-ray. X-ray can likewise be used to finish the psychological mass location strategy. For clinical photograph handling, photograph handling and photograph upgrade innovation are used to augment the top notch photograph. Thresholding and assessment change techniques are utilized to underscore the highlights of X-ray photographs implemented.

Image Classification Techniques SVM SVM is a bunch of style classification rules. One against the remainder of classifiers used to make various wonder SVMs, and one that offers view the information utilizing the enormous box. Separated capabilities with each finger impression photograph shipped off the multi-delightful SVM ace, which fabricates the SVM structure. The underlying realities are then given to SVM classifier along with control information to give a characterized end-product of fingerprints [26]. Blunder remedied style classifier leaves codes intended to assist with limiting numerous paired classifiers, which makes it helpful for learning different classes and SVM is one of them. Thus, most

5 Brain Tumor Segmentation of MR Images Using SVM and Fuzzy …

85

importantly, the above classifier should be gotten the hang of utilizing some preparation measurements accessible. Qualified classifiers who made out of preparing data then utilized inside the kind of method to decide the brilliance of another information check. Moreover, our mistake rectified leaves code classifier streamlines staggered categorizing undertakings to a straightforward twofold classifier.

Literature Survey A way to deal with clinical photography for the discovery of most kinds of malignant growth [14]. This studio dives into the subtleties of numerous philosophies showing how cross-breed acknowledgment techniques have been completed to break down most sorts of malignant growth and treatment classes. There are three fundamental kinds of malignant growth issues, including bosom, liver, and cerebrum cancers, and the utilization for most diseases has been very much talked about dataset. 3D field programmable entryway exhibit (FPGA) ultrasound registered tomography with visual recreation calculations employed to analyze dangerous infections. At last, he affirmed an outline of the different conditions of most disease identification and indicative advances. The psyche [22] is a bunch of many white platelets that make up the significant caution cells of the human body machine (CNS). A cerebrum growth is some sort of odd improvement in the cells that happens when cells duplicate suddenly and can cause passing in the event that they are as of now not accessible are found early. The most well-known sorts of cerebrum growths incorporate meningioma, glioma, and pituitary cancers. Experts face troubles in perceiving and order of cerebrum growths, and its exactness relies upon its aptitude. Data innovation should be utilized to overcome these deterrents. As clinical principles improve, engineered insight turns out to be exceptionally strong in identifying and arranging mind tissue. Specialists are extremely inspired by this. It is under development right now utilizing different gadget securing models and profound learning[28]. Different frameworks learning calculations have been utilized in the future to identify growths and group them with X-ray photos including SVM, KNN, and CNN. These calculations, be that as it may, don’t give superb outcomes. Thus, we proposed a DNN-based VGG-16 local area for work on the precision of X-ray pictures of numerous classifications of a cerebrum growth. At long last, after 20 ages of variant preparation, our suggested model gives great outcomes with 98% exactness To recognize most cerebrum diseases with X-ray [23] suggested a cross-breed worldview like Convolutional Brain Organization (CNN) and Brain Autoregressive Dissemination Assessment (NADE). They naturally recovered the qualities and trusted that the records will proliferate. The strategy inspected 3064 CE-X-ray depictions of 233 individuals, including 1,426 glioma pictures, 708 meningioma pictures, and 930 pituitary growth pictures. They have demonstrated that they are the proposed strategy which has a grouping exactness of 95.3%. There ought to be more conversation about the outcomes. Cerebrum infections, including mind cancers and different sorts of sclerosis, in which the utilization of X-ray can be perceived [24] as an unequivocal methodology for their choice.

86

A. Mishra et al.

In cerebrum, a neoplastic illness happens when uncommon synapses create crazy. METER. S. delayed infection that disturbs the sensory system of the psyche. Atomic attractive reverberation (NMR) is the best technique for insight; furthermore, analyze more than one sclerosis (MS) and threat (X-ray). Since malignant growth and various sclerosis are like such an extent that clinical mistakes the guess can inflict any kind of damage or even passing to the person in question. Convolutionally the brain local area started to be involved by the creators for the conclusion of mind growths and more than one sclerosis simultaneously. Therefore, researchers have fit for diagnosing malignant growth and numerous sclerosis with 96% precision Marton “Child”, “Lasl” on “Rusko” and Balazs “Chebfalvi” (M) Arton Toth [15] proposed a strategy for discovery of physical locales in 3D clinical pictures. Each of the hub segment of the photo is set apart with the life systems to which it relates compared to the utilization of this strategy. Head (neck), chest, mid-region, the pelvis, and legs were the regions found. Introduce on PC pre-cut classification, famous picture handling strategies, and equipment study gadgets were utilized. To keep away from misclassifications, the preprocessing step of the methodology consolidates the anticipated request and size of physical locales. The proposed approach is being tried for entire body (middle) and part-body X-ray assessments and viewed as powerful. Top of the line dynamic systems have been created for clinical picture handling not entirely set in stone with the assistance of [17]. Clinical perception is the main symptomatic apparatus to find the presence precise issues. Because of the imaging climate; constraints of the body imaging framework; and top of the line restricting factors, including commotion and obscure, getting an image demonstrated troublesome, and best case scenario. Super goal methodologies can be utilized to coordinate such pictures and keep away from this. Subsequently, expanding the photograph goal ought to increment fundamental capacity to analyze and recommend treatment. Plus, improved goal might have to further develop auto-location and prominence incredibly.

The Proposed Methodology Picture handling has been a basic issue in numerous areas such as biomedical examination as well as logical practice. Clinical imaging handling has taken extraordinary steps in finding and division mind growths after location by figured tomography (CT). Division of mind growths on a CT examination is being tried because of the assortment of the shape, region and power of the photo. This section investigates the mind. Investigation of cancer photography utilizing highlight extraction and parallel division strategies with cutting-edge enhancement methods.

5 Brain Tumor Segmentation of MR Images Using SVM and Fuzzy …

87

Brain Tumor Analysis Methodology A cerebrum growth is an extremely hazardous disease for individuals that is intracranial mass framed by an unusual expansion in tissue inside mind and in some other case inside a locale of the cerebrum. A proficient technique is utilized which represents pix as a non-cancer psyche or cerebrum growth. Segmental. This system utilizes a piece of the growth from CT pictures of the mind. This technique incorporates 1 2 3 4

pretreatment, component extraction, grouping, and divisions.

As a matter of some importance, the preprocessing framework can be set up on top of the CT. Mental photographs to further develop picture exactness. Hence, furthermore, the extraction of chances from moderate photography is followed through the comparing components used to set up the classifier. Justification for cool method for ordering customary and abnormal photographs of the psychological picture of the CT in the radiance of the minor worth [26]. After the example steps, the cerebrum cancer thing goes next division. A bunching technique is applied alongside an improvement procedure. These improvement systems are utilized to adorn division accuracy; furthermore, the ideal centroid is found to acquire cancer explicit piece of the cerebrum picture.

Pre-Processing Russian realities. Advancement of sound decrease and picture definition. Techniques suggest an essential component in the execution of the picture is great. Versatile middle cleaning is utilized to work on the capacity to change a channel to switch its length over completely to. Wants are identified by estimating the thickness of neighboring commotion. Adjusted The middle channel depends on a slant comparator in which the heavy driving edge can be controlled as a neighborhood weight administrator [19]. Middle sifting is one of the notable techniques applied to diminish persuasive commotion levels from counterfeit photographs’ daily existence (Fig. 5.1). The middle channel isn’t generally ready to isolate messy pixels from commotion and loud ones. In the event that the picture is loud and addresses the anticipated pixels, the upsides of the adjoining pixels in the picture and along the picture are around 0.and 255 in this channel, then this picture undermines the pixel delivering with the versatile middle is erased.

88

A. Mishra et al.

Fig. 5.1 Block outline of division cycle

Feature Extraction The feature extraction technique is used to extract unique features. of brain computed tomography snapshots after image preprocessing techniques were implemented. Sample reputation and image processing, feature extraction it’s a completely unique type of dimensionality reduction. When the algorithm enters the statistics are too extensive to be organized in any way, and it is suspected that deliberately repeating, the input stat becomes a discounted stat set of representations (feature vector). Computed tomography of the brain the scans are validated as part of the feature extraction approach. Feature extraction gives way to a hidden validation method. Peculiarity extraction involves converting the input image values into an array of possibilities. In this study, intentional possibilities.

Gray-Level Run-Length Matrix (GLRLM) The dim-level run-length strategy is an extraction technique which expanded demand for quantifiable surface information. The quantity of degrees dull G, in the picture is consistently decreased by prequantization before lattice conglomeration [28]. Quality network works that can be extricated for surface handling are called GLRLM. Understanding the surface is pressure dark pixel support in header well defined for the reference pixels. Number of friends and family with pixels that have similar dim strain in a specific header are known as send-off span. The dim-level runtime cluster is a two-layered exhibit, where every part z(u, v/8) is a scope of components from v to profundity or inside course 8. Dark-level string length framework delivered as

5 Brain Tumor Segmentation of MR Images Using SVM and Fuzzy …

89

GLRLM gets more information by utilizing the force of dim the worth level backings value[28]. These realities give bits of knowledge into textural part of the photograph. They are the following: • • • • •

Short-term focus (SRE). Long-term focus (LRE). Gray-level inequality (GLN). Run-length non-uniformity (RLN). Run percentage (RP).

Classification After the trademark extraction strategy portrayed over, the first technique is utilized to recognize the cancer part. Picture type separates the analyzed photographs into predefined classes utilizing the accessible method that differentiations plan from photos and instances of objectives. After highlight extraction, the class strategy is related with recognizable regular and normal artworks. Here, note reasoning of growth research because of the different grouping and division procedures. There are frameworks. A. Assessment of cerebrum growths in light of the Fuzzy strategy. B. Examination of mind cancers is dependent altogether upon SVM with fuzzy procedure.

The Assessment of a Mind Growth is Predominantly Founded on a Fluffy Philosophy. The block graph of the proposed strategies is displayed in Fig. 5.2, furthermore, Fig. 5.1. How about we start with the preprocessing stage, where middle cleaning is performed on mind CT pictures to defer blunders and work on the clearness of cerebrum pictures. Highlight extraction can be finished assuming the elements removed from the picture are utilized to prepare with the ANFIS classifier, which arranges ordinary and surprising photograph [22].

Grouping of the Utilization of Versatile Neurofuzzy Induction Gadgets (ANFIS) Sorter Adaptive Neuro-Fuzzy Inference System (ANFIS)—amazing recognized approach that brings the benefits of each strategy, named below: • Artificial neural network.

90

A. Mishra et al.

Fig. 5.2 Mind cancer investigation in view of fuzzy technique

• Fuzzy inference system. Counterfeit Brain Organization (RNA) ANN is computational design stirred by method perceptible in standard design’s natural neurons in the brain. Sensory systems are generally ready in three layers in which various interconnected hubs comprise a “trigger component” [18]. Every neuron applies enactment peculiarity to its net commitment to decide its result sign. Fake brain network has three layers, for instance, input layer, stowed away layer, and result layer. Design of the ANN input design enters the machine through the information layer, which no less than one secret layer where the real handling is finished utilizing relationship of weighted affiliations. Secret layers allude to the result layer where the necessary result is referenced. An ANN can see and learn related styles between endless input and assess the objective qualities. ANN mirrors the dominating style human psyche and can take care of issues comprising of prompt and complex records, no matter what the likelihood that the records are incorrect and uproarious.

5 Brain Tumor Segmentation of MR Images Using SVM and Fuzzy …

91

Fuzzy Inference System Fuzzy or neuro-fluffy brain network framework is a gadget discovering that uncovered the constraints of fuzzy (fluffy sets, fluffy principles) through abuse of punctuation approximations from brain organizations. The fluffy induction is the type of itemized guide on input fluffy set to create fluffy set utilizing fluffy on the grounds. Angle drop and backpropagation estimations are utilized to fit club chance boundaries (diffuse units) alongside with defuzzification loads (brain organizations) for fluffy brain organization networks [19]. An essential type of fluffy derivation structure is progressing in rice.

The Design Changes from a Reasonable Progress to an Etymological A variable uses club capacities situated in the fluffy records dataset. Incorporates three groups that foster the design contributions to the best results of the structure. This means 1. fuzzification, 2. time of association and government, and 3. defuzzification. Database: A database characterizing membership opportunity. Fuzzy sets used as part of fuzzy hints. Fuzzification: A method for changing fresh information values into semantic qualities is called fuzzification and comprises two techniques. In the first place, the approaching elements are changed over into phonetic contemplations talk utilizing fluffy sets. Etymological factors are either info or result. Outline factors whose characteristics are taken from the capability vernacular, not number signs. For this situation, the participation of the element capabilities is related with gauges and the level of reality in every creation is permitted [21]. Enrollment age and rules: Participation capabilities are utilized as a component of the FIS fuzzification and defuzzification exercises to characterize fuzzy information highlights for fuzzy phonetic terms as well as the other way around. The participation capability is utilized to assess semantics wrapped up. The most popular states of clubs are three-sided, trapezoidal with Gaussian side shapes. To ponder making rules in the FIS, a base of rules is being assembled for controlling the result variable. A fuzzy rule is a basic in the event that standard that uses condition notwithstanding the end. Evaluations of bush achievements and opposite impacts of the mandates of a man or a lady tasks finished with fuzzy sets. Defuzzification: In instance of an unmistakable gauge system, the excess fuzzy result should be defuzzied. This improvement toward the finish of the FLS DE fuzzifier

92

A. Mishra et al.

fragment. Defuzzification is finished through the club normal for the result variable. This can be utilized with uncommon techniques, for example, gravity, neighborhood bisector, they mean the larger part, the littlest of the greater part, and the biggest of the greater part.

Three Versatile Neuro-Fluffy Induction Framework (ANFIS) Classifier ANFIS is a versatile local area. The versatile local area is an organization hub with coordinated joints. It is called versatile in light of the fact that some or most hub individuals have choices that influence the result hub. These organizations utilize an association between inputs in notwithstanding flights. The ANFIS strategy considers the necessities and elements of the club from the data. Order done indeed in two basic strides, to be exact, the level of learning and confirmation level. In the preparation stage, the classifier is prepared with the concentrate preparing measurement highlights.

Reference Vector Framework (SVM) Reference vector framework (SVM) based appraisal of cerebrum growths with diffuse method. The block graph of the examination work is displayed in Fig. 5.3. For the better class impacts, we need to go to the SVM classifier with diffuse strategy. After the component extraction technique is done, the gathered highlights are utilized for preparing by the SVM classifier.

SVM This is a controlled obtaining of information about the methodology from the framework region acquiring tissue information for each sort and relapse. The SVM approach is to find a hyperplane that works on the calculation. The region and limits of the characterization blunder when given the directly recognizable intricacy of the class. SVM is viewed as valuable machine for nonlinear classification. With the picked aim of leaving the game, aberrant technique, bit capacities are introduced inside the SVM class. Two key non-significant stages in the SVM cycle: level of training and level of check. Given the association of the elaboration of the delineations, every one of which stands apart for having a region with one of guidelines, the preparation set of SVM rules makes a rendition, new instances of one kind or the other.

5 Brain Tumor Segmentation of MR Images Using SVM and Fuzzy …

93

Fig. 5.3 Mind cancer investigation in light of SVM with fuzzy strategy

Picture Division Utilizing C-implies Fluffy Grouping Picture division is a significant apparatus to lessen data in photograph handling. The motivation behind picture division is to separate huge insights of unessential measurements for information execution repeal anticipated to adorn the approach to interpreting. By and large, photograph division techniques might at any point can be separated into four classes: edge, bunch, perspective recognition, and choice of regions. Here the grouping is essentially founded on fluffy C-implies [27]. FCM could be considered for picture division and regarding the division interaction that it goes through.

94

A. Mishra et al.

Fluffy C Strategy (FCM) Bunching In photograph division, the FCM gadget is an unvalidated approach. FCM grouping is here and is generally utilized for bunching where FCM test relies upon boot bunch guarantee focus or upgrade engraving to the pixel upsides of the growth picture [14]. FCM estimation begins with a blueprint instated bunch objects (or) inconsistent club values inside a given picture. To explain the focal point of the cancer bunch, a piece of the photo is utilized streamlining system [21]. In the division strategy features, the conspicuous picture of the cancer on the grounds that the commitment to the division framework restricts the piece of the growth photograph, it will be it appeared. The main objective of iterative bunching and fluffy c-implies the estimation is to limit the age of the heap inside the aggregated populace squared blunder rate [13].

Enhancement Calculation Streamlining directs the way toward picking a pleasant association area for each feasible answer for boost or limit the expense per issue. This technique consented to take a portioned photograph and perform enhancement of the centroid of the growth photo with most extreme accuracy (Fig. 5.4). 1. Gray wolf optimization. 2. Social spider optimization (SSO) technique with genetic algorithm (GA). Gray Wolf Optimization Gray wolf optimization (GWO) is a recently presented developmental calculation, which recommends that the gray wolves have a flourishing generation more than hunting in the pack. The gray wolf begins the relocation operation by picking one pack.

Fig. 5.4 Segmentation process

5 Brain Tumor Segmentation of MR Images Using SVM and Fuzzy …

95

Fig. 5.5 Flowchart of GWO

Fig. 5.6 Sample CT brain image database

He has a decent well-being detail known as first rate CP. Elective bundles moved to a superior PC. In all rates (with the exception of the great rate) the variety of relocating wolves ought not be something similar. Dim wolves who have a well-being score progressed to some wellness consistently. The valuation in your rate could arrive at a good rate. After a relocation activity, a decent rate might have bigger male or female wolves, so finished resolve activity to lessen the proportions of the wolf selective rate. In the determination activity, the wellness score of every image is positioned and handled in rising request (diminishing trademark). The chosen individual is found in the wellness evaluation pool, where factor change bundles can haphazardly create new people for dislodging the wolf to resettlement. The joined Fig. 5.5 shows a graphical viewpoint of the GWO (see Figs. 5.6 and 5.7).

96

A. Mishra et al.

Fig. 5.7 Comparative analysis for Validation 1

First Drawing Limitations (ANFIS Classifier, FCM with GWO Improvement) • The primary disservice of the educational system is the deficiency of time. ANFIS while picking the sort of local area design. • SVMs have a totally special reaction on the grounds that the streamlining issue is a bend. • This is a favorable position contrasted with brain organizations, which have many responses connected with neighborhood minima and hence may not be intense about a few models. • The generally utilized Dim Wolf Enhancement (GWO) calculation has a few impediments of low accuracy, slow intermingling, on frightening neighborhood capacity to look. • To beat these negative parts of GWO, a social bug streamlining (SSO) strategy with a hereditary calculation (GA) is proposed which tackle restricted improvement issues likewise to accomplish the best accuracy of the fragmented growth part of the photo.

5 Brain Tumor Segmentation of MR Images Using SVM and Fuzzy …

97

Ideal Centroid for FCM 1. In the division strategy, unmistakable qualities got from an exceptional FCM runs end up being the most productive FCM centroids. 2. At last, the utilization of the ideal centroid prompts the expulsion of the cancer component. Mental symbolism on CT filter. 3. The most helpful FCM centroid guarantees the most elevated precision of proposed work.

Result and Discussion This closeness is demonstrative of temporary grouping results. A cycle did on CT pictures of the mind. On CT filters, the growth of non-cancer parts pooled utilizing versatile neuro-fluffy point of interaction framework (ANFIS) and support vector machine (SVM) classifier [25]. The proposed approach works with cancer order. What’s more, non-cancer components. When the person is finished, threatening mind filters are additionally separated into parts to remove the site of the growth of these CT pictures of the cerebrum. The presentation of every classifier is estimated concerning awareness, explicitness, and accuracy. Awareness is the degree that decides the chance of impacts that are truly hopeful for so many that the person has a cancer. Particularity is the degree that chooses the likelihood of impacts that could be exceptionally terrible on the last objective that the person no longer has a growth. Accuracy is the action which chooses the likelihood of precisely the number of outcomes that are named (Fig. 5.8). Database description This informational collection covers various cut thicknesses, clamor levels, and profundity anomaly ranges. Weighted modular pictures with a size of 1 mm cut thickness, clamor 3%, and force contrast 20% in our review. The latest informational index assembled by master radiologists included photos of all modalities. This informational collection made progress pictures of realities that contrasted and the outcomes of our procedure against manual assessment by radiologists. Information caution technique for identification of cancers rose up out of open assets, including (actually looks at M/s Aarthi, Tirunelveli, India). This series of photographs contains recall CT pictures that incorporate pictures of the growth (150 numbers) and furthermore as non-growth photos (60 numbers), as displayed in Fig. 4.1. Set of realities of the pix mind is partitioned into two sections, for example, schooling dataset and test dataset, and all test pictures are created in 512 × 512 networks with 256 individual dark reaches.

98

A. Mishra et al.

Fig. 5.8 Relative investigation for Approval 2

Classification Approval Results The versatile neuro-fluffy connection point framework (ANFIS) and backing vector machine (SVM) techniques are utilized to survey the CT Brian growth arrangement strategy. The entire measure of realities is parted into subgroups for the utilization of this methodology. One subset is terminated in each part, and the classifier is prepared to utilize the stays. To endorse the investigation, the classifier is then applied to the ignored subgroup. This approach is rehashed till each subgroup has been depleted. They are separated into Genuine Positive (TP), Genuine Negative (TN), Bogus Positive (FP), and Misleading Negative (FN) classes (FN). TP: cancer part accurately set apart as growth, TN: typical region accurately plain as cancer, FP: typical region wrongly set apart as growth, FN: cancer region wrongly plain as cancer (Table 5.1). The SVM algorithm’s rule is to use a function called the SVM’s kernel function to convert a nonlinear separation goal into a linear transformation. The nonlinear specimens may be transformed into a high-dimensional future space using a kernel function, where the division of nonlinear samples or data is conceivable, resulting in a classification that is supporting (Fig. 5.9).

5 Brain Tumor Segmentation of MR Images Using SVM and Fuzzy …

99

Table 5.1 Classification results for SVM algorithm Image

TP

TN

FP

FN

Sensitivity

Specificity

Accuracy

1

10

3

7

7

0.59

0.3

0.48

2

5

3

9

1

0.333333

0.25

0.3

3

9

4

8

6

0.6

0.33

0.48

4

15

6

4

2

0.882353

0.6

0.78

5

7

5

7

8

0.466667

0.42

0.44

6

4

8

7

8

0.333333

0.53

0.44

7

7

5

7

8

0.466667

0.42

0.44

8

19

3

3

2

0.9

0.5

0.81

9

8

7

9

3

0.73

0.44

0.56

10

7

3

15

2

0.78

0.17

0.37

11

5

10

12

0

1

0.45

0.56

12

6

6

11

1

0.86

0.35

0.5

13

9

5

8

5

0.64

0.38

0.52

14

9

5

5

8

0.53

0.5

0.52

15

10

3

11

3

0.77

0.21

0.48

Fig. 5.9 Relative examination for Approval 3

100

A. Mishra et al.

Table 5.2 Order results for fuzzy classifier calculation Image

TP

TN

FP

FN

Sensitivity

Specificity

Accuracy

1

1985

62,237

1094

220

0.9

0.98

0.98

2

0

57,472

1441

6623

0

0.98

0.88

3

0

52,622

6262

6652

0

0.89

0.8

4

0

61,495

1186

2855

0

0.98

0.94

5

128

61,495

1252

2661

0.045895

0.98

0.94

6

1852

59,643

1186

2855

0.393457

0.98

0.94

7

1244

60,253

1185

2854

0.303563

0.98

0.94

8

0

61,495

2458

1583

0

0.96

0.94

9

0

61,595

1288

2753

0

0.98

0.94

10

9643

249,699

2589

213

0.98

0.99

0.99

11

17,364

241,943

2585

252

0.99

0.99

0.99

12

6745

253,632

1542

225

0.97

0.99

0.99

13

6152

254,002

1676

314

0.95

0.99

0.99

14

6521

253,535

1776

312

0.95

0.99

0.99

15

5841

254,213

1876

214

0.96

0.99

0.99

SVM is the default decision for mind cancer grouping after it highlights determination with portion class detachability. The precision, awareness, and explicitness of the SVM calculation may be in every way estimated. The above table shows the disarray network for the words TP, TN, FP, and FN from the ordinary and ground truth results for the computation of exactness, responsiveness, and particularity (Table 5.2). The presentation information for FCM without streamlining are displayed previously. Fluffy C-mean gives delicate division that might be switched over completely to hard division by permitting pixels to be signed up for groups with the most noteworthy outrageous gauge of participation coefficients. The objective of bunching is to make choice limits in view of unlabeled preparation information. In multilayered highlight space, bunching is the most common way of recognizing normal social event groupings. Since groups of changed structures and sizes might happen in complex component space, it is dangerous.

Validation 1 In a class strategy, generally talking, every classifier attempts to utilizing a bunch of first-request capabilities, specifically the general execution of each classifier is estimated for this arrangement of highlights. In this chart, the execution of every classifier is determined, and the fundamental trait of the first-request histogram set

5 Brain Tumor Segmentation of MR Images Using SVM and Fuzzy …

101

is utilized for grouping. Rice 7 is an example of test 1 methodology and determines class pictures and execution estimates awareness, explicitness, and accuracy for different systems, for example, ANFIS, support vector machine (SVM), backpropagation brain network organization (BNN), and brain network with spiral direct polarization. For this SVM blocks some other accuracy strategy changes from 7.2 to 10%, and different techniques likewise have some distinction in quality 5.26%. These fundamental checks group three depictions as growths and two live sweeps as cancerfree pictures. The BNN classifier is more proficient as for the precision rate, which is discernibly better than cutting-edge classifiers, for example, RBFN and ANFIS. Precision will increment with development inside a wide assortment of conceivable outcomes. This method is advanced with an exorbitant cost for accuracy in relationship with other accessible nonexclusive methodologies. The awareness increments with the development of the scope of qualities. This cycle shows up with more conspicuous responsiveness references in the evaluation utilizing different kinds of innovation. In essential control framework, 100 percent exactness is accomplished, as opposed to single techniques.

Approval 2 Approval 3 Grouping models are in numerous ways like oppressive elements and show just a slight distinction while utilizing quantifiable assessment techniques. Likewise, and above all, the models are not really for the most part inconsequential and truly match one another. Rice 8 and 9 show execution examination of the measurements of test 2 and test 3. These two tests in the resulting rendition are really gathered as Brian’s tests from growth to non-cancer with a superb precision of 95.4%. In spite of the ANFIS cycle, a rendition of the class is likewise made because of the relating structure for confirmation and assessment. The showing strategy is utilized to. The test technique, informational index, and method are applied to test exactness and proficiency of the organization prepared for characterization mind growths. Approaching mind CT examines are assembled by returned spread brain organization. These directional vector machines are ready to utilize straight piece capability. Generally speaking, positioning precision of the approval informational collection is 92.3 close to 100%. The aftereffect of the broad construction of the framework is that the classifier will, in general, be excessively delicate to learning realities and may display minimal speculative capacity to the student concealed the notes. The subsequent still up in the air by smoothing a boundary that likewise expects a goliath part in the ANFIS classifier, and the fitting smoothing boundary frequently relies upon current realities. Test execution of the SVM classifier is directed by the calculation quantifiable boundaries, like awareness, explicitness, and accuracy for different classifiers. Check 3 accomplishes improved results than test 3. Most serious accuracy came to 95.26%,

102

A. Mishra et al.

and awareness and responsiveness came to 96.23%, particularity as well as relative assessment. Additionally, higher exactness and responsiveness scores and lower show explicitness scores the best presentation.

Approval 4 Approval 5 Figure 5.10 address a similar examination for Approvals 4 and 5. Classifier of the advanced order structure, multiclass SVM, is used to characterize photographs. To test our component choice calculation and SVM classifier, we play the characterization with three classifiers, ANFIS, BNN and RBFN, in the wake of utilizing the choice capability calculation. Most approval records are too exact it is essentially acquired in the straight bit utilizing the SVM gadget. Five absolute approval records that can be considered at every approval, we utilize five photographs. In the check technique 1/3 direct SVM gives 100 percent precision, and contrasted with ANFIS and BNN, the exactness of SVM is 80% higher than that of differential strategies. In danger that the general polynomial piece execution with RBFN and the proposed SVM is differentiated and the overall execution of ANFIS, then, at that point, the deviation in accuracy is 80% (Table 5.3).

Fig. 5.10 Comparative analysis for validation 5

5 Brain Tumor Segmentation of MR Images Using SVM and Fuzzy … Table 5.3 Classification performance analysis

103

Original images

Classified image

ANFIS

SVM

Tumor

Tumor

85.62

90

Tumor

Normal

86.25

80

Tumor

Normal

95.45

95.68

Tumor

Tumor

98.2

97.2

Tumor

Tumor

94.5

94

Normal

Normal

96.2

96

Normal

Normal

94.25

95.2

Normal

Normal

75.25

78.25

Normal

Normal

92.2

93.2

Normal

Normal

88.2

90.25

The outcomes affirm the significance of body processed tomography in the examination and treatment of patients with auxiliary cerebrum cancers. Nonetheless, in patients, having a fundamental cerebrum cancer is our own. The survey proposes that an edge CT sweep might contribute close to nothing to its motivation. In some cases, movable division and picture characterization techniques, at last, are very costly, troublesome, or even challenging to choose fittingly and name the school measurement with your real evaluating. Learning is a central reason of numerous SVM and ANFIS. In light of calculations, in which classifiers should be coordinated before conceivable be connected with inquiries of division and type advancement, for various assortments of data, multi-picture search different species, and enterprises, all endeavors to pick instruction a progression of records and training should be revamped. Table 5.1 shows the sort accuracy examination with unique picture and ordered photographs for the two cycles. Critical if it’s not too much trouble, note that a portion of the elements are not added to the class, these are miles around 94% in versatile fluffy surmising framework (ANFIS), 96.2 percent in SVM which includes extraction. The table shows the accuracy order absolute productivity of classifiers in cancer anticipation of cerebrum registered tomography with featuring of capability for day-to-day and normal photographs.

Segmentation Outcomes Examination In the division technique, the cancer component is sectioned utilizing proposed procedure, for example, fluffy C-means with centroid advancement utilizing GWO and SSO with GA. The division cycle comprises single picture, classified growth photograph, and portioned photograph alongside broad execution measurements like awareness, particularity, accuracy, irregular rate, worldwide surprising mistake (GCE), and Change data (VI). For the test strategy part pictures are considered, and corresponding to these photos already the predefined boundaries are parsed.

104

A. Mishra et al.

Table 5.2 shows the division study for FCM with special boundaries, e.g., accuracy, arbitrary list (RI), worldwide consistency mistake (GCE) and data change (VI) for greatest part, the mind pictures are amazing and its incline isn’t uniform, the change of this non-uniform to uniform is incredibly hazardous. Without photograph division, this is exceptionally surprising. Envisioning the design of a photo for nervous system specialists or radiologists is troublesome. In this manner, the FCM division approach is utilized to isolate growth photo. Table 5.3 discusses about the division test for FCM with GWO. Scoring drives the idea of programmed division of cerebrum cancers. Routinely, the life structures of the mind should be visible utilizing an X-ray or CT check try test. In this audit, the CT picture is taken on the grounds that the info photograph is as far as possible. From the entry picture a photo of the reality of the floor truth, which alludes to information given utilizing an immediate idea. From that second on, part of the growth of the psyche is separated by fluffy gathering of Center C and more through Dark Wolf (Tables 5.4, 5.5, and 5.6). Table 5.4 shows a half breed way to deal with division investigation for FCM with SSO and GA enhancement methodologies. The table shows the eventual outcome of the analyzed boundaries, for instance, accuracy, Arbitrary File (RI), Worldwide Consistency Mistake (GCE), and Variety. The data (VI) is approved through FCM division with interpersonal organizations, Bug streamlining and GA center. Greatest surprising accuracy zero is performed on five cerebrum pictures. Over every single present boundary, higher outcome is included in FCM-SSO and GA dissimilar to FCM, FCM with SSO, and FCM with GA. Advance, the GCE subject shows at least blunders in supported system. Arbitrary photograph file 1 in the proposition the innovation is zero. 96, and not at all like FCM, the differentiation is 0.60 and a comparative score is made for every one of the boundaries. Essentially, the quantity of realities for photograph 1 in the proposed crossover approach is zero39 and, dissimilar to FCM, VI the gauge is excessively high, decreasing the accuracy portioned photograph.

Examination Assessment for Unmistakable Division Procedures Figure 5.11 exhibits the near test of various boundaries, for example, awareness, explicitness, and exactness for various enhancement draws near, as an example, FCM, FCM-GWO, FCM-SSO, FCM-GA, and FCM-SSO with GA. The imaging responsiveness, explicitness, and precision are low in the FCM research when stood out from the FCM with enhancement systems. FCM with individual streamlining system offers improved impacts which appeared differently in relation to FCM yet the sought-after ideal result is finished in the FCM with half breed enhancement best. Figure 5.12 really addresses the close to test of parametric patterns like irregular file, GCE, and VI with outstanding enhancement frameworks, for example, FCM, FCMGWO, FCM-SSO, FCM-GA, and FCM-SSO with GA. In the FCM strategy, the erratic report regard is 0.46, GCE is. 0.01, and VI is 3.39. The VI estimations are

5 Brain Tumor Segmentation of MR Images Using SVM and Fuzzy …

105

Table 5.4 Segmentation analysis for PCM

S.No

Input Image

Ground Truth

Segmented Part

Accuracy

RI

GCE

VI

1

0.59

0.36

0.01

3.65

2

0.55

0.33

0.02

3.99

3

0.56

0.35

0.02

4.06

4

0.57

0.36

0.02

3.98

5

0.57

0.35

0.02

3.88

6

0.59

0.32

0.01

5.2

7

0.6

0.33

0.01

4.23

8

0.62

0.33

0.02

4.06

9

0.57

0.36

0.01

3.45

10

0.54

0.31

0.01 1

3.88

too high inside the FCM while stood out from substitute philosophies which flash off minimal exactness inside the fragmented picture. The sought-after best impacts are executed inside the FCM with a hybridization system in which the VI is 0.96 measurements, GCE is 0.03, and RI is 0.98.

Assembly Diagram Investigation Figure 5.13 shows the assembly diagram assessment for the streamlining systems, for example, GA, SSO, GWO, and SSO with GA, and the outline is drawn between

106

A. Mishra et al.

Table 5.5 Segmentation analysis for FCM with GWO

S.No

Input image

Ground Truth Image

Segmented part

Accuracy

RI

GCE

VI

1

0.98

0.96

0.04

0.4

2

0.88

0.78

0.04

1.09

3

0.81

0.68

0.17

1.59

4

0.94

0.88

0.04

0.59

5

0.91

0.98

0.02

0.35

6

0.9

0.98

0.02

0.3

7

0.97

0.99

0.01

0.24

8

0.99

0.92

0.03

0.24

9

0.99

0.88

0.02

0.28

10

0.99

0.87

0.01

0.27

well-being trademark (exactness in the principal figure and responsiveness in the 2d figure) and a few emphases. The well-being highlight levels from 40 to 100 and the emphases from zero to a hundred. The GWO system gives the greatest least precision for each scope of emphases. The half and half advancement SSO with GA offers the most extreme expanded precision when diverged from the singular streamlining frameworks. As far on the grounds that the awareness similarly, best results are achieved in the crossover method and the base responsiveness is finished in the GA. For example, for one centesimal emphasis, the awareness expense is 78. Figure 5.14 portrays the awareness test of six different arrangements of photos with different advancement procedures, for example, GWO, GA, SSO, and SSO with GA. The responsiveness regard levels from 0 to 100. The responsiveness assessment of GWO is 93.8%, GA is 84%, SSO is 80%, and SSO with GA is 96% for the picture 1. So likewise, the responsiveness is tried for stand-out depictions. The diagram exhibits the half breed strategy which achieves the greatest serious responsiveness for all photos. Figure 5.15 shows the particularity assessment of various psyche

5 Brain Tumor Segmentation of MR Images Using SVM and Fuzzy …

107

Table 5.6 Segmentation analysis for FCM with SSO with GA

SI.no

Input image

Ground Truth Image

Segmented part

Accuracy

RI

GCE

VI

1

0.98

0.96 0.04 0.39

2

0.88

0.78 0.04 1.07

3

0.81

0.68 0.17 1.57

4

0.94

0.88 0.04 0.59

5

0.76

0.59 0.06 2.29

6

0.99

0.98 0.02 0.25

7

0.99

0.97 0.03 0.31

8

0.99

0.98 0.02 0.35

9

0.99

0.97 0.02

10

0.99

0.98 0.02 0.24

0.3

photographs with four exceptional advancement techniques, for example, GWO, GA, SSO, and SSO with GA. The outline is drawn among particularity which stages from zero to hundred and 6 units of pics. The most amazing particularity regard is gotten in picture 3 which appeared differently in relation to unmistakable pix and particularly in the SSO with GA improvement strategy. Figure 5.16 depicts the exactness of the divided mind cancer a piece of the photo with remarkable streamlining strategies like GWO, GA, SSO, and SSO with GA. The GWO regard is diverged from the half and half methodology regard, with a differentiation is set almost 0.3% for every one of the six pictures. Taking into account the six pics, the most extreme exactness is done in the half breed procedure for the fifth picture stood out from selective techniques.

108

Fig. 5.11 Relative investigation

Fig. 5.12 Near examination for RI, GCE, and VI

A. Mishra et al.

5 Brain Tumor Segmentation of MR Images Using SVM and Fuzzy …

Fig. 5.13 Intermingling chart examination

Fig. 5.14 Responsiveness examination

109

110

A. Mishra et al.

Fig. 5.15 Explicitness examination

Fig. 5.16 Exactness examination

Conclusion The division of cerebrum growth picture from the info CT picture in the perspective on some component extractions, class, and division improvement was tried along the streamlining methodologies. Here, exact styles of classifiers, for instance, ANFIS and SVM taken after via FCM division and two enhancement calculations like GWO; furthermore, SSO with GA has been inspected to see the growth a piece of the image.

5 Brain Tumor Segmentation of MR Images Using SVM and Fuzzy …

111

References 1. Khan, H., Masoom Shah, P., Ali Shah, M., ul Islam, S., Rodrigues, J.: Cascading handcrafted features and Convolutional Neural Network for IoT-enabled brain tumor segmentation. Comput. Commun. 153, 196–207 (2020) 2. Ali, F., Riazul Islam, S.M., Kwak, D., Khan, P., Ullah, N., Yoo, S., Kwak, K.S.: Type-2 fuzzy ontology–aided recommendation systems for IoT-based healthcare. Comput. Commun. 119, 138–155 (2018) 3. Mano, L.Y., Faiçal, B.S., Nakamura, L.H.V., Gomes, P.H., Libralon, G.L., Meneguete, I.R., Filho, G.P.R., et al., Exploiting IoT technologies for enhancing Health Smart Homes through patient identification and emotion recognition. Comput. Commun. 89, 178–190 (2016) 4. Ud Din, I., Guizani, M., Rodrigues, J.J.P.C., Hassan, S., Korotaev, V.V.: Machine learning in the Internet of Things: designed techniques for smart cities, Future Generat. Comput. Syst. 100, 826–843 (2019) 5. Younus, M.U., Islam, S.U., Ali, I., S. Khan, Khan, M.K.: A survey on software defined networking enabled smart buildings: architecture, challenges and use cases. J. Netw. Comput. Appl. 137, 62–77 (2019) 6. Bhatti, F., Shah, M.A., Maple, C., Islam, S.U.: A novel internet of things-enabled accident detection and reporting system for smart city environments. Sensors 19(9), 2071 (2019) 7. Vilela, P.H., Rodrigues, J.J.P.C., Solic, P., Saleem, K., Furtado, V.: Performance evaluation of a fog-assisted IoT solution for e-health applications. Future Generat. Comput. Syst. 97, 379–386 (2019) 8. Limaye, A., Adegbija, T.: HERMIT: a benchmark suite for the internet of medical things. IEEE Internet Things J. 5(5), 4212–4222 (2018) 9. Rajan, P.G., Sundar, C.: Brain tumor detection and segmentation by intensity adjustment. J. Med. Syst. 43(8), 282 (2019) 10. Bhanumathi, V., Sangeetha, R.: CNN based training and classification of MRI brain images. In: 2019 5th International Conference on Advanced Computing & Communication Systems (ICACCS), IEEE (pp. 129–133) (2019) 11. Urban, G., Bendszus, M., Hamprecht, F., Kleesiek, J.: “Multi-modal brain tumor segmentation using deep convolutional neural networks” MICCAI BraTS (brain tumor segmentation) challenge. Proc. Win. Contribut., 31–35 (2014) 12. Yi, D., Zhou, M., Zhao, C., Gevaert, O.: 3-D convolutional neural networks for glioblastoma segmentation, arXiv preprint arXiv:1611.04534 (2016) 13. Baris Kayalibay, Grady Jensen, Patrick van der Smagt, CNN-based segmentation of medical imaging data, arXiv preprint arXiv:1701.03056 (2017). R. Vankdothu and M.A. Hameed Measurement: Sensors 24 (2022) 100440 19 14. Darshini Priya, Velusamy, Porkumaran Karandharaj, Medical Image Processing Schemes for Cancer Detection: A Survey Medical Image Processing Schemes for Cancer Detection: A Survey”, 2014, pp. 1–40. 15. Toth, M., Rusko, L., Csébfalvi, B.: Automatic recognition of anatomical regions in threedimensional medical images 35, 1240–1251, 5 (2016) 16. Havaei, M., Davy, A., Warde-Farley, D., Antoine, B., Courville, A., Bengio, Y., Pal, C., Jodoin, P.-M., Larochelle, H.: Brain tumor segmentation with deep neural networks. Med. Image Anal. 35, 18–31 (2017) 17. Jithin Saji, I., Kulkarni, R.: Scaling up of low resolution images using super resolution techniques & performing intensity correction for medical imaging. J. Biomed. Eng. Med. Imag. 2(6), 1–40 (2015) 18. Khayati, R., Vafadust, M., Towhidkhah, F., Nabavi, S.M.: Fully automatic segmentation of multiple sclerosis lesions in brain MR FLAIR images using adaptive mixtures method and Markov random field model. Comput. Biol. Med. 38(2008), 379–390 (2008) 19. Layeb, A., Lahouesna, N., Kireche, B.: A multi-objective binary cuckoo search for Bi-criteria knapsack problem, I. J. Inf. Eng. Electron. Bus. 4(2), 8–15 (2013)

112

A. Mishra et al.

20. Ramaswamy Reddy, A., Prasad, E.V., Reddy, L.S.S.: Abnormality detection of brain MRI images using a new spatial FCM algorithm. Int. J. Eng. Sci. Adv. Technol. 2(1), 1–7 ((2012)) 21. Mekhmoukh, A., Mokrani, K., Cheriet, M.: A modified kernelized fuzzy C-means algorithm for noisy images segmentation: application to MRI images. IJCSI Int. J. Comput. Sci. Iss. 9(1), 172–176 (2020) 22. Srikanth, B., Venkata Suryanarayana, S.: Multi-class classification of brain tumor images using data augmentation with deep neural network. Mater. Today Proc. (2021) 23. Hashemzehi, R., Mahdavi, S.J.S., Kheirabadi, M., Kamel, S.R.: Detection of brain tumors from MRI images base on deep learning using hybrid models CNN and NADE. Biocybern. Biomed. Eng. 40(3), 1225–1232 (2020) 24. Halimeh, S., Teshnehlab, M.D.: Diagnosing and Classification Tumor and M.S. Simultaneous Magnetic Resonance Images Using Convolution Neural Network, CFIS (2019) 25. Mishra, A., Bhatt, N.: A review of predicting heart disease using machine learning model. Ann For Res 65(1), 7516–7520 (2022) ISSN: 1844-8135, 2065–2445 (2022) 26. Mishra, A.: Cloud Virtual Image Security for Medical Data Processing/317. Taylor and Francis CRC Press, USA, 9781003038399 (2020) 27. Mishra, A.: Medical Data Security Using Block chain and Machine Learning in Cloud Computing/347. Taylor and Francis CRC Press, USA, 9781003038399 (2020) 28. Tarannum, A.T., Mishra, J., Bharadwaj, R.: Fixed point result with soft complete metric space. J Adv Res Dyn & Control Syst, Scopus 10 (2018)

Chapter 6

Artificial Intelligent Model for Riot and Violence Detection that Largely Affect Societal Health and Local Healthcare System Mahaveer Jain, Praveen Bhanodia, and Kamal K. Sethi

Introduction Artificial intelligence is the technology of the future; definitely it has the potential to change human life at large. AI basically provides a brain to the technological system that can handle the different challenging situations at very ease. AI has plenty of uses in the field of security and healthcare systems. CCTV surveillance is a common way of security for any place that is under threat. It is very difficult to monitor them by human beings to detect anomalies or unwanted activities 24/7. With increase in surveillance cameras, this challenge is increasing day by day. Human supervision may gain error and it also fosters the chances of manipulation possibilities by human beings and also the need for a specific well-trained human-being in the first place. A CCTV involves the use of a storage mechanism and remotely situated camera and an operator. A CCTV camera captures the video and sends it to the monitor of the base station that is viewed by the operator to find anomaly and suspicious activity or to record evidence, but the identification of anomaly is proportional to the attention of the operator on every video feed of the screen. Therefore, because of the low operator to screen ratio, the running of many videos feeds on the same screen, and the behavioral environment of the command room, it is not practical for a CCTV operator to monitor every action of video feeds all the time with absolute focus and caution. Therefore, there is always a chance that some anomalous activity won’t be picked up. The operator may go blind from constant viewing of the video feed. It M. Jain (B) · P. Bhanodia (B) · K. K. Sethi Department of Computer Science & Engineering , Rabindranath Tagore University, Raisen, MP Bhopal, India e-mail: [email protected] P. Bhanodia e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2023 A. Mishra and J. C.-W. Lin (eds.), Industry 4.0 and Healthcare, Advanced Technologies and Societal Change, https://doi.org/10.1007/978-981-99-1949-9_6

113

114

M. Jain et al.

is the possibility of temporary blindness after 30 to 40 min of vigorous monitoring. Therefore, automating the detection of anomalous behavior in surveillance video is necessary to maximize the effectiveness of CCTV surveillance while minimizing operator overload [1]. Public places like college, school, restaurant, and roads are most vulnerable from a safety point of view. Normally these places are secured with surveillance cameras but how to detect anomalies and unwanted behavior from these surveillance videos is a challenge that motivates me to find some automated system that could analyze the video for any violence-like activity. Violence detection from surveillance video is a part of activity recognition, i.e., used for detecting activities like dancing, painting, singing, and many more. Here I am restricting my research work to detect only disruptive or violence activities. Violence in any form has a worse effect on health and the local healthcare system. Violence may be domestic or it can be at the community level in both forms it may have many issues like: mental, emotional, physical, depression, anxiety, etc. AI may help us to detect violence through surveillance cameras and it may help us to take timely action to stop the violence and if violence is happening then it can be used as evidence against violator. A variety of tasks benefit from the ability to spot violence and abrupt patterns in images and video sequences. For CCTV video surveillance and monitoring, spotting suspicious activity or strange patterns is essential. Video’s behavioral saliency can be used to catch viewers’ attention [2]. Artificial intelligence is showing its capability to identify and stop common employee theft, scams, insider trading, health problems, and business dangers. AI has been embraced by many large businesses and organizations to identify and stop financial fraud and frauds. Social media platforms have successfully employed machine learning to control content like child pornography and fake news. Businesses have been utilizing AI to reduce risk and detect fraud. The goal of this study is as follows: • To find the scope of AI in the field of security like CCTV Surveillance. • To find the scope of AI in the field of healthcare systems. • To find different methods for violence detection in the field of deep learning and machine learning. • To study the effect of violence on human health and local healthcare system For making this study qualitative and inclusive various online resources and research publications have been accessed and their references have been provided at the end of this study.

6 Artificial Intelligent Model for Riot and Violence Detection that Largely …

115

Table 6.1 Basic concepts of artificial intelligence for violence detection S. no.

Feature

Description

1

Computer vision

This feature helps us to extract information from images, videos, and other visual inputs

2

Centroid

The all-point average position of an object shape or space dimension of an object is called the centroid [7]

3

Direction

Any object lies along a line from the point where the object is directed

4

Dimension

Property of a space that measures in length, width, and object thickness toward a given direction

5

Acceleration of images

Change of velocity or speed over the time unit [7]

6

Spatio-temporal

The feature related to time and space of objects

7

Violence

Activities that are violent in nature like fighting, rioting, beating, etc. are called Violence

8

Riot

A violent disturbance of the peace by a crowd

9

Movement

An Action in which objects change their position in videos [7]

10

Speed

Movement Speed of an object from a specific place to another place [7]

Artificial Intelligence for Riot and Violence Detection Basic Concepts of AI for Riot and Violence Some primary features related to anomaly, riot, violence, video, and image are discussed in Table 6.1. The Table describes their basic descriptions. Nowadays the topic of interest for researchers is computer vision. It’s because there are so many applications that deal with video and image analysis. Identification of the actions taken by the object is part of the analysis of photos and videos. The entire process of activity recognition begins with the capture of photos or the making of videos. Figure 6.1 depicts the fundamental architecture of activity recognition. Researchers have put forth a number of methods for activity recognition that use various feature representation algorithms [3–6].

Classification of Riot and Violence Detection Techniques Riot and Violence are serious offenses for any state and country and it may lead to unrest in all societies. In the discipline of action detection and recognition, the identification and recognition of such obstructive activities in surveillance recordings with the use of computer vision have become a highly active topic [8, 9]. Numerous scholars have provided various methodologies and methods for identifying riots,

116

M. Jain et al.

Fig. 6.1 A violence detection system’s basic steps

violent events, and other unusual occurrences in order to detect crimes with greater accuracy. The strategies for detecting violence that have been proposed recently are listed below. The methods used to detect acts of violence and riots can be divided into three primary groups: machine learning, SVM, and deep learning.

Detection of Violence and Riots Using Machine Learning Techniques Machine Learning has several algorithms like K Nearest Neighbors (KNN), Adaboost, and decision tree that can be used to detect violence in surveillance video (Table 6.2).

Violence and Riot Detection Using Support Vector Machine (SVM) Support vector machine (SVM) is the top classification algorithm among popular algorithms. It is one of the most widely used and effective statistical methods for regression and classification. SVM has the capacity to resolve both linear and nonlinear classification issues. The SVM deals with different types of classification problems including those that are linearly separable. When attempting to classify data using small sample, nonlinear, and high dimensional feature spaces, SVM works quite well. The main goal is to find a separation hyperplane that separates the data. Considering that the nonlinear problem frequently arises in practice, it is impossible to entirely isolate the data set [12]. Support vector machines (SVMs) are based on the structural risk reduction principle as opposed to conventional learning techniques like neural networks, which aim to minimize the training error. A sufficiently limited hypothesis space must constrain

6 Artificial Intelligent Model for Riot and Violence Detection that Largely …

117

Table 6.2 Illustrates the list of different detection methods used as classification Method used

Object detection method

Classification method

Population density

Detecting fight with motion blobs [10]

Binarization of images Spatio-temporal Dense method to extract blob

Kinetic framework by analyzing the posture [11]

Posture recognition using logistic regression

Joint angle for Less dense acquiring posture

Step detecting faces and violence in videos using the normalization method and vif descriptor [8]

CUDA approach with KLT face detector for face recognition

Method for histogram

Dense

Framework to detect violent set [8]

Demographic analysis approach

Ethnicity framework

Less dense

A totally different technique in which animal fighting to detect [8]

Motion recognition and optical flow method

Vif, OVif and ifV Less dense methods

the search for the optimal model or approximation function for a finite set of training data, which is fundamentally comparable to regularization. Finding functions that perfectly match the training data is achievable if the search space is large enough, but these functions will have a very poor generalization to unseen data. Violence-related actions cannot be detected using the manually chosen features. So, a two-channel approach based on violence detection is suggested. SVM is used to categorize images as violent or not violent, whereas CNN is used to extract the feature from the frame. The model includes label fusion, feature extraction, and SVM-based classification. To extract two features, CNN is used as the first channel. A video frame is utilized as the first feature to extract appearance, and the absolute change between subsequent frames is used as the second feature to extract motion. These characteristics serve as the SVM’s input [8] (Fig. 6.2).

Violence and Riot Detection Using Deep Learning Deep neural networks in action recognition or violence recognition have excellent success stories. Since convolutional neural networks were not intended to handle time-related information, it is obvious that they can only learn spatial types of information. The neural network-based approaches incorporated extra components that also extract the temporal information to address this issue. The neural network uses the difference between two successive frames as input to capture the temporal information, forcing it to pick up motion signals. The network cannot learn long temporal information with such a simple architecture, therefore, we added a convolutional Long Short-Term Memory (ConvLSTM) unit at the end of each CNN to address this issue. To record spatial correlations between aberrant acts and sceneries, employ the spatial stream. The optical flow picture is the input for the temporal stream, which

118

M. Jain et al.

Fig. 6.2 Block diagram of violence detection using SVM

extracts information on short-term activity, and the acceleration flow images are the input for the acceleration stream. Each stream operates independently and employs LSTM units to obtain temporal information. A SoftMax layer and a degree of confidence in the range [0,1] are outputs from each stream. To obtain the final score, the output from each stream is combined. The system’s design for person-to-person aggression makes it less effective in crowded environments, which is the fundamental issue with this work [13].

6 Artificial Intelligent Model for Riot and Violence Detection that Largely …

119

Below table describes different methods used for Violence/Activity Detection and Recognition using Machine Learning, Support Vector Machine, and Deep Learning (Table 6.3). Datasets: It is obvious that to make the deep learning model learn we need datasets. To find the right dataset is of greater importance for any deep learning model. There are three benchmark datasets are available: Hockey Fight Dataset: This dataset includes around 1000 footage of ice hockey games featuring 500 fighting and 500 non-fighting behaviors. Mostly this data set is used as a benchmark, by most of the methods. The collection contains films with uniform backgrounds and subjects. Crowd Violence Dataset: There are 246 videos of violent crime scenes in crowds in this collection (123 fighting and 123 non-fighting videos). The majority of the samples demonstrate football supporters at games. Modern algorithms typically do not meet the same precision in the Crowd Violence dataset as they do in the Hockey Fight dataset, and vice versa. Movie Violence Dataset: The Movie Violence dataset includes approximately 100 fight and non-fight videos that were taken from various movies [13].

Impact of Artificial Intelligence on Healthcare System Recently, the majority of healthcare service providers have started storing patient data digitally in an electronic health record (EHR). Medical experts are analyzing these historical datasets in order to raise the quality of their care and the client experience. By using machine learning techniques to study this data, new understandings into the pathogenesis of disease, clinical experts’ decisions during the care process, diagnosis, and improving the efficiency of a healthcare facility can be developed. Using well-known statistical methods, the structured patient record, such as a diagnosis, medication, and lab measurements, are reasonably simple to evaluate. To assess unstructured free text format, however, a lot of information in EHR is really captured in a more complicated manner in practice. Despite this challenge, clinical text is increasingly being used for research purposes in a variety of fields, including adverse event identification, phenotyping, and predictive analysis. These methods categorize text using established methods such as Naive Bayes and support vector machine models for text classification and bag-of-words and n-grams for text representation [40]. Keeping medical data in structured form makes analysis and comprehension easier and helps to better insight. The availability and precision of the underlying data play a major role in determining how well decisions are made. Smart data inclusion can considerably contribute and help to raise decision-making quality in the healthcare industry, where clinical decision-makers face numerous challenges and issues along the patient journey. Complex decisions in healthcare may go wrong due to

ViF descriptors with SVM for real-time crowd behavior detection [16]

Action recognition with trajectory-pooled deep-convolutional descriptors [17]

Crowd counting using CSRNet [18]

Protest activity detection using 50-layer ResNet [19]

Violence detection system using CNN and LSTM [20]

Riot detection using CNN and LSTM [21]

Understanding the highly congested scenes CSRNet (CNN) using CNN [22]

3

4

5

6

7

8

9

CNN along with LSTM

CNN along with LSTM

ResNet (CNN)

CSRNet (CNN)

CNN

ViF descriptors

CSRNet

CNN

CNN

ResNet

CSRNet

Trajectory pooled deep-convolutional Descriptor

Bag-of-Features

CNN

Violence detection fast sliding window-based approach [15]

2 Sliding Window

Pre-trained ResNet-50 ResNet-50 along with ConvLSTM

Violence detection using pre-trained ResNet-50 CNN along with ConvLSTM [14]

1

Feature extraction method

Object detection method

Method

#

Table 6.3. Violence/activity recognition and detection methods

Nil

SOFTMAX

SVM and K-nearest neighbor

Nil

Nil

Nil

Linear support vector machine

Linear support vector machine

SOFTMAX

Classification method/ activation function

Crowded

Crowded

Less Crowded

Both

Crowded

Less Crowded

Crowded

Both

Less Crowded

Scene type

Nil

82.75

97.43

Nil

Nil

78.70

Nil

96.53

89.90

Accuracy%

(continued)

Nil

224 × 224

200 × 200

Nil

300 × 450

224 × 224

Nil

Nil

256 × 256

Frame size

120 M. Jain et al.

CNN

CNN Nil Nil C3D (3D CNN)

Feature Vector Bag-of-features and Fisher vector CNN along with LSTM

Multi-view maximum DSIFT, HOG, LBP entropy discriminant (MVMED+)

Violence detection using ConvNets [24]

Video anomaly detection using KNN [25]

Visual abnormal event detection using optical flow histograms [26]

Using SVMs and 3D convolutional neural networks together to detect violence in videos [13]

Real-time anomaly detection using Gaussian classifier [27]

Action recognition with improved trajectories [28]

Video anomaly detection using predictive convolutional long short-term memory networks [29]

Maximum entropy discrimination for visual violence recognition using multi-view learning and deep features [30]

11

12

13

14

15

16

17

18

KNN

Nil

Nil

Classification method/ activation function

CNN

Bag of features and Fisher vector

Feature Vector

CNN

Maximum Entropy Discrimination

SOFTMAX

SVM

Gaussian classifier

SVM

Histograms of optical flow Single-class orientation SVM

3D Markov Random Field (MRF)

Bag-of-features

Violence detection using deep learning [23] MobileNet and SqueezeNet

Feature extraction method

10

Object detection method

Method

#

Table 6.3. (continued)

Both

Both

Both

Both

Both

Both

Both

Both

Less Crowded

Scene type

90.60

Nil

91.20

99.60

98.60

97.00

92.70

Nil

Nil

Accuracy%

(continued)

256 × 256

224 × 224

Nil

Nil

Nil

120 × 160

Nil

256 × 256

Frame size

6 Artificial Intelligent Model for Riot and Violence Detection that Largely … 121

CNN along with LSTM MobileNet CNN

Bidirectional LSTM-based big data analysis and deep learning, framework for real-time violence detection in football stadium [32]

The identification of violence using pre-trained models [33]

Pre-trained modules with various deep learning methods for detecting violence [34]

Deep learning techniques for recognizing violence in videos [35]

3D convolutional neural network-based violence detection using spatiotemporal features [36]

Real-time anomaly detection using neural networks in CCTV [37]

Abnormal activity recognition using CNN and LSTM [38]

20

21

22

23

24

25

26 Lightweight CNN and LSTM

CNN and RNN

CNN along with LSTM

ResNet50+ NN, VGG16+ NN

Top-ACLM

Framework for real-time violence detection Network with in football ground using bidirectional bidirectional LSTM and big data analysis [31] long-short-term memory (BDLSTM)

19

Object detection method

Method

#

Table 6.3. (continued)

YOLO-V4

InceptionV3 created by Google (A-CNN)

3D CNN

For extracting spatial and temporal information, use VGG-16 and LSTM

VGG16, VGG19 and ResNet50

VGG16, ResNet50

Top-ACLM

HOG

Feature extraction method

SOFTMAX

SOFTMAX

SOFTMAX

SOFTMAX

SOFTMAX

Relu

SVM

SOFTMAX

Classification method/ activation function

Less crowded

Both

Both

Both

Both

Both

Both

Both

Scene type

89.50

Nil

96.00

88.20

97.06

96.00

100.0

Nil

Accuracy%

(continued)

Nil

299 × 299

224 X 224

224 × 224

28 × 28

360 × 280

360 × 280

Nil

Frame size

122 M. Jain et al.

Method

Detecting crime scenes using ML [39]

#

27

Table 6.3. (continued)

CNN

Object detection method

YOLO

Feature extraction method

SOFTMAX

Classification method/ activation function Less crowded

Scene type

Nil

Accuracy%

Nil

Frame size

6 Artificial Intelligent Model for Riot and Violence Detection that Largely … 123

124

M. Jain et al.

data being unavailable or being too massive to review, information being missed, or suggestions being disregarded, leading to ineffective and expensive procedures and poor clinical outcomes. AI uses patient data to support clinical judgment, hospital data to support operational decision-making, and patient and hospital data to support consumer decision-making [41]. We can make wiser choices about our health with the help of artificial intelligence. Many people already use this technology all over the world to gather standard data like heart rate and sleep habits. It may be possible to warn those who are at risk of developing certain diseases before they become serious, by assessing this data with the help of machine learning. Mobile apps that offer detailed patient profile data may help patients with certain chronic diseases, manage their conditions, and lead healthy lifestyles. All of this could lead to a healthier society and lower overall expenditures [41].

Impact of Riot and Violence on Health and Healthcare System To understand the impact of Riot and Violence on Health and Healthcare System we have taken the data of Riot incidents that occurred during 2014–2020. We have been surprised that every year more than 50,000 incidents are happening (Graph 6.1).

Graph 6.1 Number of cases of rioting registered in India (2014–2020). Data source: Crime in India, NCRB

6 Artificial Intelligent Model for Riot and Violence Detection that Largely …

125

Impact of Riot and Violence on Health Violence in any form may harm the health of people and affect the communities where both parties reside. Violence has several forms, including fighting, oral abusing, sexual and physical exploitation of children, and in any form have a deleterious impact on health. The brain, neuroendocrine system, and immunological response are just a few of the biological impacts of violence that have come to be better known. Consequences include significantly higher rates of anxiety, depression, mental illness, stress disorders, and suicide as well as an increased risk of heart disease and early death. Violence has different health impacts based on the victim’s age, gender, and the type of violence used [42]. The following effects may encounter as consequences of violence in the sufferer.

Deviation in Mental Health A victim of violence may encounter depression and other mental issues like: • • • • •

depression includes persistent melancholy. anxiety. a low sense of self and doubts about oneself. suicidal thoughts or attempts. alcohol and drug abuse.

Deviation in Emotional Health Violence leaves a long impression on the emotional health of a victim. It may also lead to issues like: • • • • • •

despondency. feeling undeserving. concerned and dejected about the future. a lack of trust. wondering and questioning one’s spiritual faith. motiveless.

Have Less Empathy and Caring for Others The victims of violence become self-centered and selfish. He does not care for society and others.

126

M. Jain et al.

Develop Phobias and Insomnia Sleep is a physiological need essential for optimal body function; it is an indicator for overall health (National Sleep Association, 2018). Sufficient sleep is important in maintaining human cognitive, psychological, and physical health (FernandezMendoza & Vgontzas, 2013; Pun, 2016). According to the National Sleep Association (2018), adults require seven to nine hours of sleep per day. Riot and Violence foster victim to develop phobia and sense of unrest that may result in insomnia and other mental problems.

Physical Symptoms, Such as Headaches and Stomach Aches Violence can affect mood and emotions as well as the physical health of the people. Everybody has a different set of symptoms. The National Center for Complementary and Integrative Health states that persistent stress can result in: • digestive disorders. • headaches. • sleep disorders. Anxiety and Depression Adults who behave violently or abusively toward their relationships are engaging in violence. The majority of individuals perceived physical abuse between spouses, such as hitting, slapping, and beating.

Effects on Physical Health The effect of Violence may span from short-term to long-term severe health effects, or even deadly health effects. Research repeatedly shows that the impact of abuse on women’s physical and mental health increases with the severity of the violence. Additionally, even when misuse has stopped, detrimental health effects may still exist. When women are involved, the effects of violence are typically more severe.

Effects on Sexual and Reproductive These effects may include: • unintended/unwanted pregnancy. • unsafe abortion. • Miscarriage and pregnancy complications.

6 Artificial Intelligent Model for Riot and Violence Detection that Largely …

127

• Infections or vaginal bleeding. • chronic pelvic infection. Since poor physical and mental health are among the primary effects of violence, facilitating victims’ access to medical treatment should be part of victim management.

Impact of Riot and Violence on Local Healthcare System For understanding the impact of Riot and Violence on the local healthcare system, here we are taking the example Delhi Riots, held between 24-Feb and 26-Feb 2020. After a series of altercations between proponents of the citizenship law and demonstrators in northeast Delhi on February 24, the riots, which lasted for three days, broke out. Conflicts also erupted in Jaffrabad, Maujpur, Chandbagh, Khureji Khas, Bhajanpura, Dayalpur, Gokalpuri, and other localities as northeast Delhi became the focal point of the unrest [43]. Total 52 persons were killed and thousands of people were wounded. All this happened in broad daylight and the role of the police is questionable. Thousands of people who were suffering reached the local hospitals to revive their lives, and now the challenge arises in front of the hospitals how to handle this sudden demand for hospital and health care increased. Nevertheless, it is a big challenge to incorporate this hike in the demand of local health care. Below image illustrates the different incidents that took place during the Riots (Image 6.1).

Conclusion The world knows that artificial intelligence is creating a huge shift in the technology field as well as it has the ability to improve the lives of citizens across the world. Artificial intelligence is in the early phase of its development; gradually it will contribute to enhancing the capabilities of different technologies like CCTV Surveillance only capture the real-time footage but when it is combined with AI it becomes more useful by extracting only those features which are of interest. Like this way, AI will save time and money, and it will also gear up for prompt action. Surveillance video can be classified into violence and non-violence with the help of AI which will further help to take timely action. AI has a big potential in detecting violence and Riot well before it becomes serious; further AI may also contribute to capturing classical videos and images of these abnormal activities and these footage can be used by Law Enforcement agencies for compliance purposes. Artificial intelligence has a remarkable journey in the field of healthcare that has changed all the aspects of healthcare. Particularly in terms of predictive machine learning models for the healthcare industry. After receiving enough training data sets, deep learning exhibits notable performance and is crucial for making predictions.

128

M. Jain et al.

Image 6.1 Delhi Riots Feb 24, 2020 (Taken from Google Images)

AI and machine learning technologies guiding the path of patient care, diagnostic accuracy, and predicting outcomes by analyzing enormous amounts of data generated by the experiences of numerous patients. AI has proven its strength in COVID-19 pandemic and contributed to diagnosis, prognosis assessment, epidemic prediction, and drug discovery. During the COVID-19 pandemic, AI helped the current medical and healthcare system to greatly increase its efficacy and efficiency. In many areas of COVID-19, including diagnosis, public health, clinical decision-making, social balance, vaccine research, surveillance, combination with big data, core clinical services, and COVID-19 patient care, artificial intelligence has become increasingly prevalent. The COVID-19 pandemic must be stopped from spreading as quickly as possible (19).

References 1. Jalgaonkar, D., Gund, J., Patil, N., Phadke, M.: Detecting crime scenes using ML. Int Res J Eng Technol (IRJET) 7(5) (2020) 2. Boiman, O., Irani, M.: Detecting irregularities in images and in video. In: Tenth IEEE International Conference on Computer Vision (ICCV’05), vol. 1, (2005). https://doi.org/10.1109/ iccv.2005.70 3. Chaudhary, S., Khan, M.A., Bhatnagar, C.: Multiple anomalous activity detection in videos. Procedia Comput. Sci. 125, 336345 (2018)

6 Artificial Intelligent Model for Riot and Violence Detection that Largely …

129

4. . Li, Q., Li, W.: A novel framework for anomaly detection in video surveillance using multifeature extraction. In: Proc. 9th Int. Symp. Com-put. Intell. Design (ISCID), vol. 1, pp. 455459 (2016) 5. Basavaraj, G.M., Kusagur, A.: Vision based surveillance system for detection of human fall. In: Proc. 2nd IEEE Int. Conf. Recent Trends Electron., Inf. Commun. Technol. (RTEICT), May 2017, pp. 15161520 6. Dhulekar, P.A., Gandhe, S.T., Sawale, N., Shinde, V., Khute, S.: Surveillance system for detection of suspicious human activities at war eld. In: Proc. Int. Conf. Adv. Commun. Comput. Technol. (ICACCT), Feb. 2018, pp. 357360 7. Coletto, M., Lucchese, C., Orlando, S.: Do violent people smile: social media analysis of their prole pictures. In: Proc. Companion Proc. Web Conf. Apr. 2018, pp. 14651468 8. Ramzan, M., Abid, A., Ullah Khan, H., Mahmood Awan, S., Ismial, A., Ahmed, M., Ilyas, M., Mahmood, A.: A review on state-of-the-art violence detection techniques. In: IEEE Access August 19, 2019. https://doi.org/10.1109/ACCESS.2019.2932114 9. Olmos, R., Tabik, S., Herrera, F.: Automatic handgun detection alarm in videos using deep learning. Neurocomputing 275, 6672 (2018) 10. Fu, E.Y., Va Leong, H., Ngai, G., Chan, S.: Automatic Fight detection in surveillance videos. In: Proc. 14th Int. Conf. Adv. Mobile Comput. MultiMedia, Nov. 2016, pp. 225234 11. Nar, R., Singal, A., Kumar, P.: Abnormal activity detection for bank ATM surveillance. In: Proc. Int. Conf. Adv. Comput., Commun. Inform. (ICACCI), Sep. 2016, pp. 20422046 12. Karisma, Imah, E.M., Wintarti, A.: Violence classification using support vector machine and deep transfer learning feature extraction. In: 2021 International Seminar on Intelligent Technology and Its Applications (ISITIA) (2021). https://doi.org/10.1109/isitia52817.2021.950 2253 13. Accattoli, S., Sernani, P., Falcionelli, N., Mekuria, D.N., Dragoni, A.F.: Violence detection in videos by combining 3D convolutional neural networks and support vector machines. Appl. Artif. Intell., 1–16 (2020). https://doi.org/10.1080/08839514.2020.1723876 14. Sharma, M., Baghel, R.: Video surveillance for violence detection using deep learning. In: Borah, S., Emilia Balas, V., Polkowski, Z. (eds.) Advances in Data Science and Management. Lecture Notes on Data Engineering and Communications Technologies, pp. 411–420 (2020). https://doi.org/10.1007/978-981-15-0978-0 15. Bilinski, P., Bremond, F.: Human violence recognition and detection in surveillance videos. In: 2016 13th IEEE International Conference on Advanced Video and Signal Based Surveillance (AVSS). https://doi.org/10.1109/AVSS.2016.7738019 16. Hassner, T., Itcher, Y., Kliper-Gross, O.: Violent flows: Real-time detection of violent crowd behavior. In: 2012 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops. https://doi.org/10.1109/CVPRW.2012.6239348 17. Wang, L., Qiao, Y., Tang, X.: Action recognition with Trajectory-pooled deep-convolutional descriptors. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 4305–4314 (2015) 18. Bhangale, U., Patil, S., Vishwanath, V., Thakker, P., Bansode, A., Navandhar, D.: Near real-time crowd counting using deep learning approach. In: ELSEVIER Procedia Computer Science, vol. 171, pp. 770–779 (2020). https://doi.org/10.1016/j.procs.2020.04.084 19. Won, D., Steinert-Threlkeld, Z.C., Joo, J.: Protest activity detection and perceived violence estimation from social media images. In: MM ’17: Proceedings of the 25th ACM international conference on MultimediaOctober, pp. 786–794 (2017). https://doi.org/10.1145/3123266.312 3282 20. Sarthak Sharma, B Sudharsan, Saamaja Naraharisetti, Vimarsh Trehan and Kayalvizhi Jayavel, “A fully integrated violence detection system using CNN and LSTM” in International Journal of Electrical and Computer Engineering (IJECE) p-ISSN 2088–8708, e-ISSN 2722–2578. doi : https://doi.org/10.11591/ijece.v11i4.pp3374-3380 21. Jadhav, M.K., Chakkarwar, V.A.: Automatic detection of riots using deep learning. In: Springer, RTIP2R 2020: Recent Trends in Image Processing and Pattern Recognition pp 308–317. https:/ /doi.org/10.1007/978-981-16-0507-9_27

130

M. Jain et al.

22. Li, Y., Zhang, X., Chen, D.: CSRNet: dilated convolutional neural networks for understanding the highly congested scenes. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1091–1100 (2018). https://openaccess.thecvf.com/content_ cvpr_2018/html/Li_CSRNet_Dilated_Convolutional_CVPR_2018_paper.html 23. Baba, M., Gui, V., Cernazanu, C., Pescaru, D.: A sensor network approach for violence detection in smart cities using deep learning. Sensors 2019(19), 1676. https://doi.org/10.3390/s19071676 24. Jain, A., Vishwakarma, D.K.: State-of-the-arts violence detection using ConvNets. In: IEEE 2020 International Conference on Communication and Signal Processing (ICCSP). https://doi. org/10.1109/iccsp48568.2020.9182433 25. Saligrama, V., Chen, Z.: Video anomaly detection based on local statistical aggregates. In: 2012 IEEE Conference on Computer Vision and Pattern Recognition (2012). https://doi.org/ 10.1109/cvpr.2012.6247917 26. Wang, T., Snoussi, H.: Histograms of optical flow orientation for visual abnormal events detection. In: 2012 IEEE Ninth International Conference on Advanced Video and Signal-Based Surveillance (2012). https://doi.org/10.1109/avss.2012.39 27. Sabokrou, M., Fathy, M., Hoseini, M. Klette, R.: Real-time anomaly detection and localization in crowded scenes. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, pp. 56–62 (2015). https://www.cv-foundation.org/openac cess/content_cvpr_workshops_2015/W04/html/Sabokrou_Real-Time_Anomaly_Detection_ 2015_CVPR_paper.html 28. Wang, H., Schmid, C.: Action recognition with improved trajectories. In: Proceedings of the IEEE International Conference on Computer Vision (ICCV), pp. 3551– 3558 (2013). https://openaccess.thecvf.com/content_iccv_2013/html/Wang_Action_Recogn ition_with_2013_ICCV_paper.html 29. Ryan Medel, J., Savakis, A.: Anomaly detection in video using predictive convolutional long short-term memory networks. In: Computer Vision and Pattern Recognition (cs.CV). https:// doi.org/10.48550/arXiv.1612.00390 30. Sun, S., Liu, Y., Mao, L.: Multi-view learning for visual violence recognition with maximum entropy discrimination and deep features. Inf Fusion (2018). https://doi.org/10.1016/j.inffus. 2018.10.004 31. Samuel, R., D.J., Fenil, E., Manogaran, G., Vivekananda, G. N., Thanjaivadivel, M., Jeeva, S., Ahilan, A.: Real time violence detection framework for football stadium comprising of Big Data analysis and deep learning through bidirectional LSTM. Comput. Netw. (2019). https:// doi.org/10.1016/j.comnet.2019.01.028 32. Hua, X., Fan, Z., Jiang, L.: “TOP-ALCM: a novel video analysis method for violence detection in crowded scenes” in Elsevier. Inf. Sci. 606, 313–327 (2022). https://doi.org/10.1016/j.ins. 2022.05.045 33. Honarjoo, N., Abdari, A., Mansouri, A.: Violence detection using pre-trained models. In: IEEE Xplore, 2021 International Conference on Pattern Recognition and Image Analysis (IPRIA). https://doi.org/10.1109/IPRIA53572.2021.9483558 34. Sumon, S. A., Goni, R., Hashem, N. B., Shahria, T., & Rahman, R. M. (2019). Violence detection by pretrained modules with different deep learning approaches. Vietnam. J. Comput. Sci., 1–22. https://doi.org/10.1142/s2196888820500013 35. Soliman, M.M., Kamal, M.H., El-Massih Nashed, M.A., Mostafa, Y.M., Chawky, B.S., Khattab, D.: Violence recognition from videos using deep learning techniques. In: 2019 Ninth International Conference on Intelligent Computing and Information Systems (ICICIS) (2019). https:/ /doi.org/10.1109/icicis46948.2019.9014714 36. Ullah, F.U.M., Ullah, A., Muhammad, K., Haq, I.U., Baik, S.W.: Violence detection using spatiotemporal features with 3D convolutional neural network. Sensors 19(11), 2472 (2019). https://doi.org/10.3390/s19112472 37. Singh, V., Singh, S., Gupta, P.: Real-time anomaly recognition through CCTV using neural networks. Procedia Comput. Sci. 173, 254–263 (2020). https://doi.org/10.1016/j.procs.2020. 06.030

6 Artificial Intelligent Model for Riot and Violence Detection that Largely …

131

38. Habib, S., Hussain, A.: Abnormal activity recognition from surveillance videos using convolutional neural network. Sensors 21(24), 8291 (2021). https://doi.org/10.3390/s21248291 39. Jalgaonkar, D., Gund, J., Patil, N., Phadke, M.: Detecting crime scenes using ML. Int. Res. J. Eng. Technol. (IRJET) e-ISSN: 2395-0056, 07(05) (2020). https://www.academia.edu/442 37633/IRJET_Detecting_Crime_Scenes_using_ML?from=cover_page 40. Menger, V., Scheepers, F., Spruit, M.: Comparing deep learning and classical machine learning approaches for predicting inpatient violence incidents from clinical text. In: MDPI (2018). https://doi.org/10.3390/app8060981 41. Shaheen, M.Y.: AI in healthcare: medical and socio-economic benefits and challenges. In: ScienceOpen (Sep 25 2021). https://doi.org/10.14293/S2199-1006.1.SOR-.PPRQNI1.v1 42. Rivara, F., Adhia, A., Lyons, V., Massey, A., Mills, B., Morgan, E., … Rowhani-Rahbar, A.: The effects of violence on health. Health Aff. 38(10), 1622–1629 (2019). https://doi.org/10. 1377/hlthaff.2019.00480 43. https://www.indiatvnews.com/news/india/delhi-riots-2020-a-look-back-at-communal-vio lence-described-as-worst-since-partition-by-court-anti-caa-protests-2022-04-16-770201, Date: 29/07/2022, 1:53 PM

Chapter 7

How Artificial Intelligence is Transforming Medicine: The Future of Pharmaceutical Research Pankaj Sharma, Vinay Jain, and Mukul Tailang

Introduction The pharmaceutical industry has dramatically increased its data digitization during the last few years. To answer challenging clinical problems, it is difficult to acquire, examine, and use this knowledge as a result of this digitization [1]. Due to its ability to manage massive amounts of data with improved automation, AI (artificial intelligence) is used to address this [2]. A technology-based platform called artificial intelligence (AI) uses a variety of cutting-edge technologies and networks to simulate human intellect. At the very same moment, it doesn’t totally threaten to take the place of human physical interaction [3, 4]. Another obstacle in the process of generating new therapeutic agents has been the expense of investigation and the amount of time it took [5]. Investigators from all around the world turned to computational strategies, also referred as conventional strategies, including virtual screening (VS) and molecular docking, to reduce these difficulties and obstacles. These methods do come with drawbacks, too, including unreliability and inefficiency [6]. Thus, there is an increase in the use of unique techniques that may be used independently to overcome the difficulties associated with conventional computational methodologies. In order to address issues and challenges in the pharmaceutical development and discovery process, particularly deep learning (DL), artificial intelligence (AI), and machine learning (ML) algorithms have come to be recognized as a potential solution [7]. Long and complicated processes including target validation, medicinal P. Sharma (B) Department of Pharmaceutics, ShriRam College of Pharmacy, Morena, MP, India e-mail: [email protected] V. Jain Department of Pharmacognosy, ShriRam College of Pharmacy, Morena, MP, India M. Tailang School of Studies in Pharmaceutical Sciences, Jiwaji University, Gwalior, MP, India © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2023 A. Mishra and J. C.-W. Lin (eds.), Industry 4.0 and Healthcare, Advanced Technologies and Societal Change, https://doi.org/10.1007/978-981-99-1949-9_7

133

134

P. Sharma et al.

molecule refinement, leading molecule screening, pre-clinical and clinical trial innovation, and manufacturing procedures are all part of the drug research and design process. These all provide a significant obstacle to finding a disease-specific medicine that works. Consequently, controlling the pricing and efficiency of the operation is the main issue facing pharmaceutical businesses [8]. AI has provided straightforward, scientific answers to each of these queries that have decreased the process’s time and money requirements [9]. AI is not a solitary, all-encompassing technique; rather, it refers to a number of subspecialties (such as machine learning and deep learning) that, separately or together, provide applications more intelligence. The investigation of techniques that enable computer programmes to automatically develop via experience is known as machine learning (ML) [10]. Machine learning (ML) may be divided into three categories: supervised, unsupervised, and reinforcement learning (RL). Studies are also active in other sub-fields, such as semi-supervised, self-supervised, and multiinstance ML. Utilizing labelled data (annotated information), supervised learning may identify tumours in new photos by, for instance, utilizing labelled X-ray images of existing tumours [11]. In “unsupervised learning,” data is derived through information without labelling, for as by grouping patients with similar complaints to find underlying causes [12]. Computational entities in RL pick up new skills through authoritative presentation or trial and error. By devising a plan to enhance benefits, the algorithm learns. It is noteworthy that significant advances in AI over the past few years have relied on RL. A family of algorithms known as “deep learning” (DL) develops by employing a sizable, multi-layered collection of linked processes and exposing these processors to a huge amount of samples. Nowadays, DL has become the AI technique of choice, advancing fields like voice and picture recognition [13, 14]. Here, we briefly go over how big data has revolutionized the drug development process and how AI has progressed from machine learning to deep learning. Afterwards, we gave an overview of how AI is being used to augment the traditional drug development process as well as how AI is being used with conventional chemistry to enhance drug development. Following that, we go over all of the different ways AI is used in the drug development and revelation mechanisms, including primary and secondary vetting, toxicities, release of drugs and surveillance, appropriate dosage performance and efficacy, drug readjusting, polypharmacology, and drug reactions.

General and Specific Pharmaceutical Screenings Using Artificial Intelligence The identification of new and possible lead molecules is a key step in the drug research. According to several investigations, including clinical and pre-clinical research, in vivo experiments, including microarray research, there are around 1.2 billion molecular structures that are available in chemical realm. These chemical

7 How Artificial Intelligence is Transforming Medicine: The Future …

135

structures are eliminated using machine learning algorithm including reinforcement algorithms, regression models, including generative models based on adaptive sites, architecture, and targeting bound capability. Relatively shorter than the conventional drug development procedure, the whole artificial intelligence drug research would take between 13 and 17 years. Genomics research, reverse docking as well as quantitative biochemicals are used to identify disease-modifying specific proteins as the initial stage in the phase of drug development, or lead finding. Genetic alterations design plus virtual screening are two methods for doing this. Lead refinement and main ensure that the products are the following steps in the drug development process, which involve drug-like characterization, drug-target repeatability, and bioinformatics. After that, molecules are subjected to secondary screening, which is then proceeded by pre-clinical studies [15]. Therapeutic potential via cell-culture study, animal model experiments, and human analysis is the last stage of the drug development process (Fig. 7.1).

How Can We Create Trustworthy and Efficient AI-Assisted Healthcare Systems? The usage and implementation of AI in clinical practise remains restricted after more than a decade of intense concentration, with several AI solutions for healthcare already in the design phase of the project 19–22. Although there are several approaches to developing AI systems for the medical industry, far too frequently, efforts are made to fit square pegs into round holes, i.e., identify healthcare issues and implement AI solutions while taking into account the contextual factors (including healthcare processes, user requirements, trustworthiness, safety, and ethical considerations). We believe that instead of just replacing human intelligence, AI enhances and enhances it. Therefore, it is crucial when developing AI systems for the healthcare industry to not eliminate the critical components of human contact but rather to concentrate on it and increase the efficacy and efficiency of that relationship. Additionally, advancements in AI in healthcare will result from a thorough grasp of the intricacy of individual experiences and treatment pathways from a human-centred perspective [16–18].

Create and Expand Designing and developing AI solutions for the relevant challenges utilizing a humancentred AI and experimental methodology while including the necessary stakeholders, particularly the healthcare consumers themselves, seems to be the initial stage.

136

P. Sharma et al. General and specific pharmaceutical screenings using artificial intelligence

PLACE FOR CHEMICAL COMPOUNDS OR BIG DATA

ARTIFICIAL INTELLIGENCE USED TO DETERMINE MEDICATION MOLECULES

Fig. 7.1 General and specific pharmaceutical screenings using artificial

7 How Artificial Intelligence is Transforming Medicine: The Future …

137

Participant Involvement and Co-Creation Create a multi-disciplinary team with machine and interpersonal researchers, operational and investigations governance, clinical decision-makers (physicians, caretakers, patients), as well as experienced professionals (for example, biomedical researchers), as well as people who can authorize, inspire, finance, convene, communicate, enforce, and leader [19]. The technical, strategic, and operational knowledge needed to establish issues, objectives, success indicators, and interim objectives is brought by a multi-stakeholder team.

Human-Centred AI A human-centred AI strategy integrates AI with an anthropological knowledge of healthcare systems. Through consumer studies, initially identify the fundamental issues, such as the requirements, restrictions, and process flows in medical institutions, as well as the enablers and obstacles to the introduction of Ai within the medical settings (we recommend to use a subjective design of the study to comprehend “and what’s the difficulty,” “why is that an issue,” “towards whom does it doubt,” “how has it not been discussed prior to actually,” and “why isn’t it gaining publicity”). The next stage after identifying critical challenges is to determine which issues AI can address effectively and if there are any relevant datasets available to create and later assess AI. AI technologies would function within current norms and practises to guarantee acceptance, offering suitable answers to existing challenges for the end consumer by conceptualizing algorithms in existing business processes.

Fresh Experiments The development of new progressive trials should be the central objective with strong feedback mechanisms from participants enabling quick experience learning and gradual modifications. The studies would enable the concurrent testing of several novel concepts in order to determine which one succeeds and to discover whatever performs and why. The expected end consumers and the possible damage and moral considerations of the AI system for them should be clarified through testing and user input (taking data privacy, integrity, equality, and security as an example) [20].

138

P. Sharma et al.

Assess and Confirm The forecasts generated by the AI tool must therefore be repeatedly evaluated and validated to see how it is doing. Assessment is composed of three criteria: statistical accuracy, clinical relevance, as well as economic utility. This is crucial. Assessing the effectiveness of AI on parameters like precision, dependability, resilience, consistency, and calibration is empirical validity. To show clinical utility or effect, high model efficiency in retrospectively, in silico contexts is insufficient. Evaluate the algorithm’s therapeutic significance by testing it in real-time using retain and periodic testing accuracy (such as continuous and outside geographical data) to show its applicability and generalizability in the healthcare setting. Financial utility measures the net value compared to the cost of the AI system expenditure [18].

Size and Ubiquity Several AI systems are originally created to address a challenge inside a single medical system, taking into account the patient population unique towards that setting and environment. The distribution methods, model upgrades, legal framework, system diversity, and reimbursement environment demand particular care when AI systems are scaled up.

Watch Over and Keep Up An AI system should be continuously inspected and managed to check for dangers and unfavourable outcomes using efficient post-market monitoring even after its been used in clinical settings. Collecting and analysing the pertinent statistics for AI effectiveness, clinical and protection concerns, and adverse outcomes requires collaboration between healthcare providers, government regulators, and AI researchers (Table 7.1) [21].

7 How Artificial Intelligence is Transforming Medicine: The Future … Table 7.1 A list of businesses utilizing artificial intelligence to advance healthcare and/or drugs [22]

Organization

Principal research area

Verily

Smart clothing

Google DeepMind

Search for medical records

Deep Genomics

Genetics

Arterys

Imaging

IBM Watson

Search for medical records

Atomwise

Medication creation

Careskore

Standard of care

Zephyr Health

Choosing treatments

Enlitic

Imaging

Sentrian

AI platform for remote patients

3Scan

Imaging

139

Applications of Artificial Intelligence (AI) in Several Pharmaceutical Business Sectors Pinpoint Diagnostics Medical imaging The most popular AI usage at the moment is the automatic categorization of medical pictures. Upwards of 50% of the AI/ML-based pharmaceutical products that were authorized in the USA and Europe between 2015 and 2020 (129 (58%) gadgets there in USA and 126 (53%) gadgets in Europe) were determined to be authorized or certified for radiographic usage [23]. Research findings have shown that AI is capable of performing as well as or better than human analysts in image-based prognosis from a variety of health specialties, such as pathology (one research educated AI a convolutional neural network an anterior chest X-ray impression marked image surpassed radiologists in sensing pneumonia), dermatology (a convolutional neural network was educated to clinical images and also was discovered to categorize skin lesions precisely), and pneumonia in diagnostic imaging (when compared to cardiologists, a deep learning model accurately detected heart attacks) [24–27]. Artificial intelligence in drug testing A medication’s discovery and development can require over 10 years and cost the equivalent of US$2.8 billion. Even at that, nine out of 10 medicinal compounds fall short of passing regulatory clearance and Phase II clinical studies [28, 29]. Deep neural networks (DNNs) and Nearest-Neighbour classifiers seem to be two examples of technologies that are utilized for virtual screening depending on synthesis capability and may also forecast in vivo behaviour and hazard [28, 30]. In order to provide a framework for the development of treatments for conditions including

140

P. Sharma et al.

immuno-oncology as well as cardiovascular illnesses, a number of biopharmaceutical corporations, including Roche, Bayer, and Pfizer, have partnered with IT firms. Detecting diabetic retinopathy Patients should be screened for diabetic retinopathy and treated as soon as it is discovered in order to reduce unnecessary, diabetes-related eye problems globally. Nevertheless, due to the high prevalence of diabetic sufferers and the shortage of personnel in the field of vision care globally, monitoring is expensive [31]. Reliable detection efficiency and cost-effectiveness have been shown in previous research on supervised AI algorithms treating diabetic retinopathy in the USA, Thailand, Singapore, and India [32]. AI to detect prostate cancer When attempting to integrate pathology and radiography in 1991, Schnall et al. [33] had little success in terms of the pathologic and radiographic aspects. In order to speed up picture acquisition from MRI histopathology and in vivo, Ward et al. [34] employed an image-directed slicing method based on strand-shaped ideal foundation identifiers in 2012. Since 2014, Litjens et al. [35] have been using a variety of frameworks (evaluation algorithms of prostate images, photons for simple image recognition) to divide and characterize MRI images. Their work serves as the basis for many more methods that have since been discovered and used. The promise for prostate cancer diagnosis and prognosis prediction is demonstrated by machine intelligence and CNN’s methodologies. There is a significant need for additional investigation because there is so little knowledge. Because there is so little knowledge, there is a great need for more research. Each of the following areas—diagnostic imaging, genetics, histopathology, and therapy—has the potential to improve disease atomization and, consequently, clinical care in the hospital. Planning and operation results will soon be improved by the potential for learning algorithms in prostate surgical intervention, both in terms of improving patient outcomes and in terms of instructing and evaluating surgical talents.

Individualized Medicine A drug delivery system combines engineering technology with one or more conventional medicine administration systems [36]. We must greatly advance our understanding of illness if we are to advance towards targeted therapies. In order to develop technological and biological indicators for detection, seriousness, and development, scientists throughout the world are investigating the cellular and molecular foundation of illness. To do this, they are gathering a variety of heterogeneous datasets. Synthetic/Immunomics biology and drug development are two significant potential uses of AI (Fig. 7.2) [37].

7 How Artificial Intelligence is Transforming Medicine: The Future …

Ensuring conformity with the in-process specification

Cost of the product

Tracking and changing the development

Study of market predictions Position in the market

Management of Medicinal Product

141

Drug Formulation

Drug Evaluation

Drug Research

Assistance in selecting appropriate additives

Creation of Pharmaceuticals

The use of AI

Management and planning of clinical trials

Quality Management

Production of drugs

Enrolling in or choosing a subject

Recognize crucial process variables

Computerized production

Patient ejection

Instruct future production processes

Customized production

Trial observation

Governing in-line quality

Correlation between manufacturing mistakes and the given parameters

Establish quality control

Fig. 7.2 Applications of artificial intelligence (AI) in several pharmaceutical business sectors

Genetic engineering and immunology In the coming years, while using AI technologies on multimodal data sources, we could be capable of understanding the cellular foundation of illness, the agglomerative of illnesses, and patient groups in order to develop prevention programmes that are more precisely focused. For instance, we could use immunomics to detect and more predict exactly diagnostic and support possibilities. This will change several

142

P. Sharma et al.

standards of treatment and have a significant influence in the areas of cancer, neurological disorders, and uncommon illnesses, customizing the care experience for each patient [37]. AI for drug discovery In order to find new therapeutic, biomarkers, and drugs, researchers look into the connections between various chemicals, genomes, and proteins to determine which combinations have the most potential. RWD applications can support some of these goals. The late resurgence of interest in deep learning in pharmaceutical research has indeed given rise to an unmatched explosion of cutting-edge simulation methods and applications. Artificial intelligence (AI) can discriminate between hit and lead pharmaceuticals, enabling quicker therapeutic aim confirmation and structural engineering optimization [38]. Clinical study layout will significantly improve thanks to AI, as will the optimization of the procedures used to make pharmaceuticals. In fact, AI has the potential to substitute any multiobjective optimization technique used in medical. Following DeepMind and AlphaFold’s latest releases, we already saw the origins of this, that now pave the way for a better knowledge of pathological conditions, the prediction of protein complexes, and the creation of more precise therapies. AI in medication development The development of a new therapeutic agent is followed by the incorporation of the novel medicinal component into an appropriate dosage form with acceptable delivery characteristics. In this situation, AI can replace the old trial-and-error approach. The FDA defines the development of new drugs as consisting of four steps (Fig. 7.3). Scientific trial includes various stages of human testing to evaluate the safety and efficiency of the proposed medicine, as well as post-marketing research such as pharmaco-surveillance and bioequivalence investigations [39].

Artificial Intelligence Used in Conventional Computational Drug Design For several years, computational methods have significantly influenced drug design and development, revolutionizing the full procedure. But at the other side, conventional analytical methodologies continue to raise a number of issues such time expense, computational burden, and reliability [40]. De novo drug creation has also improved in recent times from AI. For instance MolAIcal have created a system for building three-dimensional pharmaceuticals in three-dimensional peptide spaces (https://molaical.github.io/) [41].

7 How Artificial Intelligence is Transforming Medicine: The Future …

143

Identifying the best lead for a drug-like chemical Finding the chemical target or comprehending the process Drug research Analysing substance polypharmacology and anticipating its mechanism of action Testing drugs

Drug recycling

Choosing the subjects for clinical trials

Drug authorization

Fig. 7.3 AI in medication development

New Therapeutic Methods Synthetic biology has made advancements during the last 10 years, including customized cancer treatments and CRISPR genetic manipulation. But the process of creating such cutting-edge treatments is still incredibly wasteful and costly. We will be able to manage much more methodical intricacy in the future because to AI and improved access to information (genome, metabolomic, proteome, glycomic, and bioinformatic), which will enable us to change how we comprehend, explore, and influence biology. By making it easier to anticipate that what agents seem to be more likely to be successful early on and to effectively foresee medication adverse consequences that also frequently thwart the further advancement of other powerful medications at an expensive late stage of the product development method; this will enhance the capacity of the drug discovery process. This will equalize access to innovative, more expensive advanced medicines.

144

P. Sharma et al.

AI for Market Research and Forecasting The ongoing expansion and growth of a company’s performance are vital to its success. The pharma company’s R&D productivity is declining also with the availability of significant funding because businesses are not using new marketing tools [42]. Through a multiobjective decision-making strategy that gathers and analyses mathematical and statistical data and incorporates human assumptions to start making AI-based judgement designs to discover different marketing methodologies, the advancements in digital innovations, also known as the “Fourth Industrial Revolution,” are assisting innovative digitized advertising [43].

AI for Prediction of Product Cost The corporation chooses the final product depending on market research and the expenses paid during the manufacturing of the pharmaceuticals. The key idea in using AI to calculate this cost is to make use of its capacity to simulate the reasoning of a human expert in order to evaluate the variables that influence a product’s price once it has been manufactured [44]. The cost of drug research and development, stringent price regulation regulations in the relevant nation, duration of the contract term, share of the market of the novel drug after a year prior to patent expirations, price of the optimized formulation, and anti-price-fixing laws are just a few examples of factors that affect drug prices for both generic and branded medications [45].

AI Role in Development of Nanomedicine In order to diagnose, cure, and control complicated illnesses including HIV, cancer, malaria, asthma, and numerous inflammatory conditions, nanomedicines combine drugs and nanotechnology. Because they have improved treatment effectiveness, nanomaterials based drug development has recently grown in significance in the therapeutics and diagnostics fields [46]. Numerous issues in dosage forms might be resolved by combining AI and nanotechnology [47, 48].

AI’s Role in Emergency Notifications This system sends email and SMS alerts to the program’s target users (Health professionals/Doctors) if a sensor output surpasses a predefined limit or threshold value, merely so the person is aware of their health and may take necessary action. A practitioner will get an alert notification in an urgent situation, such as when sensor data

7 How Artificial Intelligence is Transforming Medicine: The Future …

145

exceeds the threshold levels, and from this information, he or she will be able to understand the patient’s health status and take further action. He or she will issue a warning and diet recommendations to the patient if the situation is acute.

Limitations and Hindrances The healthcare industry has recently experienced noteworthy technological advancements and their application to problems relating to healthcare. The availability of high-quality healthcare at the touch of a button has significantly improved things. Through the use of cloud computing, intelligent sensors, and contemporary communications, AI has successfully altered the healthcare industry. AI like other breakthroughs has its own set of challenges and problems that may be further investigated in the future. We’ll go through some of the challenges in the next section (Fig. 7.4).

Charges for Maintenance and Service Rapid technological advancements in recent years have made it necessary for healthrelated AI gadgets to undergo regular updates. Every AI-based system makes use of a sizable number of interconnected medical devices and sensors. This includes high maintenance, support, and upgrade costs that might affect the business’s and end customers’ financial situations. As a result, it is necessary to add sensors that require less maintenance. Fig. 7.4 Limitations of health-related AI services

Security and confidentiality of data The management of data

The use of energy

Charges for maintenance and service

Limitations and Hindrances

Discovery of novel illnesses

146

P. Sharma et al.

The Use of Energy The vast bulk of AI equipment is battery-powered. Once installed, a sensor’s battery is challenging to replace. As a result, the system was powered by a high-capacity battery. Currently, inventors throughout the globe are working to develop medical devices that can generate their own power. One such reasonable option is to integrate an IoT gadget with renewable energy sources. These techniques can help to some amount in addressing the world’s energy crisis.

Security and Confidentiality of Data The combination of cloud computing has replaced the idea of genuine tracking. However, this has made healthcare systems more vulnerable to cyberattacks. Critical patient information may not be applied correctly as a result, which might affect the treatment strategy. When designing a system to defend an AI system from this damaging attack, certain precautionary measures must be taken into account. Medical and sensory equipment that is part of an AI network must analyse and apply identity validation, safe booting, authorization management, fault tolerance, whitelisting, password protection, and safe coupling protocols. Examples of network technologies that need secure routing and message authentication include Bluetooth, Wi-Fi, Zigbee, and others.

The Management of Data Due to the significance of data management in the realm of artificial intelligence, connected things generate and share enormous volumes of data that require privileged mode and handling. It would be better if some companies offered enough storage to accommodate all AI data.

Discovery of Novel Illnesses New healthcare applications are released regularly thanks to the rapid growth of mobile innovation. Despite the fact that there are many mobile healthcare applications available, the conditions for which they were designed are still limited. There is a need to address other ailments that were overlooked or received insufficient treatment in the past as a result. As a result, there will be a wider range of AI applications.

7 How Artificial Intelligence is Transforming Medicine: The Future …

147

Future Scope The quality of the results depends on the training datasets that are available for AI algorithms since they learn from data. The development and implementation of dedicated AI systems are encouraged by the numerous challenges that drug design poses in terms of data acquisition, data modelling, categorization, forecasting, and optimization. In order to generate predictive patterns of illness states, development, and treatment methods results, artificial intelligence (AI) is being utilized to identify relationships between sequences of genetic variations, transcriptional profiles, and clinical and other phenotypes. Before spending money on an actual clinical trial, CTS investigations use computational simulation techniques on chosen populations to assess various trial strategies [49, 50]. In the next 2 years, the majority of participants (59 percent) stated that their business intended to recruit more personnel to assist with the installation or use of AI [51, 52].

Conclusion and Significant Recommendations Numerous facets of healthcare might be transformed by AI advancements, opening the door to a tomorrow that is more individualized, accurate, prognostic, and transportable. The influence of these tools and the technological revolution they offer need healthcare systems to think about how they can best adjust to the dynamic world. It is unknown whether we will witness a gradual or dramatic adoption of new technology. For the NHS, the use of such innovations has the possibilities to free up time for caregiving, allowing medical professionals to concentrate on what makes a difference to their patient populations and, in the long term, utilizing a worldwide new democratic set of data assets comprising the “greatest levels of human understanding” to “collaborate at the boundaries of scientific knowledge” to produce a uniformly high quality of care, anywhere and everywhere, every time, and by everyone who. 50 Worldwide, AI has the potential to play a significant role in advancing health equity. While the last 10 years have largely been devoted to the implementation of health document digitization for efficiency’s sake (and, in some healthcare systems, invoicing), the next 10 years will be devoted to the understanding and valuation culture can derive from such digital content, how it can be interpreted into improving the quality of care with the help of AI, and the subsequent development of novel information assets and techniques. It is obvious that we are at a tipping moment in the fusion of medical practise and technological use, and while there are many potentials, there are also enormous hurdles that must be faced in terms of the actual world and the scope of innovative implementation. An increase in integrative research in the area of AI technology in healthcare will be essential to realizing this objective. Along with this, we need to engage in the skill enhancement of the healthcare workers and future leaders so that they are digitally capable and recognize the possibilities of an AI-augmented health service instead of being afraid of it.

148

P. Sharma et al.

When preparing to use AI for wellness, healthcare administrators should take the following factors into account, at the very least: Healthcare data is extremely delicate, inconstant, segmented, and not streamlined for machine learning development, assessment, application, and adoption. Procedures for honest and moral access to data include: direct exposure to domain expertise and foreknowledge to make sense of and start creating a few of the rules that need to be implemented to the data sources (to generate the necessary understanding); access to enough computer processing power to produce real-time decisions, which is undergoing an accelerating transformation with the introduction of cloud computing research into implementation: importantly, we must take into account, investigate, and research problems that emerge while using an algorithm in the actual world, creating “trusted” AI algorithms.

References 1. Ramesh, A., et al.: Artificial intelligence in medicine. Ann. R. Coll. Surg. Engl. 86, 334–338 (2004) 2. Miles, J., Walker, A.: The potential application of artificial intelligence in transport. IEE Proc. Intell. Transport Syst. 153, 183–198 (2006). 3. Yang, Y., Siau Keng, L.: A qualitative research on marketing and sales in the artificial intelligence age. In: MWAIS 2018 Proceedings, vol. 41 (2018) 4. Wirtz, B.W., et al.: Artificial intelligence and the public sector—applications and challenges. Int. J. Public Adm. 42, 596–615 (2019) 5. Hamet, P., Tremblay, J.: Artificial intelligence in medicine. Metabolism 69, S36–S40 (2017) 6. Hassanzadeh, P., Atyabi, F., Dinarvand, R.: The significance of artificial intelligence in drug delivery system design. Adv. Drug Deliv. Rev. 151, 169–190 (2019) 7. Duch, W., Swaminathan, K., Meller, J.: Artificial intelligence approaches for rational drug design and discovery. Curr. Pharm. Des. 13(14), 1497–1508 (2007) 8. Zhang, L., Tan, J., Han, D., Zhu, H.: From machine learning to deep learning: progress in machine intelligence for rational drug discovery. Drug Discovery Today 22(11), 1680–1685 (2017) 9. Jordan, A.M.: Artificial intelligence in drug design—the storm before the calm? ACS Med. Chem. Lett. 9(12), 1150–1152 (2018) 10. Mitchell, T. M., Mitchell, T. M.: Machine Learning, Vol. 1, No. 9. McGraw-hill, New York (1997). 11. Reardon, S.: Rise of robot radiologists. Nature 576, S54–S58 (1997) 12. Lasko, T.A., Denny, J.C., Levy, M.A.: Computational phenotype discovery using unsupervised feature learning over noisy, sparse, and irregular clinical data. PLoS ONE 8, e66341 (2013) 13. Baker, R.E., Pena, J.M., Jayamohan, J., Jérusalem, A.: Mechanistic models versus machine learning, a fight worth fighting for the biological community? Biol. Let. 14(5), 20170660 (2018) 14. LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature 521, 436–444 (2015) 15. Sharma, P.: Applications of statistical tools for optimization and development of smart drug delivery system. In: Ahmad, U., Haider, M.F., Akhtar, J. (eds.) Smart Drug Delivery. IntechOpen, London. https://doi.org/10.5772/intechopen.99632(2021) 16. Wiens, J., Saria, S., Sendak, M., et al.: Do no harm: a roadmap for responsible machine learning for health care. Nat Med 25, 1337–1340 (2019) 17. Dodaro, G.L.: Fiscal year 2020 budget request: US Government Accountability Office. United States Government Accountability Office (2019)

7 How Artificial Intelligence is Transforming Medicine: The Future …

149

18. Sendak, M.P., D’Arcy, J., Kashyap, S., et al.: A path for translation of machine learning products into healthcare delivery. EMJ Innov 10, 19–00172 (2020) 19. Andrews, M., McConnell, J., Wescott, A.: Development as Leadership-Led Change: A report for the Global Leadership Initiative. World Bank Publications (2010) 20. Andrews, M.: Who really leads development? In: WIDER working paper 2013/092. UNUWIDER (2013) 21. Davahli, M.R., Karwowski, W., Fiok, K., Wan, T., Parsaei, H.R.: Controlling safety of artificial intelligence-based systems in healthcare. Symmetry 13, 102 (2021) 22. Mesko, B.: The role of artificial intelligence in precision medicine. Expert Rev. Precis. Med. Drug Develop. 2(5), 239–241 (2017) 23. Muehlematter, U.J., Daniore, P., Vokinger, K.N.: Approval of artificial intelligence and machine learning-based medical devices in the USA and Europe (2015–20): a comparative analysis. Lancet Digital Health 3, e195-203 (2021) 24. Wang, X., Peng, Y., Lu, L., et al.: Hospital-scale chest x-ray database and benchmarks on weakly-supervised classification and localization of common thorax diseases. IEEE CVPR 2097–106 (2017) 25. Esteva, A., Robicquet, A., Ramsundar, B., et al.: A guide to deep learning in healthcare. Nat Med 225, 24–29 (2019) 26. Bejnordi, B.E., Veta, M., Van Diest, P.J., et al.: Diagnostic assessment of deep learning algorithms for detection of lymph node metastases in women with breast cancer. JAMA 318, 2199–2210 (2017) 27. Strodthoff, N., Strodthoff, C.: Detecting and interpreting myocardial infarction using fully convolutional neural networks. Physiol. Meas. 40, 015001 (2019) 28. Álvarez-Machancoses, Ó., & Fernández-Martínez, J. L.: Using artificial intelligence methods to speed up drug discovery. Expert Opin. Drug Discovery. 14,769–777 (2019) 29. Fleming, N.: How artificial intelligence is changing drug discovery. Nature 557, S55–S55 (2018) 30. Dana, D., et al.: Deep learning in drug discovery and medicine; scratching the surface. Molecules 23, 2384 (2018) 31. Bellemo, V., Lim, Z.W., Lim, G., et al.: Artificial intelligence using deep learning to screen for referable and vision-threatening diabetic retinopathy in Africa: a clinical validation study. Lancet Digit Health 1, e35-44 (2019) 32. Gulshan, V., Peng, L., Coram, M., et al.: Development and validation of a deep learning algorithm for detection of diabetic retinopathy in retinal fundus photographs. JAMA 316, 2402–2410 (2016) 33. Schnall, M.D., Imai, Y., Tomaszewski, J., Pollack, H.M., Lenkinski, R.E., Kressel, H.Y.: Prostate cancer: local staging with endorectal surface coil MR imaging. Radiology 178, 797–802 (1991) 34. Ward, A.D., Crukley, C., McKenzie, C.A., Montreuil, J., Gibson, E., Romagnoli, C., Gomez, J.A., Moussa, M., Chin, J., Bauman, G.: Prostate: registration of digital histopathologic images to in vivo MR images acquired by using endorectal receive coil. Radiology 263, 856–864 (2012) 35. Litjens, G., Toth, R., van de Ven, W., et al.: Evaluation of prostate segmentation algorithms for MRI: The PROMISE12 challenge. Med Image Anal. 18, 359–373 (2014) 36. Sharma, P., Jain, V., Tailang, M.: Selection and role of polymers for designing of a drug carrier. In: Villarreal-Gómez, L.J. (ed.) Drug Carriers [Working Title]. IntechOpen, London. https:// doi.org/10.5772/intechopen.103125(2022) 37. Bajwa, J., Munir, U., Nori, A., Williams, B.: Artificial intelligence in healthcare: transforming the practice of medicine. Future Healthcare J. 8(2), e188 (2021) 38. Mak, K.K., Pichika, M.R.: Artificial intelligence in drug development: present status and future prospects. Drug Discov. Today 24(3), 773–780 (2019) 39. Hu, L., Zhang, C., Zeng, G., Chen, G., Wan, J., Guo, Z., Liu, J.: Metal-based quantum dots: synthesis, surface modification, transport and fate in aquatic environments and toxicity to microorgan-isms. RSC Adv. 6(82), 78595–78610 (2016)

150

P. Sharma et al.

40. Stefania, C., et al.: Antiproliferative effect of Aurora kinase targeting in mesothelioma. Lung cancer (Amsterdam, Netherlands) 70(3), 271–279 (2010) 41. Bai, Q., Tan, S., Xu, T., Liu, H., Huang, J., Yao, X.: MolAICal: a soft tool for 3D drug design of protein targets by artificial intelligence and classi-cal algorithm. Brief. Bioinf. 22(3), bbaa161 (2021) 42. Toker, D., et al.: A decision model for pharmaceutical marketing and a case study in Turkey. Ekonomska Istraivanja. 26, 101–114 (2013) 43. Singh, J., et al.: Sales profession and professionals in the age of digitization and artificial intelligence technologies: concepts, priorities, and questions. J. Pers. Selling Sales Manage. 39, 2–22 (2019) 44. Duran, O., et al.: Neural networks for cost estimation of shell and tube heat exchangers. Expert Syst. Appl. 36, 7435–7440 (2009) 45. Park, Y., et al.: A literature review of factors affecting price and competition in the global pharmaceutical market. Value Health 19, A265 (2016) 46. Wilson, B., KM, G.: Artificial intelligence and related technologies enabled nanomedicine for advanced cancer treatment. Future Med. 15, 433–435 (2020). 47. Ho, D., et al.: Artificial intelligence in nanomedicine. Nanoscale Horiz. 4, 365–377 (2019) 48. Sacha, G.M., Varona, P.: Artificial intelligence in nanotechnology. Nanotechnology 24, 452002 (2013) 49. Gupta, R., Srivastava, D., Sahu, M., Tiwari, S., Ambasta, R.K., Kumar, P.: Artificial intelligence to deep learning: machine intelligence approach for drug discovery. Mol. Diversity 25(3), 1315–1360 (2021) 50. Sharma, P.: Modification of human behavior due to coronavirus outbreak: a brief study on current scenario. Asian J. Pharm. (AJP) 15(3), 1 (2021) 51. Sharma, P., Tailang, M.: Design, optimization, and evaluation of hydrogel of primaquine loaded nanoemulsion for malaria therapy. Futur J Pharm Sci. 6, 26 (2020) 52. Department of Health and Social Care.: NHS Constitution or England. DHSC. www.gov.uk/ government/publications/the-nhs-constitution-for-england (2012)

Chapter 8

Impression of Big Data Analytics and Artificial Intelligence for Healthcare—A Study Sonali Vyas, Dinesh Bhatia, and Sunil Gupta

Introduction Big Data analysis along with AI can enhance the outcomes related to the patient’s data which ensures a better and effective treatment for the patient. The data-oriented results are used to monitor the patient’s healthcare condition efficiently. The cause of side effects can me minimalized based on existing dataset patterns. These healthcare practices serve as an advancement in the treatment quality. The major part of data consistently been vivacious for dynamic facility of medical services. With the expanded digitization in medical care, gigantic measure of data is additionally produced from healthcare industry [1]. A huge volume of data is accessible nowadays, which guarantees support for a wide possibility of medical and medical care assignments. The development of analyzation techniques and examining, AI and ML (machine learning) methods along with different opportunities for changing this data into significant and noteworthy bits of knowledge to help decision-making, give top notch patient consideration, react to ongoing circumstances and upgrade the utilization of assets, work on the cycles and benefits to decrease the expenses on the functional and monetary side of the healthcare industry [2, 3]. The adoption of data analytical techniques into the healthcare system, medical care partners can tackle the force of data not just for analysis of recorded data (expressive investigation), but in addition for anticipating future results (perceptive analysis) and for deciding best activity for current circumstance (rigid check-up) [4, 5]. Traditionally, clinical specialists depended on the restricted measure of data available with them and previous experiences for patient diagnosis. This is of most extreme significance in medical care, where a solitary choice means the contrast among lifecycle S. Vyas · S. Gupta (B) UPES University, Dehradun, India e-mail: [email protected] D. Bhatia North Eastern Hill University, Shillong, Meghalaya, India © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2023 A. Mishra and J. C.-W. Lin (eds.), Industry 4.0 and Healthcare, Advanced Technologies and Societal Change, https://doi.org/10.1007/978-981-99-1949-9_8

151

152

S. Vyas et al.

and decease [2]. Accessibility of information from various resources proposes the freedom to have an encompassing comprehension of patient health. Making utilization of cutting-edge innovations related to this data likewise empowers accessibility to the accurate information at perfect opportunity and ideal spot to convey precise care [6]. Healthcare information—that incorporates constant data obtained from records of patients, analytical reports, and dynamic data from monitoring screens or remote monitoring of patient—is generally amorphous in behavior. It drives away more former the capacity of conventional logical tools to deal with such complicated and dynamic information [6]. Through AI and Big data analytics, the information can be handled to get significant experiences that could be applied as a crucial part in redeeming lives. Besides the effect of skills and large persistent consideration on the health of the patient, AI and Big Data analytics likewise discover their implementation in life science disciplines and clinical analysis. Sub-atomic analysis of data unlocks various measurements for the disclosure of novel treatment choices. Prescient logical models can be utilized over data to distinguish hereditary infection markers, plan and foster new medications, and evaluate their adequacy [7]. Advanced data analytics and machine learning strategies in this manner empower scientists to design imminent clinical preliminaries based on theories created from the investigation of data [8]. Due to its tremendous potential for working on the nature of healthcare, these emerging methods have drawn in expanding consideration in healthcare analysis and practice. A few analyses have added to the utilization of AI and Big Data analytics in medical care; however, the writing generally stays scattered. In request to possess a careful comprehension of the possibilities of utilizing such innovations in medical, an efficient planning learning was accomplished to get result-oriented output. Efficient planning study is considered proper in view of the accessibility of generous and different work in this area [9]. Throughout the last decade, AI has been applied to various regions, for example, web indexes, machine interpretation frameworks, and clever individual associates. Simulated intelligence has likewise discovered many uses in the clinical field, alongside the boundless utilization of EHR (electronic health record) and the fast advancement of life sciences, including neuroscience [9]. Big Data analytics indicates huge capacities of data which are used for executing complex programs. The organization of this outrageous data volumes, that might be organized and complicated, is the utmost troublesome undertaking. Executing AI calculations and methods could be processed well by a huge heap of data. For sure, utilizing ML methods, for example, artificial neural networks (ANN) and AI calculations to accomplish automatic dynamic can be an immense achievement for healthcare. ML is the investigation of software programs which can train via deduction and examples instead of being unequivocally customized with calculations and measurable representations. The significance machine learning is prominent for implementation of the smart healthcare practices. Data, the foundation of each model, are the main parts for implementing machine learning in the healthcare and the additional pertinent data, the extra gauges [10]. Step by step, healthcare information is growing. Statistics dramatically fill in quantity utilizing well-being data frameworks and managing patients’ records. Immense measures of well-being data are made each day. It unbolts discussions regarding importance from such dramatically developing

8 Impression of Big Data Analytics and Artificial Intelligence …

153

data measure. The analysis of this data is significant for removing data, acquiring data, and finding stuffed away instances [11]. In view of the current investigations and systems, this part grants a bunch of difficulties which are identified with healthcare data analytics. Difficulties have a place with advancements, assortment, capacity, total, investigation, sharing, and perception of health data [12]. Patients and the general public routinely experience life-changing consequences from healthcare decisions. When decision-makers can quickly gather and analyze complete, dependable data, they are better able to decide on a path for important health occurrences, forecast it, and create long-term plans. Data analytics can be the basis for wise, significant decisions, regardless of whether you’re a doctor working closely with patients or a healthcare administrator overseeing the business end of the sector [56]. Four main categories of data analysis exist: • Descriptive analysis: This type of analysis includes both an examination and a description of something that has already occurred. • Diagnostic analysis: This method looks for the underlying cause of an incident. • Predictive analysis, which examines past data, patterns, and theories to provide answers. • Prescriptive analysis which specifies particular steps a person or organization might take to achieve future results or goals. The following examples show how data analytics can have a big impact on the healthcare sector: • Assessing and Educating Professionals: Data gathered from patients on their encounters with medical providers can be analyzed to spot areas that need improvement. One example is the study on doctors’ empathy by Dr. Helen Riess. After noticing that patients’ perceptions of their symptoms during sessions were frequently given less weight than objective, scientific data, she conducted a study in which participants assessed their doctors on a scale of how empathetic they felt they were during visits. After the data was acquired, half of the clinicians received empathy training while the other half did not. When the doctors were reviewed, the training group scored better than the control group in terms of empathy scores [56]. • Locating Inconsistencies in Scan Results: In the healthcare sector, using machine learning algorithms is another technique to employ data analytics. When used appropriately, algorithms can evaluate data more quickly and efficiently than people. An algorithm developed by Massachusetts Institute of Technology researchers can distinguish between 3D medical images like MRI scans. The machine can detect scan discrepancies faster than a human surgeon because it can learn from previous data. Professor Dustin Tingley issues a caution about depending too much on algorithms in the Harvard Online course Data Science Principles. None of these obviously ludicrous ML applications would be workable without human supervision [51].

154

S. Vyas et al.

• How to Spot Emergencies: Additionally, by seeing patterns in the transmission of disease, data analytics enables medical facilities, hospitals, educational institutions, and even individuals to take the appropriate precautions. Tingley describes how the Centres for Disease Control and Prevention (CDC) predicts the next flu outbreak using data in Data Science Principles. According to Tingley, the CDC has been gathering information on reported flu cases for more than 10 years. “You may estimate the severity of upcoming flu seasons using that data to make decisions.” Tingley discusses in the course how the CDC collects flu data from affiliated hospitals and medical facilities and how that data impacts local decisions made by decision-makers like school administrators [52].

Big Data and Healthcare Big Data analytics in healthcare and medication introduces such varied composite data which cannot be further divided by conventional programming [13]. Lots of research are being accomplished related to healthcare Big Data. Such segment gives a short outline of the current relative studies. Past research works are partitioned into regions like business associated to healthcare data which displays the advantages of utilizing Big Data in healthcare frameworks; secondly, business associated to healthcare Big Data and structures [14]. Healthcare combination with Big Data analysis gives different advantages. Big Data can assist common people and defines significant public problems including healthcare. Big Data in healthcare aid numerous objectives, like lessening costs, permitting better, and compelling consideration for patients, giving ongoing data access and a straightforward device for the suppliers, further developing instrument quality for scientists, helping clinical organizations in coordinating data gathered from clinics and home gadgets for security observing and better administrations, and decreasing waste. Big Data analysis in healthcare exceeds the direct effect of its capacity and illumination of a diversity of data and the rapidity at which it should be directed (as shown in Fig. 8.1) [15]. By discovering associations and receiving instances and patterns in the data, Big Data analytics can probably further develop healthcare, and reduce expenses. The potential for Big Data analytics in healthcare to prompt improved outcomes occurs across many circumstances, for example, through investigating qualities of patient, the expense and outcomes of healthcare to distinguish the medically and financially practical therapies, and deal analysis and devices, subsequently impacting supplier behavior; [16] applying developed analytics to patient outlines (e.g., perceptive demonstration) to recognize individuals who might be benefitted from protection care or life-changing ways; expansive scope sickness reporting to distinguish prophetic instances and backing anticipation drives; assembling and allocating data on processes, accordingly serving patients in determining the deliberation agreements which provides better value; recognizing, predicting, and restraining extortion by carrying out cutting-edge technical agendas for falsification findings and actually looking at the precision and reliability of cases; and implementing much

8 Impression of Big Data Analytics and Artificial Intelligence …

155

Fig. 8.1 Big Data analytics framework

nearer to ongoing, assure approval, constructing new income streams by collecting and orchestrating patient medical records and privilege datasets to provide data and managements to strangers, for example, approving data to help drug organizations in distinguishing patients for integration in clinical introductions [17]. The versatile pre-processor divides data into squares, and afterward it removes pertinent data for the indicator part to assemble a prescient model for each section. Figure 8.2 addresses the layer engineering architecture for smart healthcare Big Data analytics [18]. Big Data Characteristics However, Big Data systems are planned toward meeting explicit healthcare requirements, they arrange well for taking on standard engineering rules for performing exercises, for example, data gathering, pre-handling, data analysis, understanding, and perception [19]. The main features of Big Data include different analytics to support the various parts of community. • Descriptive Analytics: It contains portraying present situations and covering them. Very less strategies are used to act out at such analytical levels. For example, descriptive measurement apparatuses like graphs and histograms are among the methods utilized in analytics.

156

S. Vyas et al.

Fig. 8.2 Big Data layered architecture for smart healthcare

• Diagnosis Analysis: This plans to clarify the occurrence of varied events happened and the related components. For model, diagnostic analysis endeavors to comprehend the explanations for the typical re-admission of convinced patients by utilizing a few strategies, for example, clustering and decision tree algorithms. • Prognostic Analytics: This reflects the volume for anticipating upcoming cases; it similarly aids in distinguishing patterns and determining prospects of dubious consequences. A representation of its profession is to predict whether a patient can get details or not. Predictive representations are often assembled using machine learning procedures. • Prescriptive Analytics: Major goal behind prescriptive analytics is to recommend rational events motivating perfect dynamic techniques. Likewise, prescriptive analysis might endorse sacking a certain treatment in the instance of an incidental side having large occurrence chances.

Big Data for Healthcare In view of Big Data advances, a couple of data handling frameworks for healthcare area have been planned all together to handle the significant measure of data streams produced by clinical gadgets [9]. Sensors are the devices on body of humans or diverse sorts of medical devices fit for providing remote diagnosis. Prior to being stimulated to the share named Insightful Building (IB), the assembled data over the Essential Clinical Devices (ECD). Then, IB starts by accumulating the input stream since its collection unit. All these units support the Big Data to capture information for analyzing the patterns and different healthcare solutions. The pre-arranged data is utilized to configuration representations anticipating patient’s health. This style is requested intermittently in an offline premise. In the online streaming situation, data is generated from various sources, for example, clinical sensors associated with patient, estimating a few clinical boundaries like circulatory strain. At that moment,

8 Impression of Big Data Analytics and Artificial Intelligence …

157

Fig. 8.3 Global healthcare Big Data market size forecast for 2025

the gathered data is coordinated dependent on schedule and its misplaced qualities are managed. Healthcare Big Data analytics is overriding a straight consequence of its capacity in addition to its diversity. With understanding the patterns and the examples inside the data, healthcare frameworks can work on the quality consideration of patients and control healthcare costs. The drug and biotechnological organizations are outfitting the force of Big Data for item strategically pitching, monetary danger of the executives, administrative consistence of the board, and others. Big Data analytics in drug assembling will likewise assist with bettering the gauge creation demand, understand the plant’s exhibition, and offer quicker help administrations to the clients (Fig. 8.3). The expanding supply of health data from diverse sources can possibly change the healthcare conveyance framework, lessen costs, work on patient results, and offer some incentive-based consideration. The volume of healthcare data represented more than 700 EB in 2017 from 153 million out of 2013 and is projected to develop to 2,314 EB by 2020. Associations are utilizing scientific tools and artificial intelligence and machine learning strategies on this developing pool of data to determine data-driven experiences to decrease healthcare costs, improve income streams, foster customized medication, and oversee proactive patient consideration. [20] Big Data analytics in the healthcare market can be portioned dependent on part, organization, analytics type, application, end-client, and locale. In light of the parts, the market can be divided into programming and administration. On-premise is the prevailing fragment, notwithstanding, the cloud section is relied upon to develop at the most noteworthy rate during the estimate time frame attributable to a horde of advantages, like proficient asset usage, low support, and no capital expense, presented by cloud distribution. Health data has been developing at phenomenal rates, driven by a fall away expenses, the rise of distributed storage, developing administrative mandates, and the expanding government drives to advance the reception of healthcare data frameworks. The expanding reception of wearable gadgets, at-home testing administrations, and mobile-health applications that are engaging patients to proactively

158

S. Vyas et al.

deal with their health is further adding to the pool of individual data. The accessibility of huge volumes of health data has cleared way for huge advances in clinical exploration, the improvement of accuracy medication and clinical choice help apparatuses, speedier medication revelation, and a more point-by-point perspective on populace health, which has opened new exhibits for overseeing persistent illnesses.

Artificial Intelligence Throughout the last decade, Artificial Intelligence has been applied to various regions, for example, web search tools, machine interpretation frameworks, and clever individual colleagues. Artificial Intelligence has additionally discovered many uses in the clinical field, along with the far-reaching utilization of EHRs and the fast improvement of life sciences, including neuroscience. Artificial Intelligence has begun upsetting a few regions in medication, from the plan of proof-based treatment plans to the execution of ongoing logical advancements. Simulated intelligence is seen as an increased or a substitute methodology for healthcare experts. The utilization of AI has been proposed to satisfy informatization prerequisites in tertiary medical centers [21]. Artificial Intelligence has additionally been applied in various fields of perceptive health administrations, like mechanical medical procedure, cardiology, disease treatment, and nervous system science [22]. Features: AI is steadily advancing clinical practice. With late advancement in digitalized data securing, ML and processing framework, AI applications are growing into regions which were recently assumed to be just the area of a human specialist. [23] The ability to understand is quite possibly the powerfully deliberated topic with regard to the utilization of AI in healthcare. In spite of the fact that AI-driven architectures have been displayed to beat people in specific reasonable undertakings, the nonappearance of the capacity to understand keeps on starting analysis. However, the huge volume of collected data needs to be understood is certainly not only technical issue, somewhat it increases a huge cluster of medical, legalized, ethical, and social studies that need comprehensive study [24].

AI for Healthcare Each structure that regulates AI enjoys an additional benefit of attaining its assignment inside a short enough said. In healthcare, experts are researching from a long time, yet with health frameworks that have ML algorithms are utilized to diminish drug revelation times [25]. Accordingly, the utilization of AI for re-establishing shares of the revelation cycle of a medicine will be less costly, faster, and safer. In any case, it probably won’t be feasible to utilize AI innovation in all the medication disclosure processes. Maybe it helps with stages, for example, the most common

8 Impression of Big Data Analytics and Artificial Intelligence …

159

way of finding new mixtures that can be potential medications. Additionally, AI can be utilized to recognize the utilization of mixtures put away in the research facility that were recently tried [26]. Hence, the eventual fate of AI in drug formation is the incorporation of in-memory processing innovation together with AI stages to increment the ability to provide faster drug revelation and advancement [27]. AI techniques are implemented to strategies like diagnostic techniques, analysis convention improvement, patient observing, drug advancement, customized medication in healthcare, and flare-up expectations in worldwide health. ML permits machines to acquire and improvise instead of being expressly modified. ML algorithms can likewise break down huge datasets also known as Big Data via Electronic Health Record (HER) for illness anticipation and conclusion. Wearable clinical gadgets are utilized to consistently screen a singular’s health status and store it in distributed computing [28].

Big Data in Healthcare The technological advancements have aided in producing an ever-augmenting amount of data to the point where it is out of control due to currently achievable development. This has encouraged the formation of the term “Big Data” to produce big and unstructured data. In order to fulfill current and upcoming social needs like healthcare systems, it is necessary to promote new strategies to integrate the data and define its significance [29]. Big Data has become an intriguing subject for more than two decades because of the amazing power veiled in it. Diverse private and public companies generate, accumulate, and disassemble Big Data with the purpose of improving the management they offer. In the healthcare system, the various hot spots of Big Data comprise of medical records of patients, clinical examination results, and smart devices. Biomedical investigation also produces a significant part of Big Data related to public healthcare. This data necessitates formal management and scrutiny in order to regulate important data. Anyhow, wanting to be organized by scrutinizing Big Data quickly is like searching a needle in bulk. There are numerous complications related with each step in managing Big Data that should work best with the use of high-resolution line responses to analyze Big Data. It is for this reason that, in order to provide critical answers to general health practices, healthcare providers need to be fully equipped on the right basis to consciously produce and investigate large amounts of data.

Role of Big Data Analytics in Healthcare Healthcare systems are transforming by application of Big Data analytics. As the world aggregates inconceivable data volumes and healthcare innovation becomes increasingly more basic to the progression of medication, legislators and controllers

160

S. Vyas et al.

Fig. 8.4 Probable sources of Big Data in the healthcare systems

are confronted with extreme difficulties related to data confidentiality and security [30]. The utilization of Big Data analytics in the medical business is developing throughout the world, and its probability and advantages are obvious. Predictive analytics, Big Data analytics, AI, ML, and profound learning can outfit enormous number of datasets. Such datasets can be utilized to further develop analysis, illuminate deterrent medication rehearses, and lessen antagonistic impacts of medications and different medicines [31]. Data reconciliation from diverse sources makes it hard to store the data in a precise characterized framework (as shown in Fig. 8.4). Due to the varied organizations and the absolute capacity of medical data, it is problematic to inspect these implementing conventional medical inventions [32].

Big Data Architecture in Healthcare Healthcare associations need exact and further developed data on their genuine expenses, in request to effectively change from volume to esteem. It is along these lines normal that few specialists have been inspired by Big Data analytics field in healthcare. IoT and Big Data should be intertwined with computational intellect prior to being changed over into any significant activity. Healthcare and Big Data have various components since it isn’t not difficult to get to, is fairly organized, and has lawful details and privacy issues. The architecture plans to handle clinical

8 Impression of Big Data Analytics and Artificial Intelligence …

161

Fig. 8.5 Architecture for Big Data healthcare

headways in illness analysis via Big Data innovation by gathering information from varied sources partaking fluctuating configurations, and additionally utilizing ML strategies to develop significant expectations. The information is sent simultaneously to the group just as streaming layer exist in the analytics layer. Subsequently pre-processing the data is then dissected while going through different stages as well as highlight choice and mine it. The pre-arranged data in the last stage is additionally utilized in plan models which are huge for foreseeing health states of a patient in imminent future. If there should be an occurrence of streaming data, data is taken from various sources, for example, clinical sensors are joined to the patient’s body which gives data in regard to the current continuous data about the state of the patient like pulse, pulse and so on the versatile pre-processor separates the data into squares and then, at that point, separates the necessary pertinent data which is additionally communicated for developing predictive representations [33] (Fig. 8.5).

Big Data Challenges and Opportunities in Healthcare The uses of descriptive, predictive, and prescriptive insightful methods whenever utilizing Big Data analytics provide freedom to advance the nature of different healthcare parts [34]. • Medical analysis: A data-driven analysis might distinguish illnesses at a beginning phase and lessen intricacies during the treatment. • Local area healthcare: Specialists might make preventive strides against the anticipated dangers of ongoing illness among a general public. • Emergency clinic observing: Constant observing of clinics can help government specialists guarantee ideal assistance quality.

162

S. Vyas et al.

• Patient consideration: Altered patient consideration worked with by BDA can possibly give quick alleviation and reducing the cost for re-admission in the hospitals. Big Data offer various challenges while enhancing the eminence of the numerous features of healthcare. • Initial speculation: The collection of requirements to use the aids of Big Data causes vast early overheads for healthcare associations. • Data Quality: Lack of adjusted intelligence and protection from hierarchical schedule changes may affect the large data compiled by the organization. • Quality of Practices: The less quality of various biomedical data has the predictable shortcomings of providing scarce information and misconceptions [41]. • Security and Privacy: Research scholars warn of security and privacy of patients’ safety regarding unauthorized access to data in exchanging between systems [42].

AI and Healthcare AI put on to computational inventions that emulate tools aided by intelligence of humans, like idea, profound learning, transformation, commitment, and tactile thoughtfulness. A limited devices can implement a job that ordinarily comprises human translation and dynamic multitasking [35]. Awareness and advancements related to medical AI applications have flooded lately because of the significantly improved processing force of present-day PCs and the immense measure of computerized data accessible for assortment and use [36]. Man-made intelligence is continuously changing clinical activities. There are very limited AI techniques in healthcare which can be utilized by diverse clinical fields, for example, diagnosis, reconstructive, careful, and analytical activities [37]. One more basic space of medication where AI is settling on an effect is clinical dynamic and infection finding. AI innovations can consume, analysis, and state huge data volumes across various modes to distinguish infection and suggest medical advice. Computer-based intelligence applications can manage the immense measure of data created in medication and discover new data that would some way or another stay concealed in the mass of clinical Big Data [38].

AI Role in Renovating Healthcare The domain of healthcare is advancing at a speeding up, and this is joined by a critical expansion in the measure of data and difficulties as far as cost and patient results, so AI applications have been utilized to lessen these difficulties. AI is extremely valuable in addressing tricky healthcare difficulties and offers various benefits over customary data analytics and clinical dynamic methods [39].

8 Impression of Big Data Analytics and Artificial Intelligence …

163

• AI in Medical Diagnosis: AI can possibly change clinical diagnostics. Pointless routine lab testing increments superfluous monetary expenses. Hence, artificial intelligence applications have been utilized to limit the circle of lab analyses that the patient might require. Computer-based intelligence can distinguish the presence of early illness at the earliest opportunity as it can computerize an enormous part of the manual work and accelerate the determination cycle. • AI for Clinical Workflow: AI is at present being utilized to productively oversee work process and analyze imaging. Artificial intelligence can be utilized to work on clinical work process, support better clinical bits of knowledge, decrease clinical inconstancy, help in setting concentrate on needs, and limit doctor burnout. AI has the capability to assume control throughout the tedious assignment of data input so clinicians can zero in on further developing work use, expanding day-by-day usefulness and giving the greatest of care to patients. • AI for Anticipating Hospital Acquired Infections: AI can standardize the analysis of contaminations with Infection Prevention and Control (IPC) suggestions, and work with the spread of IPC skill. Computer-based intelligence gives freedoms to further develop conclusion through target design acknowledgment. Utilizing AI-driven models, clinicians can screen high hazard patients, foresee which patients are probably going to foster focal line contaminations and mediate to decrease hazard. • AI and Next-Generation of Radiology Tools: AI can assist with fostering the up-and-coming age of imaging devices that will give precise data and itemized enough to swap the requirement for tissue tests at times. The up-and-coming age of artificial intelligence is relied upon to be more successful in the healthcare framework and there will be further upgrades in execution. These advancements guarantee to expand precision and decrease the quantity of routine undertakings that exhaust time and exertion.

A Case Study: Public Health System and AI The vitality of AI-empowered innovation in the medical setting implies that its clients are forced to adjust to new work processes that coordinate its capacities to emphatically impact healthcare results or, on the other hand, having no certain impact except for rather contorting the treatment pathway. In this manner, regardless of whether an innovation has no demonstrated danger to the patient under given conditions, it should be tried for how it adjusts with client work processes. As per the discussed case study, the public healthcare conditions can be improved with the use of AI. Through medical assessment, if a specified clinical gadget reacts to the medical result it expects to, there is quality in endeavor human components approval testing, bearing in mind the climate where it will be utilized. Medical adequacy for a particular gadget can be fundamentally affected by the gadget’s trying climate (a regulated lab biological system) is not quite the same as its application climate (an essential health center

164

S. Vyas et al.

with restricted web availability). For forefront health laborers with least computerized proficiency, composite interface capacities on advanced healthcare application domains would think twice about recipient volumes they can react to in a restricted timeframe, hence compromising health results for the local area. Guideline for clinical devices thusly needs to express comparable conditions that should be tried for, and enunciated for its particular ease of use in a general health setting [40].

Converging AI with Big Data Analytics for Healthcare System Convergence of AI and Big Data for the future healthcare systems is totally dependent upon pioneers in healthcare which see gigantic potential in AI and analytics to follow through on the guarantee of greater consideration at a lower cost by engaging their chiefs, business pioneers, clinicians, and medical caretakers by saddling the force of predictive analytics. Numerous healthcare associations are looking to saddle the immense capability of AI to change their medical and business procedures. They look to apply these cutting-edge innovations to sort out a steadily expanding “tidal wave” of organized and unstructured data, and to computerize continuous tasks that recently needed manual handling [41]. As per Fig. 8.6 data science steps to perform data analytics related to healthcare systems are shown. Healthcare produces a large amount of diverse data. As healthcare systems focus on object-based systems, intelligent and collaborative health data analysis is gaining importance in the management of the health system, particularly in resource development while improvising healthcare quality and outcomes. Data analysis in healthcare is influenced by new ideas and innovative techniques originating from AI and Big Data analytics [42].

AI Will Change the Future AI will have access to a variety of data sources by 2030, which will enable it to better understand disease patterns and advance care and treatment. The ability of healthcare systems to predict a person’s susceptibility for specific diseases and offer preventive actions. AI will contribute to improved hospital and healthcare system efficiency and a decrease in patient wait times. Artificial intelligence (AI) develops practices that adapt to the expert and the patient by educating from every patient, every diagnosis, and every procedure. As a result, the system achieves financial sustainability while also enhancing health outcomes, lowering burnout, and addressing clinician shortages. The connected care technology that powers this community-wide networked system joins people, hardware, software, locations, and services to build real networks of care that augment long-term health and welfare [43–45].

8 Impression of Big Data Analytics and Artificial Intelligence …

165

Fig. 8.6 Healthcare data analytics step-by-step procedure

• Return to reality By 2022, we still have a extended way to go before attaining this goal. In the clinical settings where they are deployed to aid in illness diagnosis, treatment, monitoring, and eventually prevention and cure, extremely complicated technology, IT, and data systems reduce staff productivity and jeopardize continuity of care. All three of these concepts, however, appear to have a good chance of becoming reality in the future. Intelligent machines are already able to supplement human abilities and carry out expert tasks. Examples include AI that can analyze and quantify doctor notes, detect malignant spots on an image, and improve patient flow in emergency rooms. Predictive analytics powered by AI are already being used inside hospitals to help save lives in intensive care units. It is assisting in the identification of specific atrisk groups outside of hospitals so that preventative primary or community care can minimize the necessity for hospital admissions. As healthcare endures to go global, international standards that protect how personal data is handled by AI will become a primary responsibility. Perhaps most importantly, I think we need to remember that the most effective application of AI is to supplement, not replace, human abilities. • Leading Big Data and AI Healthcare Organizations Innovation in healthcare is largely driven by artificial intelligence (AI). In the past 10 years, it has significantly impacted the healthcare industry, contributing money to many industries. Healthcare based on AI includes dose error reduction,

166

S. Vyas et al.

robotic surgery, and virtual nursing assistants. By 2026, it is anticipated that these applications will save the healthcare sector $150 billion annually. • Solution Health Through phone screening interviews, the non-physician employees at Remedy Health may identify latent chronic conditions thanks to the AI-assisted platform. Finding the ideal fulcrum point for action to improve health outcomes and cut costs is made possible by early diagnosis. Finding undiagnosed patients will also significantly raise the RAF ratings and profitability of a health institution. • Discreet Medical A set of deep learning software solutions from Subtle Medical improve images through the acquisition stage of the radiology work flow, refining workflow efficiency, and patient understanding. Deep learning techniques are used by FDA-cleared and CE-marked products SubtleMR and SubtlePET to effortlessly interact with any scanner and PACS system without altering the workflow of imaging specialists. The most recent imaging improvement technology is brought by SubtlePET and SubtleMR. • BioSymetrics The raw data formats utilized for biomedical applications are incompatible with traditional ML techniques, and there are few guidelines for data standardization, normalization, and harmonization. By using Augusta, its flagship product, a preprocessing and analytics stage that can analyze vast quantity of data for predictive analytics, BioSymetrics overcomes this problem. This facilitates the extraction of meaningful insights from other types of biomedical data (EEG, MRI, and others) as well as the EBs of data created by the 25B IoT devices. Researchers, physicians, hospitals, and biopharmaceutical companies can all make use of the specialized and flexible instrument. • Sensely It is an empathy-driven, avatar-based platform that connects insurance plan subscribers with information and services through simple user interfaces. By utilizing the scalable platform technology architecture of Sensely, an insurance company may interact with its members in a whole new way. This architecture blends human empathy with the efficiency and scalability of technology. • InformAI The start-up for artificial intelligence (AI) InformAI specializes on developing healthcare technologies that speed up point-of-care medical diagnostics and increase radiologist productivity. The largest medical center complex in the world collaborated with InformAI, along with general physician clusters and a top medical imaging firm, to develop AI-enabled image classifiers and patient result forecasters. The way healthcare is provided is being transformed by InformAI and its partners.

8 Impression of Big Data Analytics and Artificial Intelligence …

167

• SaliencyAI For pharma businesses, they offer a range of solutions that quicken each stage of the data science pipeline: – Data Labeling: To ensure that you receive all of your data uniformly, employ a single, comprehensible interface that all of your data partners can use to label and submit data. – Data Unification: Combine many existing datasets to easily conduct analysis. To build a single, homogenous dataset, many heterogeneous data sources can be retrieved using a single line of code. – Putting Machine Learning Models in Place: Create and train models quickly with just a few lines of high-level code. Select the best model by automatically comparing a large selection of state-of-the-art options. Utilize models that have learned from millions of data points already. • DeepMind/Google Health The health division of DeepMind is combined with Google Health for generating solutions that can help medical teams and provide advanced patient results. DeepMind’s health division combined with Google Health creates solutions that help medical teams and improve patient outcomes.. In order to better diagnose cancer, forecast patient outcomes, prevent blindness, and other things, Google Health is utilizing artificial intelligence (AI). Google has demonstrated that these are not just empty words. Google Health recently developed a breast cancer detection A.I.-based system in collaboration with the business’ DeepMind division. • IBM Whittell Health To extend A.I.’s helping hand to stakeholders in the healthcare industry, from payers to providers, IBM established its specialized health unit, Watson Health. Watson Health has helped a number of illustrious organizations with the application of cognitive computing, like Mayo Clinic with its breast cancer clinical trial and Biorasi to get medications to market more swiftly while lowering costs by over 50%. • Oncora Health The Philadelphia start-up intends to promote cancer treatment, particularly radiation therapy, and research. One of its co-founders, David Lindsay, saw that radiation oncologists needed a joined digital database that gathered and organized as e-medical archives. • Health CloudMedX The Silicon Valley start-up’s main goal is to employ predictive analytics to advance patient and financial results. In order to improve patient outcomes, CloudMedX fetches relevant data from e-medical records using Deep Learning and Natural

168

S. Vyas et al.

Language Processing (NLP), producing medical perceptions for healthcare professionals. Doctors and patients can then use CloudMedx’s AI Assistant to support data-driven decisions. • Babylon Health With aspirations to do the same in China, the USA, and the Middle East, the UKbased Babylon Health has now expanded its patient-centered distant consultation service to Rwanda and British locations. The dynamic AI in the smartphone app will initially ask users about their concerns before pairing customers to a suitable doctor, 24/7, through video. • AI-powered Corti Corti serves as the “co-pilot” for the staff members in emergency treatment. AI can comprehend the framework and patterns in crucial talks by hearing to patient consultations, background noises, evaluating the visitor’s speech, and using historical data and ANN to draw conclusions. It can aid emergency medical staffs by notifying them, for example, it detects a heart attack in progress and takes actions that could save a person’s life.

Conclusion The efficient study discussed in chapter provides a clear overview of prevailing in the field of Big Data and AI in healthcare. In this chapter, it additionally came about in improved knowledge of advancement and approval of models and strategies utilized for data analytics in healthcare. Flow study drifts in use of emergent advances for various areas of medical care are analyzed. The discoveries featured healthcare strengths with most noteworthy exploration interest. Besides, the most broadly applied machine learning calculations and artificial intelligence methods were additionally recognized. Careful assessment of essential investigations prompted distinguishing proof of cutting-edge exploration and disclosure of significant analysis. Suggestions for future exploration incorporate a requirement for more generous contextual analyses on use of AI with Big Data analytics in healthcare procedure. This can be considered as logical at the point when healthcare partners and professionals apply these advancements in reality healthcare arrangement, further permitting the revelation of capability of Big Data and AI for advancement of quality healthcare systemsñ.

8 Impression of Big Data Analytics and Artificial Intelligence …

169

References 1. Mehta, N., Pandit, A., Shukla, S.: Transforming healthcare with big data analytics and artificial intelligence: a systematic mapping study. J. Biomed. Inform. 100, 103311 (2019) 2. Raghupathi, W., Raghupathi, V.: Big data analytics in healthcare: promise and potential. Heal. Inf. Sci. Syst. 2, 3 (2014). https://doi.org/10.1186/2047-2501-2-3 3. Bates, D.W., Saria, S., Ohno-Machado, L., Shah, A., Escobar, G.: Big data in health care: using analytics to identify and manage high-risk and high-cost patients. Health Aff. 33, 1123–1131 (2014). https://doi.org/10.1377/hlthaff.2014.0041 4. Mohammed, E.A., Far, B.H., Naugler, C.: Applications of the MapReduce programming framework to clinical big data analysis: current landscape and future trends. BioData Min. 7, 22 (2014). https://doi.org/10.1186/1756-0381-7-22 5. Wang, Y., Hajli, N.: Exploring the path to big data analytics success in healthcare. J. Bus. Res. 70, 287–299 (2017). https://doi.org/10.1016/j.jbusres.2016.08.002 6. Costa, F.F.: Big data in biomedicine. Drug Discov. Today. 19, 433–440 (2014). https://doi.org/ 10.1016/j.drudis.2013.10.012 7. Gutierrez, D.: InsideBIGDATA Guide to Healthcare & Life Sciences, DellEMC and INTEL (2016).http://en.community.dell.com/cfs-file/__key/telligent-evolutioncomponents-attach ments/13-4431-00-00-20-44-33-27/FINAL-White-PaperinsideBIGDATA-Guide-to-Health care-and-Life-Sciences.pdf?forcedownload=true 8. Kitchenham, B., Charters, S.: Guidelines for performing Systematic Literature reviews in Software Engineering Version 2.3, Engineering, vol. 45 (2007) 1051.https://doi.org/10.1145/113 4285.1134500 9. Guan, J.: Artificial intelligence in healthcare and medicine: promises, ethical challenges and governance. Chin. Med. Sci. J. 34(2), 76–83 (2019) 10. Bani-Salameh, H., Al-Qawaqneh, M., Taamneh, S.: Investigating the adoption of Big Data management in healthcare in Jordan. Data 6(2), 16 (2021) 11. Bollier, D., Firestone, C.M.: The Promise additionally, Peril of Big Data; Aspen Institute, Communications and Society Program, pp. 1–66. Washington, DC, USA (2010) 12. Kankanhalli, A., Hahn, J., Tan, S., Gao, G.: Big data and analytics in healthcare: introduction to the special section. Inform Syst Front. 18, 233–235 (2016). [Google Scholar] 13. Raghupathi, W., Raghupathi, V.: Big data analytics in healthcare: promise and potential. Health Inform. Sci. Syst. 2, 3 (2014) 14. Feldman, B., Martin, E.M., Skotnes, T.: Big Data in healthcare hype and hope. Dr. Bonnie 360 (2012). https://www.ghdonline.org/uploads/big-data-in-healthcare_B_Kaplan_2012.pdf 15. Frost & Sullivan: Drowning in Big Data? Reducing Information Technology Complexities and Costs for Healthcare Organizations.http://www.emc.com/collateral/analyst-reports/frostsullivan-reducing-information-technology-complexities-ar.pdf 16. Knowledgent: Big Data and Healthcare Payers. 2013, http://knowledgent.com/mediapage/ins ights/whitepaper/482 17. Zenger, B: Can Big Data Solve Healthcare’s Big Problems? HealthByte, February 2012.2012, http://www.equityhealthcare.com/docstor/EH%20Blog%20on%20Analyticspdf 18. Benhlima, L.: Big data management for healthcare systems: architecture, requirements, and implementation. Adv. Bioinform. 2018 (2018) 19. Philip Chen, C.L., Zhang, C.Y.: Inf. Sci. (Ny) 275 (2014), 314–347. https://doi.org/10.1016/j. ins.2014.01.015 20. De Momi, E., Ferrigno, G.: Robotic andartificial intelligence for keyhole neurosurgery: the ROBOCAST project, a multi-modal autonomous pathplanner. Proc. Inst. Mech. Eng., Part H: J. Eng. Med. 224(5), 715–727 (2010) 21. Yu, K.-H., Beam, A.L., Kohane, I.S.: Artificial intelligence in healthcare. Nat. Biomed. Eng. 2(10), 719–731 (2018) 22. Amann, J., et al.: Explainability for artificial intelligence in healthcare: a multidisciplinary perspective. BMC Med. Inform. Decis. Mak. 20.1, 1–9 (2020)

170

S. Vyas et al.

23. Tomar, D., Agarwal, S.: A survey on data mining approaches for healthcare. Int. J. Bio-Sci. Bio-Technol. 5(5), 241–266 (2013). https://doi.org/10.14257/ijbsbt.2013.5.5.25 24. Patel, S., Patel, H.: Survey of Data Mining Techniques used in Healthcare Domain. Int. J. Inf. Sci. Tech. 6(1/2), 53–60 (2016). https://doi.org/10.5121/ijist.2016.6206 25. Allam, S.: The impact of artificial intelligence on innovation-an exploratory analysis. Sudhir Allam," the impact of artificial intelligence on innovation-an exploratory analysis. Int. J. Creat. Res. Thoughts (IJCRT), ISSN, 2320–2882 (2016) 26. Chattu, V.K.: A review of artificial intelligence, Big Data, and blockchain technology applications in medicine and global health. Big Data Cogn. Comput. 5(3), 41 (2021) 27. Dash, S., Shakyawar, S.K., Sharma, M., Kaushik, S.: Big data in healthcare: management, analysis and future prospects. J. Big Data 6(1), 1–25 (2019) 28. Hassan, S., Dhali, M., Zaman, F., Tanveer, M.: Big data and predictive analytics in healthcare in Bangladesh: regulatory challenges. Heliyon 7(6), e07179 (2021) 29. Carra, G., Salluh, J.I., da Silva Ramos, F.J., Meyfroidt, G.: Data-driven ICU management: Using Big Data and algorithms to improve outcomes. J. Crit. Care 60, 300–304 (2020) 30. Price, W.N., Cohen, I.G.: Privacy in the age of medical big data. Nat. Med. 25(1), 37–43 (2019) 31. Lee, C.H., Yoon, H.J.: Medical big data: promise and challenges. Kidney Res. Clin. Pract. 36(1), 3–11 (2017) 32. Kaur, P., Sharma, M., Mittal, M.: Big data and machine learning based secure healthcare framework. Procedia Comput. Sci. 132, 1049–1059 (2018) 33. Raghupathi, W., Raghupathi, V.: Big data analytics in healthcare: promise and potential. Health Inf. Sci. Syst. 2(1), 1–10 (2014) 34. Lin, Y.K., Chen, H., Brown, R.A., Li, S.H., Yang, H.J.: Healthcare predictive analytics for risk profiling in chronic care: a Bayesian multitask learning approach. Mis Q. 41(2) (2017) 35. Hamid, S.: The opportunities and risks of artificial intelligence in medicine and healthcare (2016) 36. Meskó, B., Drobni, Z., Bényei, É., Gergely, B.: Gy˝orffy, Z.: Digital health is a cultural transformation of traditional healthcare. Mhealth 3 (2017) 37. Thangam, D., Malali, A.B., Subramaniyan, G., Mariappan, S., Mohan, S., Park, J.Y.: Relevance of Artificial Intelligence in Modern Healthcare. In Integrating AI in IoT Analytics on the Cloud for Healthcare Applications, pp. 67–88. IGI Global (2022) 38. Salathé, M., Wiegand, T., Wenzel, M.: Focus group on artificial intelligence for health. arXiv preprintarXiv:1809.04797 (2018) 39. Verma, A., Rao, K., Eluri, V., Sharma, Y.: Regulating AI in Public Health: Systems Challenges and Perspectives. ORF Occasional Paper (261) (2020) 40. Topol, E.J.: High-performance medicine: the convergence of human and artificial intelligence. Nat. Med. 25(1), 44–56 (2019) 41. Abidi, S.S.R., Abidi, S.R.: Intelligent health data analytics: a convergence of artificial intelligence and big data. Healthc. Manag. Forum 32(4), 78–182. Sage CA: Los Angeles, CA: SAGE Publications (2019, July) 42. Patel, H.B., Gandhi, S.: A review on big data analytics in healthcare using machine learning approaches. In: 2018 2nd International Conference on Trends in Electronics and Informatics (ICOEI), pp. 84–90 (2018).https://doi.org/10.1109/ICOEI.2018.8553788. 43. Nath Singh, P.: “I-Care”—Big-data Analytics for Intelligent Systems.In: 2021 8th International Conference on Smart Computing and Communications (ICSCC) (2021), pp. 225–229. https:// doi.org/10.1109/ICSCC51209.2021.9528292 44. https://www.plugandplaytechcenter.com/resources/7-leading-ai-healthcare-companies-2020/ accessed 3 October 2022 45. https://online.hbs.edu/blog/post/data-analytics-in-healthcare , Accessed 3 October , 2022 46. Khanra, S., Dhir, A., Islam, A.N., Mäntymäki, M.: Big data analytics in healthcare: a systematic literature review. Enterp. Inf. Syst. 14(7), 878–912 (2020) 47. Tagliaferri, S.D., Angelova, M., Zhao, X., Owen, P.J., Miller, C.T., Wilkin, T., Belavy, D.L.: Artificial intelligence to improve back pain outcomes and lessons learnt from clinical classification approaches: three systematic reviews. NPJ Digit. Med. 3(1), 1–16 (2020)

8 Impression of Big Data Analytics and Artificial Intelligence …

171

48. Lambay, M.A., Pakkir Mohideen, S.: Big Data Analytics for Healthcare Recommendation Systems. In: 2020 International Conference on System, Computation, Automation and Networking (ICSCAN), pp. 1–6 (2020). https://doi.org/10.1109/ICSCAN49426.2020.9262304 49. Parashar, G., Chaudhary, A., Rana, A.: Systematic mapping study of AI/machine learning in healthcare and future directions. SN Comput. Sci. 2, 461 (2021) 50. Rathore, M.M., Ahmad, A., Paul, A.: Te Internet of Tings based medical emergency management using Hadoop ecosystem. In: Proceedings of the 14th IEEE SENSORS, IEEE, Busan, South Korea, November 2015. https://www.optisolbusiness.com/insight/importance-of-bigdata-in-healthcare 51. Hudis, C.A.: Big data: are large prospective randomized trials obsolete in the future? Breast 24, S15–S18 (2015). https://doi.org/10.1016/j.breast.2015.07.005 52. Chen, B., Butte, A.J.: Leveraging big data to transform target selection and drug discovery. Clin. Pharmacol. Ther. 99, 285–297 (2016). https://doi.org/10.1002/cpt.318

Chapter 9

Artificial Intelligence Based Querying of Healthcare Data Processing UmaPavan Kumar Kethavarapu, Praveen Kumar Mannepalli, Bhavanam Lakshma Reddy, Pusapati Siva Prasad, Ashish Mishra, and Sateesh Nagavarapu

Introduction to Healthcare Case Study The health care industry is provisioning various innovative methods to the customers after the covid-19 pandemic. Earlier approach though consists of online and appbased mechanisms to support patients but now a days the approach is fine-tuned and focusing on more customer centric [1, 2]. Let us have a case study of an app to deal with diabetic patients by an approved medical agency and they are having some community of the people like initial medical practitioner to explain the process and guide the members to register for the plans through which the patient can get the services by the agency for a stipulated period. Once the initial call is done and the patient is decided to register for the services then the allocation of resource to that patient is done by the agency. The assignment of nutrition, health coach, fitness trainer, doctor video consultation will take place. The patient needs to perform regular activities suggested by the U. Kumar Kethavarapu (B) Senior Technical Trainer on Data Services, Indium Software, Bangalore, India e-mail: [email protected] P. Kumar Mannepalli CSE (CyberSecurity) G.H. Raisoni Institute of Engineering and Technology, Nagpur, India B. Lakshma Reddy SJES College of Mgmt. Studies, Bangalore, India P. Siva Prasad VFSTR Deemed to Be University, Guntur, A.P., India A. Mishra CSE, GyanGangaInstituteofTechnologyandSciences, Jabalpur, M.P., India e-mail: [email protected] S. Nagavarapu CSE, MRIT, Hyderabad, India © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2023 A. Mishra and J. C.-W. Lin (eds.), Industry 4.0 and Healthcare, Advanced Technologies and Societal Change, https://doi.org/10.1007/978-981-99-1949-9_9

173

174

U. Kumar Kethavarapu et al.

nutrition, like start of the day he/she must take water, adding the food suggested, checking the weight on every alternate day, checking the sugar levels like before and after meal, sometimes with empty stomach and random time also. Coming to the fitness training, based on the age, and other health parameters of the patient the trainer suggest some workouts which improves the activity of the patient which in turn helps to improve the health by decreasing the sugar levels. Once the registered member is familiar with the flow like observation of his/her sugar levels, weight, workouts, later point in time he will be assigned with authenticated doctor through video mode. The assigned doctor is having all the health records like complete body check-up details, sugar levels, and other organ functionality prior to the call with the patient. Along with this the doctor is also having the prescription of the patient (if any) that he/she is currently following. Based on the parameters, status of the body tests, existing prescriptions the doctor may suggest continuing the same medicine or may suggest some other prescription. The above activities specify the process flow and various stakeholders involved in the entire lifecycle, mean time the patient need to upload the data like meal information, exercise information, sugar levels as suggested and weight information to monitor the progress of reducing the sugar levels. Based on these observations the stakeholders can guide the patient in a better way.

Process Flow of Case Study To achieve all the above-mentioned activities in a systematic and more efficient way the data need to be stored in a distributed mechanism, the same should be made available to all the stake holders’ time to time. The patient also should periodically upload the mentioned data so that the agency can suggest a better approach to keep the patient healthy (Fig. 9.1).

Hadoop and Spark Architecture for Huge Data Storage and Effective Analytics The industry 4.0 demands huge data storage and heavy data processing through which we can draw the insights of the data which helps to improve the strategies for better outcomes. Will start from huge data storage like structured Data (Relational Model), Semi-Structured Data (XML, JSON) and Unstructured Data (Log Files, Audio and Video Files etc.,). Over the period from 2010 to 2022 the usage of unstructured data recorded exponential growth, the reason is simple, in the context of individual user, mid- range

9 Artificial Intelligence Based Querying of Healthcare Data Processing

175

Fig. 9.1 High level process flow

companies, MNC’s E-commerce and social media generates huge amounts of the data [3]. The Hadoop Eco System provides a common plat form to perform various activities by users some of the provisions we can observe as follows [4]: • • • • • • • • • • • •

Storing huge amounts of the data in distributed file system (HDFS). Processing the data in parallel mode (MR). Support of Job Submission with Java/Python based source file. Running the jobs with MR/YARN based Cluster. Import or Export of the Data from Local File System to HDFS and vice-versa. Import or export the data from Relational Model to HDFS (Sqoop). Import the data from Unstructured Sources to HDFS (Flume). Access the data from streaming sources to HDFS (Kafka). Performing some ETL operations (Hive). Provide simple Analytics (Pig). Support of NOSQL aspects (HBase). Monitoring the Jobs (Zookeeper).

176

U. Kumar Kethavarapu et al.

Apart from the above-mentioned provisions Hadoop also provides support for Machine Learning Libraries to perform some analytics and come up with best strategies to achieve better outcomes. Hadoop Distributed File System is based on the block storage where the files get replicated to provide the reliable data to the users, the block size may vary from 64 to 256 bytes based on the requirements of the user. The distributed memory is a mechanism where the usage of machines is more and reducing the time required to perform reading and processing will be greatly reduced. The processing layer of the Hadoop is so versatile like usage of parallel jobs on top of the distributed storage with the provision of moving the data to the algorithm and executes the programs or algorithms in more reliable way. The best part of the Hadoop Eco System is speculative execution, Hadoop is running based on some services like Name Node is Master node stored the meta data, Data Node is slave node which is the actual storage of the data. Secondary Name Node is a backup copy of the Name Node. Job Tracker acts as a Resource Manager is a master node which takes care of job scheduling, resource allocation and Jo Monitoring. Task Tracker (Node Manager) is a slave node runs the code and execute the jobs (Figs. 9.2 and 9.3). There are some issues with Hadoop Eco system usage like, • Hadoop is Batch Processing • Hadoop every time write back the data to disk • MapReduce is a Complex Development.

Fig. 9.2 Hadoop eco system

9 Artificial Intelligence Based Querying of Healthcare Data Processing

177

Fig. 9.3 Hadoop services and data context

• Hadoop is Not for Analytics The advent of Spark gives the solution for all the above-mentioned aspects [4–6] in such a way that (Fig. 9.4), • Spark provides stream processing and Interactive running of the algorithms. • Spark works on top of in memory computing which speed up access and process of the data. • Spark provides simple scripting running on top of Scala, Python, R etc., with so many customizations through which the programming is simple and powerful. • Spark provide a separate API for Data Engineering (DE) activities like Missing value Imputations, Feature Scaling, Encoding Techniques.

Fig. 9.4 Spark high level architecture

178

U. Kumar Kethavarapu et al.

Fig. 9.5 Hadoop and Sqoop usage to store and analyze the patients data

• Spark provides a special API for Machine Learning (ML) Algorithm usage such as classifications, regression, recommender systems, clustering etc., along with the metrics. • Spark provides faster and effective framework to run the algorithms running on top of Mesos/YARN in the context of DE and ML.

Hadoop and Spark Integration Hadoop and Spark can be integrated into a common architecture to enjoy the benefits of huge data storage and faster data processing. Coming back to our case study of Health care data management of the patients, there we have seen the data collection from various sources like blood test data, nutrition recommendations, doctor suggestions and fitness coach advice. All this data is not related to a single aspect and not to a single user. Various dimensions of the data will be collected related to a particular patient in the same way for lakhs together, the data must be stored in a distributed way for that Hadoop is the suitable framework the data may be in structured, semistructured or unstructured format still Hadoop can consume and store such categories of the data. Once the data is stored, we must process the data to get meaningful insights from the huge data and perform some analytics to take the strategic decisions for better outcomes [7]. Spark provides a simple in-memory programming model to perform the data processing and analytics on the data through which the analyst can draw some facts and accordingly the stakeholders can take the decisions (Fig. 9.5).

Spark Architecture In the field of data engineering, data analytics spark is playing a vital role, as spark is supporting various programming languages such as Scala, Python, Java, and R to

9 Artificial Intelligence Based Querying of Healthcare Data Processing

179

implement and perform various aspects such as data loading, pre-processing, using algorithms and measuring the performance of algorithms with the help of metrics etc., were basic strengths of Spark. The loading of the data such as .JSON, .PARQUET,.AVRO and .CSV corresponding to cloud- based environment, from Hadoop Distributed File System or Data bricks platform asper the requirements and demands of the client needs. The data pre-processing is a key aspect in every Data Engineering, Data Science and Machine Learning projects, to name a few pre-processing techniques such as Missing Value Imputations, Feature Engineering such as normalization and standardization transforms the data into a common scaling. The problem here is if we are having features like Employee ID, Age, Income, Experience, Number of Skills related to employees in a company if we apply the ML Algorithms [there may be a possibility of giving priority to the features with high values by the algorithm which leads to abnormal outcome in the process of learning [8, 9]. To handle this issue effectively the best strategy is usage feature engineering techniques by scaling all the features to a common value range. The other commonly used data pre-processing technique is encoding of the data, in certain cases there may be a possibility of using categorical data such as Gender, Designation (DA, DE, DS Roles) and Qualifications such as (B, Tech, M.Tech, MS and PhD) rather than processing the data as it is in the form of the strings it is better to have the assignment of numbers so that The algorithm can easily learn the categories and accordingly it estimates the predictions [10]. There are various encoding techniques supported by Spark such as One Hot Encoding, Dummy Encoding to effectively transform the categorical data into number equivalent values. The metrics or measures of spark MLLib helps the algorithms to exploit the performance of the Model in the context of classification and regression, we are mentioning these 2 categories because most of the practitioners prefer the implementation of algorithms and models with these algorithms [11]. In case of the classification, the main expectation is getting the outcome as yes/no, True/False etc., we are having the algorithms such as K-Nearest Neighborhood , Naïve Bayes, Support Vector Machine, Logistic Regression, Decision Trees, Random Forest are very common and frequently used algorithms with unique features according to the requirement of the datasets and problem statements. Each algorithm is having certain statistical background which really helps the developers to understand and elect the suitable algorithm according to the requirements of clients, the scope of these concepts is beyond this chapter. The commonly used Evaluation Metrics in the implementation of classification models involves Confusion Matrix, AUC-ROC Curve, Accuracy, F1-Score, Precision, recall apart from this the Data Scientists can make use of other measures like correlation, Mean, Mode and Median etc., to get the better understanding of the data sets and the model performance aspects. In case of Regression the most elected algorithms by the practitioners are linear regression, multiple regression, and polynomial regression. In some cases, there is enough to estimate the relationship between two variables.

180

U. Kumar Kethavarapu et al.

In the ML context Independent Variable (X) and Dependent Variable (y) to achieve this one can go with Linear Regression Model, example number of hours preparation of a candidate versus JEE Mains/Advanced Rank expected for the candidate. The regression yields the results in terms of numbers or values, such as house prices in a particular area. Sometimes the estimation of multiple independent variables relation with a dependent variable in that case we must opt the Multiple Linear Regression. The interesting factor is both implementations backed by Linear Regression Model only the distinction is usage of single variable estimation vs multiple variables estimation. There may be a special case such as finding out salary demanded by a candidate who has cleared all rounds of interview. The scenario is like the candidate claiming far more salary than which the company offers for the candidate with same skill and experience range. But company want to offer him as his skillsets were excellent according to the feedback, the only need is the company need to apply some strategy that the current company offerings, did he have any equivalent offer as per the claim. To embed all these cases and come up with the eligible salary for the candidate the best possible algorithm to be used is Polynomial Regression. In case of Regression, we are having Mean Absolute Error (MAE), Mean Square Error (MSE) or Root Mean Square Error (RMSE), R2, Adjusted R2. According to the requirement the Data Scientists can elect the mentioned evaluation metrics. To understand the above-mentioned aspects clearly it is mandatory to study and observe the Spark Architecture and which are all the components in the Architecture. We can understand the running of the Job in Spark by observing the run time components of the architecture. As per the requirements the job may be submitted to client mode or in cluster mode (Fig. 9.6). Fig. 9.6 Spark components in cluster deploy mode (Source Databricks)

9 Artificial Intelligence Based Querying of Healthcare Data Processing

181

In Cluster Mode, the cluster Manager is located on a node other than the client machine, from there it starts and ends executor processes on the cluster node as required by the spark application running on the spark driver. So the cluster manager allocates resources to spark applications and maintains the executor processes in client Mod. The spark run time components consist of two parts one is Client and the other is Cluster. The client process starts the driver program. The submission can be from Command line such as spark-submit or spark-shell. The client process is responsible for passing the all- configuration aspects along with the application arguments. The driver component plan or coordinate and monitors the execution of a Spark Application, note that there will be only one driver per Spark Application, the driver can be seen as a wrapper around the application. The driver is having some subcomponents such as Spark Context and Scheduler, which are responsible for Requesting Memory and CPU Resources from Cluster Managers, splitting the application Logic into Stages and Tasks, Sending Tasks to Executors and Collecting Back the results. The early release of Spark contains Spark Context, which is the entry point for the driver program, in the later releases the designers of spark wanted to scope the Hive Context and SQL Context along with Spark Context, so the entry point is Spark Session (Figs. 9.7 and 9.8). The main observation here is the driver is running inside the clients JVM Process, in case of the Cluster Mode the driver process runs as a separate JVM process inside a cluster and cluster manages its resources (JVM Heap Memory Majority of the times). In client deploy mode the driver running inside the client’s JVM process and communicates with the executors managed by the cluster. The executors which JVM processes accepts the tasks from the driver, executes those tasks and then return the results to the driver, each executor has several task slots for running tasks in parallel. To understand more let us see the various spark cluster types, spark can run on standalone, YARN and Mesos Cluster

Fig. 9.7 Spark session usage

182

U. Kumar Kethavarapu et al.

Fig. 9.8 Spark components in client deploy mode (Source Data bricks)

Spark standalone cluster is spark-specific cluster no chance of communication with Hadoop Distributed File System the benefit here is this standalone cluster provide faster job startup compared with YARN based running. Yet Another Resource Negotiator (YARN) is a Hadoop Resource Manager and Processing system. Hadoop MapReduce1 (MRV1) is tightly coupled system, so the designers of Hadoop extended the functionality to support various other tools with MapReduce2 (MR2) which is also known as YARN. The usage of Spark job running on top YARN gives the benefit to the companies which already have the support of this cluster in terms of capabilities and infrastructure. The other advantage is that YARN can be integrated with Java applications, Python along with Spark jobs. To run secured jobs with the support of Kerberos- Secured Hadoop System with the distribution of application across the cluster is possible. The Mesos Cluster is native to Spark as the design and development of spark is based on Scala and the default cluster mode is Mesos which is Scalable and Fault Tolerance distributed system which is written in C++. Mesos is having the support of scheduling of the resources such as CPU, disk space and ports along with Memory Scheduling, especially Mesos provides fine-grained job scheduling model where the other cluster modes do not have. In case of spark standalone cluster running on a single machine, there are two possible options like Spark Local Mode and Spark Local Cluster Mode. However, these models won’t be preferred in production environments, the workload distribution is not possible, and there are certain resource restrictions and performance also suboptimal. The high availability is not truly supported by standalone modes of spark job running. Of course, in the entire architecture we can observe that tasks are the smallest element in the execution hierarchy (Figs. 9.9 and 9.10).

9 Artificial Intelligence Based Querying of Healthcare Data Processing

183

Fig. 9.9 Spark with multi-node (YARN) cluster

Fig. 9.10 Spark UI history server

Conclusion In this chapter we have taken the case study of the Health Care data management of the patients, the data is populated from various stakeholders, and it must store in a distributed way. After storing the data to process the data in the fastest manner we must opt the parallel processing of the data.

184

U. Kumar Kethavarapu et al.

To achieve these two provisions, we are making use of Hadoop Framework which basically provides a way to store the data in block based and provides a distributed storage mechanism and to load the data from various sources we can make use of the tools like HDFS, Sqoop, Flume and for initial processing of the data we can make use of tools Hive and Pig Latin. After getting the data into HDFS we can make use of spark to access the data from Hadoop framework. Spark can process the data in parallel on top of YARN which is a cluster resource management. Spark exclusively provides the API for data curation and algorithms to analyze the data with suitable measure or metrics. So, with the usage of Hadoop and Spark together for the storage and processing of the data for the use case of Health Care Management system the stake holders can suggest better and suitable precautions to the patients so as to lead healthy life. The Chapter described the Spark architecture in various modes of the operations such as Stand-alone, Cluster Mode (YARN and Mesos) along with run times components such as Driver and Spark Context which dependent on Cluster Manager to run the jobs, the job is split into stages which in turn split into tasks which are lower-level components of execution of the jobs in spark architecture.

References 1. Dhar, V.: Big Data and Predictive Analytics in Health Care. KDD (2022) 2. Kuo, T.-T.: Model Chain: Decentralized Privacy-Preserving Healthcare Predictive Modeling Framework on Private Blockchain Networks, National Coordinator for Health Information Technology (Feb 2019) 3. https://databricks.com/glossary/hadoop-ecosystem 4. https://www.sciencedirect.com/topics/computer-science/hadoop-ecosystem 5. https://spark.apache.org/docs/latest/cluster-overview.html 6. Shah, A., Gor, M., et al.: A stock market trading framework based on deep learning architectures. Multimed. Tools Appl. 81, 14153–14171, Springer (2022) 7. Prerana, C., et al.: Stock market prediction using ML and DL techniques. Int. Res. J. Eng. Technol. 7(4) (2020) 8. Uma Pavan Kumar, K., et al.: Various computing models in Hadoop eco system along with the perspective of analytics using R and Machine learning. Int. J. Comput. Sci. Inf. Secur. 14, PNO, 17–23 (2020) 9. Uma Pavan Kumar, K., et al.: Usage of HIVE tool in Hadoop ECO system with loading data and user defined functions. Int. J. Psychosoc. Rehabil. 24(4), 2020 10. Uma Pavan Kumar, K.: Performance analysis of naïve bayes correlation models in machine learning, 25(4), PNO, 1153–1157 (2020) 11. Uma Pavan Kumar, K., et al.: Sqoop usage in hadoop distributed file system and observations to handle common errors. Int. J. Recent. Technol. Eng. (IJRTE) 9(4) (2020)

Chapter 10

A Disaster Management System Powered by AI and Built for Industry 4.0 Raj Kumar Singh, Ishan Srivastava, and Vandana Dubey

Introduction What’s Disaster? Disasters are significant setbacks to a community’s ability to function that go beyond its capacity to withstand using its own resources. Disasters will be caused by a combination of natural, man-made, and technical dangers, as well as a wide range of other factors that affect how exposed and vulnerable a population is. The world organization International Strategy for Disaster Reduction (UN/ISDR, 2002) identifies natural and technological disasters as the two primary sources of hazards. Natural disasters embrace three specific groups [1]: (1) Hydro-meteorological disasters, such as landslides and avalanches, storms, droughts, and associated disasters (high temperatures and forest/scrub fires). (2) Geological disasters: volcanic eruptions, earthquakes, and tsunamis. (3) Biological catastrophes, which include diseases and insect invasions. Three groups make up the technical disasters: industrial accidents, such as chemical spills, commercial infrastructure failures, explosions, fires, gas leaks, poisoning, and radiation. Transport-related mishaps, whether they occur by air, rail, road, or sea. Miscellaneous accidents—explosions, flames, and domestic/non-industrial building collapses. R. K. Singh · I. Srivastava · V. Dubey (B) IPS Academy, Institute of Engineering & Science, Indore, Indore, India e-mail: [email protected] R. K. Singh e-mail: [email protected] I. Srivastava e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2023 A. Mishra and J. C.-W. Lin (eds.), Industry 4.0 and Healthcare, Advanced Technologies and Societal Change, https://doi.org/10.1007/978-981-99-1949-9_10

185

186

R. K. Singh et al.

Natural disasters, whether they are caused by geological forces like earthquakes and volcanoes or by meteorological forces like hurricanes, floods, tornadoes, and droughts, are accepted for their harmful effects on human lives, the economy, and therefore the setting. Developing countries are more vulnerable to the destruction that such disasters will bring because of their unstable landforms, tropical temperature, and high population density as well as their financial situation, illiteracy, and lack of infrastructural development. According to the UN/ISDR (2002), between 1994 and 2003, natural and technological disasters in developing and least developing nations had the highest level of effect in terms of the number of people killed and economic losses. It is typically acknowledged that there is no thanks to fully mitigate the harmful effects of disasters. However, steps may be taken to reduce their effects. During this sense, thriving governance depends heavily on competent DM.

Vulnerability According to its definition, it refers to “the degree to which a community, structure, service, and/or geographic area is likely to be harmed or disrupted by the effect of a particular hazard, due to their nature, construction, and proximity to hazardous terrain or a disaster-prone location.”

10 A Disaster Management System Powered by AI and Built for Industry 4.0

187

Risk The term “hazard” refers to a risk occurrence of a defined magnitude occurring in a particular location at a certain period. It is a measurement of the anticipated losses as a result. Risk is based on the likelihood that specific events would occur and the damages they would each bring about. The following elements affect risk: I. The nature of the threat. II. The impacting factors’ vulnerability. III. The significance of those factors economically.

Hazards “Phenomena that represent a threat to persons, structures, or economic assets and which may produce a disaster” is how hazards are defined. They might be created by humans or arise naturally in our surroundings. The effect, intensity, and characteristics of the event and how it affects people, the environment, and infrastructures are the factor which determines the level of damage in a disaster (Table 10.1).

What is DM? [3] A disaster is a broad term that encompasses a wide range of distressing individual and societal circumstances. These include heat and cold; rats and locusts; fires and drowning; earthquakes; and tornadoes, epidemics, and famine. Disasters can be classified as either exogenous or endogenous. The first is a process in which one section of the community goes through biological, financial, and psychosocial pain, while another achieves success financially and socially. The latter is used to describe an occurrence that puts a community or society at grave risk, causes harm and devastation, or disrupts the social order and primary functions of society. Five general phases make up disaster management, namely: (1) (2) (3) (4) (5)

Prediction. Warning. Immediate assistance. Treatment. Reconstruction.

Family-main event

Geophysical• Earthquake/Mass movement of earth materials • Volcano • Tsunami

S. no.

1

Table 10.1 Types of disaster [2]

• A sequence of waves that travel over the deep ocean with long wavelengths and are caused by the movement of large volumes of water caused by underwater earthquakes, volcanic eruptions, or landslides • Tsunami waves move over the ocean at incredibly high speeds, but as they approach shallow water, they slow down and the wave becomes steeper (continued)

• Surface shifting of clay materials brought on by earthquakes brought on by volcanic eruptions • A type of geological occurrence that involves volcanic eruptions of lava, ash, hot vapor, gas, and pyroclastic debris that occur close to an opening or vent in the Earth’s surface • Ash fall; Lahar, which is the term for a hot or cold mixture of earthy material streaming on a volcano’s slope during or in between volcanic eruptions • Lava Pour Pyroclastic Flow—During an eruption, extremely hot gases, ash, and other materials traveling at speeds of more than 700 km/h flow down the face of a volcano

• Landslides that occur after an earthquake • Earthquake-caused urban fires • Liquefaction, which is the change of soil that has been partially saturated with water from a solid condition to a liquid state • Massive earth material flow, typically down slopes • Surface displacement of earthen materials as a result of earthquake-induced ground trembling

Brief synopsis/secondary disaster

188 R. K. Singh et al.

• An avalanche is a significant accumulation of loosely packed soil, snow, or ice that swiftly slides, flows, or descends a mountainside due to gravity • Coastal erosion is the temporary or permanent loss of sediments or landmass along the coast caused by the action of waves, winds, tides, or human activity • Debris Flow, Mud Flow, Rock Fall—Types of landslides that happen when heavy rain or rapid snow/ice melt send copious amounts of vegetation, mud, or rock down slope by gravitational forces. Coastal Flood—Higher-than-normal water levels along the coast caused by tidal changes or thunderstorms that result in flooding, which can last from days to weeks • Flash Flood Hydrological—Excessive or heavy rain that falls in a quick period of time and runs off immediately, causing flooding conditions to develop minutes or hours later • Flood Hydrological—A general term for ponding of water at or close to the location where the rain fell, overflowing water from a stream channel onto normally dry land in the floodplain (riverine flooding), and higher-than-normal levels along the coast and in lakes or reservoirs (coastal flooding) (flash floods) • Wave Action: Surface waves produced by the wind that can be seen on the surface of any open body of water, including lakes, rivers, and oceans. The wind speed and the distance traveled determine the wave’s size (fetch)

Hydrological • Flood • Landslides • Wave Action

Meteorological • Wind, convective storm, tornado, storm surge, extratropical storm Risk brought on by extreme weather and atmospheric conditions that • Derecho, cold wave are transient, micro- to meso-scale, and can endure for minutes to days • High temperatures, fog, freezing temperatures, hail, and heat waves • Lightning, torrential rain, sandstorms, and dust storms • Blizzard, winter storm, snow, and ice

3

(continued)

Brief synopsis/secondary disaster

Family-main event

S. no.

2

Table 10.1 (continued)

10 A Disaster Management System Powered by AI and Built for Industry 4.0 189

Family-main event

Climatological “Unusual, extreme weather conditions related to long-lived, meso- to macro-scale atmospheric processes ranging from intra-seasonal to multi-decadal (long-term) climate variability” (“TYPES OF DISASTERS—Rishi UPSC”)

Biological exposure to pathogens and harmful materials

S. no.

4

5

Table 10.1 (continued) Extratropical storm, tornado, convective storm, wind Cold wave and derecho Heat waves, hail, freezing temperatures, high temperatures, and fog Thunderstorms, lightning, heavy rain, sandstorms, and dust storms Winter storm, snow, and ice; Blizzard

• Insect infestations • Animal stampedes • Viral, bacterial, parasitic, fungal, or prion diseases

• • • • •

Brief synopsis/secondary disaster

190 R. K. Singh et al.

10 A Disaster Management System Powered by AI and Built for Industry 4.0

191

Program for DM Risk in Brief The Government of India (GOI), Ministry of Home Affairs (MHA), and United Nations Development Programme (UNDP) signed an agreement in August 2002 to reduce the vulnerability of communities to natural disasters in areas that have been identified as multi-hazard disaster prone. Goal: “Sustainable Reduction in Natural Disaster Risk” in some of the riskiest areas in a few Indian states. These are the program’s four primary goals: 1. Assistance in building national capability for the Ministry of Home Affairs. 2. Environmental improvement, education, awareness campaigns, and bolstering of all-level capability for sustainable recovery and risk management of natural disasters. 3. Plans at the state, district, block, village/ward, and in some program states and districts for the program’s multi-hazard readiness, response, and mitigation. 4. Sharing information on efficient techniques, strategies, and instruments for managing the risk of natural disasters, as well as creating and advancing policy frameworks.

192

R. K. Singh et al.

DM in India After the earthquake in Bhuj, the Government of India reviewed the DM system. It was observed that developing comprehensive DM capabilities was necessary to prepare for both natural and man-made calamities. With the exception of epidemics, droughts, and other emergencies/disasters that are specifically assigned to other Ministries, it was decided to transfer DM from the Ministry of Agriculture to the Ministry of Home Affairs. India has a history of being extremely vulnerable to natural disasters. In addition to the 9,885 fatalities from the super cyclone in Odisha, the Bhuj earthquake claimed 13,805 lives. The government believes that these casualties could have drastically decreased if suitable mitigation measures had been employed. Disasters often result in the loss of thousands of crops each year, which affects social and community resources. It is obvious that without incorporating mitigation into the planning process, development cannot be sustained. The Indian government has changed its policy in a way that stresses mitigation, prevention, and readiness while keeping the issues in mind. The following pages contain a strategic road map that has been developed to lessen the nation’s susceptibility to disasters. According to the roadmap, steps must be taken to lessen our susceptibility to calamities. Every 2 years, the roadmap will be evaluated to see whether any course corrections are required. India’s DM strategy entails the following: 1. 2. 3. 4.

Framework for institutions and policies. A system of early warning. Disaster mitigation and prevention. Preparedness.

The Institutional and Legal Environment Since independence, institutional, and policy structures for response, relief, and rehabilitation have been well established. These processes have shown to be strong and efficient in terms of response, relief, and recovery. At the federal level, the Ministry of Home Affairs serves as the focus ministry for all DM matters. 1. 2. 3. 4. 5.

Initial National Crisis Management Committee (NCMC). The Crisis Management Group. Command Post (Emergency Operation Room). Emergency Action Plan. The financial sources for State Relief Guidebooks.

Warning Systems [3]

10 A Disaster Management System Powered by AI and Built for Industry 4.0

193

Cyclone The responsibility of monitoring tropical cyclones (TC) and issuing warnings is under the purview of the Indian Meteorological Department (IMD). The monitoring process has changed as a result of remote sensing equipment. A storm surge and TC strength forecasting approach has been devised using techniques for deciphering satellite pictures. The meteorological satellite has been useful for cyclone analysis. In addition, INSAT data have been used to examine the architecture of numerous tropical cyclones in the Bay of Bengal. Additionally, IMD is publishing Cloud Motion Vectors (CMVs). The Very High-Resolution Radiometer (VHRR) payload onboard INSAT-2E now offers water vapor channel data in addition to VIS and IR data. A Charged Couple Device (CCD), a separate payload, has also been mounted on this satellite.

Flood There are currently 166 flood forecasting stations on various rivers across the country, which are split into 32 inflow forecasting stations and 134 level forecasting stations. The four primary tasks involved in flood forecasting are as follows: 1. 2. 3. 4.

Hydrological and hydro-meteorological data observation and gathering. Data transmission to forecasting facilities. Data analysis and forecast creation. Forecast dissemination.

Early warning systems tailored to different natural calamities are also being developed.

Disaster Mitigation and Prevention Mitigation and prevention have been incorporated into the Indian government’s development strategy as crucial elements. A thorough chapter on DM is included in the Tenth Five Year Plan document. • The Indian government has released guidelines stating that initiatives dealing with mitigation will be given precedence if there is a shelf of projects. To lessen flood damage, steps were taken starting in 1950. The construction of embankments has protected a total of 15 million hectares of the 40 million hectares of land that are vulnerable to flooding. • The National Core Group for Earthquake Mitigation is composed of administrators and engineers with competence in earthquake engineering.

194

R. K. Singh et al.

• A disaster risk management program has been implemented in 169 of the most dangerous areas across 17 States, including all 8 North Eastern States, with assistance from the UNDP, USAID, and European Union. • This initiative has produced DM plans for almost 3500 villages, 250 g panchayats, 60 blocks, and 15 districts.

Preparedness To reduce susceptibility and respond to disasters quickly and professionally, mitigation and preparedness measures work together. 96 specialized search and rescue teams, each consisting of 45 persons, including doctors, paramedics, structural engineers, etc., are currently being trained and outfitted by the Central Government. The Ministry of Health is establishing a 200-bed mobile hospital in Delhi that will be attached to a top government hospital and come fully furnished and trained. Emergency responders can acquire information about key factors for the disaster-affected areas with the help of the Geographical Information System (GIS) data source.

History of Disaster and DM in India DM Plan for Surat [4] With a population of around 7,784,276 people, the city of Surat is located in the Indian state of Gujarat. Plague in Surat: The plague became a major worry on a global scale. 200 fatalities were associated with the Surat outbreak. The illness sparked widespread fear and prompted a largescale evacuation from the city. Aside from the human tragedy, it dealt a serious blow to the economy of the whole country as well as Surat, which lost several million rupees every day. The pandemic affected a variety of fields, including export, tourism, and industrial production. The shipment of food grains out of Surat was prohibited, and international flights to India were momentarily halted. Constant rain for more than 2 months in Surat caused flooding and extensive water logging in low-lying regions, which served as a catalyst for the outbreak of plague. The inadequate drainage system was the main cause of this. Due to the flood and water logging, hundreds of cattle and other animals perished. In actuality, the problems associated with poor waste management systems were significantly exacerbated by the floods. Conclusion: The Municipal Authority, other relevant authorities, and the city’s populace all learned a lesson from this disease. All the storm water and drainage systems were renovated after the pandemic. Systems for managing solid waste and maintaining cleanliness were created. The whole people learned about the problems

10 A Disaster Management System Powered by AI and Built for Industry 4.0

195

with cleaning. A hydrological contour map for the city was created, and the flood management system was introduced. Services for rescue and relief are organized.

Metropolitan City, India Disaster Risk Management Profile Functional configurations. Mumbai’s DM Plan refers to its mitigation strategy’s goals as • To significantly raise public knowledge of disaster risk so that the public expects safer neighborhoods in which to live and work. This is consistent with the national approach. • To drastically lower the risks associated with fatalities, injuries, monetary loss, and the destruction of cultural and natural resources because of disasters: inter-city connections, land use planning. Vulnerability Issues • • • •

The city’s history includes incidents involving fire and industrial mishaps. Floods. The Central Railway’s 10 sections are identified by the Mumbai DMP. Chemical, biological, and nuclear dangers (transport, handling). Earthquakes. Mumbai is located in Seismic Zone III according to the Bureau of Indian Standards (BIS).

Management of Earthquake The fact that 59% of India’s geographical area could experience moderate to severe earthquakes demonstrates the country’s significant earthquake risk and vulnerability. More than 25,000 people died in India from big earthquakes between 2000 and 2010, which also significantly damaged property and the nation’s infrastructure. All these earthquakes proved that building collapses were the main source of significant mortality. These highlight the requirement for careful adherence to India’s earthquake resistance building rules and municipal planning byelaws. These recommendations were developed after a review of the major gaps that contributed to a given risk. These recommendations stress the importance of conducting a structural safety audit of existing lifelines and other key structures in earthquake-prone locations, as well as performing selective seismic strengthening and retrofitting. The following six pillars of seismic safety serve as the foundation for the earthquake management standards in India, which are intended to increase their efficacy. The six pillars are as follows:

196

R. K. Singh et al.

• The building of new structures that are earthquake resistant. • Selected seismic retrofitting and strengthening of current lifeline and priority structures. • Enforcement and regulation. • Preparation and awareness. • Developing education, training, research and development, capacity building, and documentation capacity. Emergency reaction.

Policy for DM in Gujarat (GSDMP) [5] According to the Gujarat State DM Policy, effective DM requires an understanding of hazards and disasters, their behavior, and the risks they pose to the community. The plan for executing the GSDMP, which emphasizes an integrated approach to disaster management, includes the Pre, Impact, and Post catastrophe phases as essential components of any DM program. For the prescribed activities stated in this policy, the GoG has designed a framework of operation for a collection of entities that are essential to disaster management. The GSDMP expects the following organizations to play significant roles in the DM framework: • • • • •

DM Authority of Gujarat. Commissioner of State Relief. Bureaus of the government. The District Collector-led District Administration. Local authorities, such as Gram Panchayats, Districts, Talukas, and Municipal Corporations. • Non-profit organizations, such as NGOs. • Community; public and private sectors. The foundation of the implementation framework is the idea that DM is not a distinct field or discipline but a method of problem-solving that makes it easier to handle disasters by utilizing the abilities and resources of all stakeholders. As a result, a crucial component of the policy framework is to use the resources and skills of current entities while also developing new capacities as needed. While local governments and government employees continue to serve as the implementation agencies for most activities, the GSDMA at the state level offers the overarching direction and guidance that keeps the attention of various entities on disaster management.

The Bhopal Gas Tragedy In metropolitan centers, weak regulatory oversight and negligent industrial placement are the main causes of disease. One of the deadliest industrial mishaps in recent

10 A Disaster Management System Powered by AI and Built for Industry 4.0

197

memory was the Bhopal gas tragedy on December 2, 1984, when Union Carbide’s plant spilled 43 tons of methyl isocyanate and other chemicals used in the production of pesticides. 8,000 individuals died in the first week and another 8,000 perished later out of the 520,000 people who were exposed to the gas. Even now, the survivors can still clearly see the effect. Conclusion: To prevent such a tragedy, the Government of India and the relevant state government have established tight regulations and monitoring systems for industries. Every industry is required to have a DM plan and safety protocols.

DM Plan for GIR Conservation principles: Most prominent concentration of top carnivores, including lions and leopards (over 600), and the single largest population of marsh crocodiles in the nation. Largest compact track of dry deciduous forest in the semi-arid western parts of the country. Rich biodiversity areas support large numbers of species, including several endangered species. (Wikipedia) “Gir National Park.” • In Saurashtra, the mother of cultural and religious progress. About 150 fire occurrences affecting small, medium, and large areas occurred between 2001 and 2006. Between 2001 and 2006, more than 70 lions passed away. Center for National DM (Sasan) • • • • • • •

Management of information. Creation of plans, policies, and strategies. Evaluate vulnerability. Support and coordination in times of emergency and tragedy. Non-emergency circumstances. Carrying out audits. Education and public awareness.

India Tsunami [6] On December 26, 2004, a tsunami in India severely damaged 897 villages across five states and two union territories. 4,259 people were hurt, 5,555 went missing, and 10,749 perished in the tsunami. Fisheries and boats, ports and jetties, roads and bridges, power and ICT, housing, water supply and sewerage, and social infrastructure are the primary areas impacted in each state.

198

R. K. Singh et al.

The external agency modified the rescue and relief operations to be quick, efficient, and timely; removing debris and disposing of bodies; distributing relief supplies; and offering food, water, and medical aid. Using positive historical precedent: • To use the lessons learnt, earlier crisis management initiatives that had been effective were reviewed. • Encourage potential beneficiaries to take ownership of solutions to ensure sustainability. • Encourage collaboration between the government, those who will benefit NGOs, and women’s groups in the community to achieve sustainable development. • Show how a Project Management Unit with competent leadership and complete authority may ensure project implementation. • Discuss the necessity for a long-term strategy for funding O&M (Observations and Measurements).

Dispute and Problems The existing institutional framework functions quite well, but the issue is that there was no cohesive system in place during the crisis. India’s unique geoclimatic circumstances have made it prone to catastrophic disasters. Landslides, cyclones, earthquakes, floods, and droughts have all been frequent occurrences. The continent is susceptible to earthquakes of varying magnitude in around 60% of the territory, floods in about 40 million hectares, cyclones in about 8% of the area, and drought in 68% of the area. The amount of individual, communal, and public assets lost has been enormous. The Bangalore Circus Tragedy in 1981, the Bhopal Gas Tragedy in 1984, the Gujarat cyclone in 1998, the Orissa super cyclone in 1999, the Gujarat earthquake in 2001, the yearly flooding in large portions of the nation during the monsoon, and the tsunami in 2004 are just a few of the recent disasters to strike India. As lessons from each disaster have been applied, the disaster response has increasingly improved over time. The lack of a national-level plan policy; the absence of an institutional framework at the center, state, or district level; poor intersectoral coordination; the absence of an early warning system; the slow response from relief agencies; the scarcity of trained and committed search and rescue teams; and poor community empowerment are factors that have hindered the response to disasters in the past.

10 A Disaster Management System Powered by AI and Built for Industry 4.0

199

Challenges for Current DMS

Command

Control

Coordination

• Lack of integrated system for real-time situation information • That results in delay in assessment and command

• Lack of effective control over the relief and rescue operation

• Lack of on-ground team, coordination In-charge Officers + Higher Authorities

Ideologies The vision is to form an integrated system with which during any type of disaster. On ground rescue and relief operation in charge officer to take decisions based on real-time ground information. So, the system can enhance the current management system.

Working Working Process of the System Step-1 Updating the system (showing it on the map) with data given by on-ground rescuers and assessment team. Step-2 Help the officer to analyze the situation based on the data showing in the map.

200

R. K. Singh et al.

Step-3 Now the responsible officer can take necessary efficient action based on the situation.

Solution Red-Colored Areas Disasters typically present themselves through water. Climate change is causing an increase in the frequency and severity of waterborne disease outbreaks as well as floods, landslides, tsunamis, storms, heat waves, cold snaps, and storms. Unplanned urbanization and the deterioration of ecosystem services are two examples of elements that amplify the effects and costs of these catastrophes. To ensure access in the face of climatically unpredictable futures, water and sanitation services must be made more resilient and at lower risk.

Mayor “The mayor is responsible at the administrative level for ensuring that the response effort runs smoothly.” (“Tasks and responsibilities in the event of a crisis or disaster.”) He/she convenes the representatives of the various public services who make up the DM team. “In view of the mayor’s administrative responsibility, the municipal council may call him/her to account for the overall management of the response effort.” (“Tasks and responsibilities in the event of a crisis or disaster.”)

Fire Service The disaster response is anchored by the fire service. The operational management of the response effort is under the purview of the fire chief. (“Responsibilities and Tasks in Case of Crisis or Disaster.”) He or she is in charge of all that takes place in the catastrophe area. The fire chief executes the decisions made by the DM team as one of its members. Additionally, he or she organizes the efforts of the emergency services. The fire department’s primary priority while responding to a disaster is to save lives. Of course, firefighters also extinguish flames and run testing to see whether any dangerous compounds have been released. Accident and catastrophe medical teams in your area. A disaster will necessitate immediate medical attention for everyone injured. First aid is typically given by ambulance paramedics, who also stabilize the injured so they can be transported to a hospital.

10 A Disaster Management System Powered by AI and Built for Industry 4.0

201

Police Services The police will make sure that the ambulance and fire departments can carry out their duties. They will put up a safety zone around the catastrophe area, direct traffic, and cordon off the disaster area. If victims are hard to identify, the police will send out a disaster identification team, which consists of specialists gathered as needed. This group of experts works collaboratively to complete their tasks.

Armed Forces Military soldiers may be sent to a disaster area by the Ministry of Defense. Armed forces play a vital part in DM by leading rescue missions and organizing evacuations. Their biggest problem is always being the first responder to any disaster. In that area, there is a need for improved coordination between the armed forces and other governmental and non-governmental groups.

Municipal Services For the immediate welfare of the citizens, the municipality is accountable. “Municipal services will help in other practical ways, including giving food and temporary shelter; they may even provide mental health care.” (“Tasks and Responsibilities in Case of Crisis or Disaster.”) The local government will also list the victims and may offer assistance if they have uninsured damage. Other services that respond to a disaster, which services participate in the endeavor will vary depending on the type of disaster. Water authorities: During a flood brought on by a dam breach or a lot of rain. Coastguard: In the event of a calamity off the Dutch coast. Red Cross: To care for the injured. Rescue dog organizations: To search for victims buried under the debris. Salvation Army: To offer soup and sandwiches to both victims and first responders.

Ground Status • Work in progress. • Resources needed to call. • Situation report. As there are moderately affected areas also there where we need not to involve all equipment in the helping task and not involving all members of different teams and when they need any kind of help, they update their issues so that the nearest and help will be able to guide or help in any malfunction.

202

R. K. Singh et al.

Advantages of Integrated Disaster Management Using an integrated strategy for DM can be greatly beneficial. The proactive strategy, in the first place, enables disaster mitigation, readiness, and warning before they happen. By their timing of inception, natural hazards are typically categorized. Some have a gradual onset, giving opportunity for preventative measures. Droughts, floods, and volcanic eruptions are examples of risks with a slow beginning. Other occurrences, like flash floods, tsunamis, and cyclones, offer little to no lead time for preparations and mitigation strategies, let alone proper warnings. An at-risk population’s chances of having their lives, animals, property, and livelihoods saved are increased by giving them enough lead time. By taking a proactive approach, these potentials can be turned into advantages. Things that are changed after implementing the proposed integrated system: • • • • • •

Using this technology, resources can use integrity. The process could be controlled and commanded using this integrated system. Real-time ground information is updated in the system. Commanding officer can take fact-based commands. Since the info is real time the action is fast and accurate. Information from the affected area is also provided to the head officers and to the Home Ministry, thus this time can be utilized to take better actions for the right direction of the rescue operation the best use of available resources in the

10 A Disaster Management System Powered by AI and Built for Industry 4.0

203

right direction so that the loss of life and infrastructure during the disaster can be minimized.

Essential Ingredients for Effective Disaster Management The government is the main stakeholder in this catastrophe management, which can be thought of as public project management. The use of the integrated approach in project management will be fraught with difficulties. The conditions, facts, or influences that are inputs into management systems that can either directly or indirectly affect project outcomes are known as project success factors. In handling disasters, the essential ingredients that must be carefully considered are (1)

(2)

(3) (4)

(5)

(6)

A successful institutional setup. Adopting the integrated strategy requires an efficient institutional setup. Additionally, the absence of a competent governmental entity will result in muddled authority structures and a delay in decisionmaking, particularly in the case of emergency assistance and rehabilitation. For DM on a national scale, the specific responsible unit must have full authorization and the primary accountable government department must be identified. Collaboration and coordination. Collaboration and excellent coordination are essential for managing disasters successfully. The coordination and collaboration between important stakeholders occur on five separate levels: international, natural disaster management, national, regional, organizational, and project level. A prevalent issue was shown to be the lack of cooperation across several levels of organizations, including donors, governmental agencies, NGOs, and international NGOs. Supportive legislation and rules. The results of catastrophe management will be improved by the supportive laws and regulations. Efficient system for managing information. Lack of vital information among important players is another frequent issue with disaster management. Planning, early warning, rehabilitation, and reconstruction all depend on this information. Therefore, for DM to be successful, an efficient information management system and the sharing of crucial information among key stakeholders are required. Managerial and team member competencies. Without the assistance of the target beneficiaries or the vulnerable communities, disaster preparedness will be ineffective. Individual project managers and members of the project team often handle the DM strategy. Their technical, conceptual, and administrative abilities are crucial for successfully designing, implementing, and managing disaster programs. To expand their skills, either firms can choose individuals with high competence levels, or they can give them efficient growth training. Effective consultation with the intended beneficiaries and important stakeholders. For results to be effective, client or target beneficial participation is essential. To develop a workable project strategy and action plan, project

204

R. K. Singh et al.

planners should effectively consult with and among the project’s main stakeholders, including donors, local authorities, implementing organizations, and target beneficiaries. (7) Effective means of communication. Stakeholder cooperation and communication are crucial to a project’s success. The primary success component for a project is trust built through efficient communication between the task managers and the coordinator, with team cohesion coming in at a close second. In other words, a collaborative working partnership is what such efficient communication is called. Both the project owner and the project manager should continue to see the project as a partnership. As a result, important stakeholders must create efficient communication channels. (8) Goals and obligations made by important stakeholders clearly outlined. It is typical for project team management and stakeholders to lack commitment in international development projects. Project success is highly and significantly correlated with having a goal that is precise and well defined, with a purpose that is acknowledged by all parties involved, and with their perspectives being incorporated. (9) Efficient control of planning. The disasters must be managed effectively before, during, and after. Most transportation issues are caused by traffic jams, a lack of coordination for relief efforts, and inadequate national transportation infrastructure (port, road, rail, and air). The logistics of a disaster involve people, knowledge, and technology. Utilizing innovative technology, such as geographic information systems and remote sensing capabilities, can improve businesses’ ability to coordinate for better logistics management. (10) Adequate resource mobilization and allocation. Resource planning establishes the types and quantities of resources (people, machinery, and materials) required to complete project tasks. “Shortage of resources” is one of the issues that initiatives encounter the most frequently. A variety of issues that could cause the project to be terminated or suspended include a lack of sufficient resources and a poor or nonexistent examination of the main risk factors.

Conclusion Disasters will occur. The truth is found in the statement that “Everyone must endeavor to survive the current and upcoming crisis.” Although humans cannot control nature, everyone may at least exercise caution and vigilance. Life-saving measures including organized, well-thought-out preparation, and a sensible crisis reaction will be taken. The terms emergency management and DM are frequently used interchangeably. Plans, structures, and agreements are put in place to engage the regular efforts of public, nonprofit, and private organizations in a thorough and coordinated manner to address the full range of emergency demands. When a calamity begins to unfold, these activities are done very quickly.

10 A Disaster Management System Powered by AI and Built for Industry 4.0

205

References 1. Shah, A.J.: An overview of DMin India 2. National DMPlan.: A publication of the National DMAuthority, Government of India, New Delhi (2016) 3. National DMAuthorities, Government of India.www.ndma.gov.in 4. Gujarat DMAuthority: Gandhinagar.www.gsdma.org 5. Sasikumar, K., Bhargava, D.: DMPlan for GIR: IGNFA, Dehradun 6. Sidhu, K.S.: Tsunami Rehabilitation Program, Planning Commission, March 18, 2005, Manila

Chapter 11

A Vision for Industry 4.0 Utilising AI Techniques and Methods L. Bhagyalakshmi, Rajeev Srivastava, Himanshu Shekhar, and Sanjay Kumar Suman

Introduction Current industries have changed to rich industries by the mechanical guerilla, so the makers have been compelled by the around the world publicize to reevaluate their standard fabricating methods. The show day making requires and needs unused fabricating operations, strategies, and successful era line organisation highlights incredible respect for the growing worry [1]. The IoT (Web of Things) device known as the Radio Go Over Identifier Change has been utilised in advanced manufacturing to give producers the ability to track and identify the components or objects in order to activate the necessary information. The method should be used to coordinate objects with RFID names and makes use of RFID receiving wires in certain locations to plug in viewers and build up information about the objects. The concept of RFID Organise Organising (RNP) optimization was developed as a result of the required amount of receiving wires for RFID organise. In 2013 [2], the proposed cross-breed fake bits of data calculation has three stages that are characterising the working amplify that an RFID organise has to be built up and optimised, characterising the proposed computation’s parameters, performing the optimization calculation, and characterising RFID organisations while examining and optimising nonlinear RNP issues using Made Bits of information (AI) techniques [3, 9]. The criterion determines how these radio wires L. Bhagyalakshmi Department of ECE, Rajalakshmi Engineering College, Chennai, India R. Srivastava (B) Princeton Institute of Engineering and Technology for Women, Hyderabad, India e-mail: [email protected] H. Shekhar Department of ECE, Hindustan Institute of Technology and Science, Padur, Chennai, India S. K. Suman Depatment of ECE, Dean R&D, St. Martin’s Engineering College, Secunderabad, India © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2023 A. Mishra and J. C.-W. Lin (eds.), Industry 4.0 and Healthcare, Advanced Technologies and Societal Change, https://doi.org/10.1007/978-981-99-1949-9_11

207

208

L. Bhagyalakshmi et al.

collide, the range of the facility, and communicated control inside the organisation such as the calculation of numerical seem up. The field of cognitive science known as artificial intelligence (AI) focuses on has made significant advancements in the fields of visual orchestration, distinctive tongue arrangement, mechanical independence, machine learning, etc. For the most part, In order to persuade business that machine learning and artificial intelligence (AI) tactics would work consistently and reliably with a return on investment, there is frequently a requirement for persuasive data (ROI). While doing so, a developer’s skills and perspectives are completely dependent on how machine learning calculations are carried out. AI-driven robotization will undoubtedly have a significant quantitative impact on how well firms are able to meet new problems with growth, demand, and competition. They require the radical transformation described as “Industry 4.0.” The Industrial Internet of Things (IIoT), Massive Information Analytics, Cloud Computing, and Cyber-Physical Systems will all benefit from AI integration improve company operations in a flexible, capable way. Given that Mechanical AI is still in its early stages of development, it is crucial to describe its framework, tactics, and difficulties in order to implement it in the industry. A mechanical AI ordinary framework that addresses the common elements in this area and sets the stage for transcendence comprehension and performance [15]. Mechanical AI, which focuses on creating, may be an effective teacher, communicating and supporting with clear execution of machine learning calculations for mechanical applications. The control of associated sensors, data, and fake experiences (AI) at all stages of an operation has nonetheless to be achieved on a mystifying scale by the mechanical manufacturing firms for different a long time to boost their handle. Mechanical 4.0 manufacturing is, in a real sense, centred on the intersection of Big Data, Artificial Intelligence Internet of Things (IoT) and artificial intelligence (AI) devices in a mechanical setting, advancing these developments and the situation from the past. The independence of machines, the Internet of Things (IoT), the connectivity of computers and devices, and machine learning-affected calculations, Industry 4.0 suggests taking the mechanical revolution one step further. More equipment may now be measured, observed, followed, and organised from a centralised, isolated amplifier thanks to actuators in sensor augmentation and coordinate modules. This transparency is used by Chairmen, Masters, and Analysts to advance the efficiency of the entire operation. A significant amount of knowledge can eventually be withheld and used in machine learning calculations to assist in the robotization of specific shapes inside an organisation thanks to the development of cloud computing and the almost imminent decline in data capacity costs [17–21].

Innovative Manufacturing Manufacturing systems have been beautifully updated in the 4.0 Industry. In order to address an enthusiastic and global audience, brilliant creation uses cutting-edge information and making to materialise flexible, dexterous, and adaptable fabricating shapes. Using highly skilled sensors, flexible decision-making models, innovative

11 A Vision for Industry 4.0 Utilising AI Techniques and Methods

209

materials, clever devices, and data analytics, the entire life cycle can be engaged [4]. This concept has been implemented in a number of ways, including the Cleverly Making System (IMS), which is considered to be the next-generation manufacturing system. It converts the standard manufacturing framework into an intelligent framework by utilising display day models, underutilised shapes, and cutting-edge procedures. A highly orchestrated human–machine manufacturing system is made possible in the age of Industry 4.0 by an IMS that uses a Web-based Advantage Orchestrated Arrangement (SOA) to give collaborative, flexible, and reconfigurable organisations to end-users. AI is essential to an IMS since it offers features like learning, reasoning, and acting. By utilising AI advancements, the amount of human participation in an IMS can be decreased [27–31]. In the era of Industry 4.0, it is agreed that the following study fields need to be explored: A general framework for creatively creating, IMSs, collaborative human–machine systems, and models for cleverly making that are data-driven. Organisational drivers like frameworks and the IoT, virtualization and advantage movement, and sharp objects/assets change need to be prioritised in order to completely actualize intelligent manufacturing because doing otherwise will raise the cost of production. By effectively utilising adaptable and reconfigurable generating systems through skillfully coordinated time, coordination, and supply-chain organisation, organise change can reduce took the toll. Multiplex arrange progress, particularly for coordinate and advance, will offer a creative a fix for the issue of highly personalised goods [6]. The benefit enabled ideas for deftly fabricating essential elements in Industry 4.0 are divided into five categories: savvy coordinating, rapid watching, sharp control, and sharp machines [32–40].

Intelligent Manufacturing (IM) Models Driven by Data Huge amounts of information will be produced when the manufacturing of modern devices with RFID and/or sharp sensors advances. Such data contain valuable information or information that has been acquired that can be utilised in circumstances requiring decision-making [7]. In this way, data’s efficient utilisation not only drives more fundamental capacity and more critical integration with other parties like coordination and supply-chain organisation substances, but also joins forward manufacturing capabilities. A period system’s stream will always have an impact on capability and quality. Data-driven models can fully utilise important or real-time data for system confirmation or figures based on information or data integration, information mining, and data analytics [8]. It goes without saying that over the long term, datadriven or model-based organisations will be to a greater extent. Remarkable degree has gotten for smart manufacturing. In order to provide endeavour organisations like intelligently organise and making, period modelling and redirection, coordination, and supply-chain organisation, it’s important to integrate cloud organisations with information organisation. In order to unite human, machine, surface, work, and manufacturing bases, this coordinate will accumulate an endless aggregate of time

210

L. Bhagyalakshmi et al.

data from grouped making objects arranged with clever sensors or advanced devices. Self-learning models can be used by a clever workshop operation centre on the cloud to create more sophisticated or brilliant models and computations for sophisticated decision-making in system development [41–52].

Intelligent Manufacturing Systems (IMS) IMS organisation as well as growth call for consistent cooperation throughout the broadening of initiatives and industry. In order for IMSs to function properly and effectively, collaborative manufacturing models or components, such as a cloudbased making resources/objects organisation system, would centrally control the large collection of historical objects (2015) [5]. The decentralised control advantage, from which any successfully operating component inside the system can make self-adaptive judgements, is established as a major research field in historical times. To guarantee the synchronised time beat, for instance, expertly functioning components in each position of a gathering line can dependably take enchanted with movement parts and other lines. For IMSs, autonomous cleverly making units are extremely important. They are built on increasingly complex embedded circuits or sensors that can recognise components and keep an eye on the internet working environments, and move workpieces in this way. Advanced unmanned devices like mechanised guided vehicles will make developing executions based on this approach more feasible (AGVs). important inquire insides Long-term focus may be on the locks incorporated with IMS advancements like For a more secure time plant, use AR and VR [10]. An open structure will benefit manufacturing firms, especially SMEs, since advanced fabricating forms and organisations will be effectively empowered into IMSs. Application of Intelligent Making (IM) In Industry 4.0, where real-world enterprises may profit from cutting-edge technologies, intelligent manufacturing applications for all businesses are crucial. The solution to the problems of time management, workshop monitoring and control, and stockroom organisation will be an agent-based framework for IMSs. Agent-based use can define workflows and follow-up making processes, allowing for the efficient energization of decision-making pertaining to these components. As an example, computerization in manufacturing systems an agent-based framework with can be utilised to parallel-control robots that are bolted in dispersed aces. This will make IM usage easier [14]. Cloud-based activities, which use SOA and cloud computing to share or circulate manufacturing resources, are another future use of IM. To fully utilise IMSs, a number of grouped cloud platforms will be built up. This will allow manufacturing resources and capabilities to provide on-inquiry organisations to end-users [53–61].

11 A Vision for Industry 4.0 Utilising AI Techniques and Methods

211

Collaboration Between Humans and Machines Under the umbrella of Industry 4.0, people and robots will cooperate through using cognitive effects in mechanical settings. Through the use of improved synchronisation Brilliant machines with models, computer vision, conversation certification, and machine learning will be able to help individuals complete the majority of their labour [11]. Advanced learning models for machines like robots are crucial if people and machines are to build complementary abilities in all working environments. A “human-in-the-loop” machine learning method that enables people to appropriately and successfully interact with decision-making algorithms is one future ask about heading. In this manner, human space capacity or data may be used by dataenabled machine learning components to supply pathways that extraordinarily get it working together. For instance, standard machine learning methods or calculations might be compared to human information in order to develop human–machine relationships and intuition (Bar et al. 2017). Since machines will be providing provide give aid with each work, each divide, and anything that’s depleted making aims where eager circumstances are present, machine bits of data plays a crucial role in fostering human–machine collaboration [12]. Machines equipped with advanced control systems are becoming to behave and perform like people in real manufacturing, environments like workshops, security concerns may become a basic look at the topic. With self-learning and formative techniques, such machines may communicate with workers in an effective manner. For instance, ontology-based data organisation with the ascending cognitive spiral based on epistemology and local to global rationale movements get arranged can be realised for robotizing organise in order to in the long run wrap up creating bits of data inside the longer term [13].

Primary Industrial 4.0 Challenges with Artificial Intelligence That decision to make forms will be made within the framework of the organisation’s work with sharp items, both individually and together with the customer. The fusion of hardware and software into embedded systems known as Cyber-Physical Systems (CPS) will employ the internet of things (IoT) and Big Data to push through each device and that will have its maker’s identify and rudimentary computer skills in order to sense and/or act has already been communicated as the key column in this float. As a result, several devices are needed not so much to comply with and perform tasks within the embedded system, such as creating multiple interfaces or organising them to improve them, which lays the groundwork for the development of this company. 4.0, from the sensors, actuators, and control units that collected data from various organs in the organisation to a cyber-physical system that can manage this data and make decentralised choices. To the sharp machines that will execute the self-deduced works out to require, to the organisation that must strengthen unusual

212

L. Bhagyalakshmi et al.

streams of information, and in reality, the reproduction of forms and the organisation and testing of models based on smart and self-aware systems. Without a physical address, the science of false experiences is based on the same techniques (Fair Techniques, Machine Learning, Numerical Optimization, Neural Frameworks, Probability, Computational Bits of Data, etc.) as those that will help in the construction of a combined physical and internal voice universe. These cement-free robots that can collaborate with people in a safe environment, as well as redirection and virtualization devices, can assist throughout the decision-making process or during the included material creation.

Embedded Systems with ICT Methodologies The essential industry 4.0 change-making tools were identified in the previous section. AI applications will benefit greatly from the precise embedded systems created by the fusion of hardware and software, as has been said frequently recently. Yet, those pointed devices are frequently used for the primary divide, also known as a whole scale with muddled preventions, because it is difficult to determine when one component closes and the other begins. For instance, IoT was presented to suggest sensible to incredibly recognisable interoperable associated devices with radiofrequency recognising affirmation (RFID) modification in the middle of the crucial years of this concept’s development. A while later, however, as the organisation of these frameworks became more fundamental and began to count unused headways and concepts, the term began to be used to associate all of these advancements related to degree, recognise, According to Bi et al. [22], the Internet of Things (IoT) is a keen global organisation where self-aware items connect with one another. It may be used to position, track, and screen objects. In this hypothetical scenario where CPS can be regarded as the proper “brains” internal elements of industry 4.0, one method to put up numerous unresolved issues is to think of IoT as the global system where recognising affirmation and sensor effects are wrapped up energises with explanation headways like CPS. With the use of ICT components to promote free communication between all of them, CPS shapes indicative progress of IoT’s show day phase (Campbell et al. 2012) [23]. The configuration’s essential tenets, organised by IoT and CPS to form the embedded system, can be seen at the following location: sensor adjustment verification, programme communication protocols organisation, standardisation tongues for the collected data, massive data organisation, cloud computing and middleware organisation, or organisation rules for CPS advancement. (I) Internet of Things Sensors: The Internet of Things (IoT) and, subsequently, industry 4.0, have been said to have their roots in the merging of radio frequency identification (RFID) names, communication movements, and sensors/actuators. In line with the statement, reasoning, sensors define the parameters for how quickly moving objects can be recognised, communicate, and related to customers and/or other constituents of the organisation

11 A Vision for Industry 4.0 Utilising AI Techniques and Methods

213

[26]. Radio repeat recognisable affirmation systems are located inside the environment of recognising and communication devices. (II) Sensor Communication Protocols—Some complimentary devices for recognised certification alongside RFID advancement include sensor frameworks or further-reaching sensor systems (WSN) [25]. According to Xu et al. [23], sensor frameworks have a specific number (which can be incredibly tall) of detecting centre centres talking via a more distant multi-hop network [24]. (III) Cloud Computing, Big Data—Colossal Information can be a course of action for that by overhauling as well fundamental components like compactness, flexibility, and excited adequacy, giving a rapidly and spatially free get to them. The mind-boggling stream advances that facilitate the storage and retrieval of information in these embedded systems mechanisation [26]. The use of data mining supports analysis, modelling, simulation, combination and computing, and sound evaluation and decision-making. (IV) Artificial and Virtual Reality—Due to their ability to connect and manage virtual things, virtual reality (VR) and augmented reality (AR) are key components of industry 4.0. Nonetheless, despite the fact that VR and AR are two distinct ideas, they are commonly confused. A computer-generated environment with entirely simulated scenes and objects is where the past comes together. By superimposing virtual things onto the real-world scene that is seen on a screen, the extreme shown attempts to change the real environment. Virtual reality entirely immerses a client inside a created environment. Disconnected AR grants the client the right to observe the real world completely redesigned using data supplied by computers. In any case, diverse applications in a variety of ranges can be seen inside the composition. These alterations have major benefits and drawbacks. The examination led us to the conclusion that the majority of applications focus on instructing and coordinating. Scott and colleagues from Argentina presented some research on 3D virtual learning environments. According to the creators, modern training must take into account the unique requirements and perspectives of each learner. The idea is to coordinate tailored learning strategies in which the environment logically adapts to the learners. L’opez et al. presented a comparison of 2D and 3D virtual reality apps for healthcare education in Mexico. The results showed that gamification enabled students to make progress in their learning. A study from Brazil presented the results of an assessment of stroke patients’ resistance to physical treatment using virtual reality. The progression of stroke patients’ functional movability, solid quality, alteration, and quality of life depends on physiotherapeutic recovery. Standard procedures are boring in any circumstance. VR provides a customizable way for enhancing muscle quality and alteration. The results showed that the Berg Alter Scale, the Fugl-Meyer Evaluation, and the Stroke Affect Scale are the three most commonly used defiants. A collaborative project from Mexico and Spain revealed a practical and optional virtual reality tool for scheduling live-line maintenance. The organised and untrained understudies were divided using the dispersed timberland classifier. Furthermore, by using the visual device, a number of bad ideas in

214

L. Bhagyalakshmi et al.

response to information about misunderstandings and confusion were identified. The work by the Brazilians de Souza et al., on the other hand, revealed a positive review regarding the consistency and respect of AR on mechanical shapes. As a result, it was discovered that the majority of efforts done inside the sector focus on providing illumination for gathering shapes. Other than for maintenance and orchestration, AR illumination is used. The building, moving, automobile, equipment, and mechanisation industries have benefited the most from augmented reality. Hincapie et al. conducted a study in Colombia to get a head start on how augmented reality can benefit training. The most effective teaching strategies identified as a result of the think-around were coordinating instruction, followed by reenactment, following, gamification, and human–computer interaction. It was found that social sciences received less attention than basic sciences as time went on. The Mexicans Hernandez et al. presented an AR application for teaching individual students to memorise the computation of captivated arrangement. The factors associated with motivation, movement declaration, show up quality, and student accomplishment were examined with the participation of 103 students. The information exposed the AR device’s beneficial effects on pupils’ achievement and motivation. Also, students expressed how charmed they were with the design because of its high quality. In order to assist those with hearing impairments who are providing support to a performance in a theatre or other public space, Picallo et al. released an AR application in Spain. The AR application provides users with 3D blended media content, such as subtitles and supplementary information about the show, using HoloLens smart glasses. As part of its expansion, the application adds a call option for inquiring for assistance from the theatre crew. A deterministic 3D column influencing computation was used to illuminate the barriers with the blocked off channel. (V) Integration of the Horizontal and Vertical—The industry’s internal forms are intricate and distinct. In addition to controlling the machines, tracking shapes is essential. In this approach, a crucial setup for changing the machines to the shapes is essential to cutting down on time. This suggests that the information must be useful and that the data transmitted permits daily traceability. Businesses have mastered in-depth and vertical integration strategies to bolster their position among rivals. When a business takes over projects that have been extensively allocated to third parties, it engages in vetting integration. When one company acquires, unites, or creates another company or companies that engage in the same activity, this is known as horizontal integration. Work done by Mexicans a systematic tool for surveying the mechanical and operational criteria inside parts companies was proposed by P’erez et al. and supported. The device put the businesses inside the proper level for a trade to the coming mechanical change taking into consideration the vertical and unquestionably systems. The distribution of the plan happened in two stages. The first step involved designing an instrument to gather information. The instrument’s realness and consistent quality tests are proposed in a more condensed manner. Actuators of the Cronbach alpha chose the instrument’s

11 A Vision for Industry 4.0 Utilising AI Techniques and Methods

215

uncompromising quality. The privatisation of open enterprises where firms were vertically energised with their suppliers was examined by B’arcenas and Bego na of Spain. Consideration was given to a mixed duopoly with an open, vertically energised corporation. The results showed that a private company had been vertically coordinating with its supplier on the off possibility that the stock was a modest substitution. Aside from the commonly accepted fact that the mixed duopoly has less vertical integration than the private duopoly, the unique discovery was that the open firm was privatised at a time when the stock could have been easily replaced and the wheeling and dealing control of the private corporation was no longer appealing. The Brazilian researchers de Braganca et al. used a vertically organised framework to examine a generator’s or gentailer’s decision to try an unusual venture to increase its retail development share. The results showed that vertical integration can act as a fence in an uncontrollable situation. In addition, the results showed that businesses would be more inclined to help create their retail closeness in markets that are concentrated, are accelerating vertical empowerment, or are well-made support markets.

Key Elements of Industry 4.0 with Ai: Abcde The acronym “ABCDE” can be used to describe the essential elements of Mechanical AI. These essential elements connect the following: Analytics Advance (A), Big Data Advance (B), Cloud or Cyber Movement (C), Spatial Knowledge (D), and Outline (E). The core of AI is analytics, which can serve as a point of reference when additional elements are present. Colossal data flow and the cloud are both essential elements that provide the data source and a framework for mechanical AI. Even if these elements are vital, space data and illustration are developing fundamental elements that are initially disregarded in this context. The main element of the following centres is space wellness: Understanding the problem and putting it at the centre of Mechanical AI control; Understanding the system so that the right data can be collected; Understanding the physical recommendations of the parameters and how they relate to the physical characteristics of a system or handle; and Understanding the physical recommendations of the parameters. recognising the differences between machine to machine on these parameters. Describe a key element in enabling mechanical AI models and how to connect with them so they can incorporate learning capability. By accumulating data about plans and the depiction (or title) associated with those plans, we can, as it were, develop the AI’s appearance to wrapped up more accurate, thorough, and powerful as it gets older [15]. (i) AI Ecosystem in Industry—The mechanical AI environment, as depicted in Fig. 11.1, exemplifies a continuous thinking process for needs, difficulties, breakthroughs, and methods for developing transformational AI systems for industry. Experts can use this graph as a persuasive starting point for

216

L. Bhagyalakshmi et al.

formulating a strategy for Mechanical AI development and movement. Internally divided into industry-focused components, this characteristic framework defines the common needs such as self-awareness, self-comparison, selfprediction, self-optimization, and quality. Also, this graph unites the four basic locks in forward motion, which are Data Movement (DT), Interpretation Movement (AT), Organise Progression (PT), and Operations Movement (OT). When used in the context of the Cyber-Physical Systems (CPS) framework, which was first suggested in [16]. These four propels (DT, AT, PT, and OT), as shown in Fig. 11.2, are necessary for achieving success in Affiliation, Modify, Cyber, Cognition, and Setup, or 5C. This section of the document provides a brief description of each intended to propel [15]. (ii) Technologies for Data (DT)—Information Headways are those headways, which lock in beneficial securing of critical information with essential execution estimations over estimations. By identifying the reasonable equipment and component for obtaining useful data, it becomes a co-enabler of the “Smart Connection” phase inside the 5C building depicted in Fig. 11.2. Data communication is the other component of data movements [15]. (iii) Technology for Analytics (AT)—Analytics Headway changes over the fabric of information from essential components into productive data. Information driven modelling reveals secured plans, cloud associations and other productive data from fabricating frameworks. This data can be utilised for resource thriving crave which can be utilised for machine prognostics and success organisation. Interpretive Advancements encouraged this data with other advances for made strides in viability and movement [15]. (iv) Technologies for Platforms (PT)—Organise moves to solidify the equip arrange for making information capacity, examination and input. An unfaltering orchestrate arrangement for analysing information may be a major choosing calculate for realising sharp making characteristics such as capacity, organising complex events, and so forth. By and enormously discovered three main categories of organising courses of action: stand-alone, inserted, and cloud.

Fig. 11.1 Industrial AI ecosystem

11 A Vision for Industry 4.0 Utilising AI Techniques and Methods

217

Fig. 11.2 Technologies that make it possible to implement CPS in manufacturing

Regarding processing, capacity, and servitization capabilities, cloud computing can be a key upgrade in Information and Communication Propels. Rapid advantage sending, high levels of customisation, data integration, and rewarding visualisation with high flexibility can all be provided by the cloud infrastructure [15]. (v) Technology of Operations (OT)—Operation enhancement here insinuates to a course of action of choices made and works out depending on the data’s information that has been emptied. This machine-to-machine interaction can take place between two machines on the same shop floor or between machines isolated in two distinct manufacturing facilities. They can collaborate to show how altering particular parameters can improve performance, and they can adjust their time frame in response to the openness of other machines. Operations mobility is the extraordinary step leading to the adoption of the following four capabilities in an industry 4.0 plant: (1) Be wary about oneself, (2) Selfpredict, (3) Self-Configure, and (4) Self-Compare are the next steps [15].

218

L. Bhagyalakshmi et al.

Conclusion Intelligence isn’t found in charm boxes and cautious robots. AI may be a science as well as a development-based collection of computations critical learning and machine learning (i.e. devices that can gather data and learn from it) are strengthened by AI, change their own computations) (a combination of calculations that are commonly related). This research analysed and passed on a rundown of the show up for planning the endeavours towards the realisation of Mechanical AI systems and Mechanical AI Ecosystem in the manufacturing chart of today. Internal elements AI information sleuths use the appropriate tools and quantitative techniques to extract and decipher data. The machines acquire the ability to identify plans within the information provided to them, and map these strategies for future implementation. To implement AI in future industry manufacture, there are three key measures to take, including creating an AI technique and facilitating and building up Future industries that will be able to use AI, fabrication frameworks.

References 1. Kamble, S.S., Gunasekaran, A., Gawankar, S.A.: Sustainable Industry 4.0 framework: a systematic literature review identifying the current trends and future perspectives. Process Saf. Environ. Prot. 117, 408–425 (2018) 2. Kuo, R., et al.: The application of an artificial immune system-based back-propagation neural network with feature selection to an RFID positioning system. Robot. Comput.- Integr. Manuf. 29(6), 431–438 (2013) 3. Azizi, A.: Introducing a novel hybrid artificial intelligence algorithm to optimize network of industrial applications in modern manufacturing. Complexity (2017). https://doi.org/10.1155/ 2017/8728209 4. Li, B., Hou, B., Yu, W., Lu, X., Yang, C.: Applications of artificial intelligence in intelligent manufacturing: a review. Front Inform Tech El 18(1), 86–96 (2017) 5. Zhong, R.Y., Xu, X., Klotz, E., Newman, S.T.: Intelligent manufacturing in the context of Industry 4.0: a review. Eng., Res. Intell. Manuf.-Rev. 3, 616–630 (2017) 6. Simpson, T.W., Jiao, J.R., Siddique, Z.,Hölttä-Otto, K.:. Editors. Advances in Product Family and Product Platform Design: Methods & Applications. Springer-Verlag, New York (2014) 7. Zhong, R.Y., Newman, S.T., Huang, G.Q., Lan, S.: Big data for supply chain management in the service and manufacturing sectors: challenges, opportunities, and future perspectives. Comput. Ind. Eng. 101, 572–591 (2016) 8. Zou, J., Chang, Q., Arinez, J., Xiao, G., Lei, Y.: Dynamic production system diagnosis and prognosis using model-based data-driven method. Expert Syst. Appl. 2017(80), 200–209 (2017) 9. Zhong, R.Y., Huang, G.Q., Lan, S., Dai, Q.Y., Chen, X., Zhang, T.A.: big data approach for logistics trajectory discovery from RFID-enabled production data. Int. J. Prod. Econ. 165, 260–272 (2015) 10. Yew, A.W.W., Ong, S.K., Nee, A.Y.C.: Towards a griddable distributed manufacturing system with augmented reality interfaces. Robot Com.-Int. Manuf. 39, 43–55 (2016) 11. Antrobus, V., Burnett, G., Krehl, C.: Driver-passenger collaboration as a basis for humanmachine interface design for vehicle navigation systems. Ergonomics 60(3), 321–332 (2017) 12. Xu, X.: Machine Tool 4.0 for the new era of manufacturing. Int. J. Adv. Manuf. Tech. 92(5–8), 1893–900 (2017)

11 A Vision for Industry 4.0 Utilising AI Techniques and Methods

219

13. Yin, Y.H., Nee, A.Y.C., Ong, S.K., Zhu, J.Y., Gu, P.H., Chen, L.J.: Automating design with intelligent human-machine integration. CIRP Ann-Manuf Tech 64(2), 655–677 (2015) 14. Priego, R., Iriondo, N., Gangoiti, U., Marcos, M.: Agent based middleware architecture for reconfigurable manufacturing systems. Int. J. Adv. Manuf. Tech. 92(5–8), 1579–1590 (2017) 15. Lee, J., Davari, H., Singh, J.,Pandhare, V.: Industrial Artificial Intelligence for industry 4.0based manufacturing Systems. In: Center for Industrial Artificial Intelligence (IAI), Department of Mechanical Engineering, University of Cincinnati, Cincinnati, USA, vol. 18, pp. 20–23 (2018) 16. Lee, J., Bagheri, B., Kao, H.A.: A cyber-physical systems architecture for industry 4.0-based manufacturing systems. Manuf. Lett. 3, 18–23 (2015) 17. Tuptuk, N., Hailes, S.: Security of smart manufacturing systems. J. Manuf. Syst. 47, 93–106 (2018) 18. Lee, J., Bagheri, B., Kao, H.: A cyber-physical systems architecture for Industry 4.0-based manufacturing systems. Manuf. Lett. 3, 18–23 (2015) 19. Lee, J., Kao, H., Yang, S.: Service innovation and smart analytics for Industry 4.0 and big data environment. In: Product Services Systems and Value Creation: Proceedings of the 6thCirp Conference on Industrial Product-Service Systems, vol. 16, pp. 3–8 (2014) 20. Lasi, H., Kemper, H.G., Fettke, P., Feld, T., Hoffmann, M.: Industry 4.0. Bus. Inf. Syst. Eng. 6, 239–242 (2014) 21. Dopico, M., Gomez, A., De la Fuente, D., García, N., Rosillo, R., Puche, J.: A vision of industry 4.0 from an artificial intelligence point of view. In: Int’l Conf. Artificial Intelligence ICAI’16, pp 407–413. Administracion de Empresas, University of Oviedo, Gijón, Asturias, Spain (2016) 22. Bi, Z., Da Xu, L., Wang, C.: Internet of Things for enterprise systems of modern manufacturing. IEEE Trans. Ind. Inform. 10(2), 1537–1546 (2014) 23. Da Xu, He, W., Li, S.: Internet of things in industries: a survey. IEEE Trans. Ind. Inform. 10(4), 2233–2243 (2014) 24. Gubbi, J., Buyya, R., Marusic, S., Palaniswami, M.: Internet of Things (IoT): a vision, architectural elements, and future directions. Futur. Gener. Comput. Syst. 29(7), 1645–1660 (2013) 25. Abu-Elkheir, M., Hayajneh, M., Ali, N.: data management for the internet of things: design primitives and solution. Sensors 13(11):15582–15612 (2013) 26. Monostori, L.: Cyber-physical production systems: roots, expectations and R&D challenges. In: Variety Management In Manufacturing: Proceedings Of The 47th Cirp Conference On Manufacturing Systems, vol. 17, pp. 9–13 (2014) 27. Aguilar, J., Balderrama, C., Puente, C., Ontiveros, A., Garc’ia, J.: Genetic algorithm for the reduction printing time and dimensional precision improvement on 3D components printed by fused filament fabrication. Int. J. Adv. Manuf. Technol. 115, 3965–3981 (2021) 28. Aguilar, J., Garc´ıa, J., Hern´andez, J.: Geometric considerations for the 3D printing of components using fused filament fabrication. Int. J. Adv. Manuf. Technol. 109, 171–186 (2020) 29. Ahuett, H., Kurfess, T.: A brief discussion on the trends of habilitating technologies for industry 4.0 and smart manufacturing. Manuf. Lett. 15, 60–63 (2018) 30. Albuquerque, R., Albanez, C., de Medeiros, A., Buarque, P., Lopes, D.: Machine learning applied in SARS-CoV-2 COVID 19 screening using clinical analysis parameters. IEEE Lat. Am. Trans. 19, 978–985 (2021) 31. Allal O, Simón, V, Cuenca A (2021) Intelligent purchasing: How artificial intelligence can redefine the purchasing function. J. Bus. Res. 124, 69–76 32. Andrade, R., Guun, S., Tello, L., Ortiz, I.: A comprehensive study of the IoT cybersecurity in smart cities. IEEE Access 8, 228922–228941 (2020) 33. Araújo, F., oliveira, F., Ramos, E., Lima, N., Almeida, P., Santos, A., Costa, D., Santos, P., Antunes, A.: (2020). Evaluation instruments for physical therapy using virtual reality in stroke patients: A systematic review. Physiotherapy, Vol. 106, pp. 194–210. 34. Arcos, D., Guemes, D.: Development of an additive manufacturing technology scenario for opportunity identification—The case of Mexico. Futures 90, 1–15 (2017)

220

L. Bhagyalakshmi et al.

35. Arcos, R.: Securing the kingdom’s cyberspace: Cybersecurity and cyber intelligence in Spain. Routledge, London, First edition (2021) 36. Batista, C., de Castro, R., de Castro, M., Ferreira, E., Agoulmine, N.: State of the art and challenges of security SLA for cloud computing. Comput. Electr. Eng. 59, 1–12 (2017) 37. Bayón, C., Ramírez, O., Serrano, J., del Castillo, M., Pérez, A., Belda, J., Martínez, I., Lerna, S., Cifuentes, C., Frizera, A., Rocon, E.: Development and evaluation of a novel robotic platform for gait rehabilitation in patients with cerebral palsy: Cpwalker. Robot. Auton. Syst. 91, 101–114 (2017) 38. Bárcenas, J., Begoña, M.: Privatisation and vertical integration under a mixed duopoly. Econ. Syst. 42, 514–522 (2018) 39. Catota, F., Granger, M., Sicker, D.: Cybersecurity education in a developing nation: the Ecuadorian environment. J. Cybersecur. 5, 1–19 (2019) 40. Corke, P.: Robotics, Vision and Control: Fundamental algorithms in Matlab. Springer, Germany, First edition (2011) 41. Cujabante, X., Bahamón, M., Prieto, J., Quiroga, J.: Cybersecurity and cyber defense in Colombia: A possible model for civil-military relations. Revista Científica General José María Córdova 18, 357–377 (2020) 42. Dalmarco, G., Ramalho, F., Barros, A., Soares, A.: Providing industry 4.0 technologies: The case of a production technology cluster. J. High Technol. Managem. Res. 30, 1–9 (2019) 43. de Barcelos, A., Gomes, M., da Costa, C., da Rosa, R., Victoria, J., Pessin, G., de Doncker, G., Federizzi, G. (2020). Intelligent personal assistants: a systematic literature review. Expert. Syst. Appl. 147, 1–14 44. de Braganca, G., Daglish, T.: Investing in vertical integration: electricity retail market participation. Energy Econ. 67, 355–365 (2017) 45. de Camargo, I., Erbereli, R., Fortulan, C.: Additive manufacturing of electrofused mullite slurry by digital light processing. J. Eur. Ceram. Soc. 41, 7182–7188 (2021) 46. de Costa, R., Moreira, J., Pintos, P., dos Santos, V., Lifschitz, S.: A survey on data-driven performance tuning for big data analytics platforms. Big Data Res. 25, 1–15 (2021) 47. de Sá, A., Pereira, A., Pappa, G.: A customized classification algorithm for credit card fraud detection. Eng. Appl. Artif. Intell. 72, 21–29 (2018) 48. de Souza, L., Martins, F., Roberto, E.: A survey of industrial augmented reality. Comput. Ind. Eng. 139, 1–14 (2020) 49. DOMO (2021). Internet access. https://www.domo.com/learn/ 50. Flores, F., Paredes, R., Meza, F.: Procedures for mitigating cybersecurity risk in a Chilean government ministry. IEEE Lat. Am. Trans. 14, 2947–2950 (2016) 51. Gallegos, G. (2020). Introducci ´on a la Ciberseguridady sus aplicaciones en M´exico. AcademiaFirst edition. 52. García, A., Mauricio, A., Anaid, J., Vázquez, S.: Aplicando tecnolog´ıas bigdata para realizarb´usquedas espec´ıficas de perfiles profesionales enredes sociales. Komputer Sapiens 3, 23–27 (2017) 53. Garrido, C., Olivares, T., Ramirez, F., Roda, L.: An end-to-end internet of things solution for reverse supply chain management in industry 4.0. Comput. Ind. 112, 1–13 (2019) 54. González, L., de Fuentes, J.: Design recommendations for online cybersecurity courses. Comput. Secur. 80, 238–256 (2019) 55. Guerrero, G., Garrido, J., Balderas, S., Rodríguez, C.: A context-aware architecture supporting service availability in mobile cloud computing. IEEE Trans. Serv. Comput. 10, 956–968 (2017) 56. Guzmán, J., Villafaña, D., Peniche, L., Gomez, R., Molina, J., Rodríguez, M.: Internet of things for irrigation system. In: Mata, M., Zagal, R., Barria, C., (eds) Telematics and Computing, 1st edn, chapter 24, pp. 294–304. Springer (2019) 57. Hernández, L., López, J., Tovar, M., Vergara, O., Cruz, V.: Effects of using mobile augmented reality for simple interest computation in a financial mathematics course. Peer J. Comput. Sci. 7, 1–33 (2021) 58. Hincapie, M., Diaz, C., Valencia, A., Contero, M., Guemes, D.: Educational applications of augmented reality: a bibliometric study. Comput. Electr. Eng. 93, 1–14 (2021)

11 A Vision for Industry 4.0 Utilising AI Techniques and Methods

221

59. Huang, J.: Resource decision making for vertical and horizontalintegration problems in an enterprise. J. Oper. Res. Soc. 67, 1–10 (2016) 60. Laguna, O., Lietor, P., Iglesias, F., Corpas, F.: A review on additive manufacturing and materials for catalytic applications: Milestones, key concepts, advances and perspectives. Mater. Des. 208, 1–36 (2021) 61. León, A., Pastor, O.: Enhancing precision medicine: a big data-driven approach for the management of genomic data. Big Data Res. Vol. article in press, 1–11 (2021)

Chapter 12

Artificial Intelligence and Optimization Strategies in Industrial IoT Applications Yu-Chung Wang and Jerry Chun-Wei Lin

Introduction Greater agility, productivity, flexibility, visibility, efficiency and safety, are ambitious goals that are driving progress in the industry. However, optimization in each of these areas is nearly impossible without adopting and implementing new technologies within their own domain-specific strengths. Studies conducted in the past Industry 4.0 show that more than half of manufacturing and service organizations have increased their productivity through the use of Industry 4.0 technologies [1]. Industry is now more focused on developing the infrastructure for the adoption of data-driven models and IoT-enabled manufacturing processes [7, 11, 19–21]. For instance, “Industry 4.0” was introduced by the German Federal Government as a strategic technology plan for its industries.1 The United States introduced the concept of “smart manufacturing” as part of the “Smart Manufacturing Leadership Coalition” in 20112 as well. In addition, Asian countries are actively promoting some strategies and projects such as “Made in China 20253 ”, which focused on advanced manufacturing [21, 27]. Recently, the dynamic changes in the market especially for COVID-19 impact have driven the plans to develop rapidly since the shortage of workers in factories and the

1

Industrie 4.0 Working Group, Recommendations for implementing the strategic initiative INDUSTRIE 4.0, 2013, https://www.din.de/blob/76902/e8cac883f42bf28536e7e8165993f1fd/recommend ations-for-implementing-industry-4-0-data.pdf. 2 Implementing 21st Century Smart Manufacturing, SM Leadership Coalition, 2011, https://www. controlglobal.com/assets/11WPpdf/110621_SMLC-smart-manufacturing.pdf. 3 General Office of the State Council, PRC, Made in china 2025, 2015, http://www.dahe.com/sta ndard/standard%20folder/made%20in%20China%202025.pdf. Y.-C. Wang · J. C.-W. Lin (B) Department of Computer Science, Electrical Engineering and Mathematical Sciences, Western Norway University of Applied Sciences, Bergen, Norway e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2023 A. Mishra and J. C.-W. Lin (eds.), Industry 4.0 and Healthcare, Advanced Technologies and Societal Change, https://doi.org/10.1007/978-981-99-1949-9_12

223

224

Y.-C. Wang and J. C.-W. Lin

trend of remote offices made the manufacturing industry aware of the importance of digitalization and intelligence to replace repetitive and unnecessary work. Currently, manufacturing industries are equipped with sensors and network infrastructures to collect data from machines to servers [9]. This helps to reduce a lot of manpower resources and operational time, and improve the level of automation in the manufacturing industry [36]. In previous research, Jamwal et al. mentioned the goal of Industry 4.0 is to achieve sustainability in manufacturing, i.e., producing products through profitable processes that minimize negative impacts on the environment while conserving energy and natural resources [20]. In 2018, Lee et al. [24] proposed an industrial AI (IAI) ecosystem that encompasses the vital elements and provides a guideline for better understanding and implementation. He mentioned that IAI can perform better than AI&ML, expert systems and expert knowledge and described the predictive impact of solving visible problems to achieve Work Reduction, Waste Reduction, and Worry-Free Manufacturing (3W). The strategies propagated in manufacturing organizations are nothing more than Industry 4.0, IAI, digital transformation, etc., and the two main technologies are Artificial Intelligence (AI) and the Internet of Things (IoT). The two are like the relationship between the central nervous system and the peripheral nervous system. AI is a subfield of computer science that deals with normal perceptive problems of human intelligence, such as learning, problem solving, and pattern recognition. AI-based systems are now transforming manpower-intensive industries, semiautomated industries and automated industries into highly efficient industries by increasing productivity and reducing costs by optimizing the production process, accelerating the product development life cycle, eliminating repetitive work and redundant manpower, etc. [21]. AI also enables manufacturing systems to correctly learn behavioral patterns from datasets of past cases to eventually understand what should be a coherent, intelligent and correct action. The Internet of Things (IoT) is a system of interconnected computers, digital machines, objects, animals or people that are provided with their own unique identifiers and have the capability to exchange data over a network without human–computer interaction (HCI) [39]. An IoT object can be a machine with several temperature sensors, a heating, ventilation and air conditioning (HVAC) system with a pressure gauge, a manufacturing process with built-in sensors and data collected by Programmable Logic Controller (PLC) system or any other object that can be assigned an Internet Protocol (IP) address and is able to make data exchange over a network. Ehie et al. [13] mentioned that IoT is one of the key enablers of digital manufacturing. IoT is also a natural extension of supervisory control and data acquisition (SCADA), a category of software application programs for process control, that is, the collection of data in real-time from remote locations to control equipment and conditions. Using the same concept, industrial SCADA always uses its software and computers connected to sensors to collect, analyze and present data and deliver the results to the users who are needed [18]. More and more organizations in a variety of industries, especially manufacturing, are using IoT solutions to collect more data, operate more efficiently, improve decision-making, and add value to the business. It is most prevalent in manufacturing and transportation that uses sensors and other IoT devices; but has also proven

12 Artificial Intelligence and Optimization Strategies in Industrial IoT …

225

useful in agriculture, healthcare, infrastructure and home automation, leading some organizations down the path of digital transformation. On the other hand, Industrial IoT focuses smart networks of machines and devices that interact with each other to improve the performance of industrial processes [34]. The use of advanced communications and smart mechanisms in the industry is rapidly increasing, making cybersecurity a critical aspect. The current promotion of IIoT comes after the emergence of cloud technology. Cloud technology can store data, review historical trends, and perform data analysis on its own platform. Another content is the development of the Open Platform Communications - Unified Architecture (OPC UA) protocol in 2006 [25], which can remotely communicate between devices, programs and data sources and secure communication at the end without human intervention. Most of the IIoT devices in use today follow industrial communication protocols including AMQP 1.0, MQTT, XMPP, OPC UA, Modbus TCP, CoAP, etc. [5]. In industry, especially the manufacturing industry, OPC is also currently the most widely used communication interface because it can integrate various types of equipment information through the PLC. While AI focuses on providing a process of correct understanding, learning and action with computing devices, making them intelligent, IoT revolves around interfaces between sensors and the object being measured to provide services based on the collected data. Recently, the convergence of AI and IoT—also referred to as Artificial Intelligence of Things (AIoT) [19] has been announced. A number of AIoT-related applications are continuously initiated in the industry to achieve the following goals—agility, productivity, flexibility, visibility, efficiency, and safety. In addition to AI and IoT, industrial digitization capabilities play an important role in Industry 4.0 and smart manufacturing. Cyber-physical system (CPS) was proposed to achieve industry 4.0 by leveraging the ability of knowledge [44] found from the vertical integration of domains [4, 38]. CPS enables the exchange of data between different systems in all areas by integrating the real physical environment (OT: Operations Technology) and the cyber system (IT: Information Technology) [30], and all IT and OT components need to be in the ISA-95 layers to achieve an ideal CPS structure [35]. The structures and components of CPS have been tried to be explained using the ISA-95 layers. ISA-95 is a standard for the development of an automated interface between enterprise systems, application systems, and industrial control systems that is hierarchically [3]. Figure 12.1 shows the ISA-95 hierarchy architecture of the manufacturing environment. The concept of CPS shows that the results of AI can produce substantial benefits to the industry through automation or communication between man–machine systems. As digital systems become more prevalent in the industrial environment, access to data monitoring applications will become easier. Some literature in recent years have highlighted “smart manufacturing” as the popular topic in manufacturing systems where machines within a system are interconnected through the use of network solutions [42]. Those machines are monitored by an intelligent system and installed sensors continuously collect data and communicate with the data server. This data is trained and validated by AI-based computational algorithms such as machine learning (ML), deep learning (DL), other cutting-edge modeling techniques, etc. AIbased algorithms are able to process and analyze a large amount of data and provide

226

Y.-C. Wang and J. C.-W. Lin

Fig. 12.1 ISA-95 hierarchy architecture of a manufacturing enterprise [3]

manufacturers with insights or learning results from the same attribute data [40]. The valuable manufacturing information gathered from these datasets can be used to search for failure patterns, find out the key process variables, extract research knowledge, predict precise maintenance timing, forecast demand, and monitor condition. However, few AI-based algorithms can be used for generative design purposes as the nature and pain points of the industry still need to be explored more deeply [41]. Despite the benefits of AI and IoT to industry, AIoT adoption level and applications in industry are limited fields due to a lack of knowledge, preparation for basic IoT devices construction, and a clear understanding of the benefits of AI in manufacturing. The most important element of the three barriers above is the last one. When a company reviews the performance of a production facility, it always looks at several key performance indicators (KPIs) such as output, production capacity, OEE, quality, etc., and how much money it makes, etc. In the same way, when implementing an AIoT project, the most important thing is to evaluate the benefits and payback of AI brings rather than exaggerating its capabilities or falsely claiming that it is a smart factory. Table 12.1 shows some pain points collected from most industry practitioners regarding the implementation of AIoT solutions. With the exception of how to transform AIoT project results into actual impacts on production, which needs to

12 Artificial Intelligence and Optimization Strategies in Industrial IoT …

227

Table 12.1 Pain points from different perspectives of most industrial industry practitioners on AIoT solutions Foundation and data sources

Integration and transmission

Analytics, modeling and application

• How to measure and get data? • Data quality & data sampling rate decision [24] • External edge sensors standardization & integration • Data accuracy & traceability completeness • How to integrate and differentiate discrete & continuous process data character differences • Cyber security [24]

• Hard to standardize the communication between different complex machines to the network (Machine-to-machine interactions) [24] • Data lake (centralized repository) or decentralized database portfolio decision • Portfolio decision of local platform or cloud platform • Lack of skills in different real needs

• Needs for a decentralized system and one-stop total solution system needs • Convergence of IT&OT [3] • How to make results impact back to process/product? • How to combine physical & digital models together? • Black box AI Data imbalance issues in the manufacturing industry

be combined with specific domain knowledge, the rest can be solved with current technologies and decision analysis. In industry, it is difficult to see the benefits of better productivity, outstanding accuracy, and a much smarter machine from AIoT solutions because many AI practitioners, researchers, data scientists, and engineers simply do not know how to convert these benefits, and then convert the benefits into financial justifications. Given the literature gaps related to AIoT and practical benefit transformation, this study aims to discuss the results that AIoT solutions and applications in Industry 4.0 bring to manufacturing organizations to achieve sustainability in manufacturing and to organize some frequent methods used in practice to evaluate the financial benefits of AIoT projects. In addition, two case studies of AIoT applications in the manufacturing industry will be presented. The rest of the paper has three sections. Section “Review of Related Works” introduces about the previous studies related to performance measurement of smart manufacturing and Industry 4.0 related topics. Section “Operational Optimization Strategy” discusses about the operational optimization strategy for promoting AIoT solutions. Section “Case Study” discusses the case study of real AIoT projects linked to previous sections. Section “Conclusion” discusses the conclusion of the study.

Review of Related Works Due to the development of Industry 4.0, AIoT, and smart manufacturing, performance measurement and benefit assessment are critical elements for the successful promotion of the above technologies and concepts. Although everyone knows that the development of Industry 4.0 and industrial AIoT depends on continuous cost investment to improve production efficiency, few research studies focus on this topic

228

Y.-C. Wang and J. C.-W. Lin

due to a lack of connections to practical experience in the industry. In this section, we draw on and discuss the work on Industry 4.0, AIoT, performance measurement in smart manufacturing and benefit assessment in recent years. In fact, few texts on performance measurement and benefit assessment of AI in manufacturing can be found in the main academic/industrial works on Industry 4.0, AI, ML, and AIoT [2]. Performance measurement is the process of evaluating the efficiency and effectiveness of action [16, 32], and the assessment of manufacturing performance is demanding because of the complicated and changeable characteristics of manufacturing [22]. A survey of performance measurement in the aerospace industry is based on five aspects, namely cost, quality, flexibility, time, and productivity [17]. Some research has introduced a performance measurement structure for Industry 4.0 based manufacturing systems [2, 6, 15, 22, 29, 37]. These frameworks provide a comprehensive architecture and KPI, but the case measured by KPI and key performance results (KPR) is based on the hardware upgrade of Industry 4.0 rather than the overall solution of AIoT. The study shows that while Industry 4.0 has demonstrated significant potential for value creation and digital transformation of the supply chain, statements emphasizing the significance of Industry 4.0 to the manufacturing supply chain are still unsubstantiated, raising the question of sustainability [8]. And these problems are more reflected in small, medium and micro enterprises (SMMEs) because SMMEs are more demanding in terms of expenditure and performance. Therefore, it is more important for SMMEs to introduce Industry 4.0 and smart manufacturing performance indicators. The introduction of smart manufacturing systems (SMS) in the SMMEs will increase production capability but it will also bring complicated operational issues and side effects that affect economic, environmental, social and resource sustainability [31]. [2] delivered a performance management system in the automotive industry for smart manufacturing. Emmer et al. [15] proposed a comprehensive measurement data management (MDM) system that is used to focus on the technological requirements of Industry 4.0 for complex process chains. Kamble et al. [22] used a combination of exploratory and empirical research designs to identify and validate the performance measures relevant to the evaluation of SMS investments in the automotive component manufacturing SMMEs in India using different manufacturing performance indicators. Although the above studies in the literature have attempted to establish measurement systems in Industry 4.0, the measures they consider are limited and not available to evaluate a single AIoT or smart manufacturing project without a computational methodology. The opportunity for manufacturing performance in the future is to see more practical cases related to industry and manufacturing and to generalize the dismantling. In addition, these studies have used a conceptual structure and case studies within the manufacturing organization to evaluate the actual benefits and empirical validation of the identified measures (Table 12.2).

12 Artificial Intelligence and Optimization Strategies in Industrial IoT …

229

Table 12.2 Comparison of performance measurement systems in Industry 4.0 Topic

Object

Description

Advantages

Opportunities

Kamble et al. [22]

Auto-component SMMEs

• Use a combination of exploratory and empirical research design to identify and validate the performance measures related to the SMS investments Identify ten performance dimensions, namely, cost, quality, flexibility, time, integration, optimized productivity, real-time diagnosis& prognosis, computing, social and ecological sustainability

The Smart Manufacturing Performance Measurement system (SMPMS) framework can guide SMMEs in evaluating SMS investments

Since the 10 proposed indicators have some overlapping in the manufacturing benefits, it is difficult to estimate the correct financial justification

Ante et al. Automotive industry [2]

A tree structure of KPIs is proposed to describe the PMS of lean and smart production systems List KPIs and their measurement elements

The hierarchical structure of KPIs allowed supporting the performance measurement under strategical, tactical, and operational perspective

The Industry 4.0 case study mainly described the KPIs benefits brought by the improvement of digitization and automation

Shin et al. [37]

Describe an existing performance measurement framework with quality aspects regarding the costs of quality items

Provide a virtual tool to assess weaknesses in the manufacturing system

Hard to link financial justification directly

General

Operational Optimization Strategy Continuous improvement is the driving force of industrial progress, and the emergence of new technologies such as AI, ML, IoT, etc. gives practitioners superior capabilities and faster implementation of continuous improvement. Employee and customer expectations continue to evolve as technologies transform their experiences outside of work. AI, IoT, ML, and digital technologies make information more accessible, leading to better insights, faster response times, and on-demand

230

Y.-C. Wang and J. C.-W. Lin

configuration of products by users. All stakeholders including suppliers, manufacturers, and consumers expect greater profits and lower costs from more advanced intelligent systems that help them do their jobs. Based on three direct goals of manufactures: lower cost, higher quality, and faster time to market, there are many other criteria such as reducing downtime, improving yield, faster product line changeovers, improving maintenance time, reducing material waste, reducing labor, improving processing time, faster time to market, improving quality, reducing environmental impact, reducing energy consumption, and better change management [22]. To meet and improve the above measurable indicators, many capabilities and skills need to be cultivated and satisfied. Figure 12.2 presents the different professional capabilities and systems needed in industry and manufacturing. Not surprisingly, the foundations and applications of AI and IoT can be related to almost all capabilities. When using AIoT solutions to improve industrial problems, all sponsors and stakeholders pay attention to the benefits and payback. Even with the energetic development of optimizing AI techniques and the advancement of technology, it is not easy to bring technology into the practical field for business benefit. In order to not give stakeholders the impression that the application of AI in industrial practice is just empty talk, this section describes how the financial benefits of AI and IoT can be linked.

Linking Return with Investment to AIoT Solutions in Industry In order to drive Industry 4.0 and AIoT applications in industry, it is very essential for enterprises to continue to fund investments in equipment and technical capabilities to maintain a stable income and be highly profitable with attractive payback periods. Generally, a large part of the capital expenditure is for the purchase of specific equipment, the hardware, the system development, and the facility, the installation of sensors, the deployment of the environment, the manpower of the project and the subsequent maintenance and operation. Optimization strategies should continuously expand the framework of the industrial work model with new solutions to meet continuous and feasible goals. In addition, the development of an amended industrial work model and successful AIoT applications for the industry have strongly driven the development of AIoT. After several successful cases, enterprises and research institutes will have more confidence in AI and IoT capabilities to solve practical problems and be willing to invest in capital projects and talents in artificial intelligence, digital transformation and operations. Hence, the main goals of retrofit strategies focus on how AIoT solutions and applications can be connected to the financial improvement of diverse perspective such as overall equipment utility, output, risk avoidance and reliability, and industry cost performance. Enterprises are always looking to use their resources as efficiently especially financial resources. Financial justification is the common procedure by which a business determines whether the investment is great use of the funds and how will it take back

Fig. 12.2 Several professional capabilities and systems required by the manufacturing industry

12 Artificial Intelligence and Optimization Strategies in Industrial IoT … 231

232

Y.-C. Wang and J. C.-W. Lin

between the investment and the return. There are four main reasons why it is necessary to transform the benefits of AIoT solutions into financial benefits. First, converting projects into actual profit and return on investment (ROI) can help improve work and communication efficiency by making it easy to understand the voice of customers, align ongoing work with customers, prioritize different projects, and properly allocate time and resources for multiple projects. Second, it is not only about the project name or tracking the AI boom but also about evaluating the outstanding work in AIoT projects by dollars, which means the capability and performance of these projects. Third, the transformed performances can help managers and developers set new goals to identify new opportunities for improvement. Last but not least, ROI is the key indicator. Only when our investment and output reach equilibrium within a limited period of time can it be a profitable investment. The calculation of ROI is the net return divided by the investment cost. For relatively long-term projects, such as AIoT projects that last longer than one year, it is recommended to use the internal rate of return (IRR). Basically, the IRR has the same meaning as the annualized rate of return. The content of this section introduces how to bridge the benefits of the specific AIoT applications, solutions and financial impact evaluation. The following content presents the values and how to cook them from different financial aspects in the industrial industry, and the real cases will be introduced to provide better explanations. The methods are commonly used in the industry but are not limited to the following content. In the process of financial justification of industrial IoT applications, these financial interests are divided into direct impacts and indirect impacts. Both savings of direct impacts and indirect impacts (cost avoidance) are profit impacts. Direct impacts are impacts that can directly contribute to the reduction of unit product cost and the benefits of contribution can be evaluated through quick calculations, such as cost reduction and increased production efficiency. Cost is the most important part of keeping a product competitive [12]. In most factories, actual manufacturing cost (AMC) is used as a cost index. AMC is the actual cost incurred to acquire materials and convert them into real products. As for production efficiency, overall efficiency effectiveness (OEE) is commonly used to evaluate production efficiency, quality and availability. For each of the above elements, factories and enterprises have specific departments that perform their tasks in various improvement projects. For direct impact, the ratio between total AMC and total output is usually used as the unit AMC. For the industry, the indirect impact is usually regarded as opportunity cost saved, such as preventing several potential customer complaints, tightening product specifications to increase brand value, the impact of downtime on the plant, etc. (Fig. 12.3). There are five steps for evaluating the financial benefits of AIoT solutions. The first step is to evaluate whether this project can have an impact. The improvement results should generate a positive impact on some metrics. If no clear results can be achieved in the evaluation step, the project manager should reassess whether this AIoT project has solved the specified pain points. The second step is to identify the performance metrics that will be used for baseline evaluation. In this step, the scenarios described in the following sections can be selected to measure the actual costs saved or benefits

12 Artificial Intelligence and Optimization Strategies in Industrial IoT …

233

Fig. 12.3 The structure of unit product cost connected with AIoT solutions

achieved. The third step is to estimate the benefits. Consideration should be given to the benefits of the measure, and what indicators should be applied to track the impact of the measure. The next step is to calculate the financial impact using the benefits from the previous step. The aim is not only to come up with a number but to see whether the measure has a sustainable cost–benefit impact compared to the reference year, and how much this case affects the cost-efficiency of individual products, sites, plants, and the entire business. From the conclusion of the fourth step, we can also assess the migration and scaling capabilities of this AIoT solution. The last step is to confirm the impact on the cost–benefit and verify if the cost reduction has already been partly covered by another measure and ensure that this number appropriately distributes the cost–benefit among all related improvement measures. It is important to fairly allocate benefits to measure and avoid double counting (Fig. 12.4).

Overall Equipment Effectiveness Overall Equipment Effectiveness is the most important indicator for measuring manufacturing productivity. Generally speaking, each production equipment has its own on-paper production capacity. This is also the clear direct impact mentioned in the previous sector. Before introducing OEE, we must first introduce a few proper

234

Y.-C. Wang and J. C.-W. Lin

Fig. 12.4 The process of AIoT project financial benefits calculation

nouns as follows. Below are some efficacy indices commonly used in factories [23], including cycle time, lead time, uptime/downtime, flexibility, and yield. Cycle time (CT) is the actual spent time required to produce a product or provide a service. Cycle time is calculated by summing total process time, inspection time, shipping time, waiting time, and any other time required during the production process. Lead Time (LT) is the time it takes from the start of a process until the product or service is delivered to the customer. Uptime (UT) indicates the time interval in which a system operates without failure, in which it is contrasted with downtime (DT) and DT refers to time with failure. Flexibility is the ability to switch rapidly from one type of product to another. Yield is the proportion of product that is not defective, and is often referred to as First Pass Yield (FPY). OEE consists of three key elements: availability rate, performance and quality rate. Availability is used to evaluate the efficiency loss affected by equipment shutdowns, including all events that result in the shutdown of scheduled production such as equipment failure, raw material shortages, and production recipe changeovers. Availability is calculated as the ratio of operating time to theoretical working time. Performance takes into account any event that results in the inability to achieve maximum speed during manufacturing. The calculation of performance is the product of the ideal CT and the total number of products, which is then divided by run time. Quality deals with parts produced that do not meet quality standards, and is often related to yield. The calculation method is generally the number of good products divided by the total number of products. Many AI and IoT applications in the industry can directly improve industry operational efficiency of the industry. In terms of availability, AIoT solutions are used whenever there is a need to reduce troubleshooting time by identifying Key Process Input Variables (KPIV) and Key Process Output Variables (KPOV). Predicting downtime is another part that is commonly applied. For the performance, cycle time

12 Artificial Intelligence and Optimization Strategies in Industrial IoT …

235

improvement and new equipment or process startup are selected as improvement targets. For the quality perspective, improving yield and increasing sample rate are continuously targeted as improvement goals. These are applications that can actually bring positive financial benefits. Improving the OEE index is the most direct way for an AIoT solution to actually reduce unit product costs by increasing total product output. Below are some regularly used methods for calculating direct benefits to unit AMC including UT improvement, yield improvement, CT reduction improvement, and LT reduction improvement. UT improvement: CS = UAMC (Cash) × Increased UV (Quantity)

(12.1)

Yield improvement: CS = UAMC (Cash) × Increased UV (Quantity)

(12.2)

CT reduction: CS = UAMC (Cash) × Increased UV (Quantity)

(12.3)

LT reduction: CS = UAMC (Cash) × Increased UV (Quantity)

(12.4)

Since AMC could be confidential information for some enterprises or facilities in the industry, it suggests that the following approaches can be used to estimate financial benefits without AMC information. Different to the calculation of unit AMC, the following calculation is based on the estimate of increased sales revenue or specific cost savings rather than the increased cash flow resulting from the reduction in unit costs as mentioned above. For an increase in OEE or production capacity by a few percent ‘x’, the following formula can be used, assuming that the unit/line/ equipment process is a bottleneck and more products can also be used to simulate that an AIoT solution replaces existing machines or processes is migrated to other units/lines/equipment (Table 12.3). BS = Total sales in a time interval × avg margin profit × x%

(12.5)

For a few percent ‘x’ increases in yield, the cost saving can be enumerated by multiplying the in-process product cost by the product of the increased yield and the total amount of products in a time interval. CS = In process product cost × UV in a time interval × x%

(12.6)

Furthermore, excessive inventory is a burden on the enterprise. Excessive inventory brings several negative impacts to the enterprise in industry, such as capital

236

Y.-C. Wang and J. C.-W. Lin

Table 12.3 Acronyms

Acronyms UT

Uptime

CT

Cycle time

CS

Cost savings

BS

Boost profits

UAMC

Unit AMC for baseline

UV

Unit volume

UPB

Unit price before

occupation, management costs, potential price loss, and potential product expiration risk. Therefore, it is also necessary to reduce inventory and increase production flexibility. Inventory control of production is also one of the key points that AIoT solutions can develop in the industry, such as using maintenance repair and operations procurement (MRO Procurement) data to predict forecasts to reduce the overall spare parts and inventory. The benefit of this inventory reduction is that the overall lead time can be shortened to reduce the overall AMC. Here is the benefit calculation if the WIP level is reduced. BS = UC × Reduced pcs × Reduced days × Daily interest rate

(12.7)

Finally, flexibility is an indicator that is less mentioned because it is difficult to quantify. Sometimes, it is able to use uptime, downtime, and yield but relied on actual cases. For example, an AI solution can find the key tuning parameters of a critical process, which can greatly reduce product changeover time and significantly increase production efficiency.

Products or Service Application for Customer Value Creation In industry, various engineering technical solutions embedding IoT devices and AI can make existing products or services more productive, reliable, and efficient. As an example from the manufacturing industry, some advanced and high-end machines or equipment modules are equipped with the perception capabilities of IoT sensors and AI applications to make timely predictions about the remaining useful life of the equipment and automatically detect the consumption of consumables. Many new cameras in the industry also have AI image processing and image recognition capabilities. For these benefits that apply emerging technologies in the industry and significantly increase the value of products and services, we usually use the simplest method of simply multiplying the margin profit by the total revenue in a time interval to obtain direct benefits.

12 Artificial Intelligence and Optimization Strategies in Industrial IoT …

237

Insight Acquisition for Knowledge Discovery AI applications in the industrial domain can be used for knowledge discovery by identifying insights in engineering systems. With the development of IoT devices, the industry is capturing more accurate and timely data in the control systems of manufacturing facilities. However, the measurement of the value of knowledge and the benefits brought by knowledge are rarely discussed, so the implementation of AIoT often fails to become a stable closed loop and be trusted by industrial practitioners. This section describes the approaches to assessing the benefits of several commonly used AIoT applications, and these conceptual approaches can be generalized to a variety of industrial fields.

Engineering Critical Asset Protection The deterioration of large engineering infrastructure systems such as power systems, facility equipment, and expensive equipment represents millions of dollars in a facility. Nowadays, predictive and preventive maintenance through data-driven machine learning is also a critical issue in terms of cost reduction for industrial applications. Prognostics and health management (PHM) find opportunities at the shop floor by modeling equipment health aging through the data from newly equipped sensors, i.e., vibration sensors [33]. In facilities, it uses the extended usage time of the machine and reduced downtime to calculate the benefit, and of course, the loss caused by down time can also be used to calculate potential and indirect impacts. Maintenance and asset protection for extending remaining useful life cost savings are shown as follows. CS for maintenance = Anual unit euqipment depreciation × (AUA − AUB) (12.8) For example, if the service life of a machine is extended by one year due to the cognitive AIoT solutions, the depreciation of the device for the whole year is considered as a generated benefit. On the other hand, it can also calculate the loss caused by this machine if it is not extended for one year.

Key Process Input Variable Finding A key process input variable (KPIV) indicates the process inputs that significantly influence the output variation of the key process output variable (KPOV) of a product. A key process output variable (KPOV) is the factor that results in an output from a process or some objects such as parts, assemblies or entire systems. Thus, KPIV is the factor to determine whether KPOV or process output achieves good quality. Keeping the KPIV constant yields a predictable and consistent output. Stable control of KPIV to reduce quality variation improves the overall process capability by a

238

Baseline Quality indicators x

Y.-C. Wang and J. C.-W. Lin

Control Simulation

Model Selected

Controllable KPIVs (M>N) Top N Top M

Method 1

-a %

-f %

Method 2

-b %

-g %

Method 1

-c %

-h %

Method 2

-d %

-I %

Method 1

-e %

-j %

Fig. 12.5 Top N and Top M KPIV benefit evaluation process by simulation method

quality improvement program. In general, factories propose to find out KPIV in the improvement of six sigma and process quality assurance (PQA) improvement and well control it. However, the biggest problem for the industry is that there are many parameters in the production process, and a production line has hundreds or even thousands of settings and measurable parameters that are recorded in the shop floor control system. Finding the actual KPIV is very difficult indeed. With the technical support of AI and IoT, it is possible to find out KPIV quickly and fix the root cause through ML and optimization modeling. Back to the focus of our section, how do we measure the direct and indirect financial benefits of finding KPIV? The simple way is to identify and calculate the improved OEE including cycle time, uptime, and yield that can help you increase production efficiency, then to convert the above indicators into cash savings. Unfortunately, if there is no way to use OEE for the calculation, it can use this KPIV to affect the improvement of several standard deviations of process capability, and then make a linear estimate to compare with the original baseline cost. Finally, if the above two points do not work, the simulation method can be used in this case. Generally, it should look at the actual improvement during a period for benefit evaluation, but sometimes if the system is in the designation/development stage, some simulation results to display the expected results can be accepted during startup period. As shown in Fig. 12.5, we can use historical data in the past to change the range of this KPIV by using the concept of quartiles or design of experiment (DoE) to see the percentage for the improvement compared to the baseline model, and then use this percentage to convert to how much yield is improved (Table 12.4).

12 Artificial Intelligence and Optimization Strategies in Industrial IoT …

239

Table 12.4 Top N and Top M KPIV financial benefit evaluation process by simulation method Basic goal

Stretch goal

Specific defect rate of quality indicators

Specific defect rate baseline

a % reduction for e.g

f % reduction for e.g

Loss reduction

Output gain

Loss reduction

Output gain

Line 1

I%

I×a

I × a × Year output

I×f

I × f × Year output

Plant 1

J%

J×a

J × a × Year J × f output

J × f × Year output

Actual Manufacturing Cost Reduction Several AI and IoT solutions are designed to achieve the benefits of cost reduction regardless of any waste of resources, labor, and lead time, among others in the industry. In this section, the benefits of various cost reduction approaches are frequently used for financial improvement. For material cost reduction, three major savings including material change, usage reduction, and yield improvement are respectively introduced that are equivalent to scraps reduction as follows. Material change: CS1 = UU × Demand × ( UPB − UPA)

(12.9)

CS2 = UU × Demand × (UB − UA)

(12.10)

CS3 = UU × Produced Volumn × ( SB − SA)

(12.11)

Usage reduction:

Scrap reduction:

Overall material cost reduction: CS for material reduction = CS1 + CS2 + CS3

(12.12)

Utility bill reduction is one of the most imperative targets in the industry. Energy consumption in the industry, especially in factories, is always considered as one of the largest fixed expenditures. Therefore, there are countless improvement projects on energy issues, and of course, various AIoT projects that optimize energy consumption. With the emergence of sustainable issues and carbon tax policy planning, the saved energy consumption has great benefits. Here you can see how the reduced utility cost is calculated from the savings. Utility cost reduction:

240

Y.-C. Wang and J. C.-W. Lin

CS for utility = Reduced consumption × Unit energy price

(12.13)

Automation is one of the major aspects in industrial AI applications. With the help of AI, the capability of automation has been fundamentally enhanced. AI and ML technologies can boost the performance and expand the possibility of the conventional automation industry. Therefore, the reduction of manpower must also be taken into account as the benefits of AI and IoT solutions. Human resource cost reduction is separated into two parts in industrial financial calculation including headcount saving and time saving. The headcount saving is calculated based on direct labor (DL) and each cost of DL. For time saving, the cost that are most desired to reduce the manufacturing industry is overtime pay, rework time, and repair time. Headcount reduction: CS4 = Reduced DL × DL rate

(12.14)

CS5 = Reduced time length × time interval rate

(12.15)

Time saving:

Overall manpower cost reduction: CS for manpower reduction = CS4 + CS5

(12.16)

Apart from the material saving, utility saving, and human resource saving, facility cost saving is required to be considered in the industry. In the manufacturing facilities, every space is a huge cost of land and factory building planning. At present, AI and IoT applications on images rely on cameras and edge computing services for defect inspection, which saves a considerable amount of space compared to the traditional inspection machines with large masks, which can be used for the benefit calculation from the reduced space. Facility cost saving: CS for facility improvement = Rental rate × Reduced space

(12.17)

In this section, while an AI project is successfully deployed, we can use the methods mentioned above to evaluate the real financial value of AI/IoT projects (Table 12.5).

Potential Cost Avoidance for Customer Complaint In addition to the benefits mentioned above, there are some potential benefits that AIoT applications are able to avoid the possibility of quality issues. For example, the industrial inspection station uses AI to re-evaluate the product quality by image or video identification to avoid customer complaints caused by human misjudgment.

12 Artificial Intelligence and Optimization Strategies in Industrial IoT … Table 12.5 Acronyms

Table 12.6 Acronyms

241

Acronyms CS

Cost saving

UPB

Unit price before

UPA

Unit price after

UU

Unit usage

UB

Usage before

UA

Usage after

SB

Scrap before

SA

Scrap after

HC

Headcount

DL

Direct labor

AUB

Annual usage before

AUA

Annual usage after

Acronyms CS

Cost saving

UCCC

Unit cost of customer complaints

NPR

Number of product recalled

Cost of Poor Quality (CoPQ) is the summation of the accompanying costs which are vanished due to quality issues. This includes all waste and variation, running costs to fix the issue, as well as lost customers or reputation [28]. Competitive companies are constantly trying to reduce the cost of poor quality because they can continuously maintain competitiveness in the capital market. When unaddressed, CoPQ can be incredibly destructive to the bottom line of the business. The cost saved from AI by preventing customer complaints caused by CoPQ can be calculated by the following methods (Table 12.6). Cost reduction for mitigating customer complaints: CS for customer complaints avoidance = UCCC × NPR

(12.16)

Discussion Any improvement, especially the successful AIoT development in the industry, will definitely have its benefits no matter from which point of view. Figure 12.6 shows three basic elements of manufacturing and metrics for manufacturing profitability. Starting with the materials, processes, equipment aspects, and varied technologies

242

Y.-C. Wang and J. C.-W. Lin

Fig. 12.6 Three basic elements of manufacturing related necessary technologies, disciplines, and metrics for manufacturing profitability

combined with AIoT in each aspect, such as modeling and advanced process control, can bring profits through cost reductions and efficiency improvements. Finally, it is necessary to identify only one type of category each time, a financial benefit is to be calculated to avoid double counting and maintain the integrity of the information.

Case Study The voice of the customer (VOC) is a market research technique that captures needs and wants in a hierarchical order and then prioritizes them. AIoT solutions are different for different customers, processes, stages, and roles, but the approach to realizing the solutions can be very similar. This section describes two industrial AIoT use cases with similar frameworks in the manufacturing industry where the voice of the customer led to fundamental needs. The voice of the customer comes directly from leaders and engineers at the manufacturing site and the groups required much thought and discussion to reach successful outcomes that have an impact on the business. Although the process details of the two cases are different, the cases essentially follow the following five steps, i.e., (1) Create, (2) Collect, (3) Consolidate, (4) Correlate, (5) Apply. The role of data in it runs through the entire process, and Fig. 12.7 shows the data flow through these steps [10].

12 Artificial Intelligence and Optimization Strategies in Industrial IoT …

243

Fig. 12.7 The data flow in the manufacturing industry through defined five steps [10]

Creating industrial automation includes all aspects of manufacturing systems, equipment setup, process configuration and control, work in progress (WIP) and finished goods movement, tracking and automatic data processing with respect to quality assurance and control system. In this step, network infrastructure for communication among all manufacturing systems and equipment is required. Collection shows how to collect and store contextualized information about the related product and how to process manufacturing data, in which it typically involves both time series and discrete sources. Consolidation shows unifying and relating across data sources in manufacturing including Manufacturing Execution System (MES) data, process information, and quality control (QC) data. Correlation means the process of finding insight by experts, i.e., data scientists, researchers, and engineers by analytics, advanced modeling and ML or AI-based algorithms. Application presents the information or insight that can continuously produce business value and benefits such as prescribed action, failure prediction, advanced process control, and automated response, among others. That is, specific facilities can be deployed on the designed platform and interact with users for real-time control or model inference by AI or ML methodologies. After those steps are achieved, we will look at the results and impacts of the project.

244

Y.-C. Wang and J. C.-W. Lin

Case Study: Complex Process KPIV Exploring by AIoT Machine learning approaches are able to compensate for the challenges related to the required prior knowledge about the system for physics-based approaches through model training. The use of machine learning approaches is particularly suited to problems with processes that are not fully realized, or where the implementation of physics-based approaches is computationally infeasible due to an excessive number of variables. Therefore, machine learning approaches have been implemented to mitigate these drawbacks of physics-based approaches [43]. KPOV can be a specific quality indicator or an early indicator for quality evaluation. KPIV is the factor that determines the process output quality of KPOV. Maintaining a stable KPIV yields a predictable and controllable output. However, in complex industrial manufacturing, many parameters and variables affect specific quality factors that it is difficult to narrow down and discover specific KPIV and KPOV. With the advancements of industrial technology and the rise of AI-based approaches, the opportunities to find KPIV related to specific quality issues also emerge as a critical issue accordingly. Through model training with a large amount of data, parameters with relatively high correlation can be identified and then validated with process domain knowledge and basic physical principles. The know-how assisted by the AI-based model has the following three advantages. First, it can help engineers and process experts find new and previously overlooked knowledge. There are many reasons that affect the output and quality of products and industrial machine performance, and these reasons are usually mixed problems. Existing knowledge can usually only solve a certain proportion of problems, and the remaining problems will be attributed to performance, variation, or disturbance. However, each process expert has professional insights from different perspectives, and it is easy to have blind spots while solving process problems. The correlation found through the AI model is simply presented as a result of the process data. If the model evaluation outcome presents reliable evidence, there is a chance to solve the new problems by AI and IoT solutions. Second, when a quality event occurs, AI and IoT solutions can help the operator quickly, finding the potential root cause. When quality problems or customer complaints occur in the production line, operation engineers are usually required to identify the root cause in a short time and make improvements accordingly. It is critical to different processes and facilities and will continuously happen if the real cause is not found. Generally, operation engineers take the “FOUR” steps for the root cause finding. i.e., (1) Process engineers identify and scrutinize assorted distinct defect types such as product particle, scratch, breakage, chip, etc., then stratify the data by various defect modes from the shop floor control system. Through these inductions, engineers can check whether these specific defects are distributed on specific sides and locations, and if the frequency of occurrence is a random pattern or not, (2) Process engineers make use of their domain knowledge and experience to do some root cause analysis inclusive of real defect sample collection, physical analysis, and possible station searching, these analyzes assist in exploring possible

12 Artificial Intelligence and Optimization Strategies in Industrial IoT …

245

causes, (3) Process engineers observe the changes in some process parameters from the results of previous step, (4) Draw the conclusions and make decisions from the observations. Based on the above steps, and the experience of the experts, the probability of solving the problem within a short time and reducing the loss is not high. In most cases, it is still difficult to find the root cause using the above procedures. Even though process engineers can occasionally identify the true cause through the analysis process, the overall process has three major drawbacks including time-consuming for data collection from scratch, long responding time for back-and-forth discussion, and experience-dominated process variables checking. Nevertheless, dealing with root cause analysis problems have been well solved in the industry by the application of large amounts of data collection and AI-based models. Using the AI models to find process KPIV can not only help us troubleshoot, but also reveal new technical opportunities for the future, such as Advanced Process Control (APC), Virtual Metrology (VM), smart sampling, and PHM. This is a case study from the chemical and back-end processing industries. The complex manufacturing process makes quality control a major challenge, and yield losses in semi-finished products increase the cost significantly. To overcome these drawbacks in this case, a one-stop AI-based system and analysis platform is being developed to maximize the efficiency of this analysis process. This is a process where the sensors are fully distributed in the production lines, and the data is collected by the PLC and synchronized with the application layer through the OPC UA interface. Starting from the data collection, the system can stratify a variety of different defect types, and defect modes with IoT-enabled and consumed data, and monitor the defect trend through a consolidated data lake via the user-friendly graphical interface. Users can select an arbitrary time interval to visualize the QC (Inspection) data for trend observation. The inherent unique product ID enables the automatic data integration of the manufacturing data and QC (Inspection) data. During the root cause analysis process using the AL-based approaches on the integrated data and reaching preset criteria for AI model performance such as training accuracy, testing accuracy, recall rate, etc., major variables highly impact the selected KPOV from thousands of process variables that can be identified and monitored in real-time. Overall, the one-stop analytic platform forms a promising and novel tool, which significantly improves process efficiency. In addition, this one-stop AI-based system not only uses AI and ML models to quickly find relevant input and output but also adopts some explainable AI methods to find out whether the correlation distribution of KPIV and KPOV is linear. Here, SHapley Additive explanation (SHAP) values method [26] is used by the system. The system can present the impact of each data point on the training progress of the ML models through data visualization, and the user can use the results to rapidly adjust appropriate and logical parameters accordingly. Figure 12.8 shows the objectives, analytic pipeline, and one-stop system structure of the AI applications for discovering KPIV. However, there are two challenges with AI-based methods in manufacturing. First, data used in manufacturing often have rare but interesting events such as defect data. For example, products with a 95% yield rate only have 5% defect data, and the distribution will cause the bad performance

246

Y.-C. Wang and J. C.-W. Lin

Fig. 12.8 System architecture, and analytic pipeline in KPIV discovering case [14, 26]

of ML and AI-based models. Synthetic minority oversampling technique (SMOTE) [14] and under-sampling approaches will be used for solving the data imbalance problem in this system. Second, it is necessary to remove highly correlated variables from process experiences and correlation, since the contribution between two variables affects the performance of the AI models. In this system, the system will automatically find the sets of Pearson correlation coefficients of pairwise variables higher than 0.8, taking the process variable that occurred at the early stage as the representative, and then remove the remaining variables. In addition, it is possible to select the necessary variables by processing knowledge in the system. Table 12.7 shows potential applications and benefits for the given KPIV. In practice, a defect problem caused by product burns results in the highest percentage of downtime in the year. With the above-mentioned analytic pipeline of Fig. 12.8, the developed AI-based system helps reduce hundreds of process Table 12.7 Potential applications and benefits for given KPIV Category

Was

Is

Benefits

Benefit metrics

Trouble shooting

Empirical and passive

Data-driven and proactive

Loss and downtime reduction

• Yield enhancement • Uptime improvement

Quality control

Postmortem analysis

Prevent in advance CoPQ elimination

• How many QI issues are prevented

Shop Floor Operations

Diverged monitor index

Converged index

• Downtime reduction

Flexibility

12 Artificial Intelligence and Optimization Strategies in Industrial IoT …

247

parameters that may affect burn defects by 89% in the feature selection stage. With the operation of the AI-based system, the remaining variables achieve higher than 98% precision and recall rate under the performance of the ML classification model combined with SMOTE techniques. The parameter with the highest feature importance displayed in the ML model is a kind of grinding amount. Through empirical evidence and experiments, this is indeed a key factor affecting burns. This factor also coincides with the physical principle. Through better compensation amount control, burn defect cases can be significantly decreased. After completing this case study, subsequent automation improvement can be carried out in series. If the AIoT-oriented projects can only be carried out to this stage, the benefit and financial justification can be calculated for the KPIV simulation method in Sect. 12.3. After finding the KPIV, the Application stage has the following schemes, i.e., (1) automatic compensation (Advanced process control), (2) Process monitoring and smart sampling, and (3) Optimizing found parameters to prolong the service life of consumable materials. In this case, several direct impacts and their corresponding financial benefits are actually calculated for one line which includes down time reduction per year, consumable materials reduction per year, and yield improvement. If there are previous customer complaint costs, the potential CoPQ cost saving can also be calculated. The calculation of the aggregated results above can be used to estimate the remaining lines and convince sponsors, as well as the stakeholders to execute the scaling project.

Case Study: Computer Vision Computer vision (CV) has become an interdisciplinary scientific field that deals with how computers can gain high-level understanding from digital images or videos. Combining high-speed, high-resolution cameras, and increasingly mature AI technologies, CV becomes a feasible research topic in manufacturing. Using CV and ML, image and video data are collected, and the state of manufacturing machines is followed in real-time. If something goes wrong in the production process, or any system failure happens, the AIoT solutions can notify the shop floor control system immediately or solve the issue automatically by automation. This section describes a case study to detect abnormal events in a greatly standardized manufacturing environment without human supervision. Three questions would be asked for detecting an abnormal event including when, where and what happened. By answering the above three questions, we can clearly use the AI model and how to structure the system. There are several surveillance cameras and IoT devices available. A case shows the detection for anomalies of the packaging process in the streaming video, the abnormal packaging defects can be detected using image data and ML models. An AIoT prototype was used for this process and achieved a good validation result in the online tests. In this case, the system can detect nearly 3% of packaging problems that were not detected in the past, significantly reducing downtime and customer complaints. In terms of direct impact, we can calculate the financial benefits based on the reduced downtime and lab saved possibly to use past customer

248

Y.-C. Wang and J. C.-W. Lin

Fig. 12.9 A general workflow for an anomaly detection system and 3 question examples

complaint costs for a case to evaluate CoPQ savings. In addition, due to the limited capability of inspection equipment, there were many redundant steps for manually re-evaluating photos of defective products. Currently, there are also many AIoT solutions in the industry that can replace operators in photo re-evaluation with AI models. The benefits of AIoT solutions consist of the following two parts. First, it can use the comparison of accuracy rate between operators and AI to calculate the difference in defect rate and calculate the total indirect impact through customer complaint cost. It can calculate the reduced labor costs if the AI performs at least as well as or better than the human operator (Fig. 12.9).

Conclusion As AI and IoT become increasingly popular technologies in the industry, new applications are constantly being applied to the production line. There is an urgent need to see the real impact of AIoT solutions by evaluating KPIs and financial justification. Most of the present study focuses on the Industry 4.0 supply chain performance measurement framework. The promotion of strategies to evaluate the benefits of AIoT solutions allows for the support and measurement of performance. This study also provides a helpful tool for manufacturing and practitioners to promote the values of AIoT. The ROI of the AIoT projects can help to obtain scaling funds for existing projects, initiate new projects, and achieve continuous improvements. Two case studies in the manufacturing industry describe AIoT applications linked to the funding strategies.

12 Artificial Intelligence and Optimization Strategies in Industrial IoT …

249

References 1. Akter, S., McCarthy, G., Sajib, S., Michael, K., Dwivedi, Y.K., D’Ambra, J., Shen, K.N.: Algorithmic bias in data-driven innovation in the age of AI. Int. J. Inf. Manag. 60, 102387 (2021) 2. Ante, G., Facchini, F., Mossa, G., Digiesi, S.: Developing a key performance indicators tree for lean and smart production systems. IFAC-PapersOnLine 51(11), 13–18 (2018) 3. Apilio˘gulları, L.: Digital transformation in project-based manufacturing: developing the ISA95 model for vertical integration. Int. J. Prod. Econ. 245, 108413 (2022). ISSN 0925–5273. https://doi.org/10.1016/j.ijpe.2022.108413 4. Arm, C., Zezulka, F., Bradac, Z., Kaczmarczyk, V., Benesi, T., Schroeder, T.: Implementing industry 4.0 in discrete manufacturing: options and drawbacks. IFAC-PapersOnLine 51(6), 473–478 (2018) 5. Atutxa, A., Astorga, J., Barcelo, M., Urbieta, A., Jacob, E.: Improving efficiency and security of IIoT communications using in-network validation of server certificate. Comput. Ind. 144, 103802 (2023). ISSN 0166–3615. https://doi.org/10.1016/j.compind.2022.103802 6. Bauters, K., Cottyn, J., Claeys, D., Slembrouck, M., Veelaert, P., Van Landeghem, H.: Automated work cycle classification and performance measurement for manual workstations. Robot. Comput. Integrated Manuf. 51, 139–157 (2018) 7. Boyes, H., Hallaq, B., Cunningham, J., Watson, T.: The industrial internet of things (IIoT): an analysis framework. Comp. Ind. 101, 1–12 (2018). ISSN 0166 3615. https://doi.org/10.1016/ j.compind.2018.04.015 8. Büyüközkan, G., Göçer, F.: Digital supply chain: literature review and a proposed framework for future research. Comput. Ind. 97, 157–177 (2018) 9. Chauhan, C., Singh, A., Luthra, S.: Barriers to industry 4.0 adoption and its performance implications: an empirical investigation of emerging economy. J. Clean. Prod. 285, 20 (2021). https://doi.org/10.1016/j.jclepro.2020.124809 10. Daki, H., El Hannani, A., Aqqal, A. et al.: Big Data management in smart grid: concepts, requirements and implementation. J. Big Data 4, 13 (2017). https://doi.org/10.1186/s40537017-0070-y 11. de Sousa Jabbour, A.B.L., Jabbour, C.J.C., Foropon, C., Filho, M.G.: When titans meet–Can industry 4.0 revolutionise the environmentally-sustainable manufacturing wave? The role of critical success factors. Technol. Forecast. Social Change 132, 18–25 (2018). https://doi.org/ 10.1016/j.techfore.2018.01.017 12. Dhavale, D.G.: A manufacturing cost model for computer-integrated manufacturing systems. Int. J. Oper. Prod. Manag. 10(8), 5–18 (1990) 13. Ehie, I.C., Chilton, M.A.: Understanding the influence of IT, OT Convergence on the adoption of Internet of Things (IoT) in manufacturing organizations: an empirical investigation. Comput. Ind. 115, 103166 (2020). ISSN 0166–3615 https://doi.org/10.1016/j.compind.2019.103166 14. Elreedy, D., Atiya, A.F.: A comprehensive analysis of synthetic minority oversampling technique (SMOTE) for handling class imbalance. Inf. Sci. 505, 32–64 (2019). ISSN 0020–0255, https://doi.org/10.1016/j.ins.2019.07.070 15. Emmer, C., Glaesner, K.H., Pfouga, A., Stjepandi´c, J.: Advances in 3D measurement data management for industry 4.0, Procedia Manuf. 11, 1335–1342 ( 2017), ISSN 2351–9789. https://doi.org/10.1016/j.promfg.2017.07.262 16. Gosselin, M.: An empirical study of performance measurement in manufacturing firms. Int. J. Product. Perform. Manag. 54(5/6), 419–437 (2005). https://doi.org/10.1108/174104005106 04566 17. Hon, K.K.B.: Performance and evaluation of manufacturing systems. CIRP Ann. Manuf. Technol. 54(2), 139–154 (2005) 18. Hunzinger, R.: Scada fundamentals and applications in the IoT. In: Internet of Things and Data Analytics Handbook, Wiley, pp. 283–293 (2017). https://doi.org/10.1002/978111917 3601.ch17

250

Y.-C. Wang and J. C.-W. Lin

19. Ishengoma, F., Shao, D., Alexopoulos, C., Saxena, S., Nikiforova, A.: Integration of Artificial Intelligence of Things (AIoT) in the public sector: drivers, barriers and future research agenda. Digital Policy Regul. Gov. 24 (2022). https://doi.org/10.1108/DPRG-06-2022-0067 20. Jamwal, A., Agrawal, R., Sharma, M., Giallanza, A.: Industry 4.0 technologies for manufacturing sustainability: a systematic review and future research directions. Appl. Sci. 11(12), 5725 (2021) 21. Jamwal, A., Agrawal, R., Sharma, M.: Deep learning for manufacturing sustainability: Models, applications in Industry 4.0 and implications. Int. J. Inf. Manag. Data Insights 2(2), 100107 (2022). ISSN 2667–0968, https://doi.org/10.1016/j.jjimei.2022.100107 22. Kamble, S. S., Gunasekaran, A., Ghadge, A., Raut, R.: A performance measurement system for industry 4.0 enabled smart manufacturing system in SMMEs—a review and empirical investigation. Int. J. Prod. Econ. 229 (2020). https://doi.org/10.1016/j.ijpe.2020.107853. 23. Khan, I. H., Javaid, M.: Role of Internet of Things (IoT) in adoption of Industry 4.0. J. Ind. Integr. Manag. (2021). Article 2150006. Optimization of overall equipment effectiveness in a manufacturing system 24. Lee, J., Davari, H., Singh, J., Pandhare, V.: Industrial artificial intelligence for industry 4.0based manufacturing systems, Manuf. Lett. 18, 20–23 (2018), ISSN 2213–8463. https://doi. org/10.1016/j.mfglet.2018.09.002 25. Leitner, S.H., Mahnke, W.: Opc ua- service-oriented architecture for industrial applications. Softwaretechnik-Trends 26 (2006) 26. Lundberg, S., Lee, S.I.: A unified approach to interpreting model predictions. In: NIPS’17: Proceedings of the 31st International Conference on Neural Information Processing Systems, pp. 4768–4777 (2017) 27. Machado, C.G., Winroth, M.P., Ribeiro da Silva, E.H.D.: Sustainable manufacturing in Industry 4.0: an emerging research agenda. Int. J. Prod. Res. 58(5), 1462–1484 (2020). https://doi.org/ 10.1080/00207543.2019.1652777 28. Mahmood, S., Ahmed, S., Panthi, K., Kureshi, N.: Determining the cost of poor quality and its impact on productivity and profitability. Built Environ. Project Asset Manag. 4, 296–311. https://doi.org/10.1108/BEPAM-09-2013-0034 29. Miragliotta, G., Sianesi, A., Convertini, E., Distante, R.: Data driven management in Industry 4.0: a method to measure data productivity. IFAC-PapersOnLine 51(11), 19–24 (2018) 30. Napoleone, A., Macchi, M., Mozzetti, A.: A review on the characteristics of cyber-physical systems for the future smart factories. J. Manuf. Syst. 54, 305–335 (2020) 31. Ndubisi, N.O., Zhai, X., Lai, K.H.: Small and medium manufacturing enterprises and Asia’s sustainable economic development. Int. J. Prod. Econ. (2020) 32. Neely, A., Gregory, M., Platts, K.: Performance measurement system design: a literature review and research agenda. Int. J. Oper. Prod. Manag. 15(4), 80–116 (1995) 33. Ochella, S., Shafiee, M., Dinmohammadi, F.: Artificial intelligence in prognostics and health management of engineering systems. Eng. Appl. Artif. Intell. 108, 104552 (2022). ISSN 0952– 1976. https://doi.org/10.1016/j.engappai.2021.104552 34. Plageras, A.P., Psannis, K.E.: Digital twins and multi-access edge computing for IIoT. Virtual Reality Intell. Hardware 4(6), 521–534 (2022). ISSN 2096–5796. https://doi.org/10.1016/j. vrih.2022.07.005 35. Rossit, D.A., Tohmé, F., Frutos, M.: Production planning and scheduling in cyber-physical production systems: a review. Int. J. Comput. Integrated Manuf. 32(4–5), 385–395 (2019). https://doi.org/10.1080/0951192X.2019.1605199 36. Sharma, R., Jabbour, C.J.C., Lopes de Sousa Jabbour, A.B.: Sustainable manufacturing and industry 4.0: what we know and what we don’t. J. Enterp. Inf. Manag. 34(1), 230–266 (2021), https://doi.org/10.1108/JEIM-01-2020-0024 37. Shin, W.S., Dahlgaard, J.J., Dahlgaard-Park, S.M., Kim, M.G.: A quality scorecard for the era of industry 4.0. Total Qual. Manag. Bus. Excel. 29 (9–10), 959–976 (2018) 38. Tao, F., Qi, Q., Liu, A., Kusiak, A.: Data-driven smart manufacturing. J. Manuf. Syst. 48, 157–169 (2018)

12 Artificial Intelligence and Optimization Strategies in Industrial IoT …

251

39. Thouti, S., Venu, N., Rinku, D.R., Arora, A., Rajeswaran, N.: Investigation on identify the multiple issues in IoT devices using convolutional neural network, measurement. Sensors 24, 100509 (2022). ISSN 2665–9174. https://doi.org/10.1016/j.measen.2022.100509 40. Verma, S., Sharma, R., Deb, S., & Maitra, D.: Artificial intelligence in marketing: Systematic review and future research direction. Int. J. Inf. Manag. Data Insights 1(1) (2021). Article 100002 41. Wan, J., Li, X., Dai, H.-N., Kusiak, A., Martínez-García, M., & Li, D.: Artificial-intelligencedriven customized manufacturing factory: key technologies, applications, and challenges. Proc IEEE 109, 377–398 (2020) 42. Wang, J., Ma, Y., Zhang, L., Gao, R.X., Wu, D.: Deep learning for smart manufacturing: methods and applications. J. Manuf. Syst. 48, 144–156 (2018) 43. Xu, Y.W., Kohtz, S., Boakye, J., Gardoni, P., Wang, P.F., Physics-informed machine learning for reliability and systems safety applications: state of the art and challenges. Reliab. Eng. Syst. Saf. 230,108900 (2023). ISSN 0951–8320. https://doi.org/10.1016/j.ress.2022.108900 44. Yusuf, Y., Gunasekaran, A.: Agile supply chain capabilities: determinants of competitive objectives. Int. J. Prod. Econ. 159, 379–392 (2004)