Robotic Vehicles: Systems and Technology [1 ed.] 9813366869, 9789813366862

This book introduces the technological innovations of robotic vehicles. It presents the concepts required for self-drivi

407 69 13MB

English Pages 221 [207] Year 2021

Report DMCA / Copyright

DOWNLOAD PDF FILE

Table of contents :
Preface
Acknowledgements
Abbreviations
Automobile Standards (Courtesy of SAE International)
Contents
List of Figures
List of Tables
List of Flow Charts
List of Programs
1 Introduction
1.1 Preliminary
1.2 Highlighting Features of the Book
1.3 Book Organization
2 Autonomous System Connectivity
2.1 Interconnected Cars
2.2 Vehicular Network and Communication
2.3 Vehicle Tracking
2.4 Cloud Processing
2.5 Vehicle Safety
2.6 Vehicle Security
References
3 Embedded System Development
3.1 Virtual Machine
3.2 Embedded SoC
3.3 Embedded Safety Processor
3.4 SoC Functional Safety
References
4 Autonomous Vehicle Data Processing
4.1 Vehicle OS Structure
4.2 Processors
4.3 Ethernet
4.4 Vehicle Databus
4.5 Remote Processing
4.5.1 Internet of Things
References
5 Vehicle Components
5.1 Brakes and Steering
5.2 Electric Power Steering
5.2.1 Cyberspace Security
5.3 Vehicle Suspension System
5.3.1 Regenerative Braking
5.4 Electronic Control Units
Reference
6 Vehicle Navigation Computing
6.1 Wireless MCU
6.2 Mapping
6.3 Cloud Computing
6.4 Edge Computing
6.5 Car Navigation
7 Vehicle Test Drive and Simulation
7.1 Vehicle Components Testing
7.2 Vehicle Simulation and Testing
7.2.1 Automated Driving Attestments
Reference
8 Advanced Chip Driven Electronics
8.1 Automotive Electronics
8.2 Memory Safety
8.3 Embedded Chip Security
8.4 Embedded Computing Board
9 Sensors and Vision System Fusions
9.1 Vision Hardware
9.2 Image Processing
9.3 Vision Chip
9.4 Embedded Vision
9.5 Sensors Fusion
Reference
10 V2V and V2X Communications
10.1 Driving Vehicle Connectivity
10.2 Vehicular IoT
10.3 5G
10.4 Internet of Vehicles
10.5 Interconnected Vehicles
References
11 Deep Learning and Neural Network Algorithms
11.1 Artificial Intelligence
11.2 Neural Network
11.3 Deep Learning
11.4 Neural Network Accelerator
Reference
12 ADAS in Autonomous Driving
12.1 Advanced Driver Assistance Systems
12.2 Imaging Radar
12.3 LiDAR
12.4 Software Component
12.5 Hardware Component
References
13 Electric Vehicle
13.1 HEV/EV
13.2 Powertrain and Engines Management
13.3 Battery, Converter and Inverter
13.4 Transmission and Fuel Injection Module
13.5 Braking System
13.6 Vehicle Charging System
References
14 Vehicular Advancements
14.1 Vehicular Compliances and Expectations
14.2 Traffic Management
14.3 Emission Reduction
14.4 Future of Driverless Car
Reference
15 Electric Mantis Robot
15.1 The Electric Mantis Robot
15.2 The 3 Degrees of Freedom Mechanism
15.3 Robot Calibration
15.4 Mantis Robot Walking Methodology
15.5 Programming Methodology
15.6 Autonomous Mantis Robot
Reference
16 Rotorcrafts Control Theory
16.1 State-Space Equation Formulation
16.2 Controllability and Observability
16.3 Robust Linear Quadratic Regulator
Reference
17 Aerial Copters
17.1 Aerodynamics
17.2 Derbe Quadcopter
17.3 Quadcopter
17.4 Hexacopter
17.5 Tri-coaxial Copter
References
18 Air Autonomous System
18.1 Micro Air Technology
18.2 VTOL Air Taxis
18.3 Unmanned Air System
18.4 Go-Green Air Mobility
References
Index
Recommend Papers

Robotic Vehicles: Systems and Technology [1 ed.]
 9813366869, 9789813366862

  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
File loading please wait...
Citation preview

Tian Seng Ng

Robotic Vehicles: Systems and Technology

Robotic Vehicles: Systems and Technology

Tian Seng Ng

Robotic Vehicles: Systems and Technology

Tian Seng Ng School of Mechanical and Aerospace Engineering Nanyang Technological University Singapore, Singapore

ISBN 978-981-33-6686-2 ISBN 978-981-33-6687-9 (eBook) https://doi.org/10.1007/978-981-33-6687-9 © The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021 This work is subject to copyright. All rights are solely and exclusively licensed by the Publisher, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed. The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. The publisher, the authors and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors give a warranty, expressed or implied, with respect to the material contained herein or for any errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in published maps and institutional affiliations. This Springer imprint is published by the registered company Springer Nature Singapore Pte Ltd. The registered company address is: 152 Beach Road, #21-01/04 Gateway East, Singapore 189721, Singapore

In Memory of The Late Assoc. Prof. Mr. Teo Ming Yeong

Preface

Global warming has led to changes in the way we live. On the road, diesel and gasoline engines-driven vehicles are diminishing gradually. The new hybrid vehicles carry out their mission to roam the streets. Forging ahead, we can see electric cars making their way through their popularism, as well. Throughout the decades, many developments have evolved in automobile technology. Nowadays, where computer control technology is becoming increasingly popular, the technological breakthrough has progressed to combine with computed technology for more precise and complete automation. The time has arrived for automakers to further their development to automated control in the zero-emission vehicle. Software advancement aspires to create a simulation platform to replace real-time testing, rewarding in its economic benefits. In academics, the study in nano-electronics finds its use in automobiles to perform at high speed in the limited space of the vehicle compartment. Smart sensors are entangling in the machine vision world. Autonomous control techniques in drones, robots, and automobiles are seeing its future when combining the machine vision technology for implementing in the land and air vehicles or machine systems. Many research and studies over the years have found significant improvements in the development of aerial and land vehicle technology. Presently, many enterprises and software developing company venture to establish the concept for engineering a self-driving car. With the latest machine learning techniques such as deep learning, platforms for artificial intelligence arise to drive the functionality of an autonomous system. With the success of the semiconductor and software technology, the integration of smart sensors, algorithms, and computer modules helps to increase the reliability of the self-driving car. Cars can select its route automatically for its travelling path to avoid congested traffic and bad weather. The autonomous vehicle has mobile eyes all round to detect objects as well as to avoid a collision. The coming generations of vehicles may even respond to human intervention in speech and action to control and drive the car. Our future road mobility envisages a voice-commanded zero-emission drive engine that operates on several tasks autonomously, not forgetting about safety reliability in the changing era. The changing evolution from hybrid vehicles to electric vehicles developing further on to autonomous cars is becoming a reality. vii

viii

Preface

Modern technology in the automobile has brought us to the present advancements in automatic control vehicular systems and progressing towards the millenniums. Future road transportation will depend heavily on research investments in the domain of automobile technology. Advancement in the vehicular sensing depends heavily on the current technology in smart cameras and other sensing capabilities. This book introduces the new features in automobiles and studies the factors involving the autonomous vehicle. It focuses on the present technology at large. The system embraces machine vision, artificial intelligence, machine learning, and deep learning. It forecasts the latest development of modern technology in automated cars to expose readers to the engineering features, functions, simulation, embedded system, connectivity, sensing, control, processing, and communication of the autonomous vehicle. Moreover, the context also expressed the control technology of the six-legged walking robot. Autonomous robots can participate in the search and rescue mission, where tasks are impossible for humans. We deployed them in environments where it possesses threats to human safety. Examples of such applications are bomb disposing and fallen buildings. These robots can enter the debris to detect signs of human life. In the air, several types of rotor-propelled crafts are pathing the ways. They have the capability of delivering human organs to organ transplant patients. Amazon has provided air parcel and fast food delivery using these rotorcrafts. It cuts downtime and cost when considering one of these options. Swarms of rotor copters apply to surveillance and military missions. The adoption of modern technological successes inspires us to develop and build the vision of futuristic robotic machines and vehicles. Singapore

Tian Seng Ng

Acknowledgements

The author would like to express his sincere gratitude to Mark Pitchford for the preparation of some of the contents in vehicle safety. He would also like to thank Associate Professor Ng Heong Wah for his expertise and guidance on the automobile system topics. He is the leading Professor for the solar car teams in NTU, which have won numerous titles in overseas competitions. Sincere gratitude goes to Assistant Professor Mir Feroskhan of the aerospace division and Associate Professor Nickols Francis Malcolm John, who had offered his expertise joining Prof Ng in supervising the solar car team members. Moreover, A/P Nickols Francis has also contributed to the creation of the ground robotic system, which walks on its six legs. Not forgetting the support team assisting the Professors, Deputy Director Mr. Roger Tan Kay Chia, Assistant Director Mr. Iskandar Shah Salleh, the laboratory supporting staff Mr. Ang Teck Meng, Mdm Agnes Tan Siok Kuan, and Mr. You Kim San, who work in the Robotic Research Centre and many others working to help in the engineering laboratory. Appreciations go to my family and friends for their overall support and patience. Without these people, the writing of the book would not have been successful. Finally, I would like to thank NTU, School of MAE, Division of Aerospace Engineering, for allowing me to make the work a reality.

ix

Abbreviations

Automobile Standards (Courtesy of SAE International) J941 J287 J1757/1 J1757/2 J1757/3 J2364 J2365 J2395 J2396

Driver’s Eye Locations Driver Hand Control Reach Standard Metrology for Vehicular Displays Optical Metrology for Automotive HUD Power Metrology for Vehicle Display Navigation Function Accessibility While Driving Calculation of the Time to Complete In-Vehicle Navigation Tasks In-Vehicle Message Priority Definitions and Measures-Related Driver Visual Behavior Using Video Techniques J1050 Describing and Measuring the Driver’s Field of View J2399 Adaptive Cruise Control (ACC) Operating Characteristics and User Interface J2400 Forward Collision Warning Systems: Operating Characteristics and User Interface J2678 Navigation Function Accessibility While Driving Rationale J2802 Blind Spot Monitoring System (BSMS): Operating Characteristics and User Interface J2808 Road/Lane Departure Warning Systems: Human Interface J2830 Process for Comprehension Testing of In-Vehicle Icons J2831 Design and Engineering for In-Vehicle Alphanumeric Messages J2889/1 Measurement of Minimum Noise Emitted by Road Vehicles NHTSA FMVSS No. 101: Controls and Displays ISO 15008 Ergonomic Aspects of Transport Information and Control Systems ISO 16505 Ergonomic and Performance Aspects of Camera Monitor Systems

xi

Contents

1

Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.1 Preliminary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.2 Highlighting Features of the Book . . . . . . . . . . . . . . . . . . . . . . . . . . 1.3 Book Organization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1 1 3 4

2

Autonomous System Connectivity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.1 Interconnected Cars . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.2 Vehicular Network and Communication . . . . . . . . . . . . . . . . . . . . . 2.3 Vehicle Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.4 Cloud Processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.5 Vehicle Safety . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.6 Vehicle Security . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

7 7 8 10 10 11 12 15

3

Embedded System Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.1 Virtual Machine . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.2 Embedded SoC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.3 Embedded Safety Processor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.4 SoC Functional Safety . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

17 17 17 19 20 22

4

Autonomous Vehicle Data Processing . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.1 Vehicle OS Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.2 Processors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.3 Ethernet . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.4 Vehicle Databus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.5 Remote Processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.5.1 Internet of Things . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

23 24 27 27 29 33 33 33

5

Vehicle Components . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.1 Brakes and Steering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.2 Electric Power Steering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.2.1 Cyberspace Security . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

35 35 36 37 xiii

xiv

Contents

5.3

Vehicle Suspension System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.3.1 Regenerative Braking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.4 Electronic Control Units . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

38 40 41 42

6

Vehicle Navigation Computing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.1 Wireless MCU . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.2 Mapping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.3 Cloud Computing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.4 Edge Computing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.5 Car Navigation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

43 43 44 45 46 47

7

Vehicle Test Drive and Simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.1 Vehicle Components Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.2 Vehicle Simulation and Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.2.1 Automated Driving Attestments . . . . . . . . . . . . . . . . . . . . . . Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

49 49 51 55 55

8

Advanced Chip Driven Electronics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8.1 Automotive Electronics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8.2 Memory Safety . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8.3 Embedded Chip Security . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8.4 Embedded Computing Board . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

57 58 61 61 62

9

Sensors and Vision System Fusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9.1 Vision Hardware . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9.2 Image Processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9.3 Vision Chip . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9.4 Embedded Vision . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9.5 Sensors Fusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

63 63 66 67 68 68 71

10 V2V and V2X Communications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10.1 Driving Vehicle Connectivity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10.2 Vehicular IoT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10.3 5G . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10.4 Internet of Vehicles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10.5 Interconnected Vehicles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

73 73 75 76 77 77 80

11 Deep Learning and Neural Network Algorithms . . . . . . . . . . . . . . . . . 11.1 Artificial Intelligence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.2 Neural Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.3 Deep Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.4 Neural Network Accelerator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

81 81 84 84 84 86

Contents

xv

12 ADAS in Autonomous Driving . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12.1 Advanced Driver Assistance Systems . . . . . . . . . . . . . . . . . . . . . . . 12.2 Imaging Radar . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12.3 LiDAR . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12.4 Software Component . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12.5 Hardware Component . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

87 88 90 91 92 93 93

13 Electric Vehicle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13.1 HEV/EV . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13.2 Powertrain and Engines Management . . . . . . . . . . . . . . . . . . . . . . . 13.3 Battery, Converter and Inverter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13.4 Transmission and Fuel Injection Module . . . . . . . . . . . . . . . . . . . . . 13.5 Braking System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13.6 Vehicle Charging System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

95 95 96 98 101 104 105 106

14 Vehicular Advancements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14.1 Vehicular Compliances and Expectations . . . . . . . . . . . . . . . . . . . . 14.2 Traffic Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14.3 Emission Reduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14.4 Future of Driverless Car . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

107 107 109 110 110 112

15 Electric Mantis Robot . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15.1 The Electric Mantis Robot . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15.2 The 3 Degrees of Freedom Mechanism . . . . . . . . . . . . . . . . . . . . . . 15.3 Robot Calibration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15.4 Mantis Robot Walking Methodology . . . . . . . . . . . . . . . . . . . . . . . . 15.5 Programming Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15.6 Autonomous Mantis Robot . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

113 113 116 117 119 124 130 139

16 Rotorcrafts Control Theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16.1 State-Space Equation Formulation . . . . . . . . . . . . . . . . . . . . . . . . . . 16.2 Controllability and Observability . . . . . . . . . . . . . . . . . . . . . . . . . . . 16.3 Robust Linear Quadratic Regulator . . . . . . . . . . . . . . . . . . . . . . . . . Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

141 141 146 146 148

17 Aerial Copters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17.1 Aerodynamics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17.2 Derbe Quadcopter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17.3 Quadcopter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17.4 Hexacopter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17.5 Tri-coaxial Copter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

149 149 154 160 165 167 179

xvi

Contents

18 Air Autonomous System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18.1 Micro Air Technology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18.2 VTOL Air Taxis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18.3 Unmanned Air System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18.4 Go-Green Air Mobility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

181 181 183 186 187 188

Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189

List of Figures

Fig. 2.1 Fig. 2.2 Fig. 2.3 Fig. 2.4 Fig. 2.5 Fig. 2.6 Fig. 2.7 Fig. 2.8 Fig. 2.9 Fig. 3.1 Fig. 3.2 Fig. 3.3 Fig. 3.4 Fig. 3.5 Fig. 4.1 Fig. 4.2 Fig. 4.3 Fig. 4.4 Fig. 4.5 Fig. 4.6 Fig. 4.7 Fig. 4.8 Fig. 4.9 Fig. 4.10 Fig. 5.1 Fig. 5.2 Fig. 5.3 Fig. 5.4 Fig. 5.5

Car features and developments (Courtesy of www.intelligent-aerospace.com) . . . . . . . . . . . . . . . . . . . . . . . Automotive HDBaseT (Courtesy of Valens) . . . . . . . . . . . . . . . . Vehicle tracker (Credit ANTZER TECH) . . . . . . . . . . . . . . . . . . Automotive performance (Courtesy of Heavy Reading) . . . . . . Hypervisors and gateways (Courtesy of Blackberry QNX) . . . . Potential car (Credit LDRA Ltd) . . . . . . . . . . . . . . . . . . . . . . . . . BlackBerry’s seven pillar (Courtesy of Blackberry QNX) . . . . . Rear ECU (Courtesy of OpenSynergy) . . . . . . . . . . . . . . . . . . . . Vehicle fleet (Courtesy of DARPA) . . . . . . . . . . . . . . . . . . . . . . . Graphical virtualization workload (Courtesy of Intel) . . . . . . . . System on chip (Courtesy of Imagination Technologies) . . . . . . NVIDIA Xavier (Credit Nvidia) . . . . . . . . . . . . . . . . . . . . . . . . . Embedded technology (Credit TI) . . . . . . . . . . . . . . . . . . . . . . . . Automotive arm processor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Data centricity (Credit RTI) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . AGL virtualization architecture (Credit Linux Foundation) . . . . Automotive architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Vehicle generic architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Electric car POF optical connectivity (Credit KDPOF) . . . . . . . Control and infotainment software (Courtesy of Wittenstein) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Autonomous system data (Credit RTI) . . . . . . . . . . . . . . . . . . . . Data bus architecture (Credit RTI) . . . . . . . . . . . . . . . . . . . . . . . . Centralized processing module (Credit Xilinx) . . . . . . . . . . . . . Data bus flow (Credit RTI) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Electric power steering (Courtesy of Continental) . . . . . . . . . . . GNSS receiver . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Inclined spring force vector (Credit Ford) . . . . . . . . . . . . . . . . . . Vehicle suspension system (Credit Designboom) . . . . . . . . . . . . EV regenerative suspension system (Credit www.caradvice.com.au) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

8 9 10 11 12 13 13 14 15 18 18 19 20 21 24 25 26 28 29 30 30 31 31 32 37 38 39 39 40 xvii

xviii

Fig. 6.1 Fig. 6.2 Fig. 6.3 Fig. 6.4 Fig. 7.1 Fig. 7.2 Fig. 7.3 Fig. 7.4 Fig. 7.5 Fig. 7.6 Fig. 8.1 Fig. 8.2 Fig. 8.3 Fig. 8.4 Fig. 8.5 Fig. 9.1 Fig. 9.2 Fig. 9.3 Fig. 9.4 Fig. 9.5 Fig. 9.6 Fig. 9.7 Fig. 9.8 Fig. 9.9 Fig. 10.1 Fig. 10.2 Fig. 10.3 Fig. 11.1 Fig. 11.2 Fig. 11.3 Fig. 11.4 Fig. 11.5 Fig. 11.6 Fig. 12.1 Fig. 12.2 Fig. 12.3 Fig. 12.4 Fig. 13.1

List of Figures

Wireless data processing infrastructure . . . . . . . . . . . . . . . . . . . . Renesas’ 28 nm automotive flash MCU . . . . . . . . . . . . . . . . . . . Edge computing platform. Credit Kontron . . . . . . . . . . . . . . . . . Car bus system . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Electric Motor management (Courtesy of NI) . . . . . . . . . . . . . . FPGA power electronics models (Credit NI) . . . . . . . . . . . . . . . NI hardware testing platform (Credit NI) . . . . . . . . . . . . . . . . . . ANSYS simulation construction (Courtesy of ANSYS) . . . . . . Virtual simulation architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . Virtual vehicle models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . IGBT module (Credit Infineon Technologies) . . . . . . . . . . . . . . Automotive ARC processor application (Credit Synopsys) . . . . Automotive ICs (Credit Infineon Technologies) . . . . . . . . . . . . . Automotive embedded security (Credit Infineon Technologies) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . AI computing board . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Automobile vision system (Credit Bosch) . . . . . . . . . . . . . . . . . . Automobile ambient sensing (Courtesy of Maximintegrated) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Self-driving vehicle safety sensor suite (Credit Uber) . . . . . . . . IEEE-SA P2020—automotive image quality standard (Courtesy of Algolux) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Vision chips (a, b) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Smart sensor (Courtesy of Octavo Systems) . . . . . . . . . . . . . . . . Embedded vision processing architecture (Credit Synopsys) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Embedded vision applications (Courtesy of Avnet) . . . . . . . . . . Perception and planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Vehicle connectivity (Courtesy of UCLA) . . . . . . . . . . . . . . . . . V2V engineering (Courtesy of GM) . . . . . . . . . . . . . . . . . . . . . . Vehicle synchronization (a, b) (Courtesy of Rohde & Schwarz) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Neural network vehicle control system (Courtesy of ANSYS) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Deep neural network latency (a, b) (Credit Xilinx) . . . . . . . . . . Dynamic function exchange (Credit Xilinx) . . . . . . . . . . . . . . . . PowerVR 2NX NNA (Credit Imagination Technologies) . . . . . NN accelerator in SoC (Credit Imagination Technologies) . . . . Activation functions (Credit SAS) . . . . . . . . . . . . . . . . . . . . . . . . ADAS control features (Courtesy of Renesas) . . . . . . . . . . . . . . Radar CMOS sensor (Credit TI) . . . . . . . . . . . . . . . . . . . . . . . . . AWR1642 device (Credit TI) . . . . . . . . . . . . . . . . . . . . . . . . . . . . LiDAR System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Electric vehicle charging system (Courtesy of Cypress Semiconductor) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

44 45 47 48 50 50 50 52 54 54 58 59 60 61 62 64 65 65 66 67 68 69 70 70 74 78 80 82 83 83 85 86 86 89 90 91 91 98

List of Figures

Fig. 13.2 Fig. 13.3 Fig. 13.4 Fig. 13.5 Fig. 13.6 Fig. 13.7 Fig. 13.8 Fig. 15.1 Fig. 15.2 Fig. 15.3 Fig. 15.4 Fig. 15.5 Fig. 15.6 Fig. 15.7 Fig. 15.8 Fig. 15.9 Fig. 15.10 Fig. 16.1 Fig. 17.1 Fig. 17.2 Fig. 17.3 Fig. 17.4 Fig. 17.5 Fig. 17.6 Fig. 17.7 Fig. 17.8 Fig. 17.9 Fig. 17.10 Fig. 17.11 Fig. 17.12 Fig. 18.1 Fig. 18.2 Fig. 18.3 Fig. 18.4 Fig. 18.5 Fig. 18.6 Fig. 18.7 Fig. 18.8 Fig. 18.9 Fig. 18.10 Fig. 18.11 Fig. 18.12 Fig. 18.13

xix

Battery system (Credit KIT IPE) . . . . . . . . . . . . . . . . . . . . . . . . . On-Board AC-DC converters (Credit Texas Instruments) . . . . . Off-Board charger (Credit Texas Instruments) . . . . . . . . . . . . . . Vehicle electrification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 V power supply for automated driving (Credit Ford) . . . . . . Fuel injection chipset (Credit STMicroelectronics) . . . . . . . . . . Vehicle charging station (Courtesy of www.iot-now.com) . . . . . Mantis robot . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Electric mantis circuit layout . . . . . . . . . . . . . . . . . . . . . . . . . . . . Sections of a robotic leg . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Stretched legs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Legs at two different heights . . . . . . . . . . . . . . . . . . . . . . . . . . . . Calibration of supporting legs, a calibration, b re-calibrated . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Left legs motion methodolgy . . . . . . . . . . . . . . . . . . . . . . . . . . . . Right legs motion methodology . . . . . . . . . . . . . . . . . . . . . . . . . . Robot retrack paths . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Symmetrical calibration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Rotorcraft control block diagram . . . . . . . . . . . . . . . . . . . . . . . . . Derbe quadrotor diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Shrediquette derbe (Source SHREDIQUETTE) . . . . . . . . . . . . . Derbe quadrotor control (a, b) . . . . . . . . . . . . . . . . . . . . . . . . . . . Quadrotor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Quadrotor control (a, b) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Hexacopter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Hexacopter simulation (a, b) . . . . . . . . . . . . . . . . . . . . . . . . . . . . Tri-coaxial copter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Tri-coaxial copter configuration . . . . . . . . . . . . . . . . . . . . . . . . . . Tri-coaxial copter simulation (a, b) . . . . . . . . . . . . . . . . . . . . . . . Tri-coaxial copter attitude and position (a, b) . . . . . . . . . . . . . . . Effect of tuning the integral gain . . . . . . . . . . . . . . . . . . . . . . . . . The hummingbird (Credit Purdue University) . . . . . . . . . . . . . . The mosquito (Credit Harvard University) . . . . . . . . . . . . . . . . . The black hornet (Credit FLIR) . . . . . . . . . . . . . . . . . . . . . . . . . . Hybrid VTOL air taxis (a, b, c) . . . . . . . . . . . . . . . . . . . . . . . . . . VTOL air taxi (Credit Volocopter) . . . . . . . . . . . . . . . . . . . . . . . . E-VTOL air taxis (a, b, c) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Flying car (Credit SkyDrive) . . . . . . . . . . . . . . . . . . . . . . . . . . . . Ducted fan UAV . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Next generation UAS (Credit FlightWave) . . . . . . . . . . . . . . . . . E-VTOL air Vehicle (Credit Uber Technologies) . . . . . . . . . . . . Arsenal Airplane . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Hydrogen-Fueled Airplane (a, b) . . . . . . . . . . . . . . . . . . . . . . . . . Autonomous Electric Air Car (Credit EmbraerX) . . . . . . . . . . .

99 101 102 103 103 104 106 114 115 116 116 117 118 120 122 124 134 142 156 156 160 161 165 166 169 169 173 174 178 179 182 182 182 183 184 184 185 185 185 186 187 188 188

List of Tables

Table 1.1 Table 4.1 Table 5.1 Table 12.1 Table 13.1 Table 13.2 Table 15.1 Table 15.2 Table 15.3 Table 16.1 Table 17.1 Table 17.2 Table 17.3 Table 17.4 Table 17.5 Table 17.6 Table 17.7 Table 17.8 Table 17.9 Table 17.10 Table 18.1

Five levels of vehicle autonomy . . . . . . . . . . . . . . . . . . . . . . . . Automotive Ethernet . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Various cortex processors [1] . . . . . . . . . . . . . . . . . . . . . . . . . . . Sensor application . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Light electric vehicle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Automobile Electrical Architecture . . . . . . . . . . . . . . . . . . . . . . Timing units for different performances . . . . . . . . . . . . . . . . . . Program parameters for different performances . . . . . . . . . . . . Robot legs’ variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Model parameters’ specifications . . . . . . . . . . . . . . . . . . . . . . . Rotor copter aerodynamic constants . . . . . . . . . . . . . . . . . . . . . Derbe matrix gains . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Quadrotor matrix gains . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Hexacopter PID controller matrix . . . . . . . . . . . . . . . . . . . . . . . Quadcopter K-gain after pole placements . . . . . . . . . . . . . . . . . Tri-coaxial copter PD controller . . . . . . . . . . . . . . . . . . . . . . . . Similar PD controller . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Integral gains . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Robust PD control matrix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Robust PD control matrix after fine tuning . . . . . . . . . . . . . . . . Design assurance level . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

2 28 42 89 96 97 132 133 133 145 150 155 162 168 170 171 172 174 176 176 186

xxi

List of Flow Charts

Flow Chart 15.1 Mantis forward strokes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Flow Chart 15.2 Electric mantis feeler . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Flow Chart 15.3 Timing flowchart . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Flow Chart 15.4 Electric mantis main flowchart . . . . . . . . . . . . . . . . . . . . . . . . Flow Chart 15.5 Motion settings for alternate legs (Connector B) . . . . . . . . . Flow Chart 15.6 Cycling motion flowchart . . . . . . . . . . . . . . . . . . . . . . . . . . . .

125 126 127 128 129 131

xxiii

List of Programs

P15.1 Mantis Feeler Program . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . P15.2 Mantis Timing Program . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . P15.3 Left Center Leg . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . P16.1 LQR with Pole Placement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . P17.1 Aerodynamics Coefficients Computation Program . . . . . . . . . . . . . . . . P17.2 Derbe—Robust PD Control Matrix Computation . . . . . . . . . . . . . . . . . P17.3 Quadrotor—Robust PD Control Matrix Computation . . . . . . . . . . . . . . P17.4 Tri-Coaxial Copter—Robust PD Control Matrix Computation . . . . . .

135 135 136 148 153 157 163 175

xxv

Chapter 1

Introduction

Land and air autonomy provide a convenient transportation system for urban and rural districts. Our present automobile’s establishment and development reduce travelling time. Research and innovations in the domain lay the foundation of our new inventions for the century. Six-legged robot assists in dangerous and impossible tasks that cannot be handle by humans. Aerial propelled rotor copter displays its advantage of a shorter time in its delivery missions. On the road, vehicles are improving to achieve their go-green mobility without any air contamination. Furthermore, autonomous cars are becoming a reality.

1.1 Preliminary At the turn of the twentieth century, humans face numerous challenges in developing advanced robotics systems ranging from land robots to autonomous vehicles and drones. These complex applications profoundly affect and change our daily life. The self-driving capability of these systems put a new chapter to today’s engineering and technological world. In the age of the transportation revolution, driverless vehicles are becoming a reality. On the road, surveys have found that human errors are the main contribution to accidents. The primary objective of developing an autonomous vehicle is to reduce the number of accidents caused by humans. Today, a large cluster of bodies works hand in hand for a successful future in the autonomous machines industry. In the private sector, they are the partners, consultants, manufacturers, and distributors for robots, automotive, and drones. In the government sector, it involves academia, researchers, scientists from various institutions, and regulatory bodies. In the public sector, it includes public transportation companies. With the confidence of the upcoming autonomous land and air vehicle companies and start-ups, our dream of seeing automated machines in action, whether on the road or in the sky, will soon become a reality.

© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021 T. S. Ng, Robotic Vehicles: Systems and Technology, https://doi.org/10.1007/978-981-33-6687-9_1

1

2

1 Introduction

Automated driving is not an easy task to perform. Decisions made by self-driving cars have to be fast, accurate, and error-free with acceptable tolerances. It incorporates path planning for the vehicle as well as environmental modelling. However, the abundance of the new vehicular technology and redundancies in the sensors and actuator operating techniques lead the way to future vehicles on the road. To allow the autonomous vehicle to perform its full capacity on the streets, automakers embraced the ADAS vision and machine learning to the automobile system. The new combination of technology embraces vehicle autonomy to the level of safety strictly anticipated by drivers, passengers, and road users. Automated systems face many challenges for them to be near perfection. The adapted artificial intelligence must manage complex data flow and state. It must be built securely for flexible deployment with safety features. A cloud-based system allows the smooth flow of network traffic with reliability in real-time. With modular systems, practice, and distributed data bus ease the system integration to complex software. There are five levels of autonomy for future vehicles. Levels one and two depending on human monitoring and intervention. They are driver assistance and partial automation, respectively. The next three levels comprise conditional, advance, and full automation. In these higher levels, the system monitors the driving environment. Only level zero to three allows vehicular intervention. Levels four and five drive the vehicle autonomously (Table 1.1). The in-vehicle system comprises several components in an automated car. First is the environmental perception by radar, LiDAR, and cameras. Next is the GPS information and status of the vehicle, for example, the inertia, CAN bus unit, etc., to be available for sensor fusion. The analysis of vehicle detail content results in the decision to take action to energize the actuators for control. The activated actuators then control the steering and cruising of the car. Communication of the vehicle to the outside environment will ensure a safe journey. This connectivity is by using 5G, LTE, or G5. Monitoring and infotainment, human-machine interface, etc., make the self-driving vehicle more attractive and comfortable. Table 1.1 Five levels of vehicle autonomy SAE level

Description

Capability

Control

0

No automation



Human driver

1

Driver assistance

Adaptive cruise control/lane keeping and parking assist

Human driver and vehicle

2

Partial automation

Traffic jam assist

Vehicle

3

Conditional automation

Full stop and go Highway Vehicle Driving Self parking

4

High automation

Automated driving

Vehicle

5

Full automation

Driverless vehicle operation

Vehicle

1.2 Highlighting Features of the Book

3

1.2 Highlighting Features of the Book Road mobility immobilizes various areas. These entanglements are automotive mapping technologies, sensor technology, embedded system, electric drivetrain, security, safety, hardware, and processing power optimization, data analytics, vehicular and storage sustainability, and mobility services. The book incorporates the essentials of an autonomous vehicle and its automatic navigational system. The vehicle operating system forms the primary infrastructure to input, process, command, and control the autonomous vehicle. Vehicle model simulation in the software environment enables auto-car builders to test their performances to satisfactory before manufacturing and exporting to the automobile industry. Present artificial intelligent techniques introduced into the automobile allows it to perform analysis and decision thinking for the autonomous vehicle. All activities running the interior of the car performs evaluation and acts to control the vehicle in a time-critical manner. New sensing technology involves machine vision in the advanced driver assistance systems. Furthermore, the book also introduces the hardware components for vehicle navigation. Altogether, it comprises the hardware and software necessary to build an autonomous vehicle. With the rapid advancement in connected cars, vehicle autonomy, and e-mobility, we lay the path for our future road transportation technology. • In the book, we explored the usage of connectivity in an autonomous vehicle system. V2V and V2X expand the vehicle range of communication, thereby improving safety to pedestrians and motorcyclists. • The embedded system, data bus, and network are equally important in an automated car system. • The book studies vehicle components such as the Electronic Control Unit ECU and automatic embedded wireless controller. • How the autonomous vehicle processes data and mapping, which in turn navigates the car automatically. • It includes the advanced hardware chip of the electronics driving the AI deep learning accelerator and neural network. • The book deals with advanced ADAS in autonomous driving. • Sensors and vision system fusions are popular topics that we do not neglect in autonomous vehicles. • The integration of the vehicular models in ANSYS for autonomous driving simulation and testing saves time and money. • It highlights traffic management, emission reduction, and road mobility future. • We also come across to learn of the electric vehicle with its electric steering capability. • In the land, the six-legged robot navigates the environment well. • Air mobility shears light for the future air autonomous system. The book narrates the accounts of several types of rotorcrafts.

4

1 Introduction

1.3 Book Organization • Chapter 2 introduces the interconnected system of the car. Cloud computing adds on a layer of extra information for capturing and analyzing in the vehicular networking platform. Vehicle safety and security also depend on its connectivity. It equips readers with general knowledge of the autonomous system for automobiles. • Chapter 3 highlights the embedded operating system and its processors for driving vehicles automatically. We discussed safety functionalities in the virtual embedded vehicular platform. • Chapter 4 enhances the understanding of the operating system for data processing, the existence of hypervisors and data bus for internal vehicle networks, and the Ethernet backbone for communication. The internet of things is widely present in today’s technology. • Chapter 5 studies the electric power steering for controlling the vehicle. The understandings of dynamics, control, and stability analysis of the automated vehicle, with briefed on the vehicular mechanisms. • Chapter 6 illustrates the wireless controller infrastructure and the microcontroller for the vehicle system and spotlight mappings for an autonomous vehicle in the car navigation. The vehicle system incorporates cloud and edge computing to enable vehicle mappings. • Chapter 7 shows the various vehicle models for integration into the simulating environment. Several platforms are available for simulation, vehicular testing, and attestments. • Chapter 8 advances electronics for chipset data extraction in vehicular systems. • Chapter 9 learns the image processing, embedded vision, vision sensors, and sensor fusion techniques in machine vision. The vision chips and hardware are available in the contribution of data and information. • Chapter 10 works on the technology in vehicular infrastructure for tapping an abundance of details. V2V and V2X grasp the connectivity advantages of using 5G, IoT, and Internet of Vehicles to claim information. • Chapter 11 includes the artificial intelligent techniques in algorithm processing and analysis for determining vehicle driving. The themes are machine learning, deep learning, CNN, and neural network. • Chapter 12 narrates the ADAS system. Vision sensors and computations are the primary highlights of driving the autonomous system. • Chapter 13 involves the electric vehicle system. Functions such as brakes, transmissions, clutches, engines, batteries, and powertrains are discussing topics. • Chapter 14 presents the latest mobility development in transportation, future traffic, and car management. Car compliances and standards lead to accident reductions and increase safety consciousness. • Chapter 15 gives an account of the narration of the six-legged walking mechanisms. It explains and illustrates the unconventional robot walking methodology with indicative diagrams and software programs.

1.3 Book Organization

5

• Chapter 16 explains the modern control technique for flight control systems. The linear quadratic regulator method with the combination of the pole placement technique regulates the air systems with optimum control and robustness. • Chapter 17 touches on the rotor-propelled air flying machines. We discussed the various forms of rotorcraft systems under the control and aerodynamic perspectives. • Chapter 18 mentions the various air autonomous systems for military and commercial applications. The developmental effort aims for green going technology.

Chapter 2

Autonomous System Connectivity

Vehicle communications are vital for unforeseen circumstances. Data from every vehicle combine to yield an open gateway for more information reliant on every travelling vehicle. It enhances the safety of automobiles in avoiding possible collisions. Alerts and warnings save the day for connected cars or vehicles. With the automobile’s vast sensor features, a broad vehicular network and communication create better reliability and safety for the autonomous vehicle system. However, with vehicle connectivity, we should not compromise its security. The vehicle becomes more vulnerable to a potential cyber-attack.

2.1 Interconnected Cars Communication support in-car connectivity enhances the safety features in autonomous driving. Smart antennas connected to a TCU process the data received from the cloud. Automotive communication platforms such as JamaicaCAR support 2D and 3D applications. The extensive libraries provide access to the board net, internet, GPS information, and many other sources. Apps can communicate directly with security. It is a combination of the virtual reality of Jamaica with Java Technology. It enables an open-access network of internet ability with GPS information to access services such as parking places, hotel rooms, and vehicular fleet communication. By having not reaching their limited capacity, connected cars have more advantages. Firstly, it is possible to monitor the condition of the car, and services to maintain car reliability drivings. Drivers can foresee tyre brake health, fuel sufficiency, and fluid alerts for taking preventive actions to counteract such situations before taking on the car. Connected cars can inform the car dealers for remote diagnostics through the vehicle curtailing having traffic accidents or vehicle breakdowns. Secondly, connected cars can link to insurance companies by telematics solutions. Insurers can suggest lower premiums by rewarding drivers with lower risk in driving © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021 T. S. Ng, Robotic Vehicles: Systems and Technology, https://doi.org/10.1007/978-981-33-6687-9_2

7

8

2 Autonomous System Connectivity

Fig. 2.1 Car features and developments (Courtesy of www.intelligent-aerospace.com)

and higher safety observations by monitoring driver behavior. If a crash occurs, the insurers can provide medical and rescue team to the scene immediately. In a way, it offers insurance to the vehicle and life. Lastly, car cloud computing offers road safety. The more sensors we have, the merrier it is in road vehicle automation. Sensors such as lane change alert, blind zone alerts, lane departure warning, lanekeeping assist, forward collision alert, and automatic braking may help to prevent accidents (Fig. 2.1). Connected cars offer the V2V and V2I aids in situations that the drivers cannot foresee the dangerous roads and travelling traffic conditions in conjunction that driver advisories, warnings, and infrastructure control cannot offer. Telematics can offer suggestions to road driver about speeding, distraction, and cornering, promoting safe driving, and improve driving habits. Thus, telematics not only suggests a safer road journey but also reminds drivers to drive safely.

2.2 Vehicular Network and Communication Automated car interconnectivity comes along with various options for communication with one another. The three primary communication modes are namely the Dedicated Short-Range Communications (DSRC), the Cellular Vehicle-to-Everything (C-V2X), and the Satellite connectivity. The new Cellular Vehicle-to-Everything CV2X that has a transmission range of 450 m communicates directly via short-range communications or cellular networks. DSRC, which is in line with the IEEE standard 802.11p, has a data transmission rate of only 27 Mbps can communicate at only 225

2.2 Vehicular Network and Communication

9

m range. It is a secured and wireless bi-directional communication, which operates under one of the seven 10 MHz channels spectra in the 5.9 GHz bandwidth, spreading one control channel and six service channels. The control channel sounds service announcements and safety messages where the service channels cater to non-safetyrelated data traffic such as infotainment and e-commerce. With the on-coming 5G technology, we can achieve 10 gigabits per second data rates in communication. It also has an improved latency that makes it desirable for V2V operation while the cellular network has more lagging latency suggests its suitability in V2I deployment. However, with sporadic connections in rural areas, low transmission speed, and high packet losses, satellite connectivity sees its benefits in its deployment. Vehicle-tovehicle (V2V) utilizes a short-range messaging system, supporting real-time safety systems that avoid vehicle collisions. Vehicle-to-infrastructure (V2I) is the communication link between cars and infrastructures such as road signs and traffic signals, potentially reducing congestion. Vehicle-to-pedestrian (V2P) alerts pedestrians of potential accidents using mobile apps on smartphones or smartwatches. Vehicle-tocloud (V2C) provides communication for automobiles to access in-vehicle services for infotainment or Over the Air (OTA) software updates, amongst others. The seamless control capability of the automotive network inside the vehicle consists of the video, audio, network, control, TCP/IP, etc. The UNICENS or unified, centralized network stack software is an open-source available free from Microchip. It enables the communication of all networks from one central node. Besides, it is easy to use without the burden of network configuration, and one can concentrate on the application instead. HDBaseT (Fig. 2.2) is a network technology designed to support the high-speed backbone of the connected car. The PCI switch, which we utilized in the ring network topology, connects the many sensor hubs and electronic control units (ECUs) together. HDBaseT automotive technology enables load balancing and redundancy.

Fig. 2.2 Automotive HDBaseT (Courtesy of Valens)

10

2 Autonomous System Connectivity

Fig. 2.3 Vehicle tracker (Credit ANTZER TECH)

2.3 Vehicle Tracking ANTZER TECH Company from Taiwan introduces the low power wide area network (LPWAN) LoraWAN and NB-IOT Vehicle Tracker [1] to support vehicular telemetry and communication in connected cars and fleet management. The RIFA series have announced the GPS-tracker built with Gyro and G-sensor. It is equipped with voice communication ability. Thus, it can fulfill the internet of vehicle (IoV) for fleet command and control system, and vehicle data collection for vehicular tracking in the telemetry system. Moreover, the CAN to ADR functions to locate the vehicle position with accuracy in situations of weak GPS signals. It is known as the automotive dead reckoning system. The 3G/4G communication system supports OBDII and J1939 protocols. The latest 5G technology allows lower latency, higher throughput, more connectivity as desired by the autonomous automobile (Fig. 2.3).

2.4 Cloud Processing The vehicle sends information such as telemetry and mappings to the cloud for processing. In the cloud, there are management APPS, the network, and monitoring APPS. Optimized network improves bandwidth and latency and prevent network jitter and packet losses. Figure 2.4 shows the latency and bandwidth of the connected vehicles as compared to other devices and components. The introduction of the eSync system is important to the vehicular security and vulnerability. It is an end-to-end cloud service, which updates firmware and software of the automobile. Human-machine interface, drivers, OS, and kernels all can receive service from the eSync system. It can update over the air reaching many different platforms and devices through a single OTA update. Updates of the electronic control unit (ECUs) are also available to a varying range of vehicular networks such as LIN, CAN, FlexRay, and Ethernet.

2.5 Vehicle Safety

11

Fig. 2.4 Automotive performance (Courtesy of Heavy Reading)

2.5 Vehicle Safety Why cars are increasingly accessible and more vulnerable to threats from the outside world? Connectivity and the increasing usage of software in automobiles open vulnerabilities to hackers. We must mitigate risks of hacking from intruders for the autonomous driving vehicle. The software and hardware of the car require protection. Assuring vehicle safety and security fall into the range of the ADAS, vehicle infotainment, the various ECUs, and the V2X or vehicle to everything communications. The vulnerabilities exploited even include the firmware update process. These vulnerabilities, if breached, will cause harm to the internal vehicular network. Coding in the automobile system is in C, C+, MISRA C++, AUTOSAR, etc. Millions of lines of codes need to secure the vehicle network from security breaching. Coding error received will destroy the functionalities of the safety missions. Safety-critical systems include vehicle transmission, steering, airbag, braking, and collision avoidance. Furthermore, the attacker may access the vehicular system through the GPS and the diagnostic port. Security for the safety missions requires testing and the effectiveness measured. The functional safety standard ISO26262 refers to the automobile industry. The single vehicular network contains ADAS advanced driver assistance systems, V2X vehicles to everything communications and infotainments. Solutions having separate kernels, gateways, and Hypervisors provide isolation and time separation. The Hypervisor enables us to combine several different characteristics platforms for

12

2 Autonomous System Connectivity

Fig. 2.5 Hypervisors and gateways (Courtesy of Blackberry QNX)

working under one microcomputer to allow for entertainment display and visualization. It is Blackberry’s QNX Hypervisor (Fig. 2.5). Automakers also used it to isolate between the safety-critical and non-safety critical electronic control units (ECUs) [2] (Fig. 2.6), but they cannot eliminate the potential vulnerabilities in the car network. The recommendation of BlackBerry’s 7-Pillar (Fig. 2.7) for automotive cybersecurity can help mitigate risk. Chipmakers establish the root of trust in the system on chip (SoC) hardware. An example is such that the electronic control units utilized certifications and keys for authentication. Another safety lookout is to use binary code scanning. We can analyze the binary software components for any potential weaknesses to intruders. Furthermore, we can check for MISRA compliance, insecure API usage, and cyclomatic complexity. Security means through the virtual private network is another different way of protecting your data privacy. A trusted VPN service keeps you safe from hackers, snoopers, and cybercriminals. It keeps your information and data transmission safe and secure while you connect to the internet or any public networks. Therefore, when your phone or computer links to the vehicle for car information and control, it already established an encrypted connection with the remote VPN server of the service provider.

2.6 Vehicle Security The internal and external connectivity possesses threats to the automobile. In the vehicle, security functions to protect the ECUs and sensors. Besides, how the data

2.6 Vehicle Security

Fig. 2.6 Potential car (Credit LDRA Ltd)

Fig. 2.7 BlackBerry’s seven pillar (Courtesy of Blackberry QNX)

13

14

2 Autonomous System Connectivity

Fig. 2.8 Rear ECU (Courtesy of OpenSynergy)

exchange and interactions among multiple ECUs must be full proof to protect personal information and privacy. Internally, the USB connections, the ECUs for lighting engines, steer brake, airbag, access, and ADAS all possess threats. Hypervisor helps to integrate many different ECUs as a modular system. Figure 2.8 illustrates an example of the rear ECU. It shows HEV ECU and BMS ECU formed together as one rear ECU of the vehicle. The advantages of having the modular system with the integration of Hypervisor is that it helps to save cost, provides isolation in case of virus attack, having modular software updates, and reduce network complexity. The system also allows us to integrate AUTOSAR based applications and software from different vendors. Besides AUTOSAR operating system, FreeRTOS and bare metal applications are operable under the platform. There are many potential attack entry points to the car wireless transmissions too. Remote links include vehicle to everything bi-directional transmissions, wireless key, TPMS, Bluetooth and smartphone engagements. IoT extends internet connectivity to local devices and sensors. With embedded computing technology, the various connected devices can communicate among themselves. It enables device monitoring and remote control over the network. For fleet control of vehicles, it is important to create a mesh network protocol with a distributed processing environment. This environment should include formal verification to mitigate the risk of vulnerabilities throughout the swarm network. Furthermore, it must defend itself against a protocol-based attack. The base of the open communication protocol must be reliable. In the implementation of the vehicular network solution, we should also investigate the encryption and authentication of the traffic among the nodes. The sensitivity of the autonomous vehicular network system must include anomalous behaviors and discrepancies detection that is not possible from one platform. Hypervisor platforms running on SDKs integrate multiple functions with different real-time developments, boot-time, security, and safety requirements on a single hardware. Each different SDK performs a unique activity. ADAS, gateway, infotainment, cockpit, connectivity, IoT, and domain controller are the variety of activities involved. For example, COQOS Micro SDK supports braking and power train, and Radio tuner, Voice SDK support infotainment or entertainment, COQOS Hv SDK support ADAS and gateway while Blue SDK supports cockpit as well as connectivity. Figure 2.9

2.6 Vehicle Security

15

Fig. 2.9 Vehicle fleet (Courtesy of DARPA)

is an autonomous fleet vehicle with a secure and sound interconnected system for communication with one another as well as the control station.

References 1. https://www.embedded-computing.com/news-releases/antzer-tech-introduces-lorawan-and-nbiot-vehicle-tracker 2. https://www.embedded-computing.com/guest-blogs/build-security-into-the-connected-car-dev elopment-life-cycle

Chapter 3

Embedded System Development

Embedded systems-on-chip SoC, safety processors, and virtual machines are the developing opportunities in the vehicular embedded system. The embedded technology must adhere to the automotive safety ISO26262 standard. SoC functional safety offers to ensure we meet the expectations in the ISO26262 ASIL automotive safety integrity level.

3.1 Virtual Machine In the vehicular system, there are many entities to support. ECU support, ADAS, instrument cluster, and infotainment all require high performance and boot time. Besides, safety and security are critical to handle failure. It faces many challenges in hardware sharing. The featured area of sharing lies in the GPU and device sharing, the CPU core sharing, and the interrupt request routing. We created a virtual machine to host several entertainment and media processing workloads in the guest operating system. It can hold many virtual CPUs, restricted RAM, virtualized timers and interrupts, and set up for secondary paging table. It is accessible to pre-configured hardware devices such as PCI devices, memory-mapped IO, and interrupts. As shown in Fig. 3.1, each workload has its characteristics and requirements that are unique for each different function. Some perform 3D rendering and media processing, while others handle GPU computation and visual display for the ADAS platform. The Intel graphics drive the next-generation internet of vehicle entities.

3.2 Embedded SoC The neural network accelerator not only improves system performance but also runs calculations at minimal power. The various processes of computations run in either © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021 T. S. Ng, Robotic Vehicles: Systems and Technology, https://doi.org/10.1007/978-981-33-6687-9_3

17

18

3 Embedded System Development

Fig. 3.1 Graphical virtualization workload (Courtesy of Intel)

Fig. 3.2 System on chip (Courtesy of Imagination Technologies)

CNN, VPU, DSP, CPU, or GPU (Fig. 3.2). Hardware accelerators designed for specific means process data at much lower power with higher throughput as complexities increase. NVIDIA platform [1], powered with four high-performance artificial intelligent processors, coupling two Xavier SoC processors and two next-generation discrete GPUs with deep learning and hardware accelerators of computer vision. Sensor data experience three stages of processing before we fused them for their applications. The three levels of processing are the low level, intermediate level, and high-level processing stages. Pixel processing occurs at the low-level stage. Hundreds of millions of pixel processing occur in one second. The second stage processes thousands of objects per second. The high-level processing stage completes its object recognition task at a speed of about twelve objects per second. The next segment of computation at the control logic functions performs to decide before sending it to the actuator or visual display. With the advent of the programmable MPSoCs, it helps to leverage the sensor fusion task together with the machine learning techniques using the OpenCV image processing libraries in the hardware-optimized architecture. The all-programmable chips provide higher efficiency at higher speed and lower cost leveraging the SWaP-C comparison. For all the connectivity, the FPGA caters to the hardware programmability feature while the SoC builds the software algorithm.

3.2 Embedded SoC

19

Fig. 3.3 NVIDIA Xavier (Credit Nvidia)

The advanced and high performing NVIDIA Xavier system-on-chip and DRIVE software allocate for the numerous deep neural network AI for robust control of autonomous vehicles. These advanced learning tools can path the way for localization, sensor perception and path planning in self-driving vehicles. The NVIDIA DRIVE software utilizes the power of the GPU to simulate the cameras, LiDAR, and radar for validating the hardware and software-in-the-loop. In the system, GPU performs matrix multiplication. The energy-efficient hardware runs in parallel computing, powers the AI to run its sensor fusion processing for its self-driving functions. For example, the NVIDIA DRIVE AGX Pegasus requires no pedals or steering wheel for the highly automated vehicles. Figure 3.3 is the NVIDIA Xavier System-on-chip device.

3.3 Embedded Safety Processor The embedded real-time machine learning technique employs a neural networks process to analyze the image recognition system for the time-critical decision of self-driving cars. The latest model of the spiking neural network (SNNs) has more advantages than conventional CNNs. Its lower power consumption and increased throughput enable its usage in the FPGA processor to handle more numbers of video channels in much higher frame rates. These FPGAs have better responsiveness and determinism than the GPUs. Moreover, their parallel processing ability allows for reprogrammable tasks in the multiple processing cores of the system-on-chip SoCs. Figure 3.4 illustrates the embedded processing architecture. The embedded technique provides all the vehicular applications we can propose. Errors arising from the embedded processor are a big concern in vehicle safety. The safety processors must adhere to the ISO26262 standard for vehicle autonomy to drive its system to the autonomous driving stage, highlighting that any failure in between the processing

20

3 Embedded System Development

Fig. 3.4 Embedded technology (Credit TI)

modes of operation is critical to the autonomous vehicle at travel. Adherence to safety requirements adds complexity to the redundancy of the system on chip SoC design when designing a safety processor for the automobile. A build-in self-test during power-on is necessary to ascertain the hardware to be free of error. Besides, it is also mandatory to have the memory and peripherals’ isolation ability on detection of fault to ensure the vehicle operation safety. The safety processor must handle failures to reduce risks and mitigate the ICs’ errors. The error correction codes are necessary to enforce data integrity on memories and caches. As far as high bandwidth is concerned, the vehicle utilizes the Gigabit Ethernet (GigE), MIPI, LVDS, and JESD204B. Other interfacing standards such as I2C, UARTs, SPI, and CAN also established in automobile connectivity. The Mobile Industry Processor Interface (MIPI) is aliased to the A-PHY specification standard. The specifications optimize wiring, weight, and cost. The MIPI [2] interface network for cameras and sensors has low power consumption. In conclusion, the physical layer specs also provide high-speed data, control data, and power-sharing on the same wiring. Arm Processor The Multi-processor SoC ICs leverage the system performance in terms of parallelism, high bandwidth, and provides a programmable logical environment. Figure 3.5 shows the Tetra series arm processor for automotive purposes. It works with a power of 4.1 W and functions at a speed of 800 MHz. With a storage capacity of 2 GB, its four-core processor NXP i.MX6, provides HDMI video, USB, RS-232, and Ethernet connections, and MicroSD interface.

3.4 SoC Functional Safety Bugs increased due to the ever-increasing complexity of the silicon on chips SoCs in an advanced automated vehicle. Automotive software providers have started to meet

3.4 SoC Functional Safety

21

Fig. 3.5 Automotive arm processor

challenges in counteracting bugs before simulation verification can begin. This helps to save a fraction of the time and money spent on retracing back to the stages where bugs happen. Advancement in chip technology makes the smaller transistors run at faster clock speeds, therefore producing heat. Furthermore, electromagnetic interferences, crosstalks, neutrons, and alpha particles influence rise faults in transistors due to die shrink. These problems also create bit errors in DRAM dynamic randomaccess memory systems. To remedy these faults, hardware, and software lockstep are necessary. Software replicas can perform the safety-critical check. Although it cannot compensate for Bohrbugs [3], the approach can identify and recover from Heisenbugs. Bohrbugs always cause failure during operation while Heisenbugs are bugs that may not necessarily result in an error operation. To summon for remedy, a system could use fraternal replicas. They help to perform the same calculations using different algorithms. Another way to detect fault is to use identical replicas. In the model, two or more identical computations are running on different threads and different memory. Most votes for the same conclusion in these replicas enhance confidence in error detection. In the replication scheme, the system designer can interpose middleware into synchronizing points in the communication paths. The middleware replicates the messages from the client to the server and receives results from the servers. We can check the subsystem to be within permitted tolerances. In the implementation, the server model divides into fraternal and similar models. The fraternal model is further classified into a monitor or peer fraternal model. The same SoC in each different server runs on different processing cores in the identical server model. In the peer fraternal models, different compilers compile the same source code. Different levels of monitor functionalities are used in the monitor fraternal model. These measures offer to ensure ISO26262 ASIL (automotive safety integrity level) rating is within the limit in regulating automotive functional safety. In summary, car manufacturers achieved functional safety by determining, analyzing, and mitigating risk hazards.

22

3 Embedded System Development

References 1. https://www.nvidia.com/en-us/self-driving-cars/drive-platform/hardware/ 2. https://www.embedded-computing.com/guest-blogs/mipi-worlds-journey-from-mobile-to-aut omotive-interfaces 3. https://www.cs.rutgers.edu/~rmartin/teaching/spring03/cs553/papers01/06.pdf

Chapter 4

Autonomous Vehicle Data Processing

Innovation in the autonomous vehicle fosters an increase in the algorithm and software for the onboard hardware, electronic control units (ECUs), sensors, and actuators. As a result, a large amount of data habours in the vehicle’s electrical and electronics systems. Heterogeneous data sources arise from the control command and the many different sensors. We must account for information from the GPS, cameras, radar, LiDAR, control unit, and even error messages in the vehicle data processing. Through the vehicular OS or QNX platform, the autonomous car utilizes the sensor data to compute its acceleration and steering angle and navigates through its projected path simultaneously. The amount of vehicle data varies in volume and frequency. Solving the dataflow problem poses a challenge to autonomous vehicle developers. Safety and time-critical systems involve the autonomous driving system and the ADAS or advanced driver assistance systems. We employed multicore processors and GPUs to perform the internal vehicle computations. Functional safety must be present in the software for vehicle data processing. Automobile engineers have to ensure the vehicle software architect is reliable through several stages of testing to provide safety in communications and information processing in a time-critical manner. Altogether, the vehicle data flow requires a distributed architecture that is fast, reliable, and scalable to fulfill its mission in the vehicle application. Figure 4.1 illustrates data centricity in vehicles supporting a higher autonomy. We required the velocity of both the front and the following vehicles for processing the safe distance between them. Hybrid fusion is more effective at processing data nearer to the sensors. The distributed systems reduced wirings, allow for redundancy, and spread the workload. The follow-up is Eq. 4.1, which represented the safe distancing formula. With L, the average vehicle length, ρ the back vehicle’s response time, aa, and ab are the maximal acceleration and braking of the vehicles, respectively. Tf is the time for the rear car to reach a full stop if it would apply maximal acceleration during the response time, and thereby maximal braking, vr, and vf are the velocities of the rear and front of the vehicles.

© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021 T. S. Ng, Robotic Vehicles: Systems and Technology, https://doi.org/10.1007/978-981-33-6687-9_4

23

24

4 Autonomous Vehicle Data Processing

Fig. 4.1 Data centricity (Credit RTI)

dmin

       p 2 ab Tr − T f vr + paa − T f − p ab + = L + T f vr −v f + p(aa+ ab ) − 2 2 (4.1)

The chapter gives an overview of the vehicle operating system, its embedded processor, the Databus, the Ethernet, and remote processing.

4.1 Vehicle OS Structure The real-time operating system in the automotive architect must be reliable, scalable, and modular, besides being safe and secure. Open source virtualization platforms for automated cars bring customization, reuse, and reduce time to market. We defined virtualization as multiple resources abstraction or partitioning creation in the virtual execution environment. It brings about the development of the AGL software-defined car architecture. The software-defined vehicles in the autonomous system to execute hundreds of different functionalities require virtualization. Automotive-grade Linux or AGL serves the advantages as an automotive operating system to support the various functions of the autonomous vehicle. It provides an isolation framework for the many different safety and security levels of requirements for execution. This includes the car connectivity to the automatic driving features of the automobile. It can drive the many different functions like telematics, instrument cluster, heads-up display (HUD), infotainment, IVI systems, Heating, Ventilation, and Air Conditioning (HVAC), rear-view camera, navigation, advanced driver assistance systems (ADAS) and safety-critical functions. The system supports scalability, flexibility, interoperability, and configuration during run-time. Software company builds AGL using the hardware board support package, framework and APIs, middleware, application Linux kernel, software-development kit (SDK), and reference applications.

4.1 Vehicle OS Structure

25

Fig. 4.2 AGL virtualization architecture (Credit Linux Foundation)

The hypervisors, automotive functions, and virtual machines are interchangeable, which creates versatile modules in the system architecture. We can fully customize the OS to become a complete system. With AGL virtualization (Fig. 4.2), we can also support multiple CPUs, software licenses, and hypervisor units for deployment. The AGL platform runs functional safety-critical software inside a container or hypervisor while running the legacy OS (e.g. QNX) and Linux simultaneously. The deployment of different layers of critical functionality with real-time responses is possible with the architecture. Automakers can isolate the many safety-critical functions from the rest of the system. AGL powers V2V and V2C connectivity to collect and share a common open database for traffic situations, decisions, road conditionings, and intersections. This virtualization platform creates a heterogeneous open source developing platform ideal for the automotive system. With such benefits, we can deploy it in millions of cars. The introduction of the hypervisor is computer hardware, software, or firmware that executes and runs virtual machines. It creates a hardware abstraction for OS, automotive functions, and applications. With its virtualization in the embedded devices, it stands out for its size, weight, power, portability, isolation, and hardware obsolesces. Hypervisors, containers, and system partitioning enable the executions of multiple functions platform with many different functional safety requirements. The Docker containers of a vehicle manage system dependencies. The partitioning of the virtual firewall secures Java codes for non-critical functions. The multicore structure offers safety for autonomous driving. The purpose of the hypervisor is to monitor the incoming data. It not only offers device virtualization, consolidation, and a separating time or space partitioning but also security for communication. How to select the hypervisor? We have to look into the expandability of the ecosystem, modularity, and open support structure. Both the QNX Hypervisor 2.0 virtualization software and the BlackBerry’s QNX Software Development Platform 7.0 build the human-machine interface graphical screen. The QNX Hypervisor 2.0 permits the Android and Linux operating systems to function separately in the cockpit of the automobile. The new software platform runs VxWorks with hypervisor technology (Fig. 4.3a) for the RTOS. As outlined, BlackBerry QNX is becoming popular in the automotive industry in providing functional safety for embedded software. Automakers leverage the software architect by running safety-critical functions like

26 Fig. 4.3 Automotive architecture

4 Autonomous Vehicle Data Processing

4.1 Vehicle OS Structure

27

the control, sensing, and actuation of the vehicle alongside the infotainment and telemetry functions, which are, consider less critical. This new platform provides Ethernet-based communication and AI with advanced sensing computing features. The framework offers connected and autonomous driving a safe and reliable vehicle system for deploying in the highway road. Figure 4.3 illustrates the architectures for the Hypervisor, SAFERTOS, and automotive software. The RTOS layout of Fig. 4.3c supports tasks separation, isolation, and sophisticated tasks monitoring known as SAFE Checkpoints.

4.2 Processors A very high-speed processor or millions of instructions per second (MIPS) processing are required to run the latest sensors fusion and advanced driver assistance systems (ADAS). The artificial intelligence technique deploys to gain advantages over machine vision and radar detection and for driver condition evaluation. Thus, the vehicle desires very high-speed processing. Additional accelerators such as digital signal processors can run deep learning tasks or complicated neural network algorithm. Image processing workload poses a challenge to vision system developers. Especially in today’s computing technology, machine vision processors relying heavily on neural networks and deep learning computations. We can compromise the time of processing by deploying these processing algorithms for precision and accuracy. Fortunately, with the advent of the technology of the fast speed, low power processors, we can overcome to reduce the latency in image processing and transmission and retain its accuracy in image recognition. The FPGA-based processor can pick bad pixels for correction and perform Bayer interpolation. The modern GenICam camera describes the camera’s properties in the XML descriptor file. As such, the machine vision software, when used in conjunction with the cameras, provides high speed, low latency computing in the ECU, or server through the fiber-optic interfaces.

4.3 Ethernet In today’s market, the highlighting issues to the automotive industry are pervasive networking together with artificial intelligence. The electric powertrain execution requires noiseless performance over the sensor signal, control, and actuation for the hybrid and electric vehicles. The introduction of the Ethernet (Table 4.1) serves the purpose. Inside the car, the Ethernet (Fig. 4.4) forms the vehicle internal backbone. Ethernet connectivity provides a high-speed, noiseless communications backbone over a ubiquitous physical network. The Automotive Gigabit Ethernet plastic optical fiber offers high-quality initiative and reliability for communication. The GEPOF transceiver for electric and autonomous driving provides high connectivity with low

28 Table 4.1 Automotive Ethernet

4 Autonomous Vehicle Data Processing OSI

Automotive Ethernet

7. Application

HTTP, SMTP, FTP, SOME/IP etc.

6. Presentation 5. Session 4. Transport

TCP, UDP

3. Network

IP

2. Data Link

100BASE-T1 1000BASE-T1

1. Physical

Fig. 4.4 Vehicle generic architecture

latency and low jitter, with an operating speed of 100 Mbps. The communication bus of the transceiver can withstand heat, temperature, loading constraint, mechanical and electromagnetic noise over its transmissions. The radiation-free harness architecture for the POF-based communication protocol (Fig. 4.5) provides galvanic isolation between communication modules. The isolation is necessary for hazardous operating voltage above 25 V ac or 60 V dc between the converters and the different operating modules. Undisturbed data communication is achievable for the many different domain interactions in the electric vehicle. The Ethernet from Microchip [1] provides a range of specific vehicular applications. For example, it caters for vehicular diagnostics, in-vehicle connectivity and backbone, telematics, infotainment, and Advanced Driver Assistance Systems (ADAS). Testing is necessary to ensure error-free communication. Network interaction, Ethernet, Internet Protocol (IP), gateways, and switches are some of the components

4.3 Ethernet

29

Fig. 4.5 Electric car POF optical connectivity (Credit KDPOF)

for testing. Several testing characteristics are available. They are load and scalability testing for vehicle installed cameras, latency tests for ADAS equipment, and quality tests for media files. Automakers conduct availability and reliability testing across a wide variety of scenarios. This is to ensure consistency and predictability of critical powertrain, chassis, and other vehicle functionalities. Next, conformance testing consists of negative testing and performance testing. The former tests the vehicle for its functional safety when the Ethernet transmission breakdown. The latter checks for the maximum load capacity of the vehicular network bandwidth, whether the car can still deliver the brake function message upon reaching its full network capacity. Besides, they also need to check for the correct booting time and latency and validate the sleep, or wakeup cycles. The Ethernet must be resistantly bound to electromagnetic interference. The transmission rate for 100Base-T1 and 1000BASE-T1 are 66.66 MHz and 750 MHz, respectively. It contains one unshielded twisted pair cable, and it is full-duplex. The DiSTI Corporation releases the latest version of GL Studio, 6.2 for a 2D/3D graphical user interface

4.4 Vehicle Databus Control buses in automobiles linked up all their sensors, actuators and ECUs, etc., communications through optical, electrical, or wireless protocols. There are many different protocols in today’s car communication. Let us name a few. CAN serve the network to control a variety of in-vehicle functions, namely brake, engine management, and window and seat operation, etc. (Fig. 4.6). FlexRay utilizes two independent data channels for mission-critical data transmission. Infotainment relies on electrical or optical ring topology to transport data using MOST. LIN provides a

30

4 Autonomous Vehicle Data Processing

Fig. 4.6 Control and infotainment software (Courtesy of Wittenstein)

cheaper alternative to CAN. Data bus in a vehicular system requires latency, security, reliability, frequency, and volume (Fig. 4.7). High speed-driven data bus performs multi-tasking functions such as planning and navigation, mapping, and sensor fusion tasks. The autonomous vehicle data bus features four different domains. Figure 4.8 shows the data bus architecture. The layered data bus consists of the machine, unit, site, and cloud data bus. Machine data bus consists of vehicle control, sensor fusion, planning, etc. from the vehicle. Unit data bus habours the V2V and V2X nearby sensings. Information from the unit data bus comes from environmental weather, dynamic vehicle location, and other probe sensor data. Site bus taps data from traffic management, environmental and road conditions suitable for fleet management. Each intelligent hierarchy level consists of the data model, discovery, and security domain. The site data bus requires

Fig. 4.7 Autonomous system data (Credit RTI)

4.4 Vehicle Databus

31

Fig. 4.8 Data bus architecture (Credit RTI)

discovery control, data model translation, and subsystem export control. The vehicle data bus must accommodate various features of the autonomous system. It includes the cloud-based data bus for the traffic flow and navigational mappings. There are lots of requirements for data analysis gathering at different speeds in the vehicle. Safety and location data need 0.5 Mbps. Vehicle diagnostics need 10 Mbps. Camera LiDAR needs 1 Gbps. Autonomous computation requires 5Gbps. The A_PHY [2] high-speed data interface in the autonomous car provides a speed of up to 24 Gbps (Fig. 4.9). This speed is required to process all the data coming

Fig. 4.9 Centralized processing module (Credit Xilinx)

32

4 Autonomous Vehicle Data Processing

Fig. 4.10 Data bus flow (Credit RTI)

from the vehicle’s external sensors. We also make embedded vision processing tasks easier from the high-speed data link infrastructure in the self-driving car. DDS data connect provides matching to the architecture without codes. It works by communicating states instead of sending messages. The data bus automatically discovers and connects publishing and subscribing applications. The Quality of Service control permits each different module in the bus to specify its update rates, reliability, data availability guarantees, and much more. It separates development and deployment designs, allowing for full location, chip, network, and OS transparency. Its flexibility, and robustness targets it to support vehicle autonomy at level four and five where highly and fully automation occurs. RTI data bus connects the vehicle components for application in the vehicular network system. Components function through dataflow interactions (Fig. 4.10), thereby arises the data-centric connectivity. RTI Connect Drive draws the Data Distributed Bus for data distribution in the interconnected modules in the vehicle platform for vision and machine learning processing applications. The platform allows us to integrate the automotive ecosystem such as ROS2, AUTOSAR Classic, and AUTOSAR Adaptive through open Data Distribution Service (DDS) standard. DDS implements a virtual abstraction of a global data space. The scalability, ease of integration and safety certification, deployment flexibility, and real-time performance of the interconnected data bus makes it the right choice for automakers. It includes the software development kit to develop autonomous drive applications, integrating software within the diverse operation in the system for interoperating functions. Linking fleet control, external vehicle communication, and data exchange, and provide software updates with ISO26262 ASIL D safety standard. It runs the legacy AUTOSAR on the NVIDIA processor without any operating system. Lastly, by employing the data-centric architecture, we can have the advantage of defining safety-critical messages and data integrity messages separately. Thus, it provides an automated vehicle with a framework for security. The centricity of the architect also enables continuous vehicle mobility even when the IP address changes. Furthermore, it also reduces the lines of codes in the ECUs.

4.5 Remote Processing

33

4.5 Remote Processing By allowing remote data processing and storage helps to ease the in-vehicle processor for other critical functioning tasks. The Software as a Service (SaaS) platform enables the user to run the provider’s application in the cloud. Next, the Infrastructure as a Service (IaaS) provides the user with managing services to run their remote applications. Lastly, the Platform as a Service (PaaS) is allocated for the third-party developer to perform their applications. These cloud-computing platforms bring inadequacy and redundancy in managing large data storage and computations. The business models are available to users on a pay-per-use or as-needed remote facility. However, cloud computing technology depends on the network to reduce jitters, packet losses, and delays.

4.5.1 Internet of Things Google’s IoT Core provides services like Google Cloud Dataflow, Google BigQuery, and Google Cloud Machine learning Engine. These services serve the requirements of millions of devices for their data without too much complexity. The Amazon Cloud Platform also offers a choice of automotive information data flow. Other cloud environments like Microsoft Azure also serves the same IoT purpose. The MQTT processes output data from the device layer, which flows to the AWS IoT, will forward to SQS the simple queue service of the Amazon. An example of an IoT solution in the automotive industry will be to provide driving information, power consumption, and navigational data for the partners in the driving community.

References 1. https://www.microchip.com/design-centers/automotive-solutions/automotive-products/con nectivity/ethernet?utm_source=embedded-computing.com&utm_medium=Native&utm_cam paign=AutoEthernet 2. https://www.embedded-computing.com/automotive/solving-critical-challenges-of-autono mous-driving

Chapter 5

Vehicle Components

The automobile suspension system, brakes, and steering, electric power steering, and electronic control units are the primary components in vehicular construction. Cybersecurity safeguards the vehicle system such that it is functioning securely and free of hacking. Improvements to the vehicle suspension system reduce vibration for a better ride for passengers. Regenerative braking and ECUs boost the electrical powertrain of the automobile. Safety-critical automobile processors entrusted ensures no malfunctioning of the vehicle to create any accidents.

5.1 Brakes and Steering Fleet management for taxi and ride-sharing services on the road drives the demand for steering system durability. The steering system functions to control the steering wheel and road wheel angle and the steering assist feature. The increase in load or mileages in cars creates tension for the steering system adaptation and improvement. The voltage applied to the braking or acceleration of the car depends on the road’s environmental condition. In the event of system failure, we need to accommodate the vehicle as well as the surrounding safety. Certain functions of the car must be operatable to ensure safety in the highly responded scenarios in the failure mode. By developing a highly sophisticated system, we counteracted the vehicle failures to operate its critical functions, such as the braking and steering of the car safely. During brake failure, the front and rear wheels must be free for vehicle steerable, and stability respectively. Also, during vehicle failure, the automobile must be gear lock to a secure standstill. For the case of failure during car deceleration, we bring the car to a higher deceleration rate in line with the higher operating speed of the vehicle. Bosch’s solution for a fail-degrading brake system is the combination of its electromechanical brake booster, iBooster, and ESC (Electronic Stability Control) or ESP (Electronic Stability Program) systems. Both are independently capable of performing vehicle brake functions in case of failure. Other companies like General © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021 T. S. Ng, Robotic Vehicles: Systems and Technology, https://doi.org/10.1007/978-981-33-6687-9_5

35

36

5 Vehicle Components

Motors and Ford automotive companies also implement their strategy in the brakes and steering system. In level 5 autonomy, we can eliminate the steering wheel and pedals. The new drive-by-wire technology drives a vehicle through the CAN bus without any driver or control robot in place. Drive-by-wire or steer-by-wire technique utilized a steering angle sensor to feedback signals from the ECU to electric motors for actuating the commanded steering input angle. Electric motors replaced the mechanical throttle links. In another innovative drive-by-wire technology, a handicapped driver with minimum mobility has the option to use the driving aids controlled by microprocessors for the acceleration, steering, and braking system. Car manufacturers can create redundancies for a fail-safe driving system as well. Torque vectoring technique used in the traction motor or brakes can drive the front wheels in a controlled manner. By varying the traction forces, the vehicle will remain steerable in time of emergency. Similarly, the smart braking system employs the brake-by-wire concept, which calls for the elimination of the clamps, hydraulic fluid, hoses, vacuum booster, and the process required to fill and bleed the system. Tactile feeling helps the driver to regulate over or under driving conditions. The TFD or tactile feel device produces the feel to the driver from the magnetic resistance force of the device. We can fix the device onto the vehicle in the placement between the steering controller and the wheel. An algorithm design to counteract hysteresis feel, sense accelerator feels, and centrifugal slide is as good also. Additional control and sensing are not limited to wheel synchronization and directional control, variable steering ratio, steering feel, and steering wheel feedback control. EPS or electric power steering drives without a steering wheel. Nevertheless, it needs to have a backup actuator for EPS failure in fail-safe operation.

5.2 Electric Power Steering The power steering control function breaks up into four different activities. They are the CSF, ACSF, ESF, and ASF in short. Firstly, the Corrective Steering Functions (CSF) helps to keep the vehicle within its tracking lane. Secondly, the Automatically Commanded Steering Functions (ACSF) covers from A to E different categories with low-speed maneuvering, continuous lane-keeping systems, and automated lane change systems. Thirdly, the Emergency Steering Functions (ESF), exist to avoid imminent collisions. Lastly, is the Autonomous steering function (ASF), which is capable of activating at least partly based on a signal that generated off-board the vehicle? Infineon delivers innovative steering systems for automated driving. The EHPS has several goals in the system. It functions to avoid erroneous torque control, blocked steering, sudden loss of steering assistance, and too little steering assist. The lane assist system has input coming from the torque sensor and steering angle. The torque generated by the 3-phase motor of the EHPS keeps the car in line with the lane. When combined with a camera, the vehicle installed with EPS (Fig. 5.1) performs

5.2 Electric Power Steering

37

Fig. 5.1 Electric power steering (Courtesy of Continental)

self-centered steering within the lane.

5.2.1 Cyberspace Security The autonomous system falls prey to the threats of cyberattacks. It needs secure control over its network and infrastructure, with the potential to map out vulnerabilities across all autonomous platforms. The autonomous vehicle regulating policy incorporates a holistic system for system safety, reliability, and privacy. The autodrive needs to address five threats, especially to the vehicle control and steering system. They are threat intelligence, hardware security, software security, network security, and automotive Ethernet. All the sensors connecting to the ECUs transform their sensor information to control the vehicle. Sensor fusion combines signals coming from the air into the car with other sensors of the car. Vulnerability in cyberspace brings about unreliable receiving signals, which affects and creates destruction to the auto-driving vehicle. In the automotive radar system, signal jamming attacks cause the object to disappear from detection. Spoofing penetration alters the object distance resulting in catastrophe if an accident occurs. Signal jamming and spoofing are also possible with all types of GNSS receivers. A signal jammer creates interference to jam the GNSS signal. There may also be a case of signal deception providing fake signals where it is hacked and hijacked. It is called signal spoofing. Spoofing can affect the car in its power calculation, rate of acceleration and deceleration, navigation, and mapping. It can cause uncontrolled acceleration of the car from 30 to 100 Kph. Another example of spoofing is the air suspension height change. This leads to inaccurate calculation of the car speed resulting in different routes tracked. In executing protection for navigation, GPS, LiDAR, smart cameras, radar, and other sensors. Regulus has provided the technology to prevent sensor hacking thereby adding reliability, and security to it. Regulus has successfully integrated the RF communication with a resilient chip in the GNSS receiver. The idea is to prevent signal spoofing, interference, jamming,

38

5 Vehicle Components

Fig. 5.2 GNSS receiver

and attack in the automated vehicle. This upholds the ISO/SAE CD 21,434 standard in road vehicle cybersecurity engineering. It leads to the fulfillment of sensor reliability, security, and robustness for automotive. The signal entering the antenna goes through the low noise amplifier, mixer, bandpass filter, and an analog to digital converter together with the function of the frequency synthesizer. The GNSS receiver (Fig. 5.2) performs these RF frontend signal activities. The filtered signal passes through the microcontroller unit. Regulus modified Digital Baseband microcontroller chip caters for signal acquisition, tracking, and navigation data processing. Finally, the signalling data enters the Ethernet for sensor fusing activity. Cybersecurity is a concern among the electric power steering and its steering system.

5.3 Vehicle Suspension System Battery electric vehicle utilized a traction motor for the drive and a second motor for the active suspension system. This arrangement eliminates the gearbox, universal joint, anti-roll bar, clutch, and transmission shaft. However, the electric motor introduces torque vibration that transmits through the vehicle’s suspension system. Besides, the vertical force of the switched reluctance motor affects the anti-rollover and lateral stabiles of the car. It includes the resonant vibration frequency of the vehicle and wheel altogether. From the arising of the vehicle high torque ripple and the unbalanced radial force vibrations, we used unwanted spring mass and magnetic dampers to damp the vibrations. Force vectoring suspension system can absorb and reduce the vehicle vibrations. By implementing force-vectoring spring in the vehicle’s rear suspension system, the vehicle reduces yaw and its sideslip angle. Overall, it improves the stability and response of the vehicle. With force vectoring mechanism, the vehicle can travel steadily at higher speed and increased loading. This is because more spring compression generates an increase in lateral

5.3 Vehicle Suspension System

39

load, enabling the vehicle to be more stable as it travels. Force vectoring springs with a twist beam improves the performance of the vehicle (Fig. 5.3). With softer bushings, we can improve the impact harshness of the car. Moreover, the usage of watts linkage reduces oversteers and toe compliance. Figure 5.4 shows the active suspension system, which can recover up to 400 W of energy.

Fig. 5.3 Inclined spring force vector (Credit Ford)

Fig. 5.4 Vehicle suspension system (Credit Designboom)

40

5 Vehicle Components

5.3.1 Regenerative Braking Regenerative braking offers a 32% in fuel savings, a 50% reduction in nitrogen oxide emissions and a 30% reduction in CO2 emissions. The energy created by braking stores in ultra-capacitors is light and cost effective. The stored energy converts its power to the electric motor by the ECU software for assisting in acceleration or the drivetrain. Regenerative braking helps to increase the battery range up to 480 km. It also decreases reliance on the friction brake, which causes tyre wear. While most companies focus on regenerative energy from braking, emerging regenerative energy also comes from the vehicle suspension system. Audi has developed the eROT system where the electric motor acts as the electromechanical rotary dampers. A lever arm transmits the wheel carrier movement to the electric motor through a gear series. The motion converts into electricity during compression. In another novel suspension system, the impact of the wheel in a selective suspension system reacts only when it meets a threshold. Germany damps the vibration shock from the suspension system by using the electronically controlled mechanically coupled linear generator (Fig. 5.5). As such, the regenerative system converts the kinetic energy harvested from the vehicle acceleration and the vertical suspension motion into electrical energy. It has the benefit of yielding regenerative energy of 15 kWh per 100 km, with a wheel suspension movement of ±25 mm per meter. The 48 v mild-hybrid drive saves 0.7 L of fuel for every 100 km of travels from the regenerative energy.

Fig. 5.5 EV regenerative suspension system (Credit www.caradvice.com.au)

5.4 Electronic Control Units

41

5.4 Electronic Control Units An automated car employs hundreds of ECUs. The car’s many ECUs drive the hardware of the HVAC, BMS, HEV, inverter, and MCU. These hardware systems include the software domain fusion for torque, stability and battery control, and energy management. The demand for high-capacity ECUs for autonomous vehicles drives them to the requirements of the solid-state drives (SSDs) microcontrollers (MCUs) for signal processing. One central SSD connects to several ECUs distributed around the vehicle. The many ECUs in the vehicle driving system caters for the connections of the cameras, sensors, radars, LiDARs, the car’s infotainment, and ADAS (advanced driver assistance systems) capability. For example, car architecture, one ECU controls the vehicle. A second ECU performs mapping, route planning, and sensor fusion functions. Parallel executions run on the multi-rate automotive control system. The vehicle engine requires each of its controllers to manage the fuel injection, engine noise, emission, and many other components. Visteon’s SmartCore domain controller enables the integration of instrument clusters, driver information applications, and Android O-based infotainment and heads-up displays into only one ECU instead of three. Therefore, the functional diagnostic of these ECUs plays an important role in the automobile system. Virtual ECUs have several requirements to fulfill. Its multicore processor must be able to allocate one or more multiple cores to each execution environment. It must be a trusted computing module to isolate safety-security critical applications and assets (Intel Trusted Execution Technology and Arm TrustZone). The virtual hardware must support cache, memory, interrupts, and CPU for creating execution environments such as Arm Virtualization Extensions, Intel VT-x, and AMD SVM. Cortex-R processors act as the central control systems and network controllers for ECUs. It enables the delivery of hard real-time high- performance multi-core products. It functions to control the Heating Ventilation Air Conditioning (HVAC) system, the combustion efficiency, Spark Ignition (SI), and Compression Ignition (CI) engines. While the Cortex-R52 managed the control of the complete system from charging, energy storage, drive source selection and balancing to energy recovery and demand prediction, both the Cortex-M and Cortex-R processors manage the automobile electrification system for energy storage and charge electrification management. Table 5.1 shows the various types of cortex processors built for the many different multiple functions of the autonomous vehicle. SoC ECUs must establish the root of trust. We accomplished this by authentication using certificates and keys.

42 Table 5.1 Various cortex processors [1]

5 Vehicle Components Processors

Functions

Features

Cortex-A65AE, Cortex-R52

Vision ADAS

Control, computer vision, heterogeneous multi-core

Cortex-A76AE, Mali G76, Mali-C71, Cortex-R52

Autonomous driving

high performance multi-cluster, functional safety, Machine learning

Cortex-M7, Cortex-M0+

Central body control

Efficient performance, scalable, low power

Cortex-R52

Powertrain

Real time, homogeneous multi-core

Cortex-A76, Cortex-A55, Mali G52

Navigation and infotainment

Rich OS, security, energy scheduling

Reference 1. https://community.arm.com/developer/ip-products/system/b/embedded-blog/posts/a-startersguide-to-arm-processing-power-in-automotive

Chapter 6

Vehicle Navigation Computing

Vehicle components such as the engine, brakes, steering, batteries, smart sensors, cameras, etc., all send information to the cloud system through each of its embedded wireless microcontrollers. Cloud-based computing aids in the integration of all these sensors information gathered for mappings to secure a safe journey. However, edge computing reduces latency and provides a faster means of decision makings for autonomous vehicle control and drivings. Sensor fusions and analytics at the edge have faster processing speed in responding to detections and repelling the automobile away to safety in real-time. But cloud computing serves to send vehicle information and conditions to users in a non-time critical demand so vehicle maintenance can be forward in advance.

6.1 Wireless MCU The embedded central processor contains information from the engine, steering, batteries, brakes, besides the sensory data from smart cameras, ultrasonic sensors, LiDAR, and radar. The central processor links the gateway through the Gigabit Ethernet. The vehicle antenna and gateway provide communication between vehicular units. Therefore, strict intrusion detections and firewall protections laid upon these components stem the foundation for the security of the vehicle. The gateway spilled the line between the vehicle smartphone and infotainments and the vehicle management unit, together with the braking and driving system. The advantages of the backend receiving infrastructure (Fig. 6.1) enables the retrieval of road conditions and real-time vehicle data for serving online services, such as driver assistance systems. Furthermore, the connection information from the cloud allows manufacturers to obtain information about the reliability and wear and tear of its components used. Online diagnostic conducts to improve vehicular hardware and software capability. Updates and apps are downloadable into the vehicle.

© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021 T. S. Ng, Robotic Vehicles: Systems and Technology, https://doi.org/10.1007/978-981-33-6687-9_6

43

44

6 Vehicle Navigation Computing

Fig. 6.1 Wireless data processing infrastructure

The automotive industry acquires the Over-the-air (OTA) flash MCU. The builtin flash memory updates the ECU software wirelessly and automatically. The new RH850/E2x Series of flash MCUs (Fig. 6.2) incorporate up to six 400 MHz CPU cores and 9600 MIPS processing capabilities. The automotive control MCU, featuring a built-in flash ROM of up to 16 MB, can update certain arbitrary areas during program operation.

6.2 Mapping Vehicle autonomy must address the defined components necessary to pursuit movement from sensing to perception. Sensor fusion produces semantic understandings leading to the autonomous decision for vehicular activation of its actuators. Altogether, the components necessary to create mappings for autonomous driving comprises sensor sensing, sensor fusion integration, localization technique, and object classification into the world model. The sensing components are the primary sensors, including the cameras, radars, image and LiDAR sensors, GPS, and ECU fusion. Next, we integrate the sensors for a hypothesis about the data from all the sensors collected, which then locates the vehicle on a map. Object classification comes from the semantic understandings of the movement of data from the 3D images developed from mappings. It needs to solve for 3D mapping, VSLAM, electronic horizon predictive awareness, lightweight data size, and high-density maps for autonomous driving. Finally, we placed the resulting data into the ever-growing

6.2 Mapping

45

Fig. 6.2 Renesas’ 28 nm automotive flash MCU

huge knowledge database called the world model. With localization information on the map, we can predict route planning ahead of time.

6.3 Cloud Computing Cloud-based cars aid in the development of maps for route selection. Car cloud computing allows vehicles to everything (X) communications to predict a sound and safe route and to avoid traffic jams. With such benefits in its development, it hinders that car cloud operating vehicles will be coming to the road soon. Therefore, we must ensure security in the cloud-computing platform in parallel to building autonomous vehicle technology as a complete system. We must mitigate the risks of having the autonomous car hacked by intruders or turning up the autonomous vehicle controlled by someone else through the IoT device. Situations, where hackers intruded DNS servers through vulnerable IP addresses, lead to disaster for the vehicle. Moreover,

46

6 Vehicle Navigation Computing

cloud computing in the autonomous vehicle must be switchable, for example, to airplane mode to disconnect it from the car driving system. It prevents any vulnerabilities that might allow the virus to enter through the cloud-based vehicular system for the cloud-enabled vehicle to take control. Remote diagnostics enables data consolidation in the central monitoring station to send alerts to the regional technical Centre for a fault in an individual vehicle’s ECU. We can also relate this piece of information to the service Centre if local engineers are not available. The automotive cloud satisfies this demand from the OTA software updates and communication. The system analyses data in the cloud for better services and diagnostics. Besides data transfer automatically in the diagnostic process, the cloud also supports manual activation from the user interface for action. Data input to the cloud engages AI algorithms to scan the data for anomalies continuously. Therefore, we ensure that the vehicle condition and states are current to the customer at any instant of time. It promises a safe and secure journey to the patrons. The realtime model of the cars, which utilizes AI, can develop maintenance routines to serve a fleet of vehicles in the future.

6.4 Edge Computing Artificial intelligence has come to the age where it transforms embedded computing to occur at the edge. Data asset comes from real-time vehicle mapping, predictive maintenance, autonomous driving, usage-based insurance (UBI), and personalized concierge. Driving decisions and data analytics can occur at the edge to reduce the distance the data have to travel in the long communication range between vehicle and the cloud-computing platform resulting in the deployment of edge computing device in the vehicular system. Machine learning and neural network acceleration occur at the edge because of several benefits. It requires lower operating power to process data at the edge rather than at the cloud. Since it does not have to send the data to the cloud and back to the edge device, it reduces latency and increases security. Therefore, it is more reliable with higher bandwidth and improved performance at the edge device. The next generation of artificial neural network acceleration will be ubiquitous in SoCs. How can we achieve data acceleration at the edge? Smart cameras have the capability for multi-core processing. The sensor also provides data storage onboard. The sensor’s smart platform encrypts data and provides a secure data exchange with the cloud utilizing the secured protocols. It involves the combination of FPGA, CPU, and GPU altogether under a processing platform or built-in inside the smart camera. The function of the FPGA plays the role of instance data capturing of the raw image and computing. The CPU can be multi-threaded for optimization and handles the decision as well as the logic workload. The GPU performs intensive parallel computation, especially for the machine vision application. It meets the high-resolution demand of 3D vision system processing from quality to shape measurements. Having all these advantages and prominent features, we performed data analytics, decisions,

6.4 Edge Computing

47

Fig. 6.3 Edge computing platform. Credit Kontron

and execution from the edge rather than sending them to the cloud. The outcome is accelerated data processing and sensor fusion without risking proprietary data leakage. It retains abundant bandwidth in the network for other network control applications. Edge analytics caters to immediate response analyses from the autonomous vehicle. Embedded artificial intelligence can run on the edge devices installed on the autonomous vehicle. The AI processor is capable of processing high-resolution sensor information in real-time. The real-time sensory data comes from the cameras, GPS, LiDAR, radar, range sensors, and other on-board sensors of the vehicle. A mobile computing unit designed for automobiles is applicable to run the video capture in real-time. The COM Express device is capable of video and data operation with storage function. It is suitable for mission-critical operations to provide communication with a fast response time. Figure 6.3 shows the edge-computing platform, where we deployed the edge device in the vehicle.

6.5 Car Navigation The Robot Operating System (ROS) middleware (Fig. 6.4) performs perception, motion planning, and controls. Real-time vehicle motion planning subjected to autonomous driving in urban environments poses a challenge for developers. The motion planning methods and control techniques involved is the main driver for the actuators. Mobileye, together with the cloud computing power, is exploiting the advanced Driver Assistant Software to build a map of the environment in real-time. The software quality must adhere to the vehicle international ISO26262 standard. Mobileye’s Road Experience Management is an autonomous mapping and localization platform. The platform comprises three layers. They are the harvesting agent or camera-equipped vehicle, the map server or cloud, and the map agent or autonomous vehicle. Firstly, the harvesting agent performs the role of collecting and transmitting road data of landmarks and driver’s path geometry. Secondly, the map-consuming agent localizes the autonomous vehicle on the road map from the stored landmarks in it. Lastly, the map server agent aggregates and reconciles the several road segment data and streams it to the cloud. The deployment of vehicular mapping technology enables seamless motion planning of the vehicle path in real-time.

48

Fig. 6.4 Car bus system

6 Vehicle Navigation Computing

Chapter 7

Vehicle Test Drive and Simulation

ANSYS provides the solution for a testing platform for the automobiles before driving on the real ground. We verify these vehicle components simulation, sensor data testings, and the integrated vehicle drivings under various scenarios and weather conditions in the ANSYS simulation platform. However, automobile safety standards establishment ensures vehicle power train and its electrical architect functions smoothly on the run.

7.1 Vehicle Components Testing Figure 7.1 shows the car layout of the electric motor, ECU, engine, battery, and inverter. The electric motor connects to the electronic control unit. As it is the autonomous vehicle’s primary asset, automobile builders must ensure and verify its reliability. NI adopts model fidelity for FPGA-based simulation of the electric motor. One of the advantages of the FPGA simulation (Fig. 7.2) is the fast PWM inputs into its models. Moreover, car engineers can utilize tools such as PSIM, Multisim, PLEXIM, and SimPowerSystems to provide electrical modelings. The non-linear motor models enable power testing from 500 W to 25 MW. It is possible to test the ADAS ECU that runs through the CAN, FlexRay, and Ethernet for its radar, camera, and V2X communication system. The NI hardware test platform allows hardware in the loop, validation, and production testing. Figure 7.3 shows the testing of sensor fusion embedded software for both the radar and camera systems.

© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021 T. S. Ng, Robotic Vehicles: Systems and Technology, https://doi.org/10.1007/978-981-33-6687-9_7

49

50

Fig. 7.1 Electric Motor management (Courtesy of NI)

Fig. 7.2 FPGA power electronics models (Credit NI) Fig. 7.3 NI hardware testing platform (Credit NI)

7 Vehicle Test Drive and Simulation

7.2 Vehicle Simulation and Testing

51

7.2 Vehicle Simulation and Testing Simulation is a safe and quick development tool but the engine choice limits the physics model and realism. We must denote that fixed time steps and real-time simulation serve different purposes. In the simulation testing, we have the simulated image, sensor simulation, software, processing platform, and the actuated simulated vehicle. Whist on the actual road tests, we have the sensors, software, processing platform, and vehicle actuation. Modular testing allows one to test each component individually, like lane testing, sensor fusion, and object detection. To be consistent with real-world testing, the simulator set must have the correct delays and latencies of the sensor created artificially based on the real world. Actuation command limits, vehicle, and world physics in the simulator must be the same as in the real world. Sensor and Data Simulation Siemen’s PreScan simulation environment produces physics-based simulated raw sensor data for testing driving scenarios and traffic situations. They fused these simulated data in Mentor’s DRS360 platform for creating a high-resolution model of the vehicle’s environment and driving conditions. The DRS360 chipset runs on Xilinx Zynq FPGA, tests and refines the algorithms for object recognition. It takes advantage of several sensors like LiDAR, radar, and other vision-sensing technologies, where we no need to test the sensor data in a realistic environment but perform the same tests in the simulation. Engineers can utilize the advanced sensor data processing for verifying and validating self-driving cars. Automotive teams need to demonstrate that the vehicle can maintain driving on its straight lane besides detecting traffic signs. On a rural road where many curves are unavoidable, this is where auto cars must face as well in the simulation model. 3D simulated environment and its enhanced features allow different driving activities to carry out. In an urban environment, the scenario of the crowded road conditions and winter weather conditions applies. With the MATLAB and Simulink auto-drive tools, we can test the virtual vehicle under the simulating model. The complex simulated environment suitable for navigating highways helps engineers to acquire the skills to fine-tune algorithms and sensors to fit into the real-world environment. The tunings of the adaptive algorithms in response to the virtual sensor data help engineers to gather experience in the many different forms of weather conditions. Deep learning and artificial intelligence technologies accompany to tackle the increasing driving difficulties. With MATLAB Simscape driveline modeling, we can input the factors affecting the vehicle dynamics to simulate the driveline. Appreciation goes to the software for its road conditioning input, headwind, and grade. The C implementation is compatible with the OPAL-RT system. We can calculate the force of an autonomous vehicle from the formula (Eq. 7.1). It applies to the vehicle dynamic model under simulation for its attribute to testify its performance. The real-time profile provides realistic inverter power testing and battery utilization. F = mgCrr +

1 pC D Av 2 + ma + mg sin(θ ) 2

(7.1)

52

7 Vehicle Test Drive and Simulation

where m represents the mass of the vehicle, Crr denotes the coefficient of rolling resistance, CD denotes the aerodynamic drag coefficient, A is the vehicle front surface area, v, the velocity, a, the acceleration, and θ is the angle of road incline. The first term on the right-hand side of the equation defines the required force to overcome rolling resistance. The second term represents the drag force to reach the equivalent speed of the driving cycle. The third term is the mass inertia of the vehicle. The fourth term is the strength to propel the car on non-zero grade. Vehicle Simulation The autonomous vehicle is a mixture of hardware, software, electronic, and mechanical components. Therefore, testing the autonomous vehicle is always a difficult challenge. Manufacturing engineers innovate for high-quality solutions to bring in offerings to mitigate risks in car mobilities. ANSYS [1] develops an open simulation platform, especially for autonomous vehicle simulation. The platform validates the many tasks and performances of the vehicle as if it is driving the road test on actual roads. Six dominating fields exist in the ANSYS simulation constructions (Fig. 7.4) for auto-driving test and development. They are driving scenario systems simulation, software and algorithm development, functional safety analysis, sensor simulation, electronics, and hardware simulation, and semiconductor simulation. The developed platform integrates the embedded systems with software, hardware electronics, and physics to determine the autonomous vehicular system. Under the simulation roof, we can compute the vehicle dynamics. Control algorithms decide

Fig. 7.4 ANSYS simulation construction (Courtesy of ANSYS)

7.2 Vehicle Simulation and Testing

53

and generate the actuator inputs to control the vehicle. The sensing and processing models identify the surroundings for recognition. The platform includes a driving scenario model, which animates the vehicle motion together with other objects in the test drive. Engineers can experience its high-fidelity data analytics, vehicle dynamics, physics studies, embedded software code development, connectivity optimization, diverse sensor models, and functional safety analysis, and world scenarios under one roof. It simulates the environment through these ecosystems to verify the functional safety requirements. The automated functional safety analysis allows engineers to evaluate failures for designing a suitable system to mitigate the component failures, thereby saving costs and time leading to the race against the competitive edge toward autonomous driving in the automobile industry. ANSYS offers a variety of SCADE solutions. These solutions integrate into machine learning and neural network software easily. ANSYS SCADE Suite applies to design model-based development for critical embedded software. SCADE Architect is applicable for systems architecture design where the SCADE system develops the software architecture. ANSYS Medini analyzer is introduced for functional safety analysis. Its cybersecurity tool identifies and addresses cyber threats with a fast, efficient, and accurate approach. Altogether, they provide a safety–critical mission in compliance with the ISO26262, IEC61508, and ARP4761, to develop an automotive solution. IEC61508 functional safety defines the standard for electronic or electrical programmable electronic safety-related systems. The model-based construction not only has optimum signal integrity but also conforms to structural and thermal issues as well as electromagnetic reliability. The development tools are complete with functional Simulink models to convert into the SCADE model via a gateway. Scenario-based on software requirements runs to perform testing. Besides hardware in the loop simulation, it also features to conduct radar, signal processing, and semiconductor testing. The virtual simulating model helps to test road scenarios with human-machine interface software for control and communication with the automobile. In the fast-performing auto car driving analysis, it is easier to locate faults as happen in the actual road driving condition. For the complete simulation of the sophisticated machine, it also caters for ADAS simulation. ADAS utilization has enriching functions to assist the vehicle in autonomous driving. The vehicle simulation simulates car sensors from perceptron all the way to path planning to execute the vehicle’s motion to drive safely. We can capture the dynamics of the virtual vehicle for analysis and study. The automatic code generator from ANSYS model-based design software defines the most stringent safety requirements of its embedded software applications. Code generation completes after running verification in the functional model. The source code is in terms employed in the realworld automotive system. VRXPERIENCE is an autonomous vehicle simulator that helps to perform sensor testing in closed-loop simulation. We can conduct the camera sensor simulation in a virtual driving scenario under various road driving conditions and weather. Figure 7.5 presents the ANSYS virtual simulation architecture. Figure 7.6 shows the virtual vehicle models. Drive Scenario Model creates a 3D model of the world virtually with the animated car, objects, and surroundings. The motion simulation of the test drive runs under the time domain. Vehicle Dynamics

54

7 Vehicle Test Drive and Simulation

Fig. 7.5 Virtual simulation architecture

Fig. 7.6 Virtual vehicle models

Model computes the velocity, position, and orientation of the test car. It produces the mechanical model of the vehicle with a sub-model for vehicle attributes. The virtual platform creates the multi-physics model of the vehicle components for simulation. The Vehicle Component Model simulates the response of the actuator input to the brakes and steering of the vehicle under test. Signal Processing and Sensor Fusion identify the driving conditions and objects from sensor data. Sensor Models such as LiDAR, radar, V2X, PMD, GPS, ultrasonic sensors, and cameras detect the ambient

7.2 Vehicle Simulation and Testing

55

of the vehicle for sensing simulation in the virtual world. Control Algorithms and HMI makes critical decisions and display their information. The software, codes, and algorithms all adhere to the ISO26262 functional safety of the vehicle.

7.2.1 Automated Driving Attestments With the advent of more advanced and highly automated vehicles to be on the highway and streets soon, cybersecurity to safeguard the assets of the car is vital. It serves to protect the occupants of the autonomous automobile. In the ISO26262 safety standard, there are many different requirements to fulfill. These requirements are adapting to the ever-changing technology of the systems to go along with the car. One example of such compliance is the vehicle power train or its electrical architecture. Real road automation takes place after we conducted vehicle simulation in the laboratory. It is the final test run after confirming every component and integration of the vehicle system function accordingly without any failures . In the year 2016, self-driving car tests on the actual road environment fail for every 3 h of automated driving. These failures cause the driver to take over the driving wheel to control back the car, or else the outcome will be catastrophic if we let it self drive. Car compliance tests are insufficient to show that automatic car driving is safe. We will discuss more in Chap. 14 under the vehicular compliances and expectations section.

Reference 1. https://www.ansys.com/blog/autonomous-cars-adas/?utm_source=open_systems_media& utm_medium=enewsletter&utm_campaign=18-sys&utm_content=1290&utm_term=driving_t owards

Chapter 8

Advanced Chip Driven Electronics

Semiconductors like the Schottky barrier diode, SBD, and the SiC MOSFET are popular in meeting the demand of the automotive industry. Thousands and millions of these semiconductors embedded in the automotive electronic chip drive the functions of the IC. Advanced electronics hardware is one of the most critical components of an autonomous vehicle. It supports communication, image and data capture, system control, artificial intelligence, and mobility. Thus, the electronics must be able to overcome vibrational and mechanical shock but also electrical and thermal overheats. Renesas Electronics introduces an automotive video decoder IC TW9992. The chip takes in the rear camera image through the single-ended and differential composite video input signal. Automatic contrast applies to adjust the image brightness and contrast visibility. The digital video output signal flows from the MIPI-CSI2 interface to the head unit’s system on chip (SoC). It processes the video signal to send to the rearview mirror or the LCD screen of the dashboard. Unfortunately, the core processor manages infotainment as well. Often, the SoC cannot provide timely rear camera images of the car while booting up and freeze up during operation. Our solution is to provide a highly integrated video processor with an additional video decoder, LCD controller IC and scaler. During start-up, the video processor bypasses the SoC for displaying the rearview video. Advanced chip-driven electronics manipulate automation. With the upcoming developments, it brings self-driving vehicles a step closer to realism. With the press of a button on the app in the driver’s smartphone or control activation inside the car, the automobile can easily control to park itself automatically. Improvements German chipmaker Infineon has developed the IGBT power module (Fig. 8.1) for optimized cooling. It can drive 700A at reduced heat conduction and losses when switching. The IGBT is a double side molded structure cooler module optimized for the automotive inverter. It offers double-sided cooling, over-molded modules, and even the removal of the layers of packages to optimize thermal management. Wide-band gap silicon carbide (SiC) technology power electronic devices can meet © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021 T. S. Ng, Robotic Vehicles: Systems and Technology, https://doi.org/10.1007/978-981-33-6687-9_8

57

58

8 Advanced Chip Driven Electronics

Fig. 8.1 IGBT module (Credit Infineon Technologies)

its range and charge time performances in electric vehicles. While SiC has an Eg of 3.3 eV, gallium nitride (GaN) offers better Eg at 3.4 eV. Both the SiC and GaNbased power modules serve the DC/DC converter and drive-inverters charger in an electric vehicle. SiC handles a higher current in a smaller chip volume because of its higher thermal conductivity and has a breakdown electric field of 2.8MV/cm. Hence, SiC-based power inverter can operate at a higher temperature with a fast switching rate, reducing gate charge and lower losses. That is the reason for its investment in developing a high voltage rating SiC device of about 10 kV. However, it has a primary challenge of high electromagnetic interference due to its fast switching transients. GaN devices and super compact silicon carbide (SiC) inverter have the advantage of space reduction in their storage for its miniaturization of power electronics in electric vehicles. The hybrid power combination of a pair of SiC Schottky barrier diodes and a pair of silicon IGBTs, promises a reduction of 305 volume in size and 2 kg in weight. This combination serves to contribute to its primary theme of improving energy and fuel efficiency. Commercialization of it holds benefits for both electric and hybrid electric vehicles. GaN FETs are an excellent choice for high-voltage, high-frequency power FETs. It can switch more than 1 kV at megahertz frequencies and acquires the automobile applications in the high power on-off switches and ultrahigh frequency switching power. With the integration of low and high voltage under the various frequencies switching of the same package, chip isolation enhances safety in ensuring that no electrocution occurs in the electric vehicle.

8.1 Automotive Electronics Figure 8.2 is a classification of several automotive parts according to their usages and applications. Vehicle parts contain infotainment such as multimedia, dashboard instrument, and navigation. Automotive powering parts are the HVAC, EMS, transmission, ADAS, Electronics Power Steering, body control module (BCM), CAN bus, and lightings. Subcomponents streamline to radars, cameras, displays, switchers, DSP, MCU, and FPGA supplies. Automotive electronic components play a part in automotive networking, linear voltage regulator, DC/DC converters, and sensors. Automotive parts manufacturers design automotive electronics to suit the conditions in highly critical automotive operations. One example is in the electronic power

8.1 Automotive Electronics

59

Fig. 8.2 Automotive ARC processor application (Credit Synopsys)

steering. Most of its electronics have low standby current with high accuracy and stability in a robust environment. In vehicular automation, automotive electronics applies to the sensory systems in powertrain management. Sensors deployments are in clutches and brakes, engines, and automated transmissions. Today, the values of sensors have increased to provide emissions reduction, power train improvement, and comfort to passengers. In-vehicle sensors must be able to withstand vibration and shock, variations in temperature, and harsh environments. (1)

(2)

(3)

HVAC The HVAC system electronic chips are TLE7257SJ for networking, TLE8366EV for the switchers, and TLS105B0 for the linear regulator. Powertrain TLE4473G V53 is the voltage regulator recommended in the power steering of the electric vehicle. TLS115B0 is a sensitive tracker for sensors applicable to the power steer of the powertrain system. We can apply the tiny chip TLE9222PX in the electronic power steering network, while TLE7368-2E is for the switcher of the power steering for its reliability. The cooling of the SMD packages improves the thermal management in the vehicle. STMicroelectronics has created SiC MOSFETs that can withstand the highest junction temperature of 200 °C. It can boost efficiency as well as solving the heat dissipation problem by providing more cooling, thereby increasing reliability and durability. Whereas, Mitsubishi Electric has improved heat dissipation by connecting the heat sink and the power semiconductor modules with solder. BCM Network ICs apply in network communications and CAN programming. BCM automotive network transceiver ICs achieve a higher communication rate. They offer a flexible payload for greater efficiency and robustness with increased

60

8 Advanced Chip Driven Electronics

Fig. 8.3 Automotive ICs (Credit Infineon Technologies)

(4)

data-rate. BCM components for switchers are TLF50281EL and TLE83862EL, which offer 0.5A current for switching. Besides, TLS835D2 offers low quiescent current and fast regulating, which applies to the BCM linear voltage regulatory system. Dashboard The voltage regulator qualifies for the dashboard is TLS202B1, and TLS835D2, for CAN, MCU, etc. that enable power savings. The TLF51801ELV for dashboard switchers require a high current of up to 10 A. The suitable component categories for dashboard network ICs offer the flexibility of up to 2Mbit/s bandwidth. Figure 8.3 shows a variety of automotive ICs for different utilizations. The leftmost four chips are the BCM, and the rightmost four chips are applicable in dashboard functions.

As can be seen, autonomous driving utilizes a mixture of analog and digital ICs for implementation. Their usage spans across ADAS to night vision in the automated computerized car systems. The hardware such as FPGAs and GPUs can operate at a very low voltage of 1 V with at most 10A. However, the automotive circuit engineer has to comply with their electronics with the ADAS compliance standard for noise immunity. At present, switching regulators are constantly replacing the linear regulators for reasons of low heat dissipation and higher efficiency in the chips and automotive converters. We must attenuate and reduce the conducted emissions and radiated emissions from the EMI. An idea is to utilize magnetic and metallic shields to tackle radiated emissions to space. While PCB conducted emissions or traces is tarned by adding filters and ferrite beads. All these enforcements aim to attenuate the EMI to an acceptable level in the regulatory bodies. An example is the LT8645S, a high input voltage capable monolithic synchronous buck converter from an analog device.

8.2 Memory Safety

61

8.2 Memory Safety Failures happen quite often inside memory devices. There may be a system power failure and noise spike occurring during operation. Incorrect data read and write or charge loss may damage the flash cell creating system and component fault. In the memory IC, there may be a latency reliability failure. These may cause catastrophic failures in the automotive drive controlling the vehicle. Cypress is the leader in failsafe commodity memories. EnduraFlex architecture allows NOR flash memory to achieve long retention and high cycling endurance for code and data storage. It enables the partitions to have 25 years of data retention and 1.28 million P/E cycles for 512 MB. The one-time-programmable configuration can assign each region to either high endurance or long retention partitioning, where each region consists of several sectors for a maximum of five regions. The safety–critical functions in NOR flash inherit different diagnostics for the embedded data. The memory system has the cyclic redundancy check for the memory error detection, error correction code, safe boot, and reset operation for secure data recovery and error report function. Furthermore, the prevention of inadvertent write to the device enhances data integrity establishing safe memory interconnects to the microcontroller unit.

8.3 Embedded Chip Security The Trusted Platform Module (TPM) SLI 9670 automotive security chip from OPTIGA is an unusual automotive embedded electronics. It is a secured tamperresistant microcontroller with advanced hardware security technology. The firmware is loaded with a set of rich features of security functions. Functions like authentication, encryption, and decryption, secured logging time, key management, and signature functions are the various embedded key features. The proven security solution is often qualified in the AEC-Q100 standard, making its way to ECUs, gateway, telematics, and other multimedia applications. It provides tight security and protection for remote car access, car sharing, over-the-air updates, infotainment, and fleet management. (Fig. 8.4). Fig. 8.4 Automotive embedded security (Credit Infineon Technologies)

62

8 Advanced Chip Driven Electronics

Fig. 8.5 AI computing board

8.4 Embedded Computing Board In the market, we already have an embedded computing board that is capable of edge computation. It is for artificial intelligence computation in self-driving car applications. The Real AI Processor (RAP) inference chip supports long short-term memory neural networks, recurrent neural networks, convolutional neural networks, and reinforcement learning neural networks. It includes adders, multipliers, arithmetical and logical units, and signal processing functions for the decision-making performance. From AlphaICs, the board features 60 total operations processing (TOPs) per second at 40 W. AI frameworks such as Caffe2, TensorFlow, PyTorch and ONNX are the supporting tools to use with the board. Driverless car application utilizes the processing power of the board for future road transportation (Fig. 8.5).

Chapter 9

Sensors and Vision System Fusions

We can divide car sensing into two parts. They are the exteroceptive sensors and proprioceptive sensors. The former sense the car surrounding, whereas the latter detects the interior of the car. Examples of the internal of the vehicle are wheel speed, inertial measurement, and driver attentiveness. Modern vision sensing capability adds an eye to the autonomous driving of the car. The developing 3D technology combines optical, imaging, and software technology to provide high-quality 2D and 3D visualizations. The best-in-class optical systems with tilt and picture-generating unit (PGU) technologies display innovations that include selective dimming, curved surfaces, enhanced optics management, center displays with HS haptics, and a new interior rearview e-mirror that enables switching from mirror mode to display mode. Augmented reality provides a digital layer of additional information. The instrument cluster is installed with an integrated infrared driver-monitoring camera design for facial recognition, head, and eye-gaze tracking.

9.1 Vision Hardware Cameras: Today cameras are cheaper and better in terms of dynamic range and resolution. It leads the way in terms of object classification and texture interpretation. With cameras, we can extract a rich source of raw data from it. Automakers plant the cameras for headlight detection and the lane department warning (LDW). However, cameras generate a massive amount of data and also rely on clear visibility. Several types of different cameras find their capabilities and usefulness in self-driving vehicles. They are stereo cameras, monocular cameras, imaging cameras, smart cameras, RGB cameras, infrared cameras, and depth-sensing cameras. Sensors: An additional form of data provides a higher level of safety to the pedestrians on the streets and drivers on the road. Besides the vehicle LED headlights, we include © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021 T. S. Ng, Robotic Vehicles: Systems and Technology, https://doi.org/10.1007/978-981-33-6687-9_9

63

64

9 Sensors and Vision System Fusions

Fig. 9.1 Automobile vision system (Credit Bosch)

sensors (Fig. 9.1) such as radars, LiDAR, infrared, ultrasonic, and a laser scanner to project the scene complexity for fast vehicle responses and control. Ultrasonic deserves its judging capability only at short ranges. Radar, however, detects objects at longer distances. Although it can withstand bad weather conditions, it has low resolution. On the other hand, light detection and ranging (LiDAR) has higher resolution but loses sight in heavy rain and snow. LiDAR can detect up to a distance of three football fields. However, short-ranging LiDAR is also available in the market. It transmits millions of pulses per second in an area of 360°. LiDAR calculates the distance by the time travel to reflect from an object. Radar detects the distance and size of the object. It is deployable to two key technologies in an autonomous vehicle. They are the AEBS automatic emergency braking system and the ACC or the adaptive cruise control. In comparison, cameras detect what the objects are. The camera and radar sensors sense blind spots and utilize them in the automatic braking system. A combination of the automatic emergency braking AEB with the lane departure warning LDW can provide a collision-free system. A combination of the sensor information will exactly reflect the scene to output more information than one expects. The sensors are capable of creating safe surroundings, as shown in Fig. 9.2. In this way, we can include robustness and redundancy to the vision system for the autonomous car. The measurement of sensors in vehicles is now capable of regulating the fuel injection, the engine timing, predicting travelling route and traffic jams, avoiding impending collisions, and at the same time, identify objects and pedestrians. The deployments of the vehicle sensors will generate at least four TB of data every day. There are four elements of sensing in the road path. They are path driving, scene semantics, moving objects, and free space. Therefore, the autonomous vehicle must possess the ability to demonstrate all these sensing capabilities. Figure 9.3 shows the safety sensor suite for self-driving vehicles. An example of car sensing is the A8 model, which functions with eye viewing features. It encompasses several smart cameras in combination with LiDAR and radar sensors. The A8 model benefits from a 360° safety view, which features a front infrared camera, a monocular camera at

9.1 Vision Hardware

Fig. 9.2 Automobile ambient sensing (Courtesy of Maximintegrated)

Fig. 9.3 Self-driving vehicle safety sensor suite (Credit Uber)

65

66

9 Sensors and Vision System Fusions

the top of the windshield, and four 360° cameras mounted on the front, rear, and at the two wing mirrors. The car also includes a long-range radar at the front, four mid-range radar sensors at the corners, and twelve ultrasonic sensors around the vehicle perimeter. Another single laser scanner at the front sweeps a 145° field of view within milliseconds, registering obstacles up to 262 feet away. Front cameras detect bumps in the road and predictively adjust the suspension system. This active suspension influences and minimizes rolling movement when handling cornering and pitching movements during braking or accelerating.

9.2 Image Processing The monocular cameras or vision sensors for processing will capture the images running in sequence. The vision processing estimates the camera position and pose. 3D reconstruction builds the final image from the output of the video processing to find the height, width, and distance of a stationary object whereas, pattern recognition and motion analysis techniques applied to detect moving objects. Engineers can process the images through centralized, decentralized, and hybrid sensor fusion processing. However, the raw data from the smart sensors have increased its complexity in sensor fusion. Information from the cameras must go through image acquisition, image pre-processing, feature extraction, segmentation, object analysis, and finally utilized the expert system to decide its course of action. In the IEEE-SA P2020 standard (Fig. 9.4), one must consider fulfilling several imaging standards for safety. They are the IQ requirements and specifications on standards, LED flicker standards, image quality for viewing and computer vision, image quality safety, camera subsystem interface, and customer perception of image quality. Fig. 9.4 IEEE-SA P2020—automotive image quality standard (Courtesy of Algolux)

9.3 Vision Chip

67

9.3 Vision Chip Vision chip, SoC plays an important role in the 360 surrounding views of an automated vehicle. It takes in images from multiple smart CMOS cameras to compute the image processing algorithm. As a result, the car can recognize pedestrians, detects departure lanes and other oncoming obstacles. Its output controls the automotive various control devices such as the steering wheel, the brakes, and the accelerator. We can achieve these controls from the power of the MCU from Renesas. The chip features parallel image processing and rendering to replace the image signal in the processor of each camera. The automotive automated technology includes the automated parking system and advanced driver assistance technology. Renesas Electronics Corporation has released the R-Car V3H system-on-chip processor. The SoC processes artificial intelligence and computer vision applications in level 3 and level 4 autonomy of the automated vehicles. It is capable of performing unconditional emergency braking, highway and traffic jam piloting, and remote parking. Its dual image signal processor processes Dense Stereo Disparity, Dense Optical Flow, and Object Classification algorithms. So much so, the SoC also contains a convolutional neural network block to accelerate deep learning computations. With its IMP-X5 + image recognition engine, it handles stereo front camera optimization, boosting its performance five times at only 0.3 W. The low-power system on chip is capable of handling cognitive tasks to control the vehicle to drive and park its car autonomously with the aid of its 3D surround view. OmniVision Technologies incorporates with Smart Eye to provide integration of hardware sensors with the software library. The former introduces its latest two Megapixels image sensor. The latter uses deep learning and artificial intelligence in its algorithm for driver monitoring with augmented reality displays. The image sensor opens a wide viewing area for precise and clear image tracking (Fig.9.5). The smart sensors [1] have no external pins at all. Another highlight of the sensor is that it can perform its operations all by itself. It includes sourcing its energy for computation and conducting wireless communications. The energy buffer boosts up from micro Amperes to Milli Amperes or voltages (Fig. 9.6).

(a) 2-Mpixel Image Sensor (b) R-CAR V3H SoC (Credit OmniVison Technologies) (Courtesy of Renesas) Fig. 9.5 Vision chips (a, b)

68

9 Sensors and Vision System Fusions

Fig. 9.6 Smart sensor (Courtesy of Octavo Systems)

9.4 Embedded Vision Autoliv’s 77 GHz radar system sees oncoming vehicles and autonomously manages lane changes during highway driving. Cameras, radar, and range sensors combine to have a 180° angular view in the front and rear of the car. The front and rear radars provide different functions for the vehicle. The front corner radars support object and free space detections, while the rear corner radars contribute to object detection, blind-spot warning, and back cross-traffic alert. The features assist in the autonomous functions of the vehicle for a collision-free system. The next-generation embedded vision computes in the edge and cloud. It requires processors, algorithms, and tools. The learning algorithms for autonomous vision processing need to run in a secure partition to minimize interference, disruption, and latency in controlling the autonomous vehicle. As a result, the machine vision platform, NXP S32V, runs on LynxSecure 6.0, a real-time separation kernel hypervisor to achieve security. Figure 9.7 illustrates the embedded vision processing architecture. The EV6x embedded vision processor runs on the safety enhancement environment to promote safety. With safety securely tagged in the embedded vision processor, automakers can then take a step further to introduce the Advanced Driver Assistance Systems (ADAS) with deep machine learning techniques to drive the full capability of the autonomous driving systems. Figure 9.8 shows the ADAS features employed by embedded vision.

9.5 Sensors Fusion In the new vehicular frontier, cameras and vision fusions hold the primary technology for navigation in the autonomous driving system. An extended view of the spatially and temporally ambient surroundings for vehicle execution is possible. Most important of all, it can detect, interpret, and recognize objects, pedestrians, and traffic signs more easily to guide the autonomous vehicle to drive more safely. Sensor fusion creates a better image to give a complete detail of the environment because their

9.5 Sensors Fusion

69

Fig. 9.7 Embedded vision processing architecture (Credit Synopsys)

sensors do not depend on each other but a combination. Sensor fusion is not difficult to achieve because we can add complementary data to each other through fusion. The only disadvantage is under the condition where the cameras have low visibility. Sensors fusion not only localized the mapping (SLAM) of the car but also provides target recognition to track objects of interest. It requires a combination of multiple surrounding car cameras to build up ambient visibility. Furthermore, it has other advantages besides giving lower power consumption to overall sensor processing. Moreover, it also ensures a high degree of operation in an uncertain environment and weather such as heavy rain, smoky, dusty, mist, and fog conditions. Figure 9.9 shows the sensor fusion for the perception and planning of the automobile. A variety of sensors deployed is the ultrasonic sensor, the radar, the LiDAR, the proximity sensor, and the cameras. However, each sensor has its capability and shortcomings. We need to integrate them to collect the best outcome for all possible car situations in real-time. It harnesses the time-critical safety aspect and concerns of the vehicle and its surroundings. An example is the reduction of noise by combining two overlaying camera images. Co-operative sensor fusion is a co-operative sensor network. A competitive fusion combination increases the robustness of the perception and is fault tolerance, while co-operative and complementary fusion provides extended and more complete views. Competitive configuration collects independent measurements of the same target

70

9 Sensors and Vision System Fusions

Fig. 9.8 Embedded vision applications (Courtesy of Avnet)

Fig. 9.9 Perception and planning

from each sensor. However, there are two possible kinds of competitive data combinations. They are the fusion of data from different sensors or the fusion of measurements from a single sensor taken at different instants. Co-operative sensor fusion provides a full image of the environment with its redundant information but has the disadvantage of decrease accuracy and reliability. Stereoscopic vision is an example

9.5 Sensors Fusion

71

of a co-operative sensor fusion technology. Resources like processing power, application requirements, and memory are the criterion to determine the type of algorithm for the fusion. Altogether, smart sensor fusion applies to object classification, SLAM, and target tracking.

Reference 1. https://www.embedded-computing.com/guest-blogs/what-are-the-components-of-a-smart-sen sor#

Chapter 10

V2V and V2X Communications

Automobiles provide wireless or WiFi based link-enabled communications to other vehicles or infrastructures. The wireless system adds additional information such as speed, direction, and position to other oncoming vehicles as a warning or highlight. It helps to alert the driver in unforeseen circumstances. Especially in non-line-of-sight and long-range sensing, it can outperform smart cameras and radars. The vehicle to everything (V2X) technology consists of two types of wireless communications. They are the onboard and the roadside units. The former carries the V2X system with its vehicle. The establishment is for information display on vehicular resources for the benefits of energy-saving. The latter unit provides information exchange to the road infrastructures. In other words, these are the vehicle to vehicle (V2V), the vehicle to pedestrian (V2P), and the vehicle to infrastructure (V2I) systems. V2X technology manifests in fleet management, green-light traffic optimization, and emergency braking. Information and data flow to devices capture important details and aspects of the vehicle to locate the vehicle and highlight situations for traffic management and warnings of vehicle status. The NXP SAF5400 [1] is a single-chip standalone modem that applies to V2X communication.

10.1 Driving Vehicle Connectivity For smart and safe applications, vehicle connectivity known as Vehicle-to-Everything (V2X) provides a connection to devices, infrastructure, and other vehicles. The idea of V2X is to alleviate traffic congestion and make an improvement in road safety. Moreover, vehicle to X communication gateways, smart wireless charging, and intelligent security prevent vulnerable attacks. The V2X communication data packet generates at every 20 ms to 100 ms time. Emergency braking, hazardous road conditions, and the radio channel load, which depend on the signal power sensed during monitoring of the radio channel, are situations to generate the data contents. Linking vehicles with city infrastructure (V2I) helps to improve the driver’s awareness, thus enabling © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021 T. S. Ng, Robotic Vehicles: Systems and Technology, https://doi.org/10.1007/978-981-33-6687-9_10

73

74

10 V2V and V2X Communications

the driver to shunt heavy traffic areas and mitigate traffic accidents. Alerting drivers of upcoming bus stops and any pedestrian traffic will reduce the speed of the car and improve safety as well as pedestrian awareness. By providing drivers with information about the traffic ahead can help to reduce traffic jams by changing routes. The vehicle’s time and position for vehicle warnings in high-risk areas are beneficial. All these advantages help to reduce traffic flows and enhance the safety of pedestrians. We cannot achieve a vast amount of benefits without the car connectivity network. E-call, Wifi, 802.11p, DSRC, and the coming 5G technique all have a role to play in the mobile communication system. Dedicated short-range communication (DSRC) system plays a role in allowing neighboring vehicles to share data of their speed, location, brake status, and environment. It utilizes radio frequency communication to perform detection within a distance of a quarter-mile. The system exercises its advantages of dynamic cruise control interaction with other cars for driving efficiency and comfort. A vehicle with DSRC installed will have braking status and speeding information of the front vehicle. It can identify situations ahead of any car throwing a brake, for a much more re-silent driving, and avoids collision. The vehicle clouds link to the internet cloud through cellular base stations [2] (Fig. 10.1). Seamlessly, we can share the information among them through the edge cloud. Therefore, there is a communication link between each of the vehicles in different vehicle clouds. On the other hand, the edge cloud version provides faster communication for traffic and vehicle awareness functions. The computation and distribution of data are faster in the edge cloud than the internet cloud because

Fig. 10.1 Vehicle connectivity (Courtesy of UCLA)

10.1 Driving Vehicle Connectivity

75

of the shorter data flow distance. Preferably, edge cloud is the primary choice for consumers.

10.2 Vehicular IoT The IoT solution collects data from the telemetry for analysis. The system can detect vehicle fuel consumption from its low emission rate. It monitors tyre pressure, where there is a possibility to improve gas mileage. As such, the instant telematics data enables a proactive and pre-emptive maintenance program. Smart SIM offers the solution to switch seamlessly between cellular providers and networks, where we account for changes in network types and providers. Often, software fixes and application updates maintain the life cycle of the fleet program. With this scalable module, over-the-air upgraded is possible for any new configuration. While the traditional subscriber identity module (SIM) card is widely applicable in handsets, it is unmatchable in the IoT supply chain. It creates logistic issues when connected to IoT devices. The card may need to undergo servicing in which, its location such as height, and environment may be hard to access. Reports have shown that eUICC accelerates its way to the path of IoT opportunity in the cellular market drive. The eUICC is a SIM in the form of a card or embedded IC that incorporates into the IoT device. It is not easy to remove the chip that embeds inside the IoT device. This is the advantage in terms of small form factor, and we cannot physically destroy. Besides employing it in-car telematics, its usage spans into the stock-keeping unit (SKU) in the production lines. Vehicles are traceable through wireless connection once we switched on the eUICCs of the devices. In-car-cleverness installed with telematics benefits drivers by connecting to the support sources for their assistance using eUICC. Two groups in the cellular markets drive the IoT technology. They are the unlicensed and licensed spectrum (cellular-based). Presently, the IoT technology spans almost the NB-IoT and the LTE-M effort. These are suitable for the low bandwidth, low power requirements of the IoT LPWA, or low power huge area network. Endless identities such as angle bracelets, livestock, bicycles, and many other mobile assets, are trackable for its location information. The DynaGATE 10-06 is an IP67, heavy-duty IoT gateway for automotive applications. It features internal batteries that can stand-in minutes of uninterrupted operation during a power failure. Commonly it runs on the NXP i.MX 6UltraLite Cortex-A7 processor, with 512 MB RAM and 4 GB eMMC. The DynaGATE 10-06 connectivity capabilities range from an internal LTE Cat 1 modem with dual Micro-SIM support, Wi-Fi, Bluetooth Low Energy, to a dedicated GPS with optional Dead Reckoning and two Fast Ethernet ports on rugged M12 connectors. The DynaGATE 10-12 and the DynaGATE 10-06 come with an Oracle Java SE Embedded 8 Virtual Machine and Everyware Software Framework, a commercial, enterprise version of Eclipse Kura, the Java/OSGi edge-computing platform for IoT gateways that adds advanced security, diagnostics, provisioning, remote access and full integration with Everyware Cloud. Both systems are carrier pre-certified, with an integrated LTE Cat 1

76

10 V2V and V2X Communications

cellular, GPS, Wi-Fi, BLE, E-Mark, and SAE/J1455 certifications, operating in an environment temperature ranging from −40 to +85 °C.

10.3 5G Vehicle connectivity technology via mobile apps is wide-spreading the growth in the market. Today’s 3G such as UMTS, HSPA, and HSPA+, and the LTE of 4G have extensive employment in the V2V and V2X communication technology. However, the 5G technology supersedes the present 4G technology. It has many benefits to employ the 5G technique in automobiles. With the increasing data bandwidth for vehicular sensing, 5G will expedite fast information to and flow of the cloud platform and the vehicle network for sharing. In this way, the 5G technology can increase the effectiveness of ADAS functionality. Only the next-generation 5G network will keep the autonomous vehicle stay ahead of information dissemination and workflow efficiency. Vehicles give power back to homes in times of unforeseen home emergencies, which has cut off the power of electricity. This is the vehicle to home or V2H technology. Besides, electric cars can return electricity to the grid infrastructure. This is the V2G or vehicle-to-grid technology. The vehicle-to-vehicle (V2V) technique can alert one another on the road for the presence of each other. More for pedestrian safety, the car can communicate to pedestrians to alert them of oncoming vehicles (V2P). The automobile providing vehicle-to-infrastructure (V2I) communication signal senses roadside infrastructures such as traffic lights and traffic congestions. An example of V2D, the vehicle-to-device communication link is to transmit information to the cyclist’s smartphone. With the vehicle to everything commercialization, the 5G CV2X technology will accelerate the automobile market with promises. It is a secure, low latency, high data rates communication system that guarantees packet delivery, shared communication infrastructure, and an established ecosystem. Besides, it also achieves a V2X solution that enhances safety to the vehicle and its surroundings. It tends to optimize the 28 GHz Gallium Nitride on Silicon Carbide (GaN-on-SiC) frontend modules (FEMs) fixed wireless base stations making it to be a smaller, cheaper, more powerful, and efficient millimeter-wave phased array. The 0.15 µ GaN-on-SiC can dissipate an isotropic radiated power (EIRP) of 65 dBm. The QPF4001 and dualchannel QPF4002 models of the GaN FEMs can integrate up to two MMICs. Each monolithic microwave integrated circuit consists of a three-stage low-noise amplifier, a three-stage power amplifier, and a low-loss transmit/receive (T/R) switch. Also, the 5G technology finds its usage in vehicular localization. As observed, the 5G technology is rising its popularity as the radio frequency signal decline due to the establishment of its line of sight LOS view for detection. Moreover, the GPS jitter or jammer can stop the vehicle from being located or interfere with the communication system. In contrast, 5G gathers all information external of the vehicle to localize maneuver and secure its route in its travel destiny.

10.3 5G

77

The equipment inside the vehicle just like the RSU and the OBUs support the same wireless radio protocols. The raw data format detected is send in C-ITS message before transmission to the external of the car. Information from GPS, LiDAR, inertia, radar, etc. is sent using MQTT to the LDM servers through 5G network. The QoS or system reliability application of the vehicle tracks the bitrate and latency of the transmitted data. The Sierra Wireless Airlink MP70 [3] of Martti connects to the 5G wireless network of which data from the sensors stores in the server.

10.4 Internet of Vehicles Vehicular communication depends on the latest (ITS-G5 and LTE support up to level 2) or future 5G wireless communication technology to enable V2V (vehicle to vehicle) and V2I (vehicle to infrastructure) communication. 5G communication capability supports level 4 autonomous driving. DSRC enhances V2V and V2I applications on the roadway. The V2V-Dedicated Short-Range Communication (DSRC) provides a low-cost, low-latency means to connect cars. V2V utilizes DSRC to communicate with other vehicles to exchange information such as a vehicle’s speed, heading, and braking status. The internet of vehicles (IoV) has low latency and high reliability for road safety. The vehicular cloud stores the incoming data and processes their cooperative data for fleet coordination together with the crossing pedestrian on the streets. The toand-flow of data between the cloud and the vehicles handles V2V communication. The information processed in the vehicle cloud addressed the geographical location to provide traffic awareness beyond the line of sight. Drivers can access these abundances of traffic information in advance to predict traffic situations for adapting to current traffic conditions. The V2V effort leverage traffic flow by informing nearby vehicles of incoming traffic to avoid a collision. Thus, the highway traffic patterns analyzed from the collected data provide a better vehicular navigating opportunity. It executes constant detections and regulates neighboring vehicle speed to ensure zero accidents. After discussing the many advantages of IoV, we need to look at its complexities. IoV depends on the internal car sensing besides relying on its external sensors. The vehicle alarm reports and beacons, the driver and the automobile states from infotainment messages, the cockpit and automotive sensors, and actuators all contribute to IoV. The brakes, accelerator, steering wheel, driver’s state of health, voice tone, alertness, heart monitor seat, etc. cater for a large amount of information for processing at the cloud.

10.5 Interconnected Vehicles A connected vehicle gives access to the user for lots of connections to the vehicular environment. The passenger or driver has all the freedom to download, access,

78

10 V2V and V2X Communications

transmit, receive, and share information. To carry out these activities, we installed the vehicle system with vehicle-to-infrastructure (V2I), vehicle-to-human (V2H), vehicle-to-sensor (V2S), vehicle-to-vehicle (V2V), vehicle-to-network (V2N), and vehicle-to-broadband cloud (V2B) communication systems. V2I manages traffic light signaling, timings priority, and traffic congestion. Examples of V2I applications are message hopping for roadside emergency notifications, dynamic ramp metering, and speed warnings. V2V ensures collision avoidance and safety first. It includes the co-operative adaptive cruise control (CACC), co-operative collision warning, and co-operative intersection collision avoidance. CACC is an extension of Adaptive Cruise Control (ACC) with an additional V-V communication that potentially increases roadway efficiency. Figure 10.2 shows the intersection-vehicle collision warning system. It is part of the V2V engineering initiative where both the vehicles crossing the T-junction give warning messages, alerts, and seat vibrations to warn the drivers of potential front junction collision ahead. V2N and V2B tap real-time traffic management, routing, and cloud services. V2H or vehicle-to-pedestrian (V2P) technology gives warnings on the road for pedestrians, motorcyclists, and bicyclists. With such a vast deployment of car connectivity, it is necessary to authenticate the crowdsourced data. We take the measure in several steps. We expect to protect the internet cloud and the vehicle cloud from spoofed data attacks. The general flow proceeds as follows for each vehicle [2]. 1. 2. 3. 4. 5. 6.

Collect and process sensor data (e.g., video streams, LiDAR, radar, and wireless signals) Generate keys (rotation for privacy) Sign computed sensor data (to share with neighbors) Upload signed data to edge cloud (for persistent storage) Broadcast signatures to surrounding vehicles (pointers to data) Verify incoming signatures (each vehicle validates neighbors do exist)

Fig. 10.2 V2V engineering (Courtesy of GM)

10.5 Interconnected Vehicles

7. 8. 9.

79

Access edge cloud (download signed data) Validate neighbor vehicles data against local records Broadcast to neighbor vehicles agree or disagree message.

The major applications of the connected car are infotainment, telematics, and navigation. The automobile market includes but is not limited to the following global companies. 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19.

Audi BMW AG Continental Delphi Automotive Denso Ford Motor Company Bosch Gemalto General Motors Harman Hyundai Motor Co. Infineon Mercedes-Benz Microsoft Corporation NXP Sierra Wireless Toyota Motor Corporation Valeo Verizon Communications ZF.

Cloud-based networking supports V2V and V2X communications. With the cloud connectivity together with NLU, natural language understanding, and contextually aware databases, we can accommodate reliable information and build an automated vehicle with a vast amount of sensing activities for AI decisions in the car connectivity to leverage driving automation development. Perhaps many automotive assistants like the Microsoft Cortana, Google Assistant, and Amazon Alexa entangle in the car connections for a wider variety of automated functions through car sensor data. These assistants linger with AI to bring about more automated solutions through car connectivity in the wireless network. Therefore, we must ensure a reliable and trust worth communication present in the car network. Enhanced Mobile Broadband (eMBB) caters to internal vehicle car experience. Examples are augmented reality and handle high bandwidth for the increasing numbers of sensors without disruption. In contrast, Ultra-Reliable Low Latency Communication (URLLC) at the external of the car network flow provides a smooth communication link for V2V and V2X in the automated driving system. Finally, Massive Machine Type Communication (mMTC) provides communication links from cars to cities and houses, and between cities and homes of the car information. There are two types of vehicle synchronization. Figure 10.3 illustrates both the GNSS direct synchronization and local out-

80

10 V2V and V2X Communications

Fig. 10.3 Vehicle synchronization (a, b) (Courtesy of Rohde & Schwarz)

of-coverage synchronization [4]. The former with satellite deployment supply the time and frequency information for synchronization, while the latter self-coordinates without network coverage.

References 1. https://www.embedded-computing.com/guest-blogs/v2x-on-the-road-with-chips-software-sta cks-and-design-kits?mkt_tok=eyJpIjoiTUdOalpHWXpOR013WkRrMiIsInQiOiJQQXFWd URjYVMwS1NkRFRHZDVMZzgySzhPaTYzTytQbDVtbithcUdPYWNcL3J4eTVZeDFK Y1wvZVNLQ1Rwc2RUbTliM3E0SWw5Um8rajZvK0pZcmxWMVhWQXliWk1aQ3JM M09FbXZBZWZxaTJzSU5EM2c2MzJvaDYxXC9hc3l5dHFueSJ9 2. http://onlinelibrary.wiley.com/doi/10.1002/itl2.16/full?elq_mid=24349&elq_cid=1391842 3. https://m2mconnectivity.com.au/product/mp70-high-performance-vehicle-router/ 4. https://www.eetasia.com/wp-content/uploads/sites/2/downloads/RS_WP_EN_202001_02.pdf

Chapter 11

Deep Learning and Neural Network Algorithms

The latest vehicle development incorporates many different kinds of technologies. They are computer vision, machine vision, machine learning, deep learning, and IoT technologies. The root of artificial intelligence is machine learning and deep learning. With the combination of both machine learning and deep learning, automakers implemented the control logic for the control loop of the vehicle, thereby achieving a completely autonomous driving system. The deep learning algorithms produce the functional safety of the car. A DOER-CHECKER principle follows to check the application safety for each of the vehicle’s components, such that they have adhered to the ISO26262 objectives at ASIL D. A few predictive algorithms run simultaneously for lane selection, distance keeping, and collision avoidance scheme, driver condition evaluation as well as automotive engine prediction.

11.1 Artificial Intelligence AI technology is responsible for autonomous vehicle perception, analysis, and planning. Artificial intelligence embeds machine learning, representation learning, and even dives deeper into the concept known as deep learning. The Parker AutoChauffer offers a deep-learning CPU known as DRIVE PX Pegasus. Electronics engineers implemented the System-on-chips (SoCs), such as the AI accelerators, in conjunction with automotive software. It integrates into a system, which can react to the on-coming danger of collisions and overcomes the challenge of traffic flow on the road. In the same sense, artificial intelligence may stop car function by an alcohol detection sensor. Likewise, severely worn brakes and damaged lightings may prevent the vehicle from starting. In HMI control, the new automotive vision concept allows the car to sense the tiredness sign of the driver using biometric information from the intelligent camera. It, in turn, prompts the AI to take control of the car in replacement of the driver in signs of tiredness detection. Movidius Myriad 2 is a vision processing unit (VPU) deep neural-network accelerator. It includes a software development kit © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021 T. S. Ng, Robotic Vehicles: Systems and Technology, https://doi.org/10.1007/978-981-33-6687-9_11

81

82

11 Deep Learning and Neural Network Algorithms

for developing vision functions. Developers can call the open-source computer vision (OpenCV) and image processing libraries, and NI vision builder to assist in their machine vision applications. The artificial intelligence decides by voting neglects any false positive or negative sensing data. The Random Sample Consensus (RANSAC) utilizes this voting protocol to eliminate either the outliers or inliers or to find the best-fit data. Adaptability, flexibility, and scalability embraced the reconfigurable chip in the automotive industry. Xilinx components have great potentials for the utilization of artificial intelligence in autonomous cars. Its programmable SoCs and FPGAs are deployed widely in the automotive subsystems such as the sensor nodes and the ADAS control modules. In the chips, neural network programs [1] (Fig. 11.1) execute for identifying dangers and undesirable conditions of the autonomous drive system. The NNs are also deployable on GPUs in the benefit of their high computing performance in AI-driven cars. However, it suffers from the shortcomings of high power consumption, large heat dissipation, and increased form factor. GPUs’ inherent in the single instruction, multiple data architecture relies on batching of the data (Fig. 11.2a). It introduces latency and results in determinism reduction in the AI inference applications. In comparison, FPGA enables sensor inputs to execute individually, which improves throughput, latency, and provides a more deterministic executing system (Fig. 11.2b). By offering the programmable logic configuration to the vehicular architecture also allows for over the air updates (termed OTA silicon) and remote reconfiguration. Besides OTA, another contributing factor of the programmable chip is the dynamic function exchange capability. The Dynamic function exchange or

Fig. 11.1 Neural network vehicle control system (Courtesy of ANSYS)

11.1 Artificial Intelligence

83

Fig. 11.2 Deep neural network latency (a, b) (Credit Xilinx)

DFX (Fig. 11.3) enables every single piece of the programmable blocks to perform many mutually exclusive functions, including OTA update scanning. A camera-based detection system requires a dataset to develop further artificial intelligence for the system. We utilized these datasets to train the machine learning or deep learning algorithm. In addition, the machine vision algorithm must meet the least performance standard. We need to define a metric set and use the dataset for performance evaluation. Fig. 11.3 Dynamic function exchange (Credit Xilinx)

84

11 Deep Learning and Neural Network Algorithms

11.2 Neural Network Artificial neural networks, ANNs, find their applications in many autonomous driving features. Its usages in automotive include driver recognition, alertness monitoring, and gaze tracking besides seat occupancy. In the external of the vehicle, it applies to detect road signs, obstacles, vehicle path tracking, and traffic analysis. The computer vision techniques for facial detection, gesture recognition, and scene recognition require deep learning. CNNs have layers consisting of convolution filters. These convolution neural network layers map the input to the output. In practice, we deployed them in image processing and sensor fusion functions in an autonomous vehicle. Engineers employ CNNs to boost the image processing capability to become more efficient. A few of these connected layers trained by large datasets improve its accuracy.

11.3 Deep Learning Algorithms can bring improvements and even perform better than human eyes. A deep learning technique takes in the labeled data to output a predictive model. The middle section is the training model to calculate and train the incoming data for results. It is a repeated process through the deep learning model. AI in autonomous cars continuously renders the car’s environment to predict the possible outcome for the automobile. For example, any bounding label boxes go through deep learning for predictive bounding label boxes and their probabilities. Feature engineering is the process of writing codes for detecting the target pixels in the image for the labeled data. It is a process to perform first before going through deep learning. Various featuring processes such as classification, localization, pose estimation, and segmentation can be involved in deep learning. TensorFlow for Poets can perform classification with fast speed without GPU. Object localization API computes the location of the target object. Introducing TensorFlow Lite is a machine learning library that is smaller, easier, and faster. It applies to embedded and mobile devices. NVIDIA’s full software stack caters to deep neural network learning in the vehicle platform for sensor processing, perception, map localization, and path planning. Selfdriving simulation features automotive-grade hardware running AUTOSAR Classic applications and a high-performance NVIDIA DRIVE AGX Xavier kit using neural networks to automate driving.

11.4 Neural Network Accelerator As the dataset size increases, its computational requirements increase along with it. The PowerVR product range provides the best solution for vision, AI, and

11.4 Neural Network Accelerator

85

embedded graphics. The PowerVR 2NX NNA (Fig. 11.4) or neural-network accelerator (Fig. 11.5) in system-on-chip has a complete solution built from the ground up for neural network inference in software, tools, and hardware IP. Featuring its precision flexibility enables it to run at 69% power, meaning the highest inferences/MW at the lowest power consumption. It boosts its performance with 60% more inferences/sec running at only 54% bandwidth. Moreover, it offers the highest inferences/mm2 that makes it the most cost-effective. Comparing with DSP, it only requires a 25% bandwidth. The IP core on SoC enables efficient neural network inference for edge computation. Its flexible adjustments in weights and activations (Fig. 11.6) promise to attain optimum accuracy and precision. By employing the PVR model, we developed applications on the Imagination DNN API and ported them to the 2NX neural network accelerator for online inferencing. It applies to the automotive industry for easy prototyping, running fast speed at reduced power. Tremendous computing power is required to process artificial intelligence functions that turn autonomous systems into intelligent machines.

Fig. 11.4 PowerVR 2NX NNA (Credit Imagination Technologies)

86

11 Deep Learning and Neural Network Algorithms

Fig. 11.5 NN accelerator in SoC (Credit Imagination Technologies)

Fig. 11.6 Activation functions (Credit SAS)

Reference 1. https://www.ansys.com/about-ansys/advantage-magazine/volume-xii-issue-1-2018/drivesafely

Chapter 12

ADAS in Autonomous Driving

With the United States gradually accommodating self-driving vehicles on public roads and Singapore passed legislation of no human driver in the driving car, artificial intelligence has tremendously spurred the technology in automated and assisted driving. During testing, the self-driving car has driven ten billion miles in simulation. In actual driving, the UK has set a milestone in self-driving vehicle history when a modified Nissan Leaf completes a 370 km journey autonomously. The car equipped with a radar sensor, seven cameras, and eight LiDAR sensors altogether feed data to six electronic control units. In general terms, the autonomous vehicle system must be capable of receiving several features from processing to control. In an automated car, new functionalities explored are Active Distance Assistance, Active Lane Keeping Assist, Active Blind Spot Assist, Active Lane Change Assist, Active Speed Limit Assist, Active Emergency Stop Assist, Active Brake Assist, Traffic Sign Assist, Active Parking Assist, Remote Parking Assist, and Car-to-everything Communication. ADAS, autonomous driving, vehicle dynamics, braking, steering, powertrain controls, etc. are the many embedded software control systems. These embedded systems strictly conform to the vehicle safety standard, which lay across the automotive industry. In accommodating these control systems, autonomous vehicles have several sensing tasks. The followings are the tasks in a self-driving car. i. ii. iii. iv. v. vi. vii.

Sensing is the processing of the raw frame-by-frame data from the vehicle cameras. Perception defines as the collection of the data for object detection, classification, and positioning. Mappings identify the safe driving areas and objects within a mapped area. Localizing is the pairing of information from the vehicle’s GPS. Route/path planning determines the short and long-term driving paths, which includes incident reaction. Motion planning provides vehicle navigating control strategies appropriately for selected routes. Vehicle control issues the braking, throttling, and steering and suspension commands while driving.

© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021 T. S. Ng, Robotic Vehicles: Systems and Technology, https://doi.org/10.1007/978-981-33-6687-9_12

87

88

viii.

12 ADAS in Autonomous Driving

Driver interaction is to provide feedback to the driver, sensing driver intent, and handing off control.

The raw data from the various sensors converts to define the conditions for the safe operation of the autonomous vehicle. Operations such as drive actuation, decisionmaking, and function activations operate simultaneously to control the self-driving vehicle. Algorithms to perceives, localization, trajectory planning and decisions to control the vehicle drive the autonomous vehicle software suite. Moreover, it also combines the fusion of non-vision-based sensors for improved safety. EB robinos [1] is the functional software architecture for machine learning in autonomous driving. The input comes from the various sensors of the vehicle and output to the actuators for driving. These, in turn, control the vehicle steering, motor, engine, pedal, and clutch after going through some decisive computation. In the middle section, we have the situation analysis to construct the path planning. From here, it leads to motion control management. They are the lateral, longitudinal, and trajectory control of the autonomous vehicle. HMI integrates with the motion management unit for interactive control. The functional specification is important for software architecture and requires verification. Throughout the whole architecture, safety management and error compensation tackle the failures in the system.

12.1 Advanced Driver Assistance Systems There are four levels of control for automated car technology. Level 1 enables automated driving without feet controlling motion. Level 2 is without using our hands on the steering wheel. Level 3 drives the car without human vision. Level 4 autonomous driving materializes to free human concentration in the control car. More complicated technology advances together with each level of the automated driving system. Advanced Driver Assistance Systems or ADAS, in short, is the latest technology used in autonomous driving vehicles. It is associated with level 5 autonomy. Automatic cruise control is the technology behind the ADAS. Another such technology incorporated is the lane departure warning. The ADAS also facilitates rain-sensing, blind-spot monitoring, crosswind stabilization, glare-free high beam, adaptive light control, hill descent control, emergency driver assistant, night vision, and wrong-way driving warning. Many other similar goals for the autonomous features are the lanekeeping assist, autonomous emergency braking, speed assist system, evasion, driver monitoring, and pedestrian detection. All these systems utilize the entire range of sensing capabilities. The variety of sensors include are smart and monocular cameras, proximity, ultrasonic and infrared sensors, GPS, LiDAR, radar, V2X connectivity, mirrors, and displays to enhance driver’s visibility in reaction to an emergency. The data from the sensors fused to produce an analytical vision for real-time car navigation. A clearer mapping of the environment will henceforth enable the autonomous vehicle to decide the correct travelling direction. Besides, we installed electric brakes for automatic holding and releasing of the braking system. The feature helps to

12.1 Advanced Driver Assistance Systems

89

perform the auto parking and driving out of the parking slot. The speed warning system reminds drivers of the speed that the car is travelling. The incorporation of these technologies improves driver safety and driving experience. The functioning of advanced technologies require multiple cameras in the system. Table 12.1 shows the various sensors utilized for the different levels of vehicle automation. Sensor fusion technology increases the level of autonomy. Automakers utilized microcontrollers for ADAS sensor fusion. The ADAS from Mobileye managed Forward Collision Warning and Lane Departure Warning to log the events in the management center. The active lane-keeping assist exerts vibrations to the steering wheel to warn the driver when he drives off his lane. The side assisted radar helps the car to change lanes safely. A light shine in the exterior side mirror warns the driver of any blind spot dangers during lane changing. The stop-and-go and brake assist prevent any rear-end collisions. The rear camera of the car, together with the display in the dashboard, provides visibility to the blind spot. From the incorporation of the handful of sensors, the smart automobile becomes autonomous with various ADAS control features (Fig. 12.1). The varieties of sensors, together with Table 12.1 Sensor application L1

L2

L3

L4

Camera

Yes

Yes

Yes

Yes

LiDAR

Y/N

Y/N

Yes

Yes

Radar

Yes

Yes

Yes

Yes

Secure V2X

No

No

Y/N

Yes

Fig. 12.1 ADAS control features (Courtesy of Renesas)

90

12 ADAS in Autonomous Driving

the connected sensors for V2X communications and telematics, provide the real-time autonomous performance of the vehicle.

12.2 Imaging Radar Of all the sensors, radar is the only sensor that operates without failure. Visual sensors have the weakness of obscure viewing when the line of sight is blocked. However, radar can see through objects to detect targets outside the cone of sight. Imaging radar can operate to cater to a wider field of view without compromising its image resolution. It can detect longer distances up to 350 m in range. Also, it has the benefit of measuring Doppler or radial velocity in every frame. Doppler offers the fourth dimensioning for differentiation. That fulfills the automotive system requirement to separate 3D objects. It utilizes frequency modulation technique to determine the ranges of different objects. The imaging radar has powerful embedded antenna operating under a wide bandwidth. It functions as a low power programmable core at 77 GHz. The mm-wave radar CMOS sensor (Fig. 12.2) offers high accuracy and resolution that contributes to the close loop monitoring and control to supply radar information. From Arbe Robotics, the high-resolution, ultra-low C-SWAP radar brings realtime 4-D imaging to life. It provides a wide field of view at 100 degrees azimuth on 30° elevation. It has a detecting distance of up to 300 metres while preserving the accuracy of about 10–30 cm. The radar-based on 4-D algorithm modelling locates objects 25 times per second while creating an imaging environment (SLAM) 50 times per second. It produces optimal sensitivity by filtering random noise occurrences and achieving low sidelobes by calibration. In an autonomous vehicle, the high-resolution 4-D SoC radar is an embedded vision processor found in ADAS. The automotive radar enhances vision capability for automated vehicles. It provides rear-end collision avoidance for automated vehicles. Texas Instruments has produced the AWR1642 Fig. 12.2 Radar CMOS sensor (Credit TI)

12.2 Imaging Radar

91

Fig. 12.3 AWR1642 device (Credit TI)

device, which is a single-chip FMCW radar sensor capable of operating between 76 and 81 GHz with a transmission rate of 4 GHz (Fig. 12.3).

12.3 LiDAR The LiDAR technology is known as light detection and ranging laser works by the reflection of pulses from objects to reconstruct its 3D representation. LiDAR utilizes visible, ultraviolet, or infrared light depending on its application. It transmits and bounces back light-energy that is detectable by sensors for analysis with software to discern a target’s location, speed, and size. When applied to vehicular sensing, it can provide a 3D view of its 360° surroundings. The distribution of the scan points is sparse, and the sensor is costly. However, we can apply more beams to reduce the sparsity of scan points. The FOV, orientation, wavelength, speed, pulse rate, software, and spacing affect the resolution. The cloud, weather, or fog in murky water or the limited power of the device affects the light beam. We can apply a short pulse width to improve the resolution but need to space the pulses for increased sensitivity. Figure 12.4 illustrates the operation of the LiDAR system. With C representing the speed of light (division by 2 for light reflection), Delta T is the minimum pulse width, and K is the receiver sensitivity and material. Depending on K and delta T, we get different resolutions, frame rates, and ranges. We calculated the depth resolution by the formula (K* delta T * C/2).

Fig. 12.4 LiDAR System

92

12 ADAS in Autonomous Driving

Car manufacturers used LiDAR mainly in combination with other types of sensors for integration and fusion in the vehicular system. There are several advantages of LiDAR. They deployed the LiDAR sensor to locate path geometry for the driver, roadside signage, and other static scene semantics. It can differentiate between snow and rain and see further to provide more features and road signs for mappings to create the virtual environment. Additionally, it integrates with artificial intelligence, known as deep learning, to enhance its surroundings to identify nearby obstacles to avoid a collision. Furthermore, it can predict a much safer route for the autonomous car to travel. It has a dynamic field of view to highlight and sound warnings to the car driver and the surroundings when it encounters danger of collision. Introducing MapLite [2] is a framework created by MIT, which incorporates LiDAR sensors with GPS data. The software tool functions for autonomous vehicles’ environmental viewing and mapping. The technology navigates unmapped roads for more than 100 feet in advance. Using time-of-flight measurement, the LiDAR emits Nanosecond laser pulse to surrounding objects. The recorded time pulse signal that travels and reflects from objects allows for the computation of the LiDAR sensing. By detecting the timing distance and strength of the laser pulse returns, the object positioning locates as far as 25 m away. However, Luminar LiDAR operates at 1550 nm wavelength with 50 time’s higher resolution than the standards can detect at about 200 m range. To widen the LiDAR field of view, a moving mirror help to pulse the laser signal up to an angle of 220° . It has an accuracy of ±2 cm to millimeter range and runs at a frequency rate of 10 Hz. Although slow speed, it detects well in the dark with a range of up to 250 m. As such, we deployed it in High Definition mapping for prediction, classification and tracking, detection, and localization. There are many difficulties when detecting data under bad weather conditions. Sensors become noisy and cover with slush. Low visibility makes it difficult for trajectory planning. We have to design careful steps to follow when validating for environmental perception and collaborative sensing. The majority part of controlling the automated vehicle comes from the algorithm and software detections. Different combinations of GPS, LiDAR, and radar, together with inertial sensors, produce different types of configurations acceptable for vehicle sensings.

12.4 Software Component In the vehicle architecture, automobile builders need to define the software specification for the system design. Then, they can develop the software architect for implementation. Next comes the software functionality check and verification. The software module testing compiles to the requirements in the ISO26262 standard. All these stages must complete before going through integration and testing. A high definition coding platform is desirable for an autonomous driving system. Some new features in the advanced code cater for functions such as multimedia, automated driving, continual over-the-air updates, image computation, and V2X communication. Adaptive AUTOSAR defines a platform for driving automotive control with

12.4 Software Component

93

the sophistication of the reconfigurable electronic control unit, advanced driver assistance systems, media streaming, and over the air software updates via the internet. The platform allocates the interface for services-oriented architects and APIs. The POSIX operating system provides AUTOSAR programming in the C++ language under the object-oriented approach.

12.5 Hardware Component Computer vision accelerator aids in automated driving for self-driving vehicles. Sensors, data fusion, and mappings are the necessary components for autonomous vehicle driving. Automatic parking utilized mapping, cameras, and ultrasonic sensors. For lane switching, the vehicle needs to look at least 20 m ahead at the sides. Situation analysis and strategy evaluation take place before vehicle actuation. HD video proceeding performs computation while the HW acceleration IP is a necessary component for ADAS. Classical CV hardware accelerator computes its depth analysis with high quality and lesser requirements. It provides maximum optimization without compromising the hardware. Compare to the digital signal processor, it provides higher power efficiency and minimizes overhead. Deep learning assists in behavior prediction, while reinforcement learning helps in the vehicle’s route planning. It provides maximum optimization without compromising the hardware. With the combination of deep learning technology and classical computer vision, we can yield superior performance. As deep learning goes embedded, we anticipate a low power consumption ADAS. As GPUs consume high power, it is replaceable with dedicated embedded hardware. Configurable ASIC helps to achieve high power efficiency. The fixed-point processor and the operating GOP/s, and the capacity and bandwidth of the memory, are the computational resources that influence the power consumption. We can reduce parameters and operations by building efficient convolutional blocks to avoid redundancy as well as achieving our power goal. Memory access consumes more energy than computation. So, if we implement a smaller DNN which can access the cache, it will help the processor to save a lot of energy.

References 1. https://www.embedded-computing.com/articles/from-logistics-regression-to-self-driving-carschances-and-challenges-for-machine-learning-in-highly-automated-driving 2. https://news.mit.edu/2018/self-driving-cars-for-country-roads-mit-csail-0507

Chapter 13

Electric Vehicle

Global emission contamination and fossil fuels are rising the world temperature to a higher level. Humanity hopes to bring down these contaminations to the atmosphere by having a clean and affordable transportation system. To reduce vehicle gas pollution, to protect and create a green environment, and to stop the fuel energy resource from depletion, humanity has to look for an alternate energy source to replace fuel. Carmakers invent an electric mechanism to power electric cars, which gives birth to the electric vehicle. Charging an electric vehicle is necessary when its energy source goes to depletion. When using a 15 amp-charging socket, it takes 8–9 h to charge the car. At full charge with a speed of 1.5 h fast charging rate, it can provide the electric vehicle to travel 140 km. In the present market, carmakers are manufacturing more battery operating cars in conjunction with the improving battery management units for the benefits of higher efficiency and better power electronics drives. The electrification of the vehicle requires the discrete component, converter and inverter, and power module for the vehicle’s auxiliary system. While the plug-in hybrid vehicles (PHEVs) or hybrid vehicles reduce carbon dioxide emissions, and at the same time maintaining a far range of driving, the complete battery-operated electric vehicle (BEV), produces zero-emission at the cost of driving distance reduction. However, in the market, battery costs are reducing, and its quality is improving with battery cell technology. That increases the driving distance for electric vehicles. Furthermore, the regenerative braking system provides efficiency in boosting the powertrain of the electric vehicle while the chipset fuel injection ECU can reduce emissions.

13.1 HEV/EV Electric vehicle deployment on the road is a brilliant idea. There are four primary categories of electric vehicles. We have mild-hybrid electric vehicles (MHEV), hybrid electric vehicles (HEV), plug-in hybrid electric vehicles (PHEV), and battery electric © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021 T. S. Ng, Robotic Vehicles: Systems and Technology, https://doi.org/10.1007/978-981-33-6687-9_13

95

96

13 Electric Vehicle

Table 13.1 Light electric vehicle Light electric vehicles High power (10 kw > 30 kw) a. b. c. d. e. f.

High voltage (48–144 V)

Low speed electric vehicles E-golf carts E-forklifts Light utility vehicles E-motorbikes E-three wheelers

Low power (1–10 kw)

Low voltage (24–72 V)

g. E-rickshaws h. E-bikes i. E-scooters

vehicles (BEV). The mild hybrid is known as the 48v electric vehicle. With the electric vehicle gradually developing its upgrade and performance, the final electric vehicle provides numerous functions and features such as voltage stabilizing, recuperation, start-stop, fuel-saving, sailing electric parking, and complete battery-charged electric driving with an onboard charger. Even so, the MHEV meets the 95 g/km CO2 emissions per car. Gradually in the future, the market will adopt electrifying cars as charging infrastructure grows. Plans for a dynamic induction charging system may introduce to allocate charging facilities at the airport, car parks, and taxi stands. With that, numerous challenges concerning plug-in electric vehicles (PEVs) stand in the way. PEVs bring about many constraints, which impede their adoption. They are the cost of the battery system, its range limitations, charging, and battery density. Solutions to vehicle electrification involve the careful planning of an efficient energy management system for the vehicle engine. Power emulation by solar grid system sources high power and high voltage energy requirements for battery energy storage in electric vehicles. Maximum power point tracking storage tracks the highest efficiency for solar inverters. Evaluation of the lithium-ion battery self-discharging procedures creates better insight into electric vehicle energy management. Automotive energy, battery testing, power device modeling, and cell self-discharging methods are the primary areas of study in electric vehicle technology. With improved knowledge in the DC-to-DC converter design, we can enhance the throughput for the vehicle actuation. Discussing the types of electric vehicles, we have the gasolineelectric hybrid vehicle (HEV), the full-electric vehicle (EV), and the light electric vehicle (LEV). LEV consumes a lower amount of power compared to EV. It further classifies into low and high power types according to its input voltage requirement. Table 13.1 indicates the various types of light electric vehicles.

13.2 Powertrain and Engines Management Electric modules are the primary drivers for the conduction of electrical energy supply to the vehicle. It includes combustion engine, transmission, hybrid, electric drive, control technology, batteries, and fuel cell. The majority contribution of the

13.2 Powertrain and Engines Management

97

vehicle performance comes from the converter and inverter units. The automobile has several functionalities to power up. Every system in the automobile requires 12v and 48v electrical architecture. Table 13.2 lists the clusters in the vehicle system that requires low voltage power supplies. The driver modules affect the power conversion efficiency. The high performance of these power drivers reduces power losses and boosts up their power efficiency. Infineon provides the power conversion modules and battery management systems for the electrification of auto power trains. The most popular AC traction motors are the permanent-magnet electric motor. The neodymium iron boron (NdFeB) magnets can achieve higher efficiency than any other type of motors. Higher efficiency for traction motors means they can reduce the cost of battery capacity with the same vehicle ranging. The size and weight of the drives cause a further reduction in vehicle dynamics and prices. As the speed of the vehicle increases, the power losses increase due to the weakening of the magnetic field. By optimizing the numbers of windings per phase with voltage and inverter current control, we control the losses and improve the power efficiency of Table 13.2 Automobile Electrical Architecture

Infotainment cluster

1. 2. 3. 4. 5. 6.

Communication systems Radio and TV Navigation Instrumentation Display sets Audio Unit

Lighting and body cluster 1. 2. 3. 4. 5. 6.

HVAC system Vehicle Security Access Electrical Heating Exterior Lighting Air Blower Window Defrosting

Powertrain cluster

Fuel Pump Water Pump Cooling Fan Supercharger Generator Transmission or engine control

1. 2. 3. 4. 5. 6.

Chassis and safety cluster 1. 2. 3. 4. 5. 6. ADAS cluster

Credit TI

Electric Power Steering Vacuum Pump Suspension System Brake System Passive and Active Safety Roll Stabilization

1. Sensor Fusion 2. Camera systems 3. Radar, LiDAR, ultrasonic sensors etc.

98

13 Electric Vehicle

the vehicle. Automakers go forward to improve the magnetic flux density of the permanent magnet motor by adding structural layers to the PM materials. Driveline efficiency is the power loss from the engine to the wheels. Low powertrain efficiency is when high power loss occurs between the engine and the wheels. The wheel bearing exposes to the high temperature and pressure through the tyre. An average condition of a 3200 lb car runs at 180mph at a temperature of 180°. Fahrenheit produces more than 1.8 g of vertical and lateral loads. The friction of cars with loads is three times more than that without load. To addon, frictional torque produces as speed increases. Factors affecting bearing frictional losses are vehicle load, amount and type of lubricants, sealings, materials, and temperature. To minimize bearing friction for sufficient durability, we can adjust these factors as low vehicular frictional losses boost fuel economy with lesser emissions.

13.3 Battery, Converter and Inverter Batteries provide the source for powering electric vehicles [1] (Fig. 13.1). Instead of using lead-acid batteries, manufacturers use lithium-ion batteries for improved battery density and operating range. However, the lithium-ion battery contaminates the environment. Automakers turn to low-cost lead-acid batteries for light electric vehicles or LEV. On the other hand, power management and energy consumption, as well as voltage regulation, all depend on the choice of the power products. It includes but is not limited to gate driver ICs, microcontrollers, low voltage power MOSFETs, and high voltage SJ MOSFETs. Battery electronics components include the temperature, current, and voltage sensors for the cells, thermal management system controller, the battery module

Fig. 13.1 Electric vehicle charging system (Courtesy of Cypress Semiconductor)

13.3 Battery, Converter and Inverter

99

Fig. 13.2 Battery system (Credit KIT IPE)

with high voltage interlocks, and the isolation monitoring system. Consumers are concerned with the battery charging rate and the battery life cycle. The battery system faces the challenge of electromagnetic interference. The electric motor current flow, the switching frequency of the power converters, and unshielded bus bars and cables all contribute to electromagnetic noise. The disruption in electrical power devices and the different operating frequencies of each vehicle components lead to the battery management failure. For example, the microprocessor operates in GHz frequency, the DC-DC converter operates at 20–100 kHz, and the CAN bus operates between 250 and 1 MHz. Besides the high-frequency disruption, some other factors affecting the battery charging and discharging are the thermal and electrochemical effects. The electrical cells of the battery deteriorate due to the high-temperature thermal impact of high- frequency charging. Graphene is a material that prolongs the battery life and increases the energy density just the same as the supercapacitors. A coupled electro-thermal model-based BMS algorithm and battery cell-balancing algorithm used for joint state estimation optimize charging and improve efficiency. The battery management system functions to supervise the cells’ current, voltage, and temperature (Fig. 13.2). It detects the fault, estimates the state of health SOH, state of charge SOC, and the charge and discharge control. Battery self-discharging optimization units allow for improved performance of the electric vehicle. By the efficient handling of battery thermal management, we can optimize the powertrain of the electric car. The BMS includes the vacuum insulation layers of the dynamic insulating system for battery thermal management. Thermal management for battery increases range and durability. Hutchinson utilized a vacuum insulation panel using Ultrathin VIP material and PCsMART for long thermal storage. Meaning we can switch on the car at whatever temperature. DIS keeps the battery cool, saves its capacity by 30%, extends its range, and reduces about 25–40% in CO2 emission. By reducing losses and boosting efficiency, we can reduce the size of the battery for the electric vehicle. When we reduced the load of the vehicle, we can reserve this space volume for increasing the battery packs in a larger electric vehicle. Moreover, keeping the car with lesser weight and more space will, in turn, reduce emissions. The BMS and its sensors must be EMC compliant to ensure no data corruption. Transmission of

100

13 Electric Vehicle

the battery data over-the-air to the cloud for diagnostics improves battery safety, performance, and prolongs battery lifespan. Different vehicle sizes draw not the same amount of power. Light-duty vehicles require 0.2 kWh/km, while medium-duty vehicles consume 1 kWh/km, and HD vehicles need 1.3 kWh/km power. The latest solution for the lightweight liquid-cooled lithium-ion (NMC) battery packs offer up to 440 kWh for heavy-duty electric vehicles. Heavy-duty electric vehicles require high power charging stations and higher cooling power demand for its batteries. EV charging, in turn, can cause several impacts. These are power harmonic distortions, voltage regulation, voltage drop, and transformer losses. Furthermore, cell technology has the limitation of its charge and discharge rates at negative degrees temperature. Hence is its regenerative energy and power de-rating factor. All these factors result in the need for the innovation of new cell technology to overcome the hot and cold weather conditions and the increased charging capacity that affect the performance of the electric operated vehicle. Figure 13.3a shows an easy and cost-effective AC-DC converter which utilized diode rectification. Figure 13.3b is a Totem Pole PFC with a SiC converter supplied on-board. Figure 13.4a illustrates the 6-Switch Active AC-DC converter or an offboard charger. The Vienna rectifier (Fig. 13.4b) boost DC-link. With an efficiency of greater than 99%, it offers a bi-directional set-up option. However, the AC-DC converter requires about 3–4 power supplies to function. The off-board charger utilized an AC 3-phase 480 v power supply to convert to DC voltage with a current of 500A, from 50 to 350 kW. Typical charging time is between 30 min and 60 min with a range of 60–80 miles/h of charging. With today’s technology, extreme fast charging coming at 800 v with an output of 400 kW is available. The time to charge for a 200 miles distance takes just 7.5 min. But the power converter or high-speed DC charger has high heat capacity and thermal conductivity, which requires thermal management. Equation (13.1) governs the wastings of power for a charger. For example, a charging system of 300 kW output with an efficiency of 90% creates a power wastage of 33.33 kW.  Pwaste = Pout

 1 −1 η

(13.1)

For maintaining battery lifespan, the onboard vehicle battery must be thermally regulated also during operation and charging. As high temperature causes accelerated degradation, low temperature, on the other hand, decreases the power and capacity of the battery, thereby reducing range. The liquid cooling system helps to reduce the temperature of the cable. Liquid-cooled charging cables utilized thinner-gauge wire to reduce 40% of the cable weight. We can cool the DC contacts of the electrical connector of the vehicle charger also. Full immersion of the battery cells in a dielectric fluid provides battery coolant while conductive looped cold plates cooled battery packs.

13.4 Transmission and Fuel Injection Module

101

Fig. 13.3 On-Board AC-DC converters (Credit Texas Instruments)

13.4 Transmission and Fuel Injection Module The 48 v electrification capability offers numerous advantages (Fig. 13.5). The 48 v technology not only boosts acceleration, creates a higher low-end torque from the electric motor, but also allows for a quick restart at high load and high speed. It has a higher chance of recuperating in comparison to 12 v systems. With a power limitation of under 45 kW, it conducts at higher efficiency and lowers fuel economy and CO2 emission. A vehicle loaded with a 48v electrically heated catalyst helps to reduce the warm-up period of the catalytic converter, thereby reducing the emission rate. The ability to enable the conduction under cold performance supports momentary loadings. Besides, the 48v electrification system reduces engine loading during the transient phase. Another advantage is cable downsizing due to its reduction in the 48 v electrical losses because of lesser current exerted. The 48 v system provides the electrification of actuators and auxiliaries. It powers the electric supercharger, valve

102

13 Electric Vehicle

Fig. 13.4 Off-Board charger (Credit Texas Instruments)

actuation, oil or water pump, AC compressor, vacuum less brake system, and the EPAS for heavy vehicles. Furthermore, it backup the automated driving features for the torque vectoring and the 110 v/220 v inverter. The low-speed continuous performance drives high duty-cycle loads on 110 v/220 v inverter. Figure 13.6 refers to the power supply for the autonomous vehicle. The allocation of the intelligent switch

13.4 Transmission and Fuel Injection Module

103

Fig. 13.5 Vehicle electrification

Fig. 13.6 48 V power supply for automated driving (Credit Ford)

and safety-critical, and fail-resilient loads on the power supply system provides a better power demand to the automated driving system. The IC chipset powers the electronic control unit (ECU) for Port Fuel Injection (PFI). This ECU calculates the proper torque and speed for the vehicle. An alternator, a regulator with a magneto, or a battery powers the ECU. The ECU functions to contain the emission control. The dedicated air/fuel mix sensor allows the ECU to set injection time/flow to meet the air/fuel ratio target. Properly timed injection calculation by the ECU controls the emission. Quality of air, pressure, battery voltage, throttle positioning, the temperature of the engine, and the vehicle speed measurements all enable the correction of air injection and fuel ignition control. The reset time of the MCU is 30 ms for every 150 ms of the injection pump runtime. The L9177A chip qualifies for small engine ECU injection control for the torque and speed of the vehicle. The L9177A evaluation board performs electronic fuel injection (EFI). The SPC572 MCU provides powertrain applications. Its family, the SPC572L, is a 32-bit microcontroller unit for emission control of the bi-cylinder motor (Fig. 13.7).

104

13 Electric Vehicle

Fig. 13.7 Fuel injection chipset (Credit STMicroelectronics)

13.5 Braking System The electronics braking system of the car consists of the integration of electronics, mechanics, and hydraulics. Automatic electronic braking or AEB is slowly gaining its popularity in the market for the autonomous vehicle. Brakes steer the car around by the traction motor using torque vectoring. Besides, differential braking can manipulate the front axle steering. We integrated it for fault tolerance control of the autonomous vehicle. However, there exist the physical limitation of the lateral acceleration and the curvature magnitude. The brake-by-wire concept gets rid of its mechanical parts, hydraulic fluid, the filling and bleeding systems of the mechanical components. These mechanical systems overridden include clamps, hoses, and vacuum booster. The development of an autonomous car braking system demands the following priorities [2]. • The first priority is vehicle deceleration: In failure mode, we must achieve an adequate deceleration. The higher the speed of the autonomous vehicle, the higher is the deceleration in the degraded mode it requires. • The second priority is vehicle stability. During braking maneuver in any first failure mode, we must avoid the locking of rear wheels. • The third priority is the vehicle steerable. During braking maneuver in any first failure mode, we must avoid the locking of the front wheels. • The fourth priority lies in a stationary position. The vehicle brought to stationary needs to be secured in the standstill in any first failure mode independent for an infinite time interval (typically solved with independent actuator Park Brake/Gear lock). With the reduction in brake usage and wear on brake pads and rotors, automakers are installing rust-resistant brake rotors. Bosch combines its Electronic Stability Control ESC system with the electromechanical brake booster or iBooster. The solution provides for a fail-degraded steering and braking system for its electrical, electronic architecture. The replacement of the vacuum brake booster for an iBooster can operate the actuator to decelerate the vehicle independent of the braking pedal. In

13.5 Braking System

105

time of brake failure, either the ESC actuator or the iBooster performs to modulate the braking pressure to steer the vehicle during deceleration. According to Hyundai, its Integrated Mobis Electronic Brake (iMEB) controls the Front Collision Avoidance (FCA), Advanced Smart Cruise Control (ASCC), Electronic Parking Brake (EPB), and the ADAS functionality. It is a combination of the hydraulic brake pressure supply and the brake control system into a single unit. With the regenerative braking technology, it reduces weight and improves its braking responsiveness, while Nissan has come up with an e-Pedal. The e-Pedal lowers the necessity to shift from one pedal to the other to create a more comfortable driving experience. The model starts, accelerates, decelerates, and stops the vehicle by using only the accelerator pedal. By lifting the feet off the pedal, slows the car by regenerative braking. These brake blending technologies offer an independent electronic boost for powertrain systems that boost efficiency in the regenerative braking system. Tyre Monitoring From the tyre pressure monitoring system, information to the car dynamic controller adjusts the tyre friction to output the most efficient and optimize vehicle control. Moreover, it can detect tyre wear rates to generate feedback to tyre firms for valueadded service. We utilized sensors to detect temperature and measure pressure. Inside the vehicle, a radio signal receives this piece of information and sends it to the mobile phone by Bluetooth. An app on the phone advises the driver on the condition of the tyre. An improvement is to use rubber-based sensors to monitor temperature and tread depth. There is time delay improvement on driver alertness over the TPMS method. Furthermore, tyre pressure monitoring allows us to monitor the tyre to change the tyre contact with the ground for optimal grip in uneven, wet, and slippery roads.

13.6 Vehicle Charging System Today, fast electric vehicle chargings evolve around the world. The electric vehicle requires an AC charger. The onboard AC charger consists of two stages. The power factor correction PFC stage and the DC-DC power conversion stage. Automakers emphasize raising the power factor and lowering the DC-link capacitance for the output of the charger. Thus, they are looking into correcting the harmonic distortion for improved power efficiency and correcting the power factor for good power quality. Just 15 min of DC charging with a 150 kW charger allows a distance of 200 km drive for the electric vehicle. High power 300 kW charging utilized a DC voltage of up to 1000 v and a maximum current of 500 A. However, a global charging standard is required for all kinds of battery-powered electric vehicles. Fast DC charger requires liquid cooling solutions to deliver high power density charging. The AC-DC and DCDC converter convert the incoming 3 phase AC mains into the required DC voltage for the charging vehicle. The utilization of high-speed power electronics devices such as the CoolMOS and CoolSiC MOSFETs enable high-frequency switching design for

106

13 Electric Vehicle

Fig. 13.8 Vehicle charging station (Courtesy of www.iot-now.com)

high power density and energy-efficient switches with low losses. Billing, vehicle, and battery status all go through a secured data channel to provide its information to the e-mobility user. The OPTIGA™ TPM2.0 security controller enables a secure data transmission for remote maintenance on EV charging stations. Figure 13.8 shows the off-board charger, such as the vehicle charging station.

References 1. https://www.embedded-computing.com/articles/resolving-power-issues-in-smart-charging-aut omotive-systems 2. https://www.automotive-iq.com/chassis-systems/articles/innovative-strategies-for-blendingin-evs-regenerative-braking-systems?utm_campaign=AUIQ%20NL%20Week%2050%202 018&utm_medium=email&utm_source=internalemail&MAC=AUIQ1-OMGZLMV%7C1DP2FLXS&elqContactId=3415144&disc=&elqCampId=37394

Chapter 14

Vehicular Advancements

TRI company has advanced its development in the autonomous car through many stages. Their vehicle features a new LiDAR system from Luminar, which provides a large sensing range, a much denser point cloud, and a reconfigurable field of view. It means it can concentrate on the measurement points where sensing needs most. Additionally, TRI has created a second vehicle control cockpit on the front passenger side with a drive-by-wire steering wheel and pedals for acceleration and braking. The system allows the engineers to transfer vehicle control between the driver and the autonomous system under various challenging scenarios. Furthermore, machinelearning algorithms that can learn from expert drivers to coach novice drivers are in development. Voice control, virtual positioning system and emission reduction are some of the topics discussed in the chapter.

14.1 Vehicular Compliances and Expectations Our present car technology comes with features such as navigational advice, tyre pressure monitoring, remote keyless entry, adaptive lighting control, and eCall or ERAGlonass, and personal occupancy detection systems PODS for Air Bag Systems, are not sufficient. Soon cars will be equipped with infotainment, traffic warnings, services information, as well as safety features. Self-driving cars have to adhere to their hardware and software compliances. Each stage of the product development cycle starts a new impact on the standard of the product. The toolchain requires hardware and software integration, safety requirement, traceability, testing, verification, and validation to end-of-life-cycle progress. These development processes require risk mitigation and ISO26262 compliances. Having the ISO26262 functional safety standard arises the classifications of the Automotive Safety Integrity Level (ASIL). A

© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021 T. S. Ng, Robotic Vehicles: Systems and Technology, https://doi.org/10.1007/978-981-33-6687-9_14

107

108

14 Vehicular Advancements

DOER-CHECKER principle follows to check for the safety of the vehicle’s component, such that they adhere to the ISO26262 objective at ASIL D. ASIL D is in situations when there is a high probability of occurrence. The complaint states that the controllability by the driver is lower than 90%, where it leads to fatality in accidents. The tool serves as assistance to record how the automotive hardware and software design risks mitigation to the acceptable system and component levels. The level of complexity increases as the safety level increases from ASIL B or C to D. Inadvertent braking, front and rear cameras, radar, EPS, and inadvertent airbags are examples of ASIL D compliant for the real-time decisive driver safety-critical concerns. Accelerators performing sensor-processing functions require ASIL B complaint. General CPUs have their ranges of ASIL requirements from B to D. Jama software can review expectations to key stakeholders to create better products faster. Therefore, it is a useful tool for the automotive industry to stage and document its processes and developments to compliances before finished products. Several requirements are involved in the autonomous functioning of the vehicle. Device interfacing, scalability, SWaP-C (size, weight, power—cost) constraints, ISO26262 standards, ASIL-C (Automotive Safety Integrity Levels) safety-critical functions, withstanding vehicle’s harsh environments (AEC-Q100) qualification and certification, and evolving performance challenges are critical to the realtime processing of the vehicular data. AUTomotive Open Systems ARchitecture (AUTOSAR) shares the world with its leading vehicle architecture. AUTOSAR facilitates coding safety, security, reliability, and portability. AUTOSAR C++ 14 coding standard helps to comply with the ISO26262 functional safety standard for road vehicles. Functional safety in the automotive industry needs ASIL (ISO26262) certification for control systems, telematics, HUD, and instrument cluster. The standard highlights the electrical and electronics functional safety aspects in the development process and final product. It defines the absence of unreasonable risk due to hazards caused by the malfunctioning of the electrical or electronic systems. Risk management handles the software function safety. We mitigate the risks following function safety requirements and system design. The tool qualification package supports dynamic data flow coverage, unit or low-level testing, programming rules check, and assembler and structural coverage analysis in alignment with ISO26262 requirement standard. Failure intentions are communication failure, unintended writing of memory and registers, and wrong processor execution time. AUTOSAR watchdog stack allows deadline and flows control monitoring with live supervision to detect system failures. Memory size increased, and processor complexity leads to errors, which accumulate in the caches, registers, memory, and other processing peripherals. The existence of these errors is due to cross talk, EMI, and cosmic rays, resulting in the bit, byte, word, row, and column flips. ISO 21434 ensures all engineers have arrived at a secure electronics architecture with the documentation for all their modeling and verification activities. The Chinese enterprise Baidu is capable of driving level four car autonomy. However, with the number of accidents involved in the developing stage of the HAV and AV, goals need to be set to ascertain safety in the autonomous vehicle industry. Especially, to reach level 5 vehicular autonomy, more stringent sets of criteria have

14.1 Vehicular Compliances and Expectations

109

yet to be finalized. SOTIF (Safety of The Intended Function) names the PAS 21448 ISO intends to counteract the accidents. The underlying statement spells the validation and verification of systems with complex sensing and algorithms, whose limits in performance could cause safety hazards in the absence of malfunction. It will serve to demonstrate the functional safety and the complexity of the systems involved. The IQPC (International Quality and Productivity Centre) will guide on how to apply safety requirement completeness for artificial intelligent controlled vehicles on the road. With these, we can bring autonomous vehicle simulation and testing to a higher level of safety consciousness with the aim of mitigating road accidents. Car manufacturers adopted verification and validation challenges before vehicle mass production. Early detection of parts failure brings a positive impact on the resulting economy and manufacturing business. Software architecture standards, interoperability of modeling tools, virtual ECUs with scalable fidelity, MDD the generative model-driven development workflow, and test framework standards are solutions to the vehicle attestments. The adoption of these solutions helps to save time, cost, and improves the quality of the automobile.

14.2 Traffic Management While reports show an estimation of 91% of car accidents are due to human negligence. Autonomous vehicles can reduce the number of road accidents by 90%. FMS, in short, for fleet management solutions optimizes road management to support operators and drivers for much safer operation. A fleet management operator can track multiple autonomous vehicles (commercial or military) using an intuitive web interface from any authorized computer. Constant readings feedback of driver behavior and continuous monitoring and assessment of the fleet condition aids to gather proof of any violating rules to maintain environmental regulations. Prediction of vehicle conditions gets the fleet prepare for any failures in advance. Thus, we can manage vehicle failures leading to traffic breakdown by executing solutions and remedies beforehand. In ports, LiDAR sensors employed in the profiler measurement system handle traffic-free flowingly. Two LiDAR sensors mounted above the road on either side scan the 2D profile of cars passing through. They scan the side and top of the vehicle. Another third LiDAR installed above in the middle of the road scans the front and top of the car. It also calculates the speed and length of the vehicle. A combination of 3D point cloud details altogether computes the dimension and classifies the automobile. The system lane profiler then assigns the car to the lane specified for loading and unloading.

110

14 Vehicular Advancements

14.3 Emission Reduction As we can see, predictive maintenance and energy analysis bring about efficiency in driving the car. We implemented filters to reduce the toxic exhaust gases produced from diesel, gasoline, and other fuel mixture. As the electronic temperature sensors bring about the thermal protection of vehicle components, the differential pressure detector [1] senses the differences in pressure inside and outside the vehicle to activate a predefined pressure delta to initiate a regenerative process to burn off the PM accumulated in the filter. The greenhouse gases include N2 O, methane, CO2 , NMOG, and NOx. Before the full implementation of a total electric vehicle, present fuel technology in driving cars intends to filter off its emission to its notably reduced limits. It is acceptable for nitrogen oxide NOx emissions reduction for diesel to 80 mg/km and carbon dioxide CO2 emission reduction to 95 g/km. However, the automobile society has decided in the future years to come to reduce CO to 4.2 g/mi and NOx to 57.14 mg/km. Moreover, NMOG + NOx emissions shall be limited to an acceptable level of 30 mg/mi. The European commission 2016 states that there will be 12.5 g CO2 emission saved per 100 km travel when we reduced the weight of the vehicle by 1 kg. The wiring harness weighs about 30 kg in a car. By using lightweight electric wires, we can reduce the car weight by about less than 10 kg. Plastic bearings reduced the vehicle weight. It can also provide better noise control by absorbing vibrations, withstand high temperatures, and resist corrosion. The strictly new imposed by 2025 for emissions is 70 g/km. Rotational inertia such as unsprung tyre mass and cast-iron brake discs weigh more than 40 kg in a car. It accumulates a waste of about 0.2 litres of fuel per 100 km and causes carbon dioxide emissions. However, aluminium disc brake saves 20 kg on weight, thereby saving fuel consumption and reduces emissions. Furthermore, its lighter disc weight improves acceleration and deceleration. Automakers apply Plasma Electrolytic Oxidation (PEO) coating over aluminium alloy disc brake rotors to reduce wear, corrosion, and thermal stress. High emissivity coatings improve radiative heat loss. However, the alloy softens when no heat dissipation occurs in the fully coated rotor discs. To remedy the situation, automakers developed ventilated rotors with the coating applied to the rubbing surfaces only.

14.4 Future of Driverless Car In the cities, automated cars lead the way to our future in mobility. With the upcoming replacement of electric vehicles, there will be no need for a fuel tank in the car but a rechargeable battery. Moreover, headlights will no longer be necessary as infrared and radar sensors replace them. Primary changes will occur in the transportation industry. With the establishment of the autonomous vehicle, transport companies may be interested in holding services for ridings. The ride-sharing market may create a transport-as-a-service (TaaS) offer or a car-as-a-service (CaaS) offer to passengers

14.4 Future of Driverless Car

111

and riders. Car companies like Uber, Lyft, and others may host such transportation availability to customers at large. With the rise of the pay-as-you-go service, transport as a service (TaaS) will reduce the number of private cars on the streets. It will also do away with business, such as car mechanics, car dealers, automobile shops, fuel refilling stops, and car wash stations. On the other hand, car sharing reduces traffic and gains resource utilization, while ride-sharing increases congestion. Car sharing allows owners to have the opportunity to offset vehicle costs and earn revenues as they rent their cars to others. Future innovations include the usage of piezo plates to place underneath the roadway surfaces. With embedded platings, it creates energy when cars and trucks drove over them. Thereby, we can use the scavenging road energy to light up the front streetlights on the highway. There will be no more traffic police at the station on the streets. Neither will there be any traffic lights on the roads. Road policy and traffic infrastructure will change. Transportation cost reduces, and with the absence of traffic jams, makes transport more convenient. As a result, people and commuters will travel further out of the city to the countryside. Driverless cars will curb pollution and deliver goods more cheaply at your convenience. Thus, it creates a better, healthier living environment for people. There are relentless opportunities for nanotechnology in the automotive industry. Efforts in nanotechnology research may see its applications in miniature electronics, power train, battery, fuel cells, lubricants, catalysts, and energy management, wearresistant tyre, coating, and paint quality, self-healing paints, anti-glare windows and mirrors, lighter and stronger materials, interior comfort, hydrogen, and fuel-cellpowered cars, etc. Commercially high successful initiatives are in the Nanomaterials, Nanolithography, Nanotools, and Nanoparticles that bring their broad applications to automakers. Voice Control Automakers are coming to a stage where they required more adaptation and intuitiveness in the interface between software and human. Futuristic vehicles may come with voice-activated commands where the system recognizes and responses to a person’s voice to activate its control action in the car. The new Dragon® software integrates vocal recognition into the automotive for voice command and control system. Thus, drivers will be able to command the vehicle into safety in times of critical response incidents. However, responding to the texts in speeches span a range of understanding different pitches, dialects, and accents. It includes reading common idioms and verbal oddities to interpret. Also, voice assistants must be able to filter off all the background noises in the vehicles to receive the correct interpretations. Virtual Positioning System The virtual positioning system (VPS) drives the future of navigating techniques. Together with the existing navigating and positioning features, the VPS offers a live scenario of the moving objects in the surrounding environment. The VPS is an integration that incorporates the visual cue, 3D, depth, and phone sensors, together with the camera sensing system. The VPS software tool decodes camera input, which

112

14 Vehicular Advancements

captures images of the road. With the encoding technology, we can trace the location coordinates, angle, height, and distance of the three-dimensional vehicle or any other obstructions on the road.

Reference 1. https://www.continental-corporation.com/en/press/press-releases/2018-01-30-hts-dps-119964

Chapter 15

Electric Mantis Robot

The chapter introduced the narration of the six-legged insect robot. Servo motors, batteries, sensors, aluminium materials, and microcontrollers built up the electronics and mechanisms of the robot. With programming algorithms in alignment with the robot’s walking illustrations, make the insect robot alive. With sensors, the robot becomes an autonomous walking machine. The program algorithms have the advantage of solving the robot’s unsymmetrical walking methodology for the refinement and re-calibration of the insect robot. The robot can serve in rescue, survey, and reckoning missions.

15.1 The Electric Mantis Robot The six-legged robot named ‘Mantis’ is driven by electrical servo motors. That is where the name comes from, the Electric Mantis [1]. Eight units of the Basic Stamp chips control the robot. We utilized the BS2sx processors for the platform. The mantis robot has three degrees of freedom on each leg. Six Stamp chips each drives a single leg for six legs (Fig. 15.1). We programmed one microcontroller chip for the sensor input. Another is the primary controller IC for synchronizing the output motors and the input sensors. The control chip will determine the driving of the six legs with relevance to the input sensing sensors. Two infrared sensing devices connect as input sensors. The infrared sensor connects to each of the two long whiskers or feelers. They will trigger a detection when they detect any object or obstruction. We fixed these sensors in the best location to be at the ends of each side of the whiskers. The long feelers comprise light and small diameter non-magnetic metal rods. The ends of the left and right rods curved to avoid any sharp edges. An electrical circuit diagram for the robot mantis shown in Fig. 15.2 clearly illustrates the control board layout. The first chip on the left represents the synchronizing chip, and the sensor chip on the right will detect the input signals from the feelers. The bottom six chips are the servomotors’ driving © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021 T. S. Ng, Robotic Vehicles: Systems and Technology, https://doi.org/10.1007/978-981-33-6687-9_15

113

114

15 Electric Mantis Robot

Plan View

Front View

Side View Fig. 15.1 Mantis robot

processors. We used eight numbers of the 470 ’ resistors as the sensing resistors. Each for driving the processors as well as for synchronizing signals between two BS chips while two other 220 ’ resistors connect to extra input sensors. The ports P6 and P7 of the reflex chip connect to the IR sensors whereas P8 to P15 of the synchronizing chip connect to time the signals for the rest of the chips. Port P15 of each of the processors communicates with the timing chip. All the eight BS2sx units require power to function although they are not shown in the circuit layout, as it is a standard requirement. For the servo driving processors, we left out the 18 servomotors’ connections as the designer can select the ports used. Only a port needs to connect for driving each servomotor. Altogether, four strokes of motion will take the ‘mantis’ to move forward one step. For each leg, one servomotor will control the vertical movement, the other servomotor executes the horizontal movement, and a third servo motor to stretch the second joint of the leg. Pictures of the different views (refer to Fig. 15.3) for the legs’ sections of the mantis robot illustrate the purpose of the three servomotors. In total, we required 18 servomotors for the actuation of the six legs of the ‘electric mantis’ robot. Each of the six BS2sx processors drives three servomotors for each of the legs.

15.1 The Electric Mantis Robot

Fig. 15.2 Electric mantis circuit layout

115

116

15 Electric Mantis Robot

Fig. 15.3 Sections of a robotic leg

Fig. 15.4 Stretched legs

15.2 The 3 Degrees of Freedom Mechanism Let us look at the robotic legs of the mantis robot. The legs of the mantis robot have three degrees of freedom. They are the front or back motion, the up or down motion, and the side stretch motion where the three servomotors are required to maintain the position of a leg of the mantis robot. The thin linking rod together in each leg contributes an additional reflex. That is the stretching reflex of the leg. The end joint of the leg branches outward from the main body of the mantis robot when the servomotor pulled that link rod. We fixed the center servomotor in each leg in

15.2 The 3 Degrees of Freedom Mechanism 1200

Different leg tip position values

1450

117

Different heights of the robot

Fig. 15.5 Legs at two different heights

such a way that it rotates the front and back motion of the leg. The servo horns for the motors laid below the robot where they attached to the main branch joints of the legs. The third servomotor has a linkage attached to the bottom of the leg to pull the leg up in the vertical position as the servo rotates. The mantis robot legs’ stretching ability enhances and improves the capability of the six legs walking mechatronics robot. We photographed a section of the mantis leg to have a better understanding of the descriptions. The stretched legs at different unit values of the servomotors are as shown in Fig. 15.5. The smaller units stretchings’ of the servo legs are at a higher level making the mantis robot stand up higher. A larger variable enables the mantis to lower. Therefore, the third degree of freedom for the legs increases the robot’s flexibility. The vertical height of the mantis robot changes by stretching the leg tip-positioning variable, ‘w6’ in the program. In the program, all the variables for ‘w6’ can be calibrated to 1450 units for the servos to lower the mantis’s body height to crawl at a lower level on the ground (Fig. 15.4).

15.3 Robot Calibration We can perform calibrations for the legs of the mantis to make it walk with alternately opposite (legs) similarity. In doing so, the robot will walk in a straight path direction. We need to adjust the ‘w5’ variable in the programs. The parameter determines the front/back horizontal movement of the robot. As shown in the next figure, the robot mechanism powered up to stand firm in the following positions for the six legs. Subsequently, it will start to move forward with the forward strokes. As you can see, the left and right varying units for the servomotors are aligned 180°’ rotational symmetric as in Fig. 15.6a. The positioning angle of the right front leg may not be able to hold the weight in front. It creates a possibility of falling forward when standing or powering up. The unsymmetrical calibration also makes the robot walk with a side curve motion towards the right. To ensure a firmer standing position, we calibrated another set of the ‘w5’ variables for the right legs. The mantis robot can stand more firmly on the ground with the right front leg standing further up to the front. Figure 15.6b shows the re-calibrated mantis supporting legs at starting up. Both the pictures show the calibrated units for the servomotors’ “w5” variables for the horizontal motion. The re-calibrated legs’ settings for the robot make the robot walks in a straight path. The symmetrical alternate legs’ movements ensure the

118

15 Electric Mantis Robot

2425 2285 2215 2075

2035 1965 1895 1825

1760 1620 1550 1410

1410 1550 1620 1760

8.28°

39.6°

3.6°

1825 1895 1965 2035

3.6°

2075 2215 2285 2425

8.28° 39.6°

(a) Calibration 1200

2425 2285 2215 2075

1340 1410 1550 39.6°

2035 1965 1895 1825

1760 1620 1550 1410

23.4°

3.6° 1825 1895 1965 2035

3.6°

8.28°

44.28°

(b) Re-Calibrated Fig. 15.6 Calibration of supporting legs, a calibration, b re-calibrated

2140 2280 2350 2490

15.3 Robot Calibration

119

mantis walks in a straight line. The sets of the units besides represent the motions of the positioning points for each of the legs. The dotted lines are the center of the servomotors calibrated to be at 1.5 ms (1875*0.8 us) for each of the legs to branch out orthogonally from the body of the robot. When the servos rotate to the 1875unit values, we removed the servo horns and fixed them back. The idea is to release the tension of the link joints of the legs as they all connect to the servo horns. We adjusted the servo horns in such a way that the legs branch out perpendicular to the robot’s body in plan view. This alignment centralized all the six legs at 90° towards its body. So, this serves as a reference point for calibrating all the positionings of the legs. Each leg swings from front to back for a length of 350 units for each servo. Therefore, it travels only for a horizontal angular distance of 25.2° (0.072 × 350) for each leg as each pulsating output represents an angle of 0.072° for the servomotors. The horizontal distance travel refers to the variable ‘w5’ in the programs. Some legs begin at a positioning angle higher up, and some legs lower. The first left leg starts at 39.6° up while the right last leg is at this angle also at 180° rotational symmetrically. The middle legs’ four positions’ calibrations move at a shorter distance to prevent them from hitting their adjacent legs. The difference between each adjacent point is only 70 units but for the front and hind legs, some motions have double the distance of the 70 units travels because their adjacent points have a difference of 140 units travel distances. Figure 15.6b shows that all six legs have near to symmetrical calibration. We only changed the right legs to the other set of three new values. Now, the right front leg has a larger angle from the start. The six legs positioning values for this diagram can prevent the robot from toppling over towards the front. Therefore, it is standing in a firmer and more comfortable position. The more symmetrical values for the six legs can also compensate for the curving movement for the robot of Fig. 15.6a as it walks. However, the figures positionings are near to the servos’ safe range of operation, which is more likely to cause malfunction of the servomotors. Hence, we try our best to have a symmetrical walking robot as well as not to exceed beyond the safe operating range of the servos. To do this, we can download the same programs as in the listings for the mantis robot’s six legs with these sets of calibrated positioning values into the robot BS2sx processor chips. Remark: Angular Derivation [Each leg will move one angular degree in 2 ms/180 = 0.011 ms. The number of unit pulses required to turn one degree is 0.011 ms/0.8 us = 13.89. A unit pulse from the BS2sx travels 1°/13.89 = 0.072°].

15.4 Mantis Robot Walking Methodology The mantis robot follows a specified set of walking procedures. For a start, we calibrated the servomotors with the servo horns centered at 1875 pulsing units. For example, the six legs calibrate to be adjacent to the robot body. With the pulsed units starting at the mantis legs’ centered position, we can calibrate the legs to whatever positionings we like, as seen previously in Fig. 15.6. We give a clear indication of the desired calibrations of the legs at the starting position. Figure 15.7 (Left Legs Motion

120

15 Electric Mantis Robot

Sensing point

Forward Movement

Forward strokes

Left Front Leg (Hit-Right)

2075

2215

2285

1965 1895 1825 1410

1550

1620

1760

1410

1620 1550

1760

750

Fig. 15.7 Left legs motion methodolgy

1410

1550

1620

1760

1410

1620 1550

2035

750 1825

Left Hind Leg (Hit-Right)

1250

Left Hind Leg (Hit-Left)

1760

2425

2075

2215

2285

750 sensing point

Left Center Leg (Hit-Right)

2035 1965 1895

2035

1965 1895 1825

1965 1895 1825

2035

500

Left Center Leg (Hit-Left)

2425

2075

2285 2215

2425

2075

2285

2215

sensing point

2425

1250

Left Front Leg (Hit-Left)

Backward strokes

15.4 Mantis Robot Walking Methodology

121

Methodology) depicts the walking methodology of the mantis robot. In the figure, all the walking strokes in the programs of the mantis robot are illustrated. The figure’s illustrations correspond to Fig. 15.6b, where the legs are lying in almost symmetrical positions. The forward cycles of the mantis are the anti-clockwise strokes as shown in the diagram with the mantis forward direction (Fig. 15.7). All the clockwise directional strokes refer to the retreat movements. In Fig. 15.8 for the right legs, all the clockwise triangular cycling strokes make the mantis moves backward too. You may notice the travel distance for a cycle is only half a triangle. That is to prevent any servomotors from knocking against each other while moving. As the legs move, the adjacent legs beside move also. The angular calibrations of the cycling movements prevent the legs from hitting one another. A small space or angular allowance is necessary between the legs of the mantis. An appropriate calibration is for the front legs to be calibrated further forward at starting standing position. Likewise, we calibrated the pair of the hind legs to stand further back as the initial starting point. That left the pair of middle legs to have the freedom to move about without touching any of its neighboring front and back legs. The primary purpose of the calibration of the legs’ positionings is to ensure the servo motors do not knock against each other while walking. The three bulky servos attached to each of the legs move together with the legs. Therefore, they had to stay a proper distance away from each other during the phases of walking. As seen on each of the figures, at the top indicates an example of the forward direction. All the dotted lines indicate the forward motion for all legs. The numbers are the units that the servos travel. That is, they are the decision determining stops in the programs. The full stops or dark circle points in the figure represent the positions of the strokes of the legs when the mantis robot senses an obstacle. The mantis will retrace and then turns. To our knowledge, each cycle consists of three strokes (2 stages) of movements. One of the strokes is the horizontal movement, and the other stage manipulates the upward and downward movements together. Thus, a continuous cycle of strokes in one direction moves the robot in one direction. Once it hits anything, the robot will move backward 2.5 cycles (5 stages) and turns two cycles (6 strokes). The small loops in the figure depict the turning motions. The total view of the Figs. 15.7 and 15.8 illustrate the left and right leg movements of the mantis robot. Besides moving in the dotted lines with the forward strokes, also it shows the strokes for the left or right hit of the mantis robot (undotted lines). Let us analyze the left front leg when the robot detects an obstruction on the left. The left and the right diagrams analyze the two different positionings points of the leg when the sensor detects an obstacle. The left diagram refers to the leg of 2425 units’ positionings value when the robot hits any obstructions or detects an obstacle in front. The right diagram shows the left front leg to be at 2075 units’ positionings value when the robot hits something. Both diagrams show the same backward motion strokes when it detects an obstacle. The heights of the strokes are indicated vertically. For the left front leg (hit-right) motion, the strokes indicate differently for each diagram. The left diagram shows the movement of the backward cycle indicated by the big loop (5 stages). Next, it will move with the small cycle (4 stages). For the right diagram, the leg is at a different position (2075 units) when it hits an object.

122

15 Electric Mantis Robot

Sensing point

Forward Movement

Forward strokes

Right Front Leg (Hit-Right)

1550 2035

1965

1825

1895

500

Right Hind Leg (Hit-Right)

Fig. 15.8 Right legs motion methodology

2490

2280 2350

2140

2490

2280 2350

2140

2490

2350

2280

2140

2490

2350

2280

2140

500

1250

Right Hind Leg (Hit-Left)

1895 1965 2035

1825

2035

1895 1965

1825

2035

1895 1965

1825

1340 1410

1200

Right Center Leg (Hit-Right)

750

Right Center Leg (Hit-Left)

1550

1340 1410

1200

1550

1340 1410

1200

1550

1340 1410

1200

500

1250

Right Front Leg (Hit-Left)

Backward strokes

15.4 Mantis Robot Walking Methodology

123

The program goes for the same movement of the backward loop for 2.5 cycles. After that, it continues to rotate with the small loop for four stages (downward stroke at 2215 units). You may analyze the rest of the diagrams with the same analogy. Fig. 15.8 is the right legs’ motion methodology. With the two figures included, we can study and understand the combinations of the six legs’ movements. We programmed each of the six programs for the legs according to the methodology. With the idea that when hit, the robot retraces and turns away from the obstacle. For instance, when the robot hits the right, it will retrace and rotates turning to the left. When it hits the left, it will move directly backward and then rotates this time to avoid the left. These diagrams in the figures illustrate the motion of turnings asymmetrically. When hit right, it will turn more to the left. In contrast, when hit left, it will avoid and curve a smaller angle towards the right. That is to counteract the asymmetrical movement of the calibrated robot, which tends to curve slightly to the right. The simple diagram of the figures can explain clearly the complicated motion methodology of the mantis robot. It shows the mantis moving direction when it hits an object. The left side of the picture shows that when the robot hits an obstruction on the right, it retraces backward and then rotates toward the left. The next directional path on the right shows the tracks when it hits an object on the left. Thus, both figures show the methodology. To summarize, the straight backward path is to retrace 2.5 cycles in the program while the turning route is the two cycles rotating motion of the robot in avoiding the obstruction. We designed the small inner triangle cycles to counteract the curving movement of the robot as it calibrates itself very near to symmetricity. However, we replaced the inner loops with the outer backward cycles if all the six legs calibrate to exact symmetricity (By comparing the angles on both sides of the robot at startup. Let us look back at Fig. 15.6. Figure 15.6a demonstrates the robot will turn at an angle of 20.16 deg towards the right, whereas in Fig. 15.6b, the turning direction reduces to 3.87 deg.) That proves the flexibility in the design of the programs according to the programmer. The programming methodology for the programs in the listing follows the path direction with the effort of turning more towards the left when the mantis detects an object on the right. The mantis strokes as seen in the flow chart F.C 15.1 are the procedure for the motion of the robot. The flowchart includes the direct forward strokes apply to the six legs forward movements of the robot. It is the same as reference Fig. 15.7 for the front left leg movement methodology as shown in the dotted line motion for the big triangular cycle. The ‘w5’ variable values only apply to this leg of the mantis robot. It is different in other legs of the robot. The flowchart also includes the backward and rotating motion when it senses any obstructions in front. Thus, it represents all the strokes in Figs. 15.7 and 15.8. To understand further, next is the complete movements of the methodology with the programming techniques applied (Fig. 15.9).

124

15 Electric Mantis Robot

Fig. 15.9 Robot retrack paths

15.5 Programming Methodology The six-legged walking motion can be complicated to understand without the aids of the pictures to understand the structures of the legs. We programmed the robot legs’ movements in the robotic framework into each of the different BS2Sx chips, which control its relevant legs. Flowcharts for the movements of the legs are included to study the programming methodologies. First, we had drawn an electric mantis feeler flowchart F.C 15.2. We can download the sensor’s program into the BS2sx processor for the sensor input. The program keeps on looping to carry out the sensings of any obstacle as the mantis continues walking. We make use of the input ports 6 and 7 of the sensor chip for this purpose. A signal pulse other than a nil signal will send to the timing chip when the left or the right whisker senses something. The variables B6 and B7 served the function. The ports P12 and P13 of the sensor chip send the variable signal to the P15 and P14 of the timing processor respectively. The timing chip will detect the ‘B6’ or the ‘B7’ signal to coordinate the behavior of the robot. In response to the infrared light activation of the sensor, the robotic legs will change direction to move backward. The timing flow chart F.C 15.3 illustrates the procedures for the timing program. As shown in the timing flowchart, there are three different settings of the variable w2 for the mantis robot to decide. It depends on whether the robot detects nothing, the left, or the right. If there is an obstacle in front, the program will maneuver the mantis to avoid the obstruction as highlighted by the ‘g’ number of loops, else the robot will continue to move forward. There is a maximum waiting time for the receiving ends of the other six BS2sx processors. This maximum pulse in waiting time varies depending on the different types of BS2 chips used. The maximum waiting time for the BS2sx processor is 52.428 ms. Our maximum pulse width detected is approximately 7000 units, which is much lesser than the range of 65535 units (52.428 ms) it can go. However, as the output timing pulses keep on looping, the receiving ends received units several times more than the actual w2 units that pulsed out. The sensing resistors also played a part in the differences in their values. For example, from flow chart F.C 15.3, we know w2 is 650 units when pulsed out. However, the receiving ends will receive more than 5000 units each time w2 pulsed out. We can verify the issue by debugging. After studying the timing and reflex chips, we move forward to know how the servomotors manipulate the legs of the robot and understand the walking sequence in real-time. Firstly, we start with the overall flowchart, which is the electric mantis main flow chart F.C 15.4. Upon startup, the program resets all variables and positions all

15.5 Programming Methodology

125

Pulsout leg w7 to 2850

start

In15>0? Yes

No pulse out to all six legs

Slantup:calculate & pulse w5 & w7 Check for small step: a=28 & w5=2215? No

Yes

w5=2425 or 2075? Yes down:calculate & pulseout w7

No

Yes

W7=2850?

Check for small step: d=100?

No

Yes Slow:Delay timing synchronise

No Outpulse: all legs’ servos

In15>0? Yes Backswing:calculate & pulseout w5, w6 & w7

Yes

Check for small step: a=28 & w5=2285?

No Check w5=2425 or 2075? Yes Flow Chart 15.1 Mantis forward strokes

No

126

15 Electric Mantis Robot

Flow Chart 15.2 Electric mantis feeler

six legs of the mantis to stand firm on the ground. It then checks for the timing pulses to perform the slant up and straight down motion, follow by leg swing motion to move forward a step. We used each timing pulses individually for the synchronization of the swinging movement and the slanting up and down motion. If either of the side feelers senses an object, the robot will retreat five stages and rotates four stages. It will turn the robot away from the object and then moves forward again. The block connector B or thw sub-flowchart F.C 15.5 explains the running programs for all the six moving legs. It starts at connector B and ends at connector A. Programmings involved these steps when the sensor detects an obstruction in front. You may refer to connector B in the primary flowchart for the number of stages it moves upon a single detection of any obstacle. In combination with the signal from the sensor chip, the sensed signal traps in the internal loop for ‘g’ number of times as from the timing program or flowchart. The delay programmed in this inner loop is

15.5 Programming Methodology

127

start

Pause delay for legs to be in uproot position

Set w2 as timing pulse:w2=950 (approx.> 7000 units) Yes

Set pulse value w2=350 (approx.5000 units)

Check left sensor: B7>0? No Check right sensor: B6>0?

set g=0

No Pause delay for 1 stage

Signal pulse to the six bs2 chips (50 times)

Increment ‘g’ by 1 Yes

Signal pulse to the six bs2 chips (50 times)

Flow Chart 15.3 Timing flowchart

Pause delay for 1 stage

g>9? No

128

15 Electric Mantis Robot

start

reset variables: cnt,a,d position legs up & down to stand firm C No Timing pulse? Yes B1 left sense?

A1 cycle motion

slantup & down

Timing pulse?

No

Yes

Yes retrack 5 stages

retrack 5 stages

rotate 4 stages to the right

rotate 4 stages to the left

No

Yes B2 A2 swing legs horizontally

Flow Chart 15.4 Electric mantis main flowchart

right sense?

Connector B

No

15.5 Programming Methodology

129

B1/B2

No

Cnt5?

Increment cnt by 1

No Cnt>5?

Yes Re-specify variable a & d to small values Set speed b to -ve forward value

No

Yes Set speed b to –ve backward value

Set speed b to +ve forward value

A1/A2

A1/A2 Flow Chart 15.5 Motion settings for alternate legs (Connector B)

Set speed b to –ve backward value

130

15 Electric Mantis Robot

to allow enough time for the movement of the legs to complete one stage. One stage refers to the horizontal swinging motion or the slanting up and downward motion. Later you will know that the pause delay in the timing program will be different for different speeds and distance the legs travel. The flowchart is the same for the three alternate legs of the mantis. That is, the left front, the left back, and the right middle legs all go through the same loop of the settings in the flowchart. The other triple legs, where they are mirror alternate legs, go through the other path of settings. The flow chart illustrates the programming setting techniques for all six different legs. The six individual programs for the legs all refer to this similar procedure of the flowchart. Next, we come to the cycling motions of the mechatronics mantis. The flowchart refers to all the legs of the mantis robot. The w5 value indicates the horizontal positionings of the legs. Each leg has its desired w5 positioning values. However, their programming techniques are the same. You may refer to the full cycling motions flow chart F.C 15.6. The flowchart is the sub-cycle motion block of the main flowchart. The moving mechanisms of the mechatronics mantis refer to this flowchart. The flow procedure checks for the timing pulse and decides whether to swing or slant up the legs of the mantis. It will depend on the desired position w5 reached. It also depends on the setting of the variable a.

15.6 Autonomous Mantis Robot Most analyses of the mantis assume a symmetric system of a six-legged walking robot. In real-time, the mechanisms and cooperation of the mantis robot may not function as a symmetric walking robot in the timed sequences. Depending on the stiffness of the aluminium legs fixed, the legs’ location calibration, the alignments of the servo’s mechanisms, and the different characteristics of each servomotor, a symmetric walking robot for the six legs mechanism may not be a reality. We had to fine tune and adjust the mechanism by software programming to make the robot walk in a symmetrically sequential manner. That is where we can change some parts of the programs to correct the misalignment in driving the robotic mantis mechatronics system. Take for example the three legs of the same motion may execute at the same instance, but a leg might be out of adjustment and not touching the ground at the same time as the other two legs. That creates a cripple walking mechanism resulting in the robot walking in the curve manner when the program controls it to walk in a straight line in the forward direction. With the background in the mantis robot timing and walking cycles, we can go further details to understand how to modify the programming for different walking techniques of the robotic mechanism. The legs of the mantis robot are coordinated to perform synchronized walking motions for alternative legs. However, the mantis walking techniques may vary depending on the user (programmer). We can adjust the different speeds of the walking robot. Besides, we can program the mechatronics robot to walk at half the height, or the mantis robot

15.6 Autonomous Mantis Robot

131

Check for presence of timing pulse IN15>0? pulse out to all six legs

Yes

Confirm the pulse w2>7000?

No

B1/B2 No

Yes A1/A2 Reset all variables: cnt,a & d to normal values for no sensor detect Set speed b to +ve value

Check for ‘a’ & desired w5 reach?

Yes

Slantup: calculate w5 & w7 & pulse ‘a’ no. of times

No Check for w5 far left or far right reach?

No

Yes down: calculate w7& pulse ‘d’ no. of times =

Backswing: calculate w5, w6 & w7 & pulse ‘a’ no of ti es

Check for presence of timin ulse

pulse out to all six legs

No

Yes

No Check for w5 far left or far right reach? Yes

# Note: w5 value is different for all the legs

Flow Chart 15.6 Cycling motion flowchart

132

15 Electric Mantis Robot

Table 15.1 Timing units for different performances Programs functions

Timing timing Full height (walking)

g—for next loop

9

Pause delay to complete 1 stage

2400

i—for next loop

50

Half height (crawling) 9

Speed adjustment (2 times) 9

Half of 2400

Half of 2400

25

25

can control to crawl lower. The table next shows the differences in programming to perform different kinds of functions. The speed of motion is interchangeable, by modifying the time it takes to complete one cycle of the walking movement. That is, we observed one cycle of the legs’ motion when setting different kinds of speed. Similarly, we can program the mechatronics robot to crawl also. It will still be walking but at a lower height. We can change the timing variables in the timing synchronization program to do this. We can make the mantis robot crawl lower by adjusting the height level the mantis robot stands up. Table 15.1 shows the timing or synchronizing for the walking as well as the crawling mantis. The mantis robot can also either walk or run by tuning to the different kinds of speed. The mantis robot walking at twice the speeds of the full height-walking mantis may have the same timing synchronization as the crawling mantis at half height. However, both programs are not the same anymore if the speed is to increase three times faster than the walking speed of the mantis at full height. The numbers of ‘i’ loops coordinate a stage of the robotic motion. The pausing delay of the timing algorithm is for the legs to complete one stage. The synchronizing time is faster for the algorithm on crawling at half the height and the double speed increment program, as the mantis takes only half the time to complete each stage of the cycle. You may notice that while the mantis is walking upright at full height, the legs take a longer time to complete the cycle of motion for each step. That is because a larger circle is required to complete the cycling motion. The slant up and backswing movements during walking at lower height or crawling requires two times the variable “b” to make up the same walking distance per stage motion. The legs must extend further to lower the height of the robot when adjusting to lower the height for the crawling function. Therefore, ‘w6’ variable is set to different values when performing the crawling movement. Additionally, the smaller variable ‘w7’ of the legs lower the mantis body. Table 15.2 shows the variables and delay time for the six legs mantis robot programs. Each program, for example, the full height program refers to the six programs of the mantis’s legs. The pause delay is the time to complete the smaller loop of the motion for the mantis’s legs. The variable ‘b’ represents the adjustable speed of the mantis robot. The other variables d and c are the parameters for the swinging and slanting up movements of the legs. We utilized these variables in the programs for the six legs mechatronics mantis. Thus, the speed is adjustable besides the height of the mantis robot. A total supply voltage of dc 7.2 V is enough to drive the mechatronics mantis.

15.6 Autonomous Mantis Robot

133

Table 15.2 Program parameters for different performances Programs parameters

Six legs Full height

b

Half height

5

Speed increase double

5

10

a

According to each program

Half

Half

d

According to each program

Half

Half

c

350

350

e

6250/c

3750/c

6250/0.5c

350

During slantup

w5 = w5 + b

w5 = w5 + 2b

w5 = w5 + b

During backswing

w5 = w5 + b

w5 = w5 + 2b

w5 = w5 + b

Pause delay for small loop

1000 ms

1000 ms

Half

W7 (leg vertical down position)

2850

2350

2850

W6 (leg stretch)

1200

1450

1200

Nickel Metal Hydride (NiMH) rechargeable batteries are suitable for the purpose. We can fix six numbers of 1.2 V (nominal voltage), 3000 mAh rechargeable dc batteries on the mantis robot to carry along as the load. With the batteries lifted on top of the robot, the ‘mantis’ travels autonomously. Table 15.3 shows the three variables for programming the six legs of the mantis robot. Each leg has its different values for the whole robot to move. We programmed both the left and right side of the center legs to have some of the same varying values for the variable ‘w5’. The realization of Table 15.3 Robot legs’ variables Variables

W5

a d

Servo Legs Left front

Left hind

Right front

Right Hind Right Hind

Left center leg

Right center leg

2425

1760

1550

2490

1825

1825

2215

1550

1340

2280

1895

1965

2075

1410

1200

2140

2035

2035

2285

1620

1410

2350

1965

1895

W5 + b

W5 + b

W5 + b

W5 + b

W5 − b

W5 − b

W5 − b

W5 − b

W5 − b

W5 − b

W5 + b

W5 + b

42

42

28

28

28

28

70

70

70

70

42

42

150

150

100

100

100

100

250

250

250

250

150

150

134

15 Electric Mantis Robot

2425 2285 2215 2075

1980 1910 1840 1770

1675 1535 1465 1325

39.6°

14.4°

7.56° 7.56°

39.6° 14.4°

1325 1465 1535 1675

1770 1840 1910 1980

2075 2215 2285 2425

Fig. 15.10 Symmetrical calibration

the last second column of the variables in P15.3 refers to the left middle leg. Using the program P15.3, we can replace the values in the table for programming the other five legs. Referring to Fig. 15.6, our sets of calibrated units for the horizontal walking legs (w5) are uneven distributed across its robotic platform. In comparison with the newly calibrated mantis robot as in Fig. 15.16, we had reduced the uneven turnings of the front section (head) and the tail section (back) of the long robotic platform during walking. The fine tunings of the six legs calibrates to be in symmetrical form. The pair of the three alternate legs moves the mantis in a straight manner. We may suggest the w5 varying units in Fig. 15.16 to replace that of Fig. 15.6 to eliminate the curvy or side walking effect. However, we can still program it to retreat more to the left or right when backtracking to avoid trappings of the front obstacles. In another notion, to eliminate the side-walking effect while moving straight forward, we can divide the skeleton spine of the robot into sections. This adds flexibility to the backbone of the robot’s body. Besides, each divided section of the skeleton frame allows space or flexible rubber inserted in between throughout the spine. The flexible gaps in between each skeleton joints help the robot to maintain its flexibility during walking. Now, the mantis robot tends to move in a curvy manner instead of side walking for the uneven distribution of step distances or the unsymmetrical distribution of the walking legs in the program. Thus, we have achieved the naturarity and flexibility of

15.6 Autonomous Mantis Robot

135

the turning mantis robot during walking. Another method to reduce the side-walking effect of the robot is to use rubber material to build the skeletal spines of the robot to accommodate the same program according to Fig. 15.6. The more stepping distances it inserts on one side of the robot, the more it moves to the other side. The flexible body frame of the robotic insect is a step closer to mimic the flexibility of the real insect while moving. Mantis Programs ‘{$STAMP BS2sx} ‘P15.1 Mantis Feeler Program

‘{$STAMP BS2sx} ‘P15.2 Mantis Timing Program ‘8 s. synchronizing pulse for ‘electric mantis

136

‘{$STAMP BS2sx} ‘Electric Mantis: ‘P15.3 Left Center Leg

15 Electric Mantis Robot

15.6 Autonomous Mantis Robot

137

138

15 Electric Mantis Robot

Reference

139

Reference 1. Mechatronics Design And Robotics: Embedded Control, Wheeled & Insect Robots; Ng Tian Seng. LAP LAMBERT Academic Publishing, Saarbrücken, 2015

Chapter 16

Rotorcrafts Control Theory

In a rotor propelled drone, we can control the flying momentum by the attitude of the rotorcraft. The roll, pitch, and yaw are the attitude of a rotorcraft system. For vertical climbing of the rotorcraft, the altitude comes into place. The altitude control maneuvers the ascending and descending of the drone. In modern control theory, we introduced the state-space equation. The advantage of studying the statespace equation is that a single controller design can replace the many controllers implemented to control a system. Take, for example, the pitching motion of the drone or rotorcraft, a PID, or proportional-integral-derivative controller controls the single pitching motion. In the rolling motion, another controller requires to perform the rolling maneuvering of the drone. Yawing requires another PID controller. The state-space control method designs a single controller format to accumulate all the controllers to maneuver the system with ease of implementation. The ABCD parameters are defined for a variety of rotorcrafts. The introduction of the LQR with the pole placement technique controls the rotorcrafts with robustness.

16.1 State-Space Equation Formulation The modern control engineering introduced the state-space control method to manipulate the rotorcraft’s performance in its yaw, roll, vertical, and pitch motions. The state-space formulations of the equations are x˙ = Ax + Bu y = Cx + Du

 (16.1)

The ABCD matrices have the dimensions of A m n × n , B m n × v , C m m × n , D m . The vectors are U m v × 1 and X m n × 1 dimensions. We can assume the direct transmission matrix D to be zero. We defined the state vector as

m×v

© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021 T. S. Ng, Robotic Vehicles: Systems and Technology, https://doi.org/10.1007/978-981-33-6687-9_16

141

142

16 Rotorcrafts Control Theory

Fig. 16.1 Rotorcraft control block diagram

XT = [ψ˙ ∅˙ θ˙ z˙ y˙ x˙ ψ ∅ θ z y x]

(16.2)

With output vector defined as YT = [ψ ∅ θ z y x]

(16.3)

U = −Kx + U3

(16.4)

The rotorcraft controller

Figure 16.1 illustrates the control block diagram for the state-space feedback controller. We observed the traditional PID control in the control diagram. In equation (16.4), K represents the PID control gains, x refers to Eq. (16.2), and U3 is the biased [1] or residue driving force serving to uphold the weight of the quadrotor to maintain it at the hovering position. The state-space matrix formulation is necessary for the controller to function. Take, for example, for the Hexacopter, we stated the ABCD state matrices, as follows: ⎤ ⎡ 00000000 0 000 ⎢ 0 0 0 0 0 0 0 20 0 0 0 0 ⎥ ⎥ ⎢ ⎥ ⎢ ⎢ 0 0 0 0 0 0 0 0 20 0 0 0 ⎥ ⎥ ⎢ ⎢0 0 0 0 0 0 0 0 0 0 0 0⎥ ⎥ ⎢ ⎢0 0 0 0 0 0 0 0 0 0 0 0⎥ ⎥ ⎢ ⎢0 0 0 0 0 0 0 0 0 0 0 0⎥ ⎥ ⎢ A=⎢ ⎥ ⎢1 0 0 0 0 0 0 0 0 0 0 0⎥ ⎢0 1 0 0 0 0 0 0 0 0 0 0⎥ ⎥ ⎢ ⎥ ⎢ ⎢0 0 1 0 0 0 0 0 0 0 0 0⎥ ⎥ ⎢ ⎢0 0 0 1 0 0 0 0 0 0 0 0⎥ ⎥ ⎢ ⎣0 0 0 0 1 0 0 0 0 0 0 0⎦ 00000100 0 000

16.1 State-Space Equation Formulation

143

⎤ −ra ra −ra ra −ra ra ⎢ 0 y y 0 −y −y ⎥ ⎥ ⎢ ⎥ ⎢ −y y y y −y ⎥ ⎢ −y ⎥ ⎢ ⎢ n n n n n n ⎥ ⎥ ⎢ ⎢ 0 ro ∗ n ro ∗ n 0 −ro ∗ n −ro ∗ n ⎥ ⎥ ⎢ ⎢ −ro ∗ n −ro ∗ n ro ∗ n ro ∗ n ro ∗ n −ro ∗ n ⎥ ⎥ ⎢ B=⎢ 0 0 0 0 0 ⎥ ⎥ ⎢ 0 ⎢ 0 0 0 0 0 0 ⎥ ⎥ ⎢ ⎥ ⎢ 0 0 0 0 0 ⎥ ⎢ 0 ⎥ ⎢ ⎢ 0 0 0 0 0 0 ⎥ ⎥ ⎢ ⎣ 0 0 0 0 0 0 ⎦ 0 0 0 0 0 0 ⎡ ⎤ 000000100000 ⎢0 0 0 0 0 0 0 1 0 0 0 0⎥ ⎢ ⎥ ⎢ ⎥ ⎢0 0 0 0 0 0 0 0 1 0 0 0⎥ C=⎢ ⎥ ⎢0 0 0 0 0 0 0 0 0 1 0 0⎥ ⎢ ⎥ ⎣0 0 0 0 0 0 0 0 0 0 1 0⎦ 000000000001 ⎡ ⎤ 000000 ⎢0 0 0 0 0 0⎥ ⎢ ⎥ ⎢ ⎥ ⎢0 0 0 0 0 0⎥ D=⎢ ⎥ ⎢0 0 0 0 0 0⎥ ⎢ ⎥ ⎣0 0 0 0 0 0⎦ 000000 ⎡

where mtotal represents the total weight of the Hexacopter, we can define the tilting roll and pitch angles of flight as at 30° [1] if we want to fly it at this tilting angle. The B matrix representatives are n=

Kf , mtotal

r o = sin(30),

ra =

Kt , Jy

y=D

Kf Jp

For the case of Tri-coaxial Copter, we can formulate the B matrix as follow, while the rest of the ABCD matrices remain.

144

16 Rotorcrafts Control Theory



⎤ ra −ra ra −ra ra −ra ⎢ 0 0 0.866y 0.866y −0.866y −0.866y ⎥ ⎢ ⎥ ⎢ ⎥ y −0.5y −0.5y −0.5y −0.5y ⎢ y ⎥ ⎢ ⎥ ⎢ n ⎥ n n n n n ⎢ ⎥ ⎢ 0 ⎥ 0 ro ∗ 0.866n ro ∗ 0.866n −ro ∗ 0.866n −ro ∗ 0.866n ⎢ ⎥ ⎢ ro ∗ n ro ∗ n −ro ∗ 0.5n −ro ∗ 0.5n −ro ∗ 0.5n −ro ∗ 0.5n ⎥ ⎢ ⎥ B=⎢ ⎥ 0 0 0 0 0 ⎢ 0 ⎥ ⎢ 0 ⎥ 0 0 0 0 0 ⎢ ⎥ ⎢ ⎥ 0 0 0 0 0 ⎢ 0 ⎥ ⎢ ⎥ ⎢ 0 ⎥ 0 0 0 0 0 ⎢ ⎥ ⎣ 0 ⎦ 0 0 0 0 0 0 0 0 0 0 0 But in quadrotor, only the same A and C matrices apply. Our B and D matrices are as follows. ⎤ ⎡ −ra ra −ra ra ⎢ 0 −y 0 y ⎥ ⎥ ⎢ ⎥ ⎢ 0 y 0 ⎥ ⎢ −y ⎥ ⎢ ⎢ n n n n ⎥ ⎥ ⎢ ⎢ 0 −ro ∗ n 0 ro ∗ n ⎥ ⎥ ⎢ ⎢ −ro ∗ n 0 ro ∗ n 0 ⎥ ⎥ ⎢ B=⎢ 0 0 0 ⎥ ⎥ ⎢ 0 ⎢ 0 0 0 0 ⎥ ⎥ ⎢ ⎥ ⎢ 0 0 0 ⎥ ⎢ 0 ⎥ ⎢ ⎢ 0 0 0 0 ⎥ ⎥ ⎢ ⎣ 0 0 0 0 ⎦ 0 0 0 0 ⎡ ⎤ 0000 ⎢0 0 0 0⎥ ⎢ ⎥ ⎢ ⎥ ⎢0 0 0 0⎥ D=⎢ ⎥ ⎢0 0 0 0⎥ ⎢ ⎥ ⎣0 0 0 0⎦ 0000 In the Shrediquette Derbe quadrotor, we identified the same A and C matrices as the Hexacopter. The D matrix is the same as the case of the quadrotor. We identified the B matrix as follows. Note that the D matrix is not the arm length D.

16.1 State-Space Equation Formulation



−ra ⎢ y ⎢ ⎢ ⎢ −y ⎢ ⎢ n ⎢ ⎢ ro*n ⎢ ⎢ −ro*n B=⎢ ⎢ 0 ⎢ ⎢ 0 ⎢ ⎢ ⎢ 0 ⎢ ⎢ 0 ⎢ ⎣ 0 0

ra −y −y n −ro*n −ro*n 0 0 0 0 0 0

−ra −y y n −ro*n ro*n 0 0 0 0 0 0

145

⎤ ra ⎥ y {D is 6.4 cm} ⎥ ⎥ y {D is 0.9 cm} ⎥ ⎥ n {sum of mass weighing at 600g} ⎥ ⎥ ⎥ ro*n {sin(30) tilt flying angle} ⎥ ⎥ ro*n ⎥ ⎥ 0 ⎥ ⎥ 0 ⎥ ⎥ 0 ⎥ ⎥ ⎥ 0 ⎥ ⎦ 0 0

The arm length D starts from the side edge of the fuselage rotated at an angle of 30 degrees outwards to the center of the motors are each 6.5 cm long. In the pitching moment, we defined the rod length as half the body length of the fuselage at 0.09 m from the center of the gravitational point of the Derbe quadrotor. The total weight of the Derbe quadrotor is 0.6 kg. The 30° build inclination of the fuselage enables the quad to fly at a tilting angle of the same angular degree itself. Therefore, the body of the Derbe quad is heading straight horizontally with zero alpha angular force of attack. The Derbe quad fuselage configuration enables it to reduce drag as it flies straight forward. Therefore, we computed the rotorcraft flight-tilting angle as 30 angular degrees. In our system, we formulate the table of parameters for the rotorcrafts as follows. Most of our defined rotorcrafts have a total weighing value, as in Table 16.1. Table 16.1 Model parameters’ specifications Parameter descriptions

Value

Unit

Distance from quad center to each motor (D)

0.2

m

Propeller force-thrust constant (Kf)

0.1188

N/V

Propeller torque-thrust constant (Kt)

0.0036

Nm/V

Gravitational constant (g)

9.81

m/s2

Roll moment of inertia (Jr)

0.0552

kg m2

Pitch moment of inertia (Jp)

0.0552

kg m2

Yaw moment of inertia (Jy)

0.11

kg m2

Total mass of rotorcraft (mtotal)

1.316

kg

146

16 Rotorcrafts Control Theory

16.2 Controllability and Observability The LQR control system is controllable if all the states can be controlled or transferred from the start and settled in their desired state in finite time. Our rotorcraft consists of 12 states in the X vector. The rank of a matrix is the dimension of the vector space spanned by its columns or rows. In MATLAB, we defined the rank of the controllable matrix as

V=[B A*B A^2*B A^3*B A^4*B A^5*B A^6*B A^7*B A^8*B A^9*B A^10*B A^11*B] rank(V) If matrix B has six rotating motors represented by the six columns, matrix V is a 12 × 72 matrix. We say the system is controllable when the rank of its matrix V equals to the complete number of states as in our X vector. For the system, the rank should be 12 for consideration as controllable. In MATLAB the rank of the observable matrix is defined as

W=[C;C*A;C*A^2;C*A^3;C*A^4;C*A^5;C*A^6;C*A^7;C*A^8; C*A^9;C*A^10;C*A^11]; rank(W) The matrix W is a 72 × 12 matrix. We say the system is observable when the rank of its matrice W is equivalent to the complete number of states in the X vector. For this system, the ranking must be twelve to fulfill the criteria to consider as observable. Indeed, we have a rank of 12 for the matrix W. In other words, we can feed the 12 states of the X vector back from the output of the system. Our system is, therefore, observable.

16.3 Robust Linear Quadratic Regulator In the LQR control technique, the system may become unstable as it travels long distances or the disturbances increase. The occurrence is due to the eigenvalues of the gains falling close to the origin. A stable control system does not represent a robust system. Especially if influence by external disturbances such as wind gust, that might lead the control system to become uncontrollable from a controllable state. The system is prone to external factors, which change the behavior from a controllable to an uncontrollable system. To tune to a robust control system, we must first start with a stable control system. Our fine selections of the values for the Q and R matrices for a stable six-rotor control system are Q = diag ([25, 8, 8, 400, 600, 600, 330, 330, 330, 9000, 3, 3]);

16.3 Robust Linear Quadratic Regulator

147

R = 0.01 ∗ eye (6, 6); Using the MATLAB command [K, P, E] = lqr(A, B, Q, R) We check the gains, the poles, and its eigenvalues. An example of the eigenvalues found for the quadrotor is all negatives as follows. E = −2.9349 + 1.8075i; −2.9349 − 1.8075i; −35.6805; −28.8749; −28.8749 − 5.7074; −5.7074; −2.8373; −2.8373; −4.7859; −0.0707; −0.0707 To prove the stability of the system, we need to find all the principal minors to verify that its determinants for the matrix solution ‘P’ are such that they are positivedefinite. Although we found the algebraic Riccati equation solution ‘P’ to be positive definite, after all, the system stability is not enough to prove the robustness of the system. We tested the rotorcraft system by issuing Gaussian noise as disturbances into it. We found that as the tracking time or distance of the rotorcraft goes further or when the external disturbances increase, the platform is unable to sustain the large or continuous interrupting forces, and its stability weakened. The rotorcraft becomes uncontrollable. Our technique to improve the stability and robustness of the system involves the pole placement method to modify from the LQR closed-loop poles. After having determined the poles of our choice from the given LQR eigenvalues, we can relocate all the poles further away to the left of the origin point as possible. For example, we can replace all poles to the left of the origin point, such that they lie in the range between negative ten to negative forty. In this way, we improved the system stability by increasing the robustness to counteract the large disturbance forces and to accommodate the larger commanded or tracking input to travel a further distance. For example, we shift our desired closed-loop poles more to the left of the root locus plot as Ep = [−10; −10; 0; 0; 0; −10; −10; −10; −10; −10; −13; −13] To become E = [−12.9349 + 1.8075i; −12.9349 − 1.8075i; −35.6805; −28.8749; −28.8749 − 15.7074; −15.7074; −12.8373; −12.8373; −14.7859; −13.0707; −13.0707] Now, all the poles are negative and far away from the origin. We deployed the matrices A, B with our finalized Q, and R settings into the LQR control system. P16.1 illustrates the subprogram, where the matrix ‘v’ represents the eigenvalues’ column vectors, and the d matrix is the eigenvalues’ diagonal matrix. We located the v matrix found from the eigenvalues of the closed-loop system matrix X. After this, we

148

16 Rotorcrafts Control Theory

replaced with our desired poles ‘De’ as the new eigenvalues diagonal matrix. Then, we substituted the new eigenvalue matrix into the system to find the closed-loop gain Kf. The closed-loop gain found is in rectangular form. We may use the command ‘abs’ to convert into its related gain magnitude but not to neglect the presence of the negative sign in our results. P16.1 LQR with Pole Placement [K, P, E]=lqr (A, B, Q, R) X=A-(B*K); [v, d]=eig(X); Ep=[-10;-10;0;0;0;-10;-10;-10;-10;-10;-13;-13];% subject to E=E+Ep % changes De=diag(E); Xe=v*De*inv(v); Kf=-B\(Xe-A) Kf=abs(Kf)

The program P16.1 for the linear quadratic regulator with pole placement works well for the quadrotor but tends to have a rank deficiency in the six-propeller rotorcrafts. Therefore, it is difficult for us to find its control matrix for the PID controller when we compute for the six-propeller types of rotorcrafts. Nevertheless, once we have tuned a controller for the quadrotor, we can deploy it on the various kinds of rotorcrafts. The next chapter will bring us to show how the different models of the rotorcrafts can fly by the deployment of the original set of the controller derived from the quadrotor version of the rotorcrafts.

Reference 1. T.S. Ng, Rotorcrafts, in Flight Systems and Control. Springer Aerospace Technology (Springer, Singapore, 2018). https://doi.org/10.1007/978-981-10-8721-9_7

Chapter 17

Aerial Copters

Modelings of the air robot vehicles in simulation depend on the controller design as well as the aerodynamics surrounding the propelled crafts. The context illustrates the modelings of the Quadrotor, Tricoaxial copter, Hexacopter, and the Derbe copter in its controller design and simulated with respect to their aerodynamics. We found the rotorcrafts’ aerodynamics using the computed program. The simulations verify their robustness for the linear quadratic regulator control design with the pole-placement method. It also demonstrates an intuitive technique of designing controllers from an original controller to compare with our former controller design.

17.1 Aerodynamics Aerodynamic forces play a vital role in the flying of different air vehicles. The ability to maintain the unmanned aerial vehicle in flight depends heavily on its aerodynamic forces interacting on the UAV. The aerodynamic forces act on the UAV wings, control surfaces, bodily structure, and mainframe. Excluding the uplift forces, we have the thrust and drag aerodynamic coefficients on an air vehicle. These two forces, when accurately calculated, stabilized the flight control of the UAV system. However, if they offer the wrong information, the UAV will not be able to stabilize itself during flying, leading to a flight mishap. We can accurately determine our flight control gains, but the UAV still cannot fly without an accurate calculation of its aerodynamic forces. It is because we utilized the aerodynamic coefficients in the flight equations. The aerodynamic thrust and drag coefficients determine the flight equation of motions for the roll, pitch, and yaw manipulations.

© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021 T. S. Ng, Robotic Vehicles: Systems and Technology, https://doi.org/10.1007/978-981-33-6687-9_17

149

150

17 Aerial Copters

The program in P17.1 indicates how we calculate the copter’s maximum speed. Each different propelled copter structure, its frame pattern, and the shape of the UAV introduce its surface interactions with the external forces of the copter system. Its body shape of different lengths, breadth, and width results in different aerodynamic reactions. The manipulation of these aerodynamic forces depends on the dimension of the rotorcraft, the altitude of flight, the maximum speed of the flying craft, and its drag or thrust of the UAV system. With these four parameters, we can determine the aerodynamic coefficients for the drag and thrust. We calculated the Copter drag from the program. We then take the drag variable as the thrust coefficient of the advancing flight system. When thrust equals drag, we will have a steady straightforward flight. By using this denotation, we can calculate the aerodynamic drag coefficient of the UAV. The aerodynamic drag coefficient calculated, and the thrust coefficient was applied to the attitude equations, which it applies further on to the flight acceleration equations to manipulate its motion of flight. The method of aerodynamic computation is accurate enough to implement in the system equations to get the UAV flying steadily. However, several other aerodynamic disturbances can affect the UAV flight performance. Minor affecting aerodynamics are the rotor vibrations, blade flapping, and some other effects that do not exist under low-velocity flight but only produce UAV interferences when in high velocities. Moreover, the airflow disrupting the UAV structure, and the dependence of thrust on the angle of attack, are such considerations when calculating the contributing factors to aerodynamic forces. By enforcing the method of computing our UAV aerodynamic coefficients, we can subdue these irregular aerodynamics interferences to apply in the UAV control system. In the real situation, these aerodynamic forces may become significant when the rotor blade breaks or when the airframe cracks. Fortunately, we can neglect such aerodynamic forces and effects to fly a stable UAV system under the simulation mode, which is close to assuming a real-time UAV flying model. The aerodynamics we located for each different type of rotor copter for the tilting angle of 30° for the various flying copters are as shown in Table 17.1. In conventional flight, where the engine thrust angle is less and consider negligible, we can deduce the approximate relations as thrust is equal to drag (T = D), and the lift equals weight (L = W). With this, we can find the coefficient of drag using the formulas. Table 17.1 Rotor copter aerodynamic constants Aerodynamic constants

Derbe quadcopter

Quadrotor

Hexacopter

Tri-coaxial copter

Units

Thrust constant (b)

1.3253e−6

4.6404e−6

3.4803e−6

3.4803e−6

Ns2

Drag constant (d)

1.2529e−8

1.617e−8

4.514e−9

8.19e−9

Nms2

17.1 Aerodynamics

151

T=D T = 0.5∗ A∗ V2∗ S∗ CD

 (17.1)

A defines the air density, V, the maximum velocity, and S is the surface area of the rotorcraft. Finally, the aerodynamic constants found are T, which we assumed to be the thrust coefficient, and CD the drag coefficient. The air density the rotorcraft experienced is at an approximate height of 160 m, supposing to be 1.18 kg/m3 . Using the program P17.1, we find the thrust coefficient of the quadrotor to be 4.6404e−6 at the start of the spinning speed at 0.1 km/h. The maximum velocity or airspeed comes from the program computation, which is 54.8 km/h. The surface area is the whole drone platform area. It is the sum of the four propeller areas, the control board area, the parts of the rods not covered by the rotating propellers (RL1 and RL2) for the first X-length rod area, and the second X-length rod area. We can compute the total surface area of the quadrotor as follows. ⎫ Total quad surface area = 4π r2 + (L × W) + RL1 + RL2 ⎬ RL1 = 0.015(0.4 − 0.15 − 0.2) = 0.00075m2 ⎭ RL2 = 0.015(0.4 − 0.2 − 0.2) = 0m2

(17.2)

where L = 0.2, and W = 0.15, are the length and width of the control box, respectively. The diameter of the rod is 1.5 cm. Both the X-length joint rods are the sum of twice a rod length at 0.4 m altogether (from the center of a motor to the center of the diagonally motor). The propeller radius is 0.1 m approximately. We calculated the total surface area of the flying quad to be 0.162 m2 . After substituting all the values, we find the subject, which is the drag coefficient, using the formula Eq. (17.1). For the Hexacopter: Total Hexacopter surface area = 6π r2 + (L × W) + RH1 + 2(RH2) RH1 = 0.015(0.4 − 0.15 − 0.2) = 0.00075m2 RH2 = 0.015(0.4 − 0.17 − 0.2) = 0.00045m2

(17.3)

where L = 0.15, W = 0.15 of the center prism box, RH1 represents the 2-length rod area and RH2 the other X-length rod area, which is not covered by the rotating propellers and the prism box. The diameter of the rod is 1.5 cm. The six rods are 0.2 m from the controller board center to the center of the propellers at the other rod ends. Each propeller has a radius of approximately 0.1016 m. Therefore, the total surface area of the Hexacopter we calculated is 0.2187 m2 . For the Tricoaxial Copter: ⎫ Tricoaxial Copter total surface area = 3π r2 + (L × W) + RT1 + 2(RT2) ⎬ RT1 = 0.015(0.2 − 0.1 − 0.075) = 0.000375m2 ⎭ RT2 = 0.015(0.2 − 0.1 − 0.0866) = 0.00021m2 (17.4)

152

17 Aerial Copters

where L = 0.15, W = 0.15 of the controller board, RT1, and two RT2 represent the length rod areas not covered by the rotating propellers and the control board. The diameter of the rod is 1.5 cm. The three rods are 0.2 m from the prismatic board to the center of the propellers at the other rod ends. Each propeller has a radius of approximately 0.1 m. The thickness of the controller box is 0.03 m. Therefore, we calculated the total surface area to be 0.12056 m2 . For the Shrediquette Derbe Quadrotor: Total Derbe surface area = 4πr 2 + (L × W) + 4RN = 0.13873 m2 RN = 0.005(0.065 − 0.0635) = 0.0000075m2

 (17.5)

In the Shrediquette customization of the Derbe Quadrotor design, we have the dimension of the fuselage as L = 0.18, W = 0.05, H = 0.05 m. The diameter or width of the rod is 0.5 cm. RN represents the uncovered rod area facing the sky when the propeller rotates. The length of the four rods is each 0.065 m from the fuselage board edge to the center of its propellers. The distance from the center of the frame to each motor estimates at 0.12657 m. Each propeller has a radius, r of approximately 0.0635 m. Using the program P17.1, we find the thrust coefficient of the Derbe quadrotor to be at 1.3253e−6 unit at the start of the spinning speed at 0.1 km/h. The maximum velocity of the Derbe quad is 54.8 km/h. By using Eq. 17.1, we found the aerodynamic drag coefficient as in Table 17.1. For the Tricoaxial and Hexacopter, both flew at 77.3 km/h. You may take note that we simulate the program P17.1 [1] at a tilting angle of 30° scenario for all sorts of flying crafts for use to find its aerodynamic constants together with Eq. (17.1).

17.1 Aerodynamics

P17.1 Aerodynamics Coefficients Computation Program %% Calculating max copterspeed from max thrust and aerodynamic forces %by Dr. William Thielicke aka Willa aka Shrediquette %Using this program, the simulation for the copter drag and its max. %velocity are conducted for a flying copter of 30 deg. titling angle. clear all; clc; %Three different Rotorcopter types share same parameters as follow. prop_pitch =7;%inches; prop_diameter =8;%inches; cd_arm = 1.5; % drag coefficient of the arms (source: Hoerner) weight = 1.316 * 9.81;%N, weight of the copter; 1.828kg for Hexacopter rpm=17760; % 1200kv *14.8v; use 1200kv motor; 14.8v battery cd_frame = 0.47; angle_tilted_body = 30/180*pi; %flies at 30 deg. tilted angle % Tricoaxial Copter area_arm = 0.009144;% r2; area of arm under Six propellers area_frame_frontal = 0.0045; area_frame_dorsal = 0.0225; frame_width = 0.3464; %total frame width from 1 prop. to another prop. frame_length =0.3; %total frame length from one prop. to another prop. %Hexacopter %control box dimension at 0.15 x 0.15 x 0.03m area_arm = 0.009144;% r2; area of arm under Six propellers area_frame_frontal = 0.0045; % front of the quad frame in m^2 area area_frame_dorsal = 0.0225; % top of the quad frame in m^2 area frame_width = 0.3464; % Hexacopter width frame_length = 0.4; % Hexacopter length % Quadrotor %control box dimension at 0.2 x 0.15 x 0.03m area_arm = 0.006096;% r2; area of arm under FOUR propellers area_frame_frontal = 0.006; % front of the quad frame in m^2 area area_frame_dorsal = 0.03; % top of the quad frame in m^2 area frame_width = 0.4; frame_length = 0.4; % Derbe Shrediquette Drone (4th Rotorcopter type) cd_arm = 1.5; weight = 0.6 * 9.81;% prop_pitch =4;%inches; prop_diameter =5; rpm=24500; cd_frame = 1 area_arm = 0.00127;%6.5cm arms with width rod 0.5cm angle_tilted_body = 0/180*pi; % initially tilted at -30 back. area_frame_frontal = 0.0025; area_frame_dorsal = 0.009; frame_width = 0.178; %AT 0.05m +2(0.064m) frame_length = 0.18; %Initialize variables angle = 0; %rad, pitch angle of the copter vertical_force = weight; %while vertical_force >= weight %Initialize variables vertical_force = weight; copter_drag = 0; horizontal_force = 0; v=0; % m/s, horizontal speed angle = angle+1/180*pi; while horizontal_force >= copter_drag v=v+0.1/3.6 %slowly increase speed when drag is smaller than horizontal force v_in_prop = sin(angle)*v %the relevant component of the propeller disk inflow velocity thrust = 4.392399*10^8*rpm*((prop_diameter^3.5)/sqrt(prop_pitch))*(4.233333*10^-

153

154

17 Aerial Copters 4*rpm*prop_pitch-v_in_prop) * 4 %electricrcaircraftguy;change to 6 prop. for Hexacopter & Tricoaxial Copter v_out_prop = 147/3.6 %m/s jet speed of the propeller. Calculated from electricrcaircraftguy area_frame_horizontal = abs(sin(angleangle_tilted_body)*area_frame_dorsal) + abs(cos(angleangle_tilted_body)*area_frame_frontal) %m^2 downforce=((frame_width/frame_length)/(frame_width/frame_length+2)) * sin(angle-angle_tilted_body)*area_frame_dorsal*0.5*1.293*v^2 % negative lift is created at positive pitch angles armdrag = cd_arm * area_arm * 0.5 * 1.293 * v_out_prop^2; %Drag created by the arms of the drone remaining_thrust = thrust - armdrag; horizontal_force = sin(angle)*remaining_thrust/1 %horizontal thrust; %we divide it by 1000 to collect the copter drag at start. vertical_force = cos(angle) * remaining_thrust - downforce; %"lift", must offset weight.... copter_drag = cd_frame * area_frame_horizontal * 0.5 * 1.293* v^2 %Dragcoefficient * Area * 0.5 * rho(air) * velocity^2 end end %Output result disp(['Maximum velocity: ' num2str(v*3.6) ' km/h']) disp(['Pitch angle: ' num2str(angle/pi*180) ' degrees']); end

17.2 Derbe Quadcopter Using P17.2, the LQR program with pole placement, we arrived at Table 17.2 for the robust PD controller gain matrix for the quadrotor. Table 17.2 presents the control matrix gains for the Derbe configuration [2]. Table 17.2a is the control gain matrix K. Table 17.2b is the gain matrix Kf of which the poles of the eigenvalues shifting to −10 towards the left of the root locus plot. Table 17.2d is the improved robustness of the Derbe for pole-placement at −40 units. The integral control setting is 10% of the proportional gain values of the position control (Figs. 17.1 and 17.2).

196.4356

15.078

−39.1859

40.7173

−40.7173

40.7173

−377.8582

377.8582

377.8582

39.1859

−39.1859

−40.7173

−377.8582 51.6196

102.656

102.656

−31.8544 31.8544 −31.8544

−51.6196

31.8544

51.6196

−51.6196

15.078

15.078

−15.078

−15.078

−50.9442

50.9442

−50.9442

50.9442

27.9905

27.9905

−27.9905

−27.9905

224.2192

224.2192

224.2192

224.2192

−0.07182

−0.10604

41.57

102.656

0.07182

0.10604

41.57

102.656

−0.07182 0.07182

0.10604 −0.10604

(d) Improved PD Gains Kf

1.0604 0.7182

−1.0604

1.0604

−1.0604

1.0604

−0.7182

0.7182

0.7182

−0.7182

27.9905 415.7296 −1.0604 −0.7182

41.57

39.1859

50.9442

1.0604 −0.7182

−27.9905 415.7296 −1.0604 0.7182

27.9905 415.7296

−50.9442 −27.9905 415.7296

−15.078 50.9442

9.8546 105.0784 −26.1191 −16.6285 15.078

26.1191 16.6285

8.6603 8.6603

631.4634 474.3416 −8.6603 −8.6603

−880.902 −631.4634 474.3416 880.902

8.6603 −8.6603

−631.4634 474.3416 −8.6603 8.6603

631.4634 474.3416

41.57

8.9544

−8.9544 −9.8546 105.0784

377.8582

(c) Integral gains for Kf

90.8295

−90.8295 880.902

26.1191 −16,6285 −15.078 −50.9442

−9.8546 105.0784 −26.1191 16.6285

8.9544

−377.8582

9.8546 105.0784

−8.9544

377.8582

126.5609 126.5948

140.445 105.8198 −126.5609 −126.5948 90.8295

−377.8582

(b) PD controller gains Kf

44.8757

−196.4356 −140.445 105.8198

44.8757

126.5609 −126.5948 −90.8295 −880.902

−140.445 105.8198 −126.5609 126.5948

196.43556

−44.8757

140.445 105.8198

−196.4356

−44.8757

(a) PD controller gains K

Table 17.2 Derbe matrix gains

17.2 Derbe Quadcopter 155

156

Fig. 17.1 Derbe quadrotor diagram

Fig. 17.2 Shrediquette derbe (Source SHREDIQUETTE)

17 Aerial Copters

17.2 Derbe Quadcopter

157

P17.2 Derbe—Robust PD Control Matrix Computation A=[0 0 0 0 0 0 1 0 0 0 0 0

0 0 0 0 0 0 0 1 0 0 0 0

0 0 0 0 0 0 0 0 1 0 0 0

0 0 0 0 0 0 0 0 0 1 0 0

0 0 0 0 0 0 0 0 0 0 1 0

0 0 0 0 0 0 0 0 0 0 0 1

0 0 0 0 0 0 0 0 0 0 0 0

0 0 0 0 0; 20 0 0 0 0; 0 20 0 0 0; 0 0 0 0 0; 0 0 0 0 0; 0 0 0 0 0; 0 0 0 0 0; 0 0 0 0 0; 0 0 0 0 0; 0 0 0 0 0; 0 0 0 0 0; 0 0 0 0 0];

B=[-0.0327 0.13774 -0.1937 0.198 0.099 -0.099 0 0 0 0 0 0

0.0327 -0.13774 -0.1937 0.198 -0.099 -0.099 0 0 0 0 0 0

-0.0327 -0.13774 0.1937 0.198 -0.099 0.099 0 0 0 0 0 0

C=[0 0 0 0 0 0

0 0 0 0 0 0

0 0 0 0 1 0

0 0 0 0 0 0

0 0 0 0 0 0

0 0 0 0 0 0

0 0 0 0 0 0

1 0 0 0 0 0

0 1 0 0 0 0

0 0 1 0 0 0

0 0 0 1 0 0

0.0327 ; 0.13774 ;%length at 6.4cm 0.1937; %length at 0.09m 0.198 ; % at 600g weight 0.099 ; %sin30 tilt flying angle 0.099 ; 0 ; 0 ; 0 ; 0 ; 0 ; 0 ];

0; 0; 0; 0; 0; 1];

Q=diag([25 8 8 400 600 600 330 330 330 9000 3 3]); R=0.01*eye(4,4); [K,P,E]=lqr(A,B,Q,R) X=A-(B*K); [v,d]=eig(X);%Z=diag(d) %E=eig(X) Ep=[0;0;0;-10;-10;-10;-10;-10;-10;-10;-13;-13]; E =[ -79.0573 -49.7215 -49.1214 -2.9349 -2.9349 -4.9595 -4.8272 -4.7520 -4.0907 -3.9335 -0.0707 -0.0707

+ + + + + + + + + + +

0.0000i;% re-arrange the eigenvalues 0.0000i; 0.0000i; 1.8075i; 1.8075i; 0.0000i; 0.0000i; 0.0000i; 0.0000i; 0.0000i; 0.0000i; 0.0000i];

E=E+Ep De=diag(E); Xe=v*De*inv(v); Kf=-B\(Xe-A) Kf=abs(Kf) K2=[-Kf(1,1) -Kf(1,2) Kf(1,3) Kf(1,4) Kf(1,5) -Kf(1,6) -Kf(1,7) -Kf(1,8) Kf(1,9) Kf(1,10) Kf(1,11) -Kf(1,12); -Kf(2,1) Kf(2,2) -Kf(2,3) Kf(2,4) -Kf(2,5) Kf(2,6) -Kf(2,7) Kf(2,8) -Kf(2,9) Kf(2,10) -Kf(2,11) Kf(2,12);

158

17 Aerial Copters Kf(3,1) -Kf(3,2) -Kf(3,3) Kf(3,4) Kf(3,5) Kf(3,6) Kf(3,7) -Kf(3,8) -Kf(3,9) Kf(3,10) Kf(3,11) Kf(3,12); Kf(4,1) Kf(4,2) Kf(4,3) Kf(4,4) -Kf(4,5) -Kf(4,6) Kf(4,7) Kf(4,8) Kf(4,9) Kf(4,10) -Kf(4,11) -Kf(4,12)]; Kf=K2 H=inv(R)*B' P=H'*Kf; %or P=B*Kf %same effect D1=det(P(1,1)) D2=det(P(1:2,1:2)) D3=det(P(1:3,1:3)) D4=det(P(1:4,1:4)) D5=det(P(1:5,1:5)) D6=det(P(1:6,1:6)) D7=det(P(1:7,1:7)) D8=det(P(1:8,1:8)) D9=det(P(1:9,1:9)) D10=det(P(1:10,1:10)) D11=det(P(1:11,1:11)) D12=det(P) %-----------------------%

Modelling of the Derbe System Inertias In the Derbe quadcopter system, we can compute the linear and angular accelerations feedback. With this necessary information, we will derive the 3D positions of the quadrotor in space, as well as its manipulated angles of rotations. The fuselage in a rectangular form with L × W × H equivalent of 0.18 × 0.05 × 0.05 m has a mass M weighing 0.28 kg. While the weight of the arm m, with propeller motor, is 0.08 kg has a length D of 0.065 m where g defines the gravity force that is equivalent to 9.81 m/s2 . The total weight of the mass, mtotal = M + 4∗ m

(17.6)

The length D defined from the edge of the Derbe quadrotor to the center of the propellers is 0.065 m long. We equated the moments of inertias for the fuselage and arms as  ⎫ Ixfuse = M∗ W2 /12 + H2 /12 ⎪ ⎪ ⎪ Iyfuse = M∗ L2 /12 + H2 /12  ⎪ ⎪ ⎪ ⎪ ∗ 2 2 ⎪ Izfuse = M L /12 + W /12 ⎪ ⎪ ⎪ ∗ ∧ ⎪ Ixrod = m (0.064 + 0.025) 2 ⎪ ⎬ (17.7) Iyrod = m∗ (0.09)∧ 2 ⎪ ⎪ ⎪ Izrod = m∗ (0.12657)∧ 2 ⎪ ⎪ ⎪ ⎪ Ix = Ixfuse + 4∗ Ixrod ⎪ ⎪ ⎪ ∗ ⎪ Iy = Iyfuse + 4 Iyrod ⎪ ⎪ ⎭ ∗ Iz = Izfuse + 4 Izrod Equation of Motion

17.2 Derbe Quadcopter

159

 ⎫ roll = g∗ b∗ (0.064)∗ w42 − w22 + w12 − w32 ⎪ ⎪ ⎬ ∗ ∗ 2− 2 2 2 w4 b w1 + w3 − w2 pitch = g∗ (0.09)   2 2 2 2 ⎪ yaw = g∗ d∗ −w1 ⎪  2+ w4 2− w3 2+ w2 2  ⎭ ∗ ∗ altitude = g b w1 + w2 + w3 + w4

(17.8)

Angular Acceleration Equation ⎫ θ¨ = pitch/Iy ⎬ φ¨ = roll/Ix ⎭ ψ¨ = yaw/Iz

(17.9)

⎫ x¨ = −(cos ψ ∗ sin θ ∗ cos φ)∗ altitude / mtotal ⎬ y¨ = (− cos ψ ∗ sin φ)∗ altitude / mtotal ⎭ Z¨ = −g + (cos θ ∗ cos φ)∗ altitude /mtotal

(17.10)

Linear Acceleration Equation

Initially, the LQR control matrix gain K found for the Derbe configuration is not stable for the aerodynamics of the Derbe quad in Table 17.1. By shifting the poles away from the origin, we can yield a more robust control. The pole placement method, in conjunction with the linear quadratic regulator control technique, proves to be a more robust system. By shifting all the poles of the eigenvalues to the left of the root locus plot to −10, we rooted the stability of the system. The Derbe quadrotor is now controllable for the aerodynamic constants found in Table 17.1. Figure 17.3a shows the control output of the attitude and position of the Derbe quad without any external disturbances. The simulation is only disturbed at a noise of 0.0653 units level (Fig. 17.3b). The four motors fluctuated at an average of approximately 300 units in Fig. 17.3b. As the noise level increases, it simulates a stronger wind gust, which the Derbe went uncontrollable. The robustness of the controller metric is limited when we pulled the poles at −10 units away from the origin. However, we can put the pole values much further away at −40 units from the original. Figure 17.3c shows the improved stability and robustness of the controlled attitudes. The quad is subjected to wind disturbance equivalent to a noise level of 0.2 unit. From the illustration, higher control inputs to the motors are required. The four motors fluctuated together approximately at 800 units in Fig. 17.3c. We calculated lower aerodynamics constants to fulfill the simulation. The aerodynamic b constant simulates at the 3.698e−7 Ns2 and d constant at 3.497e−9 Nms2 . The aerodynamic constants are calculated at a division of 3.5839 from Table 17.1 for the Derbe quadrotor. To compute the precise aerodynamics, the extraction of the first four diagonal constants of the control matrix gain, Kf is necessary. Without any pole placement technique, our original diagonal constants are D1,2,3,4 {5.8697, 635.2717, 6.9128e+04, 5.7936e+06}. At −10 poles controllable matrix gain, we have D1,2,3,4 {49.4239, 243.8331, 1.8617e+03, 1.5494e+05}. At −40 poles controllable matrix gain, we have D1,2,3,4 {49.4239, 1.1088e+03, 3.3663e+04,

160

17 Aerial Copters

(a) Without Noise (b) With Noise (c) With Larger Noise Fig. 17.3 Derbe quadrotor control (a, b)

2.7369e+06}. We performed a comparison of each equivalent diagonal constant between the original controller matrix gains and pole-placement controller matrix gains. Let us look at the comparison of the diagonal constants D1,2,3,4 of the Kf without pole placement and that of the Kf with −40 pole shiftings. The multiples of the diagonal constants are D1,2,3,4 {8.42, 1.745, 2.0535, 2.1168}. We take the average of the aggregate of the four diagonal constants, which rounded up to an estimation of 3.5839 units. The found aerodynamics coefficients for the thrust and drag are a division of the aerodynamics found in Table 17.1 by the value of 3.5839.

17.3 Quadcopter With the same ACD matrices as before, we plugged in our Q and R matrices, our pole placement Ep, as in Chap. 16. We computed the B matrix prior to our knowledge of the quadrotor tilting angle to fly at 30°. We can re-shuffle the eigenvalues E, found to obtain a robust controller. In Fig. 17.4, if we label the back rotor as motor one to go clockwise to reach the right rotor to be motor four, we will arrive at the B matrix of program P17.3 to compute the gains of the controller matrix. The resulting gains are shown in Table 17.3.

17.3 Quadcopter

Fig. 17.4 Quadrotor

161

46.4317

220.2739

0

0

0

0

0

0

0

0

(b) Integral gains

−46.4317 −6.1518

220.2739

35.0871 4.6802

0

0

0

0

6.1518

−573.304 −4.0225

44.9352

573.3041

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

181.430

54.8412

0.3215

5.2627

5.5

5.5

5.5

5.5

37.9727

8.643

0.309 −0.309

0.864 −0.864

0.526 −0.526

0.032

−3.0887

3.0887

−0.032

54.8412 −8.643

54.8412

−126.1601 54.8412 −0.3215 −5.2627

126.1601

−181.430 −37.9727

−76.2573 −573.304 4.0225

76.2573

35.0871 −125.8984 −44.9352 573.3041

35.0871 125.8984

−39.0607 35.0871 −4.6802

0.6436

−220.2739

39.0607

−0.6436

−220.2739

(a) PD controller gains or Kf

Table 17.3 Quadrotor matrix gains

162 17 Aerial Copters

17.3 Quadcopter

P17.3 Quadrotor—Robust PD Control Matrix Computation A=[0 0 0 0 0 0 0 0 0 0 0 0; 0 0 0 0 0 0 0 20 0 0 0 0; 0 0 0 0 0 0 0 0 20 0 0 0; 0 0 0 0 0 0 0 0 0 0 0 0; 0 0 0 0 0 0 0 0 0 0 0 0; 0 0 0 0 0 0 0 0 0 0 0 0; 1 0 0 0 0 0 0 0 0 0 0 0; 0 1 0 0 0 0 0 0 0 0 0 0; 0 0 1 0 0 0 0 0 0 0 0 0; 0 0 0 1 0 0 0 0 0 0 0 0; 0 0 0 0 1 0 0 0 0 0 0 0; 0 0 0 0 0 1 0 0 0 0 0 0]; B=[ -0.0327 0.0327 -0.0327 0.0327 ; 0 -0.43 0 0.43 ; -0.43 0 0.43 0 ; 0.09 0.09 0.09 0.09 ; 0 -0.045 0 0.045 ; %flight tilting angle at 30 degree -0.045 0 0.045 0 ; 0000; 0000; 0000; 0000; 0000; 0 0 0 0 ]; C=[0 0 0 0 0 0 1 0 0 0 0 0; 0 0 0 0 0 0 0 1 0 0 0 0; 0 0 0 0 0 0 0 0 1 0 0 0; 0 0 0 0 0 0 0 0 0 1 0 0; 0 0 0 0 0 0 0 0 0 0 1 0; 0 0 0 0 0 0 0 0 0 0 0 1]; Q=diag([25 8 8 400 600 600 330 330 330 9000 3 3]); R=0.01*eye(4,4); [K,P,E]=lqr(A,B,Q,R) CLP = eig(A-B*K) X=A-(B*K); [v,d]=eig(X); Ep=[-10;-10;0;0;0;-10;-10;-10;-10;-10;-13;-13]; E =[-2.9349 + 1.8075i -2.9349 - 1.8075i -35.6805 + 0.0000i -23.1874 + 0.0000i -23.1874 + 0.0000i -6.0157 + 0.0000i -6.0157 + 0.0000i -2.2344 + 0.0000i -2.2344 + 0.0000i -4.7859 + 0.0000i -0.0707 + 0.0000i -0.0707 + 0.0000i ];% rearrange our eigenvectors for robustness E=E+Ep De=diag(E) Xe=v*De*inv(v);AA=(Xe-A); Kf=-B\(Xe-A); Kf=abs(Kf) K2=[ -Kf(1,1) -Kf(1,2) Kf(1,3) Kf(1,4) Kf(1,5) Kf(1,6) -Kf(1,7) -Kf(1,8) Kf(1,9) Kf(1,10) Kf(1,11) Kf(1,12); -Kf(2,1) Kf(2,2) -Kf(2,3) Kf(2,4) -Kf(2,5) -Kf(2,6) -Kf(2,7) Kf(2,8) -Kf(2,9) Kf(2,10) -Kf(2,11) -Kf(2,12); Kf(3,1) -Kf(3,2) -Kf(3,3) Kf(3,4) Kf(3,5) Kf(3,6) Kf(3,7) -Kf(3,8) -Kf(3,9) Kf(3,10) Kf(3,11) Kf(3,12); Kf(4,1) Kf(4,2) Kf(4,3) Kf(4,4) -Kf(4,5) -Kf(4,6) Kf(4,7) Kf(4,8) Kf(4,9) Kf(4,10) -Kf(4,11) -Kf(4,12)]; Kf=K2

163

164

17 Aerial Copters

Modelling of the System Inertias In the Quadcopter system, the linear and angular accelerations detected give rise to the 3D positions of the quadrotor in space and its manipulated angles of rotations. The center controller box in the form of a rectangular box with L × W × H equivalent of 0.2 × 0.15 × 0.03 m has a mass M weighing 0.25 kg. While the weight of the rod m with propeller-motor is 0.2665 kg has a length D of 0.2 m where g defines the gravity force, which is equivalent to 9.81 m/s2 . The total weight of the mass mtotal is equal to Eq. (17.6). The length D extends from the quadrotor center to the center of the propellers is 0.2 m. The moments of inertias for the control box and rod equated as  ⎫ Ixcb = M∗ W2 /12 + H2 /12 ⎪ ⎪ ⎪ ⎪ Iycb = M∗ L2 /12 + H2 /12  ⎪ ⎪ ⎪ ∗ 2 2 ⎪ Izcb = M L /12 + W /12 ⎪ ⎬ ∗ 2 (17.11) Irod = m D ⎪ ⎪ ⎪ Ix = Ixcb + 2∗ rod ⎪ ⎪ ⎪ ⎪ Iy = Iycb + 2∗ Irod ⎪ ⎪ ⎭ ∗ Iz = Izcb + 4 Irod Equation of Motion  2  ⎫ roll = g ∗ b∗ D ∗ w4 − w22  ⎪ ⎪  ⎬ pitch = g ∗ b∗D ∗ w32 − w12  2 2 2 2 ⎪ yaw = g ∗ d ∗ −w1  2+ w4 2− w3 2+ w2 2  ⎪ ⎭ ∗ ∗ altitude = g b w1 + w2 + w3 + w4

(17.12)

Angular Acceleration Equation. Refer to Eq. (17.9). Linear Acceleration Equation ⎫ x¨ = (sin ψ ∗ sin φ + cos ψ ∗ sin θ ∗ cos φ)∗ altitude/mtotal ⎬ y¨ = (− cos ψ ∗ sin φ + sin ψ ∗ sin θ ∗ sin φ)∗ altitude/mtotal ⎭ z¨ = −g + (cos θ ∗ cos φ)∗ altitude/mtotal

(17.13)

Suppose we used an 8 in. propeller with a pitch of 7 in., a 1200 kV motor, a 3300 mAh 4 s (14.8 V) battery. At a speed of 17,760 rpm, the quadrotor motor spins to maneuver its balance. Using the aerodynamic computation program, we located the drag of the quadrotor at the start of the flight, which we assumed as the aerodynamic thrust constant. If we use different motor-propeller sizing other than that similar to our specifications, the quadrotor maximum velocity and the aerodynamic drag can be re-computed using the same program. Then, we substitute these two parameters,

17.3 Quadcopter

165

(a) Without Disturbance

(b) With Disturbance

Fig. 17.5 Quadrotor control (a, b)

together with the changes in surface area, into the formula of Eq. 17.1 to find our aerodynamic drag coefficient. The resulting control performance of the quadrotor is as shown in Fig. 17.5.

17.4 Hexacopter See Fig. 17.6. Modelling of the System Inertias. Same as the above, we derived the 3D positions of the quadrotor in space as well as its manipulated angles of rotations using the linear and angular accelerations feedbacked. We have the center controller box in the dimension L × W × H equivalent of 0.15 × 0.15 × 0.03 m and have a mass M weighing 0.292 kg. While the weight of the rod m with propeller-motor is 0.1707 kg has a length D of 0.2 m, where g defines the gravity force that is equivalent to 9.81 m/s2 . The total weight of the mass, mtotal = M + 6∗ m

(17.14)

166

17 Aerial Copters

Fig. 17.6 Hexacopter

The moments of inertias for the centralized controller and rod equated as ⎫   Ixcc = M∗ L2 /12 + H2 /12  ⎪ ⎪ ⎪ ∗ 2 2 ⎪ ⎪ Iycc = M  W /12 + H /12 ⎪ ⎪ ⎪ ∗ 2 2 ⎪ Izcc = M L /12 + W /12 ⎬ ∗ 2 Irod = m D ⎪ ⎪ ⎪ Ix = Ixcc + 4∗ cos(30)∗ Irod ⎪ ⎪ ∗ ∗ ⎪ Iy = Iycc + (2 + 4 cos(60)) Irod ⎪ ⎪ ⎪ ⎭ ∗ Iz = Izcc + 6 Irod

(17.15)

Equation of Motion   ⎫ ∗ 2 2 2 2 roll = g∗ b∗ D∗ 0.866 −w5 − w6 + w2 + w3 ⎪     2  ⎪ ∗ 2 ⎬ w3 + w5 pitch = g∗ b∗D∗ −w12 − 0.5∗ w22 + w62 + w42 + 0.5  2 ⎪ + w22 − w32 + w42 − w52 + w62  yaw = g∗ d∗ −w1 ⎪  ⎭ altitude = g∗ b∗ w12 + w22 + w32 + w42 + w52 + w62 (17.16) where b stands for thrust coefficient; d stands for drag coefficient; D is the rod length; wn (n representing 1–6) denotes the speed of the motor-propeller; g defines the gravitational force in meters per second. Angular Acceleration Equation. Refer to Eq. (17.9). Linear Acceleration Equation

17.4 Hexacopter

167

⎫ x¨ = (sin ψ ∗ sin φ − cos ψ ∗ sin θ ∗ cos φ)∗ altitude / mtotal ⎬ y¨ = (− cos ψ ∗ sin φ + sin ψ ∗ sin θ ∗ sin φ)∗ altitude / mtotal ⎭ z¨ = −g + (cos θ ∗ cos φ)∗ altitude / mtotal

(17.17)

The mtotal represents the total mass of the Hexacopter. In Table 17.4, Mn where n represents 1–6, defines the motor for each different row in the matrix is the LQR gains derived by intuitive method computed from the ABQR matrices we input in the program for the four propellers (quadrotor with reference from Table 17.3) version of the rotorcrafts. We make use of the program P16.1 to locate the LQR gains for the robust system of the quadrotor. Subsequently, we fill in the matrix table by intuition, for the PD controller matrix of the Hexacopter using the quadrotor PD controller gains we have found. The XYZ integral gains are 10% of its proportional gain values. Let us set the target position of the ZYX axes to 3, 10, 10 units, respectively. Figure 17.7 shows the Hexacopter flown with and without disturbance. All the six motors are drawing the same amount of output from the controller. However, in windy conditions, the six propellers react to the interactions to control the Hexacopter and maintain it in a stable hovering position. The above intuitive method offers a simple solution to a complex problem involving six motor-propeller set for control. The technique is fast and easy to implement on the Hexacopter. We do not have to go through the hassle of computing the PD control matrix as it is complicated. Nevertheless, to implement the intuitive method, all parameters must be the same. For example, the mass of both types of rotor copters, the nose-titling angle of both rotorcrafts at flight, the rotorcraft moment of inertias, the rod lengths, and the type of propeller used must be identical. On the other hand, the factors affecting the aerodynamics are the frame width, distance, area, the propeller diameter and pitch, the flight tilting angle, the drag coefficient of the arm and its frame, and the speed of the motor-propeller.

17.5 Tri-coaxial Copter See Fig. 17.8. Modelling of the System Inertias Similarly, as in the Hexacopter system, we can compute the linear and angular accelerations feedback for the Tri-Coaxial copter. With Irod and Icc for the three axes as in Eq. (17.15), we can compute the following inertias for the system. Assuming the same control board as we had used in the Hexacopter, the dimension is 0.15 by 0.15 by 0.03 m, and the weight of the control board is 0.416 kg. Now the weight of the rod m with motor-propeller is 0.3 kg has a length D of 0.2 m. mtotal = M + 3∗ m

(17.18)

0

0

0

0

0

0

M4

M1

M5

M6

M2

M3

0

0

0

0

0

0

−220.274 41.21

(b) Integral control matrix

M3

0

0

0

0

0

0

38.13

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

19.53 35.09 −109 −38.13

−19.53 35.09 −109

38.13

220.274 41.21

220.274 −41.21 −19.53 35.09 109

M2

M6

M5

573.3 0

0

0

0

0

0

0

0

0

0

0

0

0

−573.3 157.12

573.3 157.12

573.3 −157.12

−573.3 −157.12

76.257 −573.3 0

−76.257 −38.13

−220.274 −41.21

M1 19.53 35.09 109

39.06 35.09 0 −39.06 35.09 0

220.274 0

−220.274 0

M4

(a) Proportional–derivative control matrix

Table 17.4 Hexacopter PID controller matrix 126.16 54.84 0

2.63

2.63

−2.63

5.263

−5.263

5.5

5.5

5.5

5.5

5.5

5.5

0.26 0.26 −0.26

−0.75 −0.75

−0.26

0.53

−0.53

0.75

0.75

0

0

63.08 54.84 −7.485 −2.63

−63.08 54.84 −7.485

−63.08 54.84 7.485

63.08 54.84 7.485

−126.16 54.84 0

168 17 Aerial Copters

17.5 Tri-coaxial Copter

169

(a) Without Disturbance

(b) With Disturbance

Fig. 17.7 Hexacopter simulation (a, b)

Fig. 17.8 Tri-coaxial copter

⎫ Ix = Ixcc + 2∗ cos(30)∗ Irod ⎬ Iy = Iycc + (1 + 2∗ cos(60))∗ Irod ⎭ Iz = Izcc + 3∗ Irod

(17.19)

170

17 Aerial Copters

Equations of Motion   ⎫ A = g ∗ b∗ w12 + w22 + w32 + w42 + w52 + w62 ⎪ ⎪ ⎬ ∗ 2 2 2 2 −w5 30 − w6 + w3 + w4 R = g ∗ D ∗ b∗ cos    ∗ ∗ 2 2 ∗ 2 + w42 + w52 + w62 ⎪ P = −g ∗ D ⎪  b2 −w12 − w22 + cos260 w3 ⎭ ∗ ∗ 2 Y = g d w1 − w2 + w3 − w4 + w5 − w62

(17.20)

where b stands for thrust coefficient; d stands for drag coefficient; D is the rod length; Wn (n representing 1–6) denotes the speed of the motor-propeller; g defines the gravitational force. Here, A represents the altitude, P represents the pitch, R represents the roll, and Y represents the yaw in the movement equations. Angular Acceleration Equations ⎫ θ¨ = P/(Iy) ⎬ φ¨ = R/(Ix) ⎭ ψ¨ = Y/(Iz)

(17.21)

⎫ x¨ = −g ∗ sin θ ⎬ y¨ = −g ∗ sin φ ∗ cos ψ ⎭ z¨ = −g ∗ cos θ ∗ cos φ + H/ mtotal

(17.22)

Linear Acceleration Equations

The mtotal represents the total mass of the Tricoaxial Copter. I.

Derivation of the PID Controller by Intuitive Method

By calling the LQR command in MATLAB, it computes the ABQR input matrices we specify to derive the LQR gains for the quadrotor. The reference program P17.3 applies in the robust gain computation for the four-propeller rotorcraft. Initially, we run the program P16.1 to locate the LQR gains for the robustness of the quadrotor. Subsequently, we fill in the matrix table by intuition for the PD controller matrix of the Tri-coaxial Copter using the quadrotor PD controller gains we have found. Tables 17.6 and 17.7 show the PD controller derived by the intuitive method with reference from Table 17.5 (from quadrotor [3]). Table 17.5 Quadcopter K-gain after pole placements ˙ ˙ ψ˙ φ˙ θ˙ ψ Z˙ Y X M2

−220

−53

0

M3 M4

220

0

−220

53

M1

220

0

f

P

Z

Y

X

0

−573

−185

0

50.8

5.9

0

0

−156

573

0

271

50.8

0

−10.8

−85

0

−573

185

0

50.8

−5.9

0

0

156

573

0

−271

50.8

0

10.8

35.2

85

67

35.2

0

35.2

−67

35.2

35.2

−33.5 −33.5

0

0

45.9

45.9

−45.9

−45.9

220

−220

220

−220

220

−220

M1

M2

M3

M4

M5

M6

−33.5

35.2

−33.5

35.2

35.2

35.2

35.2

67

67



Table 17.6 Tri-coaxial copter PD controller ψ˙ φ˙ θ˙ Parameter motor

73.6

73.6

−73.6

−73.6

0

0



78

78

78

78

−573

573

−573

573

−573

573

−156 −156

ψ



−160.2

−160.2

160.2

160.2

0

0

φ

−135.5

−135.5

−135.5

−135.5

271

271

θ

50.8

50.8

50.8

50.8

50.8

50.8

Z

5.1

5.1

−5.1

−5.1

0

0

Y

5.4

5.4

5.4

5.4

−10.8

−10.8

X

17.5 Tri-coaxial Copter 171

−45.9

45.9

45.9

−220

220

−220

M4

M5

M6

−35.2

−33.5

−35.2

−35.2

−33.5

−45.9

220

M3

0

−73.6

−73.6

73.6

73.6

0

−35.2 −35.2 −35.2

67

0

−220

M2

−33.5

67

0

220

M1





−33.5

θ˙

Table 17.7 Similar PD controller ψ˙ φ˙ Parameter motor

78

78

78

78

−156

−156



-573

573

-573

573

−573

573

ψ

160.2

160.2

−160.2

−160.2

0

0

φ

−135.5

−135.5

−135.5

−135.5

271

271

θ

−50.8

−50.8

−50.8

−50.8

−50.8

−50.8

Z

−5.1

−5.1

5.1

5.1

0

0

Y

5.4

5.4

5.4

5.4

−10.8

−10.8

X

172 17 Aerial Copters

17.5 Tri-coaxial Copter

173

The integral gain functions to drive the Tri-coaxial Copter to its three-dimensional space in the XYZ coordinates as its desired position. Therefore, we only need to control the XYZ parameters when adjusting the integral gains. In other words, we tuned the XYZ parameters’ integrator gains to achieve the desired destination for the Tri-coaxial Copter. We can tune the proportional controller gains as our integral gains for the XYZ parameters to 10%. We can fine-tune from there if 10% does not work for the integrator gains of the Tri-coaxial Copter. Table 18.8 shows the integral gains we derived. Alternatively, we can manipulate the rotorcraft with another configuration. Figure 17.9 shows the directions for the roll, pitch, and yaw we defined. For this configuration, the PD control matrix, as in Table 17.7, the moment equations are:   ⎫ H = −g∗ b∗ w12 + w22 + w32 + w2 + w52 + w62 ⎪ ⎪ ⎬ R = −g∗ D∗ b∗cos 30∗ −w52 − w62 +w32 + w42  P = −g∗ D∗ b∗ −w12 − w22 + cos 60∗ w32 + w42+ w52 + w62 ⎪ ⎪ ⎭ Y = g∗ d∗ w12 − w22 + w32 − w42 + w52 − w62

(17.23)

Linear Acceleration Equations ⎫ x¨ = (sin ψ ∗ sin φ − cos φ ∗ cos ψ ∗ sin θ )∗ H/mtotal ⎬ y¨ = (cos ψ ∗ sin φ + sin θ ∗ sin φ ∗ sin ψ)∗ H/mtotal ⎭ z¨ = g + (cos θ ∗ cos φ)∗ H/mtotal

(17.24)

The angular acceleration equations are the same as Eq. (17.21). We fly the simulated Tricoaxial Copter with the PID parameters setting derived from the intuitive method, setting at 30° flight angle. Tables 17.6, 17.7, and 17.8 show the controller PID gains. We supposed the coaxial configuration of the Tricoaxial Copter could fly at the tilted aerodynamics coefficients computed at 30°. However, as our controller Fig. 17.9 Tri-coaxial copter configuration

174

17 Aerial Copters

Table 17.8 Integral gains

Parameter motor

Z

Y

X

M1

5.08

0

−1.08

M2

5.08

0

−1.08

M3

5.08

−0.51

0.54

M4

5.08

−0.51

0.54

M5

5.08

0.51

0.54

M6

5.08

0.51

0.54

is from the reference of 48.59 degrees tilted flight [3], it does not fully match our calculated aerodynamic coefficients for the flight at 30°tilting angle. The Tricoaxial Copter went out of control due to the mismatch of the aerodynamics coefficients. For this scenario, we assumed the thrust aerodynamics coefficient to be 1e−6 or 1.1e−6 Ns2 . Its drag aerodynamics coefficient is 1e−8 or 1.1e−8 Nms2 . Figure 17.10 is the output of the simulation. Alternatively, you can also compute the aerodynamics coefficients at 48.59 degrees of tilted flight angle. II.

Derivation of the PD Controller from Control Matrix Computation

(a) Without Disturbance Fig. 17.10 Tri-coaxial copter simulation (a, b)

(b) With Disturbance

17.5 Tri-coaxial Copter

P17.4 Tri-Coaxial Copter—Robust PD Control Matrix Computation [K,P,E]=lqr(A,B,Q,R) CLP = eig(A-B*K); CLP =[ % realign the CLP -3.3593 + 1.8071i -3.3593 - 1.8071i -43.8319 + 0.0000i -28.4121 + 0.0000i -28.4113 + 0.0000i -5.9993 + 0.0000i -5.9993 + 0.0000i -2.2394 + 0.0000i -2.2394 + 0.0000i -4.7714 + 0.0000i -0.0707 + 0.0000i -0.0707 + 0.0000i]; X=A-(B*K); [v,d]=eig(X); Ep=[-10;-10;-0;-0;-0;-10;-10;-10;-10;-10;-13;-13]; E=CLP+Ep De=diag(E); Xe=v*De*inv(v); B=[ 0.0327 -0.0327 0.0327 -0.0327 0.0327 -0.0327; 0 0 0.43*0.866 0.43*0.866 -0.43*0.866 -0.43*0.866; 0.43 0.43 -0.43*0.5 -0.43*0.5 -0.43*0.5 -0.43*0.5; 0.09 0.09 0.09 0.09 0.09 0.09; 0 0 0.045*0.866 0.045*0.866 -0.045*0.866 -0.045*0.866; 0.045 0.045 -0.045*0.5 -0.045*0.5 -0.045*0.5 -0.045*0.5; 0 0 0 0 0 0; 0 0 0 0 0 0; 0 0 0 0 0 0; 0 0 0 0 0 0; 0 0 0 0 -1 1; %substitution to avoid rank deficiency in Kf computation 0 0 -1 1 0 0];%substitution to avoid rank deficiency in Kf computation Kf=-B\(Xe-A); rank(Xe) rank(Kf) % check for rank deficiency Kf=abs(Kf); K2= [ Kf(1,1) 0*Kf(1,2) Kf(1,3) Kf(1,4) 0*Kf(1,5) -Kf(1,6) Kf(1,7) 0*Kf(1,8) Kf(1,9) Kf(1,10) 0*Kf(1,11) -Kf(1,12); -Kf(2,1) 0*Kf(2,2) Kf(2,3) Kf(2,4) 0*Kf(2,5) -Kf(2,6) -Kf(2,7) 0*Kf(2,8) Kf(2,9) Kf(2,10) 0*Kf(2,11) -Kf(2,12); Kf(3,1) Kf(3,2) -Kf(3,3) Kf(3,4) -Kf(3,5) Kf(3,6) Kf(3,7) Kf(3,8) -Kf(3,9) Kf(3,10) -Kf(3,11) Kf(3,12); -Kf(4,1) Kf(4,2) -Kf(4,3) Kf(4,4) -Kf(4,5) Kf(4,6) -Kf(4,7) Kf(4,8) -Kf(4,9) Kf(4,10) -Kf(4,11) Kf(4,12); Kf(5,1) -Kf(5,2) -Kf(5,3) Kf(5,4) Kf(5,5) Kf(5,6) Kf(5,7) -Kf(5,8) -Kf(5,9) Kf(5,10) Kf(5,11) Kf(5,12); -Kf(6,1) -Kf(6,2) -Kf(6,3) Kf(6,4) Kf(6,5) Kf(6,6) -Kf(6,7) -Kf(6,8) -Kf(6,9) Kf(6,10) Kf(6,11) Kf(6,12)]; Kf=K2 H=inv(R)*B' P=H'*Kf; %or P=B*Kf; %same effect D1=det(P(1,1)) % [check for stability D2=det(P(1:2,1:2)) % if -ve is unstable] D3=det(P(1:3,1:3)) D4=det(P(1:4,1:4)) D5=det(P(1:5,1:5)) D6=det(P(1:6,1:6)) D7=det(P(1:7,1:7))

175

176

17 Aerial Copters

D8=det(P(1:8,1:8)) D9=det(P(1:9,1:9)) D10=det(P(1:10,1:10)) D11=det(P(1:11,1:11)) D12=det(P) %-----------------------%

In program P17.4, we compute the PD control matrix of the Tri-coaxial Copter by specifying the ABQR matrices as in Chap. 16. The theme of the program is to avoid the rank deficiency for the robust controller Kf. The principal minors for the matrix solution ‘P’ are required to be positive definite to have a stable system. The positive definiteness of the algebraic Riccati equation solution ‘P’ is not good enough to prove the robustness of the system. In program P17.4, we tackle the robustness of the system. By the arrangement of the propeller rotating direction, we may arrive at the primary principal minors ‘D1’–‘D4’ to be all negative. In other words, the six propellers’ yawing directions affect the principal minors. If we reverse the rotation for each of the propellers, we can get all the primary principal minors ‘D1’–‘D4’ to be all positive. No matter which way we fixed our propellers’ rotating direction, as long as the yawing is balanced for the Hexacopter, we get the same PD control matrix computed using the program P17.4. Therefore, the resulting PD controller gain Kf is a stable system with robustness. We collect the result of the program as in Table 17.9. From here, we do some fine-tuning to change the matrix into the following (Table 17.10). Here again, we can set our XYZ integral gains to be 10% of the proportional gain values. Comparison Between Hexacopter, Quadcopter and Tri-coaxial Copter. In the simulation, we introduced a commanded input ZYX of 3, 10, 10, with full Gaussian noise perturbation at 0.2 level. Without disturbance, the motor input stays Table 17.9 Robust PD control matrix 0

39.7 24.4 −0

−122.7 1879.2

0

163.2

31.7 −0

−1.9

−587.5 0

39.8 24.4 −0

−123.3 −1879.2 0

163.4

31.7 −0

−14.9

0

28.9

−20 24.4 −65.4 61.2 −20 24.4 −65.4 61.8

587.5

0

104.9

−81.5 31.7 −4.5 2.3 −81.8 31.7 −4.5 10.7

0

28.9

0

104.9

0

−28.9 −20 24.4 64.5

61.5

0

−104.6 −81.6 31.7 2.1

4.2

0

−29

61.5

0

−105.1 −81.6 31.7 11

4.2

−20 24.4 66.2

Table 17.10 Robust PD control matrix after fine tuning 0

39.8 24.4 −0

−123 1879.2

0

163.3

31.7 −0

−8.4

−587.5 0

39.8 24.4 −0

−123 −1879.2 0

163.3

31.7 −0

−8.4

0

28.9

−20 24.4 −65.4 61.5

0

104.9

−81.6 31.7 −4.5 4.2

0

28.9

−20 24.4 −65.4 61.5

0

104.9

−81.6 31.7 −4.5 4.2

0

−28.9 −20 24.4 65.4

61.5

0

−104.9 −81.6 31.7 4.5

4.2

0

−29

61.5

0

−104.9 −81.6 31.7 4.5

4.2

587.5

−20 24.4 65.4

17.5 Tri-coaxial Copter

177

relevant at 250 units. The motor control input fluctuated at about 250 levels to stabilize the Tri-coaxial Copter under the simulated wind interferences. The configuration of the motor-propellers in the Tri-coaxial Copter creates a very stable condition for the yawing of the platform. From the result, it is without any yawing motion for its perturbation is subdued in the configuration. It inherits its advantage of having complete control of its yawing movement without any disturbances over other types of rotor-copters. The attitude of the Tri-coaxial Copter is only affected by its roll and pitch perturbations. For the controller derived by intuitive method, when the Tri-coaxial Copter flies at the estimation of its aerodynamics constants setting of the thrust (b) and drag (d) at the units of 1e − 6 and 1e − 8, respectively, it exerts more power from its controller, or each motor input drives at 450 levels. By using the calculated controller of the LQR-pole placement method and the P17.1 aerodynamics computation program at the flight angle of 30º, it only requires approximately 250 units to drive each of the motors. But for the other types of rotorcrafts, the intuitive controllers fly at the same low power consumption of 250 unit levels. Therefore, only the Tri-coaxial Copter configuration deploying the intuitive control method consumes a higher power consumption, which is impractical. This implies the intuitive control method is impractical for use in the Tri-coaxial Copter configuration. For the different scenarios of testing the flight at the different tilting angles, we can deduce that at higher tilted angular control, for example, at 48.59°, the value of the aerodynamic thrust coefficient will be higher. The Copter must input higher thrust due to the larger surface resistance force of the rotorcraft in the horizontal direction. It, in turn, increases the drag force. Thus, the speed of the Copter will reduce. After all, it draws a higher power consumption to input a higher thrust to the motors to interact with the larger resistance force. In conjunction, it is not advisable to drive the rotorcraft at an angle higher than 45°, as it can cause overturn more easily. Concluding that an angular flight of larger than 30° will not be smoother, neither is it economically practical. The six-rotor copters, in comparison to the quadcopter, excluding the yaw motion, its linear and rotational perturbations, are slightly more aggressive in control. In the Tri-coaxial copter, each of its motors can exert only 250 units to control the platform, whereas the Hexacopter exerts 255 units at its motor input. The quadrotor powers 270 levels to each of its motor. Thus, the quadcopter configuration creates a tighter control in maneuvering to its target, as indicated in Figs. 17.5, 17.7, and 17.11 for the three different types of rotorcrafts. We noted that rotorcrafts with smaller dimensions, size, and weight are not as robust as drones of larger size and weight. The Derbe quadrotor is an example of a smaller version of the drone that flies at a much lower wind disturbance than its counterpart. The smaller drone has the advantage of a lighter weight that enables it to fly at a lower aerodynamics induction than its quadrotor counterpart at the same speed. However, it cannot take large wind gusts as experienced by the Hexacopter, Quadrotor, and Tri-coaxial copter in the chapter. We need to fine-tune the controller to adjust it to accommodate the strong winds. The six-rotor drones fly at higher speeds, subject to a lower amount of drag forces, thereby its drag coefficients are smaller, as illustrated in Table 17.1.

178

17 Aerial Copters

(a) Control Without Disturbance

(b) Robust Control

Fig. 17.11 Tri-coaxial copter attitude and position (a, b)

Effect of the Integral Gain The change in adjusting the integral term for the XY motion is as shown in Fig. 17.12. In the scenario with wind disturbance or with white Gaussian noise in the simulation, we reduce the integral gain for the X and Y travelling axes to 0.05% of the proportional gain value. There is a lower fluctuation for the XY axes of the Tri-Coaxial copter. The efforts of the motors in stabilizing the rotor copter remain fluctuating at about 450 units (see Fig. 17.10b). The settling time decreases as the rings of oscillations shortened throughout the simulation time.

17.5 Tri-coaxial Copter

179

Fig. 17.12 Effect of tuning the integral gain

References 1. https://william.thielicke.org/multirotors/copterspeed.m 2. https://recensionidroni.com/droni-aerodinamica-ecco-come-dovrebbe-essere-un-drone/ 3. Ng T.S. (2018) Rotorcrafts. In: Flight Systems and Control. Springer Aerospace Technology. Springer, Singapore. https://doi.org/10.1007/978-981-10-8721-9_7

Chapter 18

Air Autonomous System

Innovations in air technology have brought aviation research further into another new frontier. Academics, universities, government, military, commercial, and industries have ventured into the ideas of the new micro air technology, VTOL air taxis, nextgeneration air vehicle, and unmanned air system. From the micro flapping wings to the electric ride-sharing air car, we have a common sharing idea of go-green mobility.

18.1 Micro Air Technology For the next decade, new challenges in the autonomous systems involve computing automation in navigating its surroundings by fly-by-wire control, digital adaptation maps, and joystick. At sea, the command and control system adjusts the speed and depth of the submarine in control. In the air, new hummingbird technology (Fig. 18.1) mimics the miniature flying bird. The control algorithm hovers and stabilizes the tiny mechanics. By implementing speech technology, the autonomous insect can understand language to activate its flying wings to control its direction of flight or hovers. Without any vision integration, the artificial bird senses each touch altering an electric current to deter the course of its flight. Research on insect-like technology [1] is growing popular in Universities worldwide. The DARPA program [2] involves research that taps deep into the sensing, memory, signal processing, and actuation that mimics a biological insect. The insect’s neural sensory system and its physiology are studied. The new artificial intelligence computational hardware involves electromagnetics and chemical for the neural sensor system. This system satisfies SwaP constrained requirements. Figure 18.2 shows a self-powered insect robot that resembles a mosquito. Six tiny solar cells weighing a total of 60 mg power the insect robot. The little UAV is 2.6 in. in height, weighing 259 mg. It has four wings, having a wingspan of 1.4 in., flaps 170 times per second. The two pairs of wings controlled by muscle-like plate contract when experience voltage. © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021 T. S. Ng, Robotic Vehicles: Systems and Technology, https://doi.org/10.1007/978-981-33-6687-9_18

181

182

18 Air Autonomous System

Fig. 18.1 The hummingbird (Credit Purdue University)

Fig. 18.2 The mosquito (Credit Harvard University)

FLIR has developed the Black Hornet (Fig. 18.3), a lightweight Nano UAV weighing only 32 g. It is 168 mm long and flies at 21 km/h at a range of 2 km. The Nano-unmanned aerial vehicles have a flight duration, which lasts up to 25 min and can withstand up to 20 knots of wind turbulence. Equipped with electro-optical and thermal imaging sensors, it operates into smoke, fog, and even darkness to produce image snapshots and videos. It has the advantage of deploying by soldiers in GPS denied environments for surveillance and reconnaissance. The reconnaissance data send to soldiers in the field provides immediate covert situational awareness. Fig. 18.3 The black hornet (Credit FLIR)

18.2 VTOL Air Taxis

183

18.2 VTOL Air Taxis In the first few chapters, we have seen autonomous robocars now we introduce machines from self-driving cars to flying cars. The vertical take-off and landing electric multi-rotor vehicle is gaining popularism. The building of VTOL not only omits runway but also has electrically powered motor, which saves energy by using its fixed-wing for flight. Figure 18.4a shows the air taxi with eight rotors. It is a hybrid multi-rotor system with fixed-wing aircraft. It tilts a bit when flying in its forward direction using the pair of fixed-wing. Its multiple rotors in the air taxi help to perform sharp turn vertically or horizontally, or even stop in mid-air. Figure 18.4b shows an Airbus Vahana fixed or rotor hybrid VTOL in the test flight. Finally, we have the VTOL aircraft from Kitty Hawk. It works with twelve rotors and a pusher propeller for the fixed-wing vehicle. Volocopter GmbH has flown the 18 multi-rotors VTOL air taxi for its historic public flight on 25 Sept. 2017 in Dubai. Singapore has the Volocopter tested at Seletar Airport [3]. It had flown for a distance of 1.5 km for 2 min at 40 m high, flying pass Singapore’s Marina Bay (Fig. 18.5). Like land vehicles, these air vehicles are migrating to the all-electric powered engine named e-VTOL (Fig. 18.6). The electrical version of the VTOL (e-VTOL) can be 100 times quieter than the helicopter. Volocopter intends to expand its electric technology, passenger, payload, and flight system, including its autonomous features in air taxis, to support urban air mobility. Although research and development are going for a lighter battery powered motor-prop system for a longer and noiseless range flight, we also have the concern to address safety and reliability in the air system. However, before getting these air taxis to fill the sky, new laws are to establish for the sky commuters.

(a) (Credit Opener)

(b) (Credit Airbus)

(c) (Credit Kitty Hawk) Fig. 18.4 Hybrid VTOL air taxis (a, b, c)

184

18 Air Autonomous System

Fig. 18.5 VTOL air taxi (Credit Volocopter)

(a) 4-rotor Air Vehicle (Credit Wind River)

(b) 18-rotor Air Vehicle (Credit Volocopter)

(c) 8-rotor Flying Car (Credit EHang) Fig. 18.6 E-VTOL air taxis (a, b, c)

18.2 VTOL Air Taxis

185

Fig. 18.7 Flying car (Credit SkyDrive)

Figure 18.7 is the SD-03 model Japan flying car from SkyDrive. It measures 2 m × 2 m × 4 m. The pilot manned eVTOL car is build with rotors for vertical take-off and landing. However, it also has a computer assisted control system that helps to stabilize and monitor the aircraft performance. Fig. 18.8 Ducted fan UAV

Fig. 18.9 Next generation UAS (Credit FlightWave)

186

18 Air Autonomous System

Fig. 18.10 E-VTOL air Vehicle (Credit Uber Technologies)

18.3 Unmanned Air System Intelligent aerospace includes the ducted fan UAV to perform surveillance and reconnaissance in military missions. AeroVironment taps Flightwave Aerospace vertical takeoff technology for the next-generation tactical unmanned aerial system. Uber has developed an electric vertical-take-off-landing air vehicle. For an autonomous system similar to in-flight certification, certain Design Assurance Level (DAL) must be in compliment. Designers must focus on attending to unpredictable external factors resulting in system failure. For instance, in avionics, the system designer must ensure the flight control computer standard is in accordance with DO-254/DO-178C DAL A. Table 18.1 applies to land and air autonomous system design assurance competency. Figure 18.11 shows an Arsenal airplane. In the military, the Arsenal airplane leads the way to the future air combat system. In today’s air communication technology, the new 5G radio air interface comprises the latest antenna technology with advanced coding and modulation techniques to establish a brand new framework for waveform communication has increased its carrier bandwidth to 400 MHz. UAS Table 18.1 Design assurance level DAL

System of failure

Passenger outcome

Level of danger

E

Internet, Entertainment, Infotainment

Not bringing harm

Nil

D

Data recorder, Lamps and lightings, Data acquisition system

Some minor disturbance

Minor

C

Navigation unit, GPS, Environmental control system, Damper

Injuring passengers

Major

B

Auto throttle, Air landing system, Automated driving control, Display Unit

Fatal injuries

Hazardous

A

Control computer, Remote control, Digital engine control, Autonomous system complete failure

Costing lives

Catastrophic

18.3 Unmanned Air System

187

Fig. 18.11 Arsenal Airplane

developers agreed on the unmanned air mobility to run several applications simultaneously in real-time. They put forth in the priority for its functional challenges like the GPS, collision avoidance, and navigation before justification on safety crucial certification and running the machine learning or artificial intelligence capability (Figs. 18.6, 18.7, 18.8, 18.9 and 18.10).

18.4 Go-Green Air Mobility In the effort to go for an Ergo-green environment, battery and hydrogen consumption in aircraft are increasingly gaining popularity. The new Turbofan design (Fig. 18.12a) achieves a zero-emission target for the green environment. Powered by a liquid hydrogen tank to drive the gas-turbine engine, the plane does not contaminate or emit any toxic gases to the surroundings. Hydrogen-fueled (Fig. 18.12b) electric powertrain reduces fuel consumption with improved efficiency, and at the same time, reduces maintenance cost. ZeroAvia finds its powertrain application in commercial aircraft, air taxis, rotorcraft, and cargo aircraft. On the other hand, France unveiled the world’s first electric airliner in June 2019. It travels for 650 miles (1040 km) at a speed of 276 mph (440 km/h) from a height of 3000 m (10,000 ft). Surprisingly, it runs only on a single charged battery. It expects to enter service in 3 years. With so much enthusiasm in the progress of electric airplanes, however, a mishap happens along the way. An electric airplane, the Alpha Electro G2 crashes in Norway in mid-August 2019. At that time, the plane was flying to the airport while it lost its power from the battery-powered engines. The cause of it is under investigation. Nevertheless, research is looking into the direction to make its improvement in battery-powered

188

18 Air Autonomous System

Fig. 18.12 Hydrogen-Fueled Airplane (a, b)

(a) (Credit Airbus)

(b) (Credit ZeroAvia) Fig. 18.13 Autonomous Electric Air Car (Credit EmbraerX)

air mobility and re-adjust the capability of electric airplanes. Taking a step further, the autonomous electric-ride sharing air car (Fig. 18.13) is coming your way in the near future.

References 1. https://www.militaryaerospace.com/unmanned/article/14035542/uav-insect-solarpowered ?cmpid=&utm_source=enl&utm_medium=email&utm_campaign=weekly&utm_content= 2019-07-03&o_eid=6545B5976723A9Z&rdx.ident%5B pull%5D=omeda%7C6545B59767 23A9Z&oly_enc_id=6545B5976723A9Z 2. https://www.newscientist.com/article/2207687-tiny-flying-insect-robot-has-four-wings-andweighs-under-a-gram/ 3. https://www.intelligent-aerospace.com/helicopter/article/14069095/volocopter-urban-air-mob ility-flying-taxi-air-taxi

Index

¯ A Acceleration, 23, 35–37, 40, 46, 52, 93, 101, 104, 107, 110, 150, 158, 164, 165, 167, 173 Accelerator, 3, 17, 27, 36, 67, 77, 81, 85, 86, 93, 105, 108 Access, 7, 9, 11, 14, 61, 75, 77, 79, 93 Accuracy, 10, 27, 59, 70, 84, 85, 90, 92 Advanced, 1, 3, 19, 20, 27, 51, 55, 57, 61, 67, 75, 89, 92, 107, 186 Advanced Driver Assistance Systems (ADAS), 2–4, 11, 14, 17, 23, 24, 27–29, 41, 49, 53, 58, 60, 68, 76, 82, 87–90, 93, 105 Advantages, 1, 4, 7, 14, 19, 24, 27, 32, 43, 46, 49, 51, 58, 69, 74, 75, 77, 92, 101, 113, 141, 177, 182 Aerodynamic, 5, 52, 149–152, 159, 160, 164, 167, 173, 174, 177 AGL, 24, 25 Air, 1, 3, 5, 10, 37, 82, 93, 103, 149, 151, 181, 183, 184, 186–188 Airbag, 11, 14, 108 Aircraft, 183, 185, 187 Alert, 7–9, 46, 68, 73, 76, 78 Algorithm, 4, 18, 21, 23, 27, 36, 46, 51, 52, 55, 67, 68, 71, 81, 83, 84, 88, 90, 92, 99, 107, 109, 113, 132, 181 Analyse, 46, 47, 130 Analysis, 2–4, 31, 52, 53, 66, 81, 84, 88, 91, 93, 108, 110 Android, 25, 41 Angle, 23, 35, 36, 38, 52, 75, 92, 112, 117, 119, 123, 143, 145, 150, 152, 158, 160, 164, 165, 167, 173, 177

Angular, 68, 119, 121, 145, 158, 164, 165, 167, 173, 177 ANSYS, 3, 49, 52, 53, 82 Antenna, 7, 38, 43, 90, 186 Application, 1, 5, 7, 9, 14, 18, 19, 23–25, 28, 32, 33, 41, 46, 53, 58, 59, 61, 62, 67, 70, 71, 73, 75, 77–79, 81, 82, 84, 85, 89, 91, 103, 111, 187 Architect, 23–25, 32, 49, 53, 92, 93 Arm, 20, 21, 40, 41, 145, 158, 167 Artificial, 2–4, 18, 27, 46, 47, 51, 57, 62, 67, 81–85, 87, 92, 109, 181, 187 Assistance, 2, 3, 36, 43, 67, 75, 87, 93, 108 Attack, 14, 37, 38, 73, 78, 145, 150 Audio, 9 Augmented, 63, 67, 79 Authentication, 12, 14, 41, 61 Automaker, 2, 12, 25, 29, 32, 63, 68, 81, 89, 98, 104, 105, 110, 111 Automated, 1–4, 8, 19, 20, 24, 32, 36, 38, 41, 53, 55, 59, 60, 67, 79, 87, 88, 90, 92, 93, 102, 103, 110 Automobile, 1–4, 7, 9–12, 20, 23–25, 29, 35, 41, 43, 47, 49, 53, 55, 57, 58, 64, 65, 69, 73, 76, 77, 79, 84, 89, 92, 97, 109–111 Automotive, 1, 3, 7, 9–12, 17, 20, 21, 24–28, 32, 33, 36–38, 41, 44–46, 51, 53, 57– 61, 66, 67, 75, 77, 79, 81, 82, 84, 85, 87, 90, 92, 96, 107, 108, 111 AUTomotive Open Systems ARchitecture (AUTOSAR), 11, 14, 32, 84, 92, 108 Autonomous, 1–5, 7, 10, 11, 14, 15, 19, 23– 25, 27, 30–32, 36, 37, 41, 44–47, 49, 51–53, 55, 57, 60, 63, 64, 68, 76, 77, 81, 82, 84, 85, 87–90, 92, 93, 102,

© The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021 T. S. Ng, Robotic Vehicles: Systems and Technology, https://doi.org/10.1007/978-981-33-6687-9

189

190 104, 107–110, 113, 181, 183, 186, 188 Autonomy, 1–3, 19, 23, 32, 36, 44, 67, 88, 89, 108 Avoidance, 11, 78, 81, 90, 187

B Backbone, 4, 9, 27, 28, 134 Bandwidth, 9, 10, 20, 29, 46, 47, 60, 75, 76, 79, 85, 90, 93, 186 ¯ 103, 106, Battery, 38, 40, 41, 49, 51, 95–100, 110, 111, 164, 183, 187 Binary, 12 Biometric, 81 Blackberry, 12, 13, 25 Braking, 8, 11, 14, 23, 35, 36, 40, 43, 64, 66, 67, 73, 74, 77, 87, 88, 95, 104, 105, 107, 108 Bohrbug, 21

C Calculation, 37, 103, 149 Calibration, 90, 117–119, 121, 130, 134 Camera, 24, 27, 31, 36, 46, 49, 53, 57, 63, 64, 66, 67, 69, 81, 89, 111 Car, 2–4, 7–9, 11–14, 21, 23, 24, 27, 29, 31, 35–39, 41, 45, 48, 49, 52, 53, 55, 57, 60–64, 67–69, 74, 76–79, 81, 84, 87–89, 92, 95, 96, 98, 99, 104, 105, 107–111, 181, 185, 188 Cellular, 8, 9, 74–76 Certification, 32, 108, 186, 187 Challenge, 23, 27, 47, 52, 58, 81, 99 Channel, 9, 73, 106 Charger, 58, 96, 100, 102, 105, 106 Charging, 41, 73, 95, 96, 99, 100, 105, 106 Check, 12, 21, 29, 61, 81, 92, 108, 147 Chip, 3, 20, 21, 32, 37, 38, 57–59, 61, 62, 67, 75, 82, 85, 103, 113, 114, 124, 126 Classification, 44, 58, 63, 67, 71, 84, 87, 92 Clock, 21 Cloud, 4, 7, 8, 10, 30, 33, 43, 45–47, 68, 74–79, 91, 100, 107, 109 Clutch, 38, 88 Coaxial, 173 Code, 11, 12, 20, 21, 25, 32, 53, 55, 61, 84, 92 Collision, 7–9, 11, 36, 64, 74, 77, 78, 81, 89, 90, 92, 187 Communication, 2–4, 7–11, 14, 15, 21, 23, 25, 27–29, 32, 37, 43, 45–47, 49, 53,

Index 57, 59, 67, 73, 74, 76–79, 87, 90, 92, 108, 186 Company, 10, 24, 107 Compile, 21, 92 Complete, 18, 25, 41, 45, 53, 68, 69, 85, 87, 92, 95, 96, 123, 130, 132, 146, 177 Complexity, 12, 14, 20, 33, 64, 66, 108, 109 Compliance, 4, 12, 39, 53, 55, 60, 107, 108 Component, 51, 53–55, 60, 61, 93, 95, 108 Computation, 17, 18, 31, 46, 62, 67, 74, 85, 88, 92, 93, 150, 151, 153, 157, 163, 164, 170, 175, 177 Computing, 4, 8, 14, 19, 27, 41, 43, 46, 47, 62, 82, 85, 150, 167, 181 Conclusion, 20, 21 Condition, 7, 8, 27, 30, 35, 36, 43, 46, 49, 51, 53, 54, 58, 64, 69, 73, 77, 81, 82, 92, 98, 100, 105, 109, 167, 177 Confidence, 1, 21 Configuration, 9, 24, 61, 69, 75, 82, 92, 145, 154, 159, 173, 177 Congestion, 9, 73, 78, 111 Connect, 9, 12, 32, 41, 49, 77, 113, 114, 119 Connectivity, 2–4, 7–12, 14, 18, 20, 24, 25, 27–29, 32, 53, 73–76, 78, 79, 88 Containers, 25 Control, 2–5, 8–10, 12, 14, 15, 18–20, 23, 27, 29, 30, 32, 35–37, 41, 43, 44, 46, 47, 49, 52, 53, 55, 57, 64, 67, 74, 81, 82, 87–90, 92, 96, 97, 99, 103–105, 107, 108, 110, 111, 113, 114, 124, 130, 141, 142, 146–152, 154, 159, 160, 164, 165, 167, 173, 174, 176, 177, 181, 185, 186 Controllable, 146, 159 Converter, 38, 58, 60, 95–97, 99–101, 105 Cooling, 57, 59, 100, 105 Copter, 143, 149–151, 167, 169–171, 173, 174, 176–178 Core, 17, 33, 57, 85, 90 CPU, 17, 18, 41, 44, 46, 81 Crash, 8 Create, 7, 14, 21, 35, 36, 44, 92, 95, 105, 108, 110 Critical, 12, 17, 19, 20, 25, 27, 29, 33, 35, 41, 43, 53, 55, 57, 58, 108, 111 Crosstalk, 108 Current, 46, 58–60, 77, 97–101, 105, 181 Customize, 25 Cybersecurity, 12, 35, 38, 53, 55 Cycle, 29, 52, 61, 75, 99, 107, 121, 123, 130, 132

Index D Data, 2–4, 7–10, 12, 18, 20, 23–25, 28–33, 38, 43, 44, 46, 47, 49, 51, 53, 54, 57, 61, 63, 64, 66, 69, 70, 73–79, 82, 84, 87, 88, 92, 93, 99, 106, 108, 182 Deceleration, 35, 37, 104, 105, 110 Decision, 2 Deep, 3, 4, 18, 19, 27, 51, 67, 68, 81, 83, 84, 92, 93, 181 Defined, 24, 44, 141, 145, 146, 158, 173 Degrees, 64, 68, 69, 90–92, 100, 113, 116, 117, 119, 143, 145, 150, 152, 173, 177 Deployment, 2, 9, 25, 32, 46, 47, 59, 64, 78, 80, 95, 148 Derbe, 144, 145, 149, 152, 154–156, 158– 160, 177 Detection, 14, 20, 21, 27, 37, 43, 51, 61, 63, 64, 68, 74, 76, 77, 81, 83, 84, 87, 88, 91, 92, 107, 109, 113, 126 Development, 1, 4, 8, 14, 24, 25, 32, 45, 51–53, 57, 79, 81, 104, 107–109, 183 Device, 10, 14, 17, 19, 25, 33, 36, 45–47, 57, 58, 60, 61, 67, 73, 75, 76, 84, 91, 96, 99, 105, 108, 113 Diagnostic, 7, 11, 28, 31, 41, 43, 46, 61, 75, 100 Diagonal, 147, 159, 160 Different, 10–12, 14, 17, 21, 23–25, 28– 32, 36, 37, 41, 51, 55, 60, 63, 68, 70, 74, 81, 89–92, 99, 100, 111, 114, 117, 121, 123, 124, 130, 132, 133, 148–150, 164, 167, 177 Dimension, 109, 141, 146, 150, 152, 165, 167, 177 Direction, 73, 88, 117, 121, 123, 124, 130, 173, 176, 177, 181, 183, 187 Discrepancies, 14 Display, 1, 12, 17, 18, 41, 55, 58, 63, 67, 73, 88, 89 Distributed, 2, 14, 23, 32, 41, 134 Distribution, 32, 74, 91, 134 Disturbance, 146, 147, 150, 159, 167, 177, 178 Drag, 52, 145, 149–152, 160, 164, 166, 167, 170, 174, 177 Drive, 2, 8, 17, 19, 24, 32, 35, 36, 38, 40, 41, 53, 55, 57, 61, 67, 68, 75, 82, 88, 96, 105, 132, 173, 177, 187 Driver, 2, 3, 7, 8, 10, 11, 23, 24, 27, 36, 41, 43, 47, 55, 57, 63, 67, 73–75, 77, 78, 81, 84, 87–89, 92, 93, 96–98, 105, 107–109, 111

191 Driving, 2–4, 7, 8, 11, 19, 23–25, 27, 33, 36, 41, 43, 44, 46, 47, 49, 51–55, 60, 63, 64, 68, 74, 77, 79, 81, 84, 87–89, 92, 93, 95, 96, 102, 103, 105, 107, 108, 110, 113, 114, 130, 142 Drones, 1, 141, 151, 177 Dynamic, 4, 21, 30, 51–53, 63, 74, 78, 82, 83, 87, 92, 96, 97, 99, 105, 108 E ECU, 3, 14, 17, 27, 36, 40, 41, 44, 46, 49, 95, 103 Efficiency, 18, 41, 58–60, 74, 76, 78, 93, 95–101, 105, 110, 187 Eigenvalue, 148 Electric, 3, 4, 27–29, 35–38, 40, 49, 50, 58, 59, 76, 88, 95–101, 105, 110, 113, 114, 124, 126, 128, 181, 183, 186–188 Electromagnetic, 21, 28, 29, 53, 58, 99, 181 Electronic, 3, 4, 9, 10, 12, 23, 35, 44, 49, 50, 52, 53, 57–61, 67, 81, 87, 93, 95, 98, 103–105, 108, 110, 111, 113 Embedded, 3, 4, 14, 17, 19, 20, 24, 25, 32, 43, 46, 47, 49, 52, 53, 57, 61, 62, 68–70, 75, 84, 85, 87, 90, 93, 111 Emission, 3, 40, 41, 59, 60, 75, 95, 96, 98, 99, 101, 103, 107, 110 Encryption, 14, 61 Energy, 39–42, 58, 67, 75, 93, 95, 96, 98– 100, 110, 111, 183 Engine, 4, 14, 29, 33, 41, 43, 49, 51, 59, 64, 67, 81, 88, 96–98, 101, 103, 150, 183, 186, 187 Engineering, 1, 38, 78, 84, 141 Enhance, 4, 7, 21, 58, 61, 63, 74, 76, 77, 88, 90, 92, 96, 117 Entry, 14, 107 Environment, 2–4, 14, 20, 24, 30, 33, 41, 47, 51, 53, 55, 59, 68–70, 74–77, 84, 88, 90, 92, 95, 98, 108, 111, 182, 187 Equation, 52, 141, 142, 147, 149, 150, 170, 173, 176 Error, 1, 11, 19–21, 23, 61, 88, 108 Establish, 12, 41, 183, 186 Estimation, 84, 99, 109, 160, 177 Ethernet, 4, 10, 20, 24, 27–29, 37, 38, 43, 49, 75 Example, 2, 12, 14, 19, 28, 33, 37, 41, 46, 55, 58, 60, 63, 64, 69, 70, 76, 78, 79, 84, 99, 100, 108, 119, 121, 124, 130, 132, 141, 142, 147, 167, 177 Execute, 24, 25, 53, 77, 82, 114, 130

192 F Failure, 17, 19–21, 35, 36, 53, 55, 61, 75, 88, 90, 99, 104, 105, 108, 109, 186 Fault, 20, 21, 46, 53, 61, 69, 99, 104 Feature, 2, 7, 8, 17, 18, 24, 27, 30, 31, 35, 46, 51, 53, 61, 62, 64, 66–68, 75, 84, 87–89, 92, 96, 102, 107, 111, 183 Feedback, 36, 88, 105, 109, 142, 158, 167 Filter, 38, 110, 111 Firewall, 25, 43 Firmware, 10, 11, 25, 61 Flash, 44, 45, 61 Fleet, 7, 10, 14, 15, 30, 32, 35, 46, 61, 73, 75, 77, 109 Flexibility, 24, 32, 60, 82, 85, 117, 134, 135 Flight, 5, 143, 149, 150, 164, 167, 173, 174, 177, 181–183, 186 Flowchart, 123, 124, 126–128, 130, 131 Force, 36, 38, 39, 51, 52, 142, 145, 158, 164–166, 170, 177 Formal, 14 FPGA, 18, 19, 46, 49–51, 58, 82 Framework, 24, 27, 32, 75, 92, 109, 124, 186 Fraternal, 21 Frequency, 23, 30, 38, 58, 74, 76, 80, 90, 92, 99 Friction, 40, 98, 105 Fuel, 7, 40, 41, 58, 64, 75, 95, 96, 98, 101, 103, 104, 110, 111, 187 Function, 17, 25, 29, 32, 36, 38, 46, 47, 55, 61, 81–83, 88, 100, 108, 109, 114, 124, 130, 132, 142 Functional, 11, 17, 20, 21, 23, 25, 29, 41, 52, 53, 55, 81, 88, 107–109, 187 Fusion, 2, 4, 18, 19, 23, 27, 30, 37, 41, 44, 47, 49, 51, 54, 66, 68–71, 84, 88, 89, 92, 93, 97

G Gain, 27, 148, 154, 159, 167, 170, 173, 176, 178, 179 Gateway, 7, 14, 43, 53, 61, 75 Gigabit, 20, 27, 43 Gravitational, 145, 166, 170 Gravity, 158, 164, 165 Green, 5, 95, 187

H Hacker, 11, 12, 45 Hacking, 11, 35, 37

Index Hardware, 3, 4, 11, 12, 14, 17–21, 23–25, 37, 41, 43, 49, 50, 52, 53, 57, 60, 61, 67, 84, 85, 93, 107, 108, 181 Heating, 24, 41, 97 Height, 37, 66, 75, 112, 117, 121, 130, 132, 151, 181, 187 Hexacopter, 142–144, 149–152, 166, 167, 169, 176, 177 Highway, 2, 27, 51, 55, 67, 68, 77, 111 Hovering, 142, 167 Hybrid, 23, 27, 58, 66, 95, 96, 183 Hydrogen, 111, 187 Hypervisor, 4, 11, 12, 14, 25, 27, 68

I Ideal, 25 Image, 4, 18, 19, 27, 44, 46, 51, 57, 66–70, 82, 84, 90, 92, 112, 182 Imaging, 63, 66, 90, 182 Important, 3, 10, 14, 41, 67, 68, 73, 88 Inertia, 2, 52, 77, 110, 145, 158, 164, 166, 167 Information, 2, 4, 7, 10, 12, 14, 23, 30, 33, 37, 41, 43, 45–47, 55, 63, 64, 66, 70, 73–81, 87, 90, 105–107, 149, 158 Infotainment, 2, 9, 11, 14, 17, 24, 27–30, 41–43, 57, 58, 61, 77, 79, 97, 107, 186 Infrared, 63, 64, 88, 91, 110, 113, 124 Infrastructure, 3, 4, 8, 9, 32, 33, 37, 43, 44, 73, 76–78, 96, 111 Instrument, 17, 24, 41, 58, 63, 90, 101, 102, 108 Integral, 154, 167, 173, 174, 176, 178, 179 Integrate, 14, 32, 37, 44, 49, 52, 53, 63, 69, 76, 81, 88, 92, 104, 111 Integrity, 17, 20, 21, 32, 53, 61, 107, 108 Intelligence, 2, 27, 37, 46, 47, 51, 57, 62, 67, 81–83, 85, 87, 92, 181, 187 Intelligent, 3, 4, 18, 30, 73, 81, 85, 102, 109, 186 Interconnected, 4, 15, 32, 77 Interface, 2, 10, 20, 25, 27, 29, 31, 46, 53, 57, 66, 93, 109, 111, 186 Interference, 21, 29, 37, 58, 68, 99, 150, 177 Internal, 4, 11, 12, 23, 27, 63, 75, 77, 79, 126 Internet, 4, 7, 12, 14, 17, 74, 77, 78, 93, 186 Interoperability, 24, 109 Interrupt, 17, 41 Inverter, 41, 49, 51, 57, 58, 95–97, 102 Investigate, 14 IoT, 4, 14, 33, 45, 75, 81

Index Isolate, 12, 25, 41 Isolation, 11, 14, 20, 24, 25, 27, 28, 58, 99

K Kit, 32, 81, 84

L Latency, 9, 10, 27–30, 43, 46, 61, 68, 76, 77, 79, 82, 83 Learning, 2–4, 18, 19, 27, 32, 33, 42, 46, 51, 53, 62, 67, 68, 81, 83, 84, 88, 92, 93, 187 Legacy, 25, 32 Level, 2, 17, 18, 21, 24, 30, 32, 36, 60, 63, 67, 77, 88, 89, 95, 107–110, 117, 132, 159, 176, 177, 186 License, 25, 75 Lidar, 2, 19, 23, 31, 37, 41, 43, 44, 47, 51, 54, 64, 69, 77, 78, 87–89, 91, 92, 97, 107, 109 Light, 3, 40, 64, 73, 76, 78, 88, 89, 91, 96, 98, 111, 113, 124 Linear, 5, 40, 58–60, 146, 148, 149, 158, 159, 164, 165, 167, 170, 173, 177 Linux, 24, 25 Load, 9, 29, 35, 39, 73, 98, 99, 101, 133 Localization, 19, 44, 45, 47, 76, 84, 88, 92 Logical, 20, 62 Losses, 9, 10, 33, 57, 58, 97–101, 106

M Machine, 1–5, 17–19, 25, 27, 30, 32, 33, 46, 53, 68, 75, 79, 81–85, 88, 113, 183, 187 Magnet, 97, 98 Mappings, 3, 4, 10, 30, 31, 37, 41, 43, 44, 46, 47, 69, 87, 88, 92, 93 Mass, 38, 52, 109, 110, 158, 164, 165, 167, 170 Material, 91, 98, 99, 111, 113, 135 Matrix, 19, 141–144, 146–148, 154, 155, 159, 160, 162, 167, 170, 173, 176 Maximal, 23 Maximum, 29, 61, 93, 96, 105, 124, 150– 152, 164 Measure, 21, 78, 105, 185 Mechanical, 28, 36, 52, 54, 57, 104 Mechanism, 4, 38, 95, 113, 117, 130 Mechatronic, 117, 130, 132 Media, 17, 29, 93

193 Memory, 20, 21, 41, 44, 61, 62, 71, 93, 108, 181 Mesh, 14 Message, 9, 21, 23, 29, 32, 77–79 Messaging, 9 Metal, 14, 113 Microcomputer, 12 Microprocessor, 36, 99 Middleware, 21, 24, 47 Mitigate, 11, 12, 14, 20, 45, 52, 53, 74, 108 Mobility, 1, 3, 4, 32, 36, 52, 57, 110, 181, 183, 187, 188 Model, 3, 4, 19, 21, 30, 33, 44–46, 49–51, 53, 54, 64, 76, 84, 85, 105, 145, 148, 150, 185 Modelling, 2, 90, 158, 164, 165, 167 Modular, 2, 14, 24, 51 Monitor, 2, 7, 21, 25, 75, 77, 105, 185 Motion, 40, 47, 53, 66, 87, 88, 114, 116, 117, 119–124, 126, 129–132, 141, 149, 150, 177, 178 Motor, 36, 38, 40, 49, 50, 79, 88, 97–99, 101, 103, 104, 113, 114, 117, 145, 146, 151, 152, 158–160, 164, 167, 176–178, 183 Multiplication, 19

N Navigation, 3, 4, 24, 30, 37, 38, 58, 68, 79, 88, 187 Negative, 29, 82, 100, 147, 148, 176 Network, 2–4, 7–12, 14, 17, 19, 20, 27–29, 32, 33, 37, 41, 46, 47, 53, 59, 60, 62, 67, 69, 74–77, 79, 80, 82–85 Neural, 3, 4, 17, 19, 27, 46, 53, 62, 67, 82–85, 181 Neutron, 21 Noise, 28, 38, 41, 60, 61, 69, 90, 99, 110, 111, 147, 159, 176, 178 NVIDIA, 18, 19, 32, 84

O Object, 18, 37, 44, 51, 53, 63, 64, 66–69, 71, 84, 87, 90–92, 111, 113, 121, 123, 126 Obstacle, 66, 67, 84, 92, 121, 123, 124, 126, 134 Obstruction, 112, 113, 121, 123, 124, 126 Operating, 2–4, 14, 17, 24, 25, 28, 32, 35, 45, 46, 76, 90, 91, 93, 95, 98, 99, 119

194 Operation, 9, 20, 21, 29, 32, 36, 44, 47, 57, 58, 61, 62, 67, 69, 75, 88, 91, 93, 100, 109, 119 Optical, 27, 29, 63, 67 Orientation, 54, 91

P Package, 24, 57–59, 108 Packet, 9, 10, 33, 73, 76 Parallel, 19, 41, 45, 46, 67 Parameter, 93, 117, 132, 133, 141, 145, 150, 164, 167, 173 Pedal, 36, 88, 104, 105, 107 Pedestrian, 3, 9, 63, 64, 67, 68, 74, 76–78, 88 Perception, 2, 19, 44, 47, 66, 69, 70, 81, 84, 87, 92 Perform, 2, 3, 14, 17–19, 21, 23, 27, 30, 36, 38, 41, 46, 47, 51, 53, 67, 74, 83, 84, 89, 93, 103, 105, 117, 126, 130, 132, 141, 183, 186 Perturbation, 176, 177 Physics, 51, 52 Pillar, 12, 13 Pitch, 111, 141, 143, 149, 164, 167, 170, 173, 177 Pixel, 18, 27, 84 Platform, 4, 7, 10, 11, 14, 17, 18, 23–25, 27, 32, 33, 37, 45–47, 49–52, 54, 68, 75, 76, 84, 92, 113, 134, 147, 151, 177 Pole, 5, 100, 141, 147, 148, 154, 159, 160, 170 Pose, 23, 27, 47, 66, 84 Position, 10, 54, 66, 73, 74, 104, 116, 117, 119, 121, 124, 130, 142, 154, 158, 159, 164, 165, 167, 173, 178 Positive, 82, 109, 147, 176 Potential, 7, 9, 12–14, 37, 78, 82 Power, 3, 4, 17, 19, 20, 25, 27, 33, 35–38, 40, 46, 47, 49–51, 57–62, 67, 69, 71, 73, 75, 76, 82, 85, 90, 91, 93, 95–101, 103, 105, 108, 114, 177, 181, 187 Powertrain, 4, 27, 29, 35, 59, 87, 95, 98, 99, 103, 105, 187 Precision, 27, 85 Pressure, 75, 98, 103, 105, 107, 110 Privacy, 12, 14, 37, 78 Processing, 3, 4, 10, 14, 17–19, 21, 23, 24, 27, 31–33, 38, 41, 43, 44, 46, 47, 51, 53, 54, 62, 66–69, 71, 77, 82, 84, 87, 108, 181

Index Processor, 4, 17–21, 23, 24, 27, 32, 35, 41– 43, 47, 57, 59, 67, 68, 75, 90, 93, 108, 113, 114, 119, 124 Programmable, 18, 20, 53, 61, 82, 90 Programming, 59, 93, 108, 113, 123, 124, 126, 130, 133 Propeller, 148, 151, 152, 158, 164, 167, 176, 183 Protection, 11, 37, 43, 61, 110 Protocol, 10, 14, 28, 29, 46, 77, 82 Pulse, 64, 91, 92, 119, 124, 126, 130 Q QNX, 12, 13, 23, 25 Quadcopter, 158, 160, 164, 170, 177 Quadrotor, 142, 144, 145, 147–149, 151, 152, 154, 156, 158–162, 164, 165, 167, 170, 177 R Radar, 2, 19, 23, 27, 37, 41, 43, 44, 47, 49, 51, 53, 54, 58, 64, 68, 69, 73, 77, 78, 87–90, 92, 108, 110 Rank, 146, 148, 176 Rates, 8, 9, 19, 29, 32, 35, 37, 58, 59, 75, 76, 91, 92, 99–101, 105 Real, 49, 51, 53, 55, 135, 150 Reality, 1, 7, 63, 67, 79, 130 Rear, 14, 23, 35, 38, 57, 66, 68, 89, 104, 108 Receiver, 37, 38, 91 Recognition, 18, 19, 27, 51, 53, 63, 66, 67, 69, 84, 111 Rectifier, 100 Reduce, 1, 14, 20, 24, 27, 32, 33, 35, 38, 39, 43, 46, 60, 74, 91, 93, 95, 97, 99– 101, 105, 109–111, 123, 134, 145, 177, 178, 187 Redundancy, 9, 20, 23, 33, 61, 64, 93 Reference, 24, 119, 123, 167, 170 Regenerative, 35, 40, 95, 100, 105, 110 Regulator, 5, 58–60, 103, 148, 149, 159 Reinforcement, 62, 93 Reliable, 14, 23, 24, 27, 46, 79 Remedy, 21, 110 Remote, 7, 12, 14, 24, 33, 46, 61, 67, 75, 82, 87, 106, 107 Rendering, 17, 67 Replicate, 21 Requirement, 14, 17, 20, 24, 25, 31, 33, 41, 53, 55, 66, 71, 75, 84, 90, 92, 93, 96, 107–109, 114, 181 Resolution, 63, 64, 90–92

Index Response, 23, 25, 38, 47, 51, 54, 64, 111, 124 Risk, 7, 11, 12, 14, 20, 21, 45, 47, 52, 107, 108 Roads, 1–3, 8, 9, 25, 27, 30, 35, 38, 43, 45, 47, 51–53, 55, 62–64, 66, 73, 76–78, 81, 84, 87, 92, 95, 105, 108, 109, 111, 112 Robot, 1, 3, 4, 36, 113, 114, 116, 117, 119, 121, 123, 124, 130, 132–135, 149, 181 Robust, 19, 59, 146, 154, 159, 160, 167, 170, 176, 177 Roll, 141, 143, 149, 170, 173, 177 Rotational, 110, 117, 119, 177 Rotorcraft, 5, 141, 142, 145–147, 150, 151, 167, 170, 173, 177, 187

S Safe, 2, 8, 12, 23, 24, 27, 43, 45, 46, 51, 55, 61, 64, 73, 87, 88, 119 Safety, 2–4, 7–9, 11, 12, 14, 17, 19–21, 23– 25, 29, 31, 32, 35, 37, 42, 43, 49, 52, 53, 55, 58, 63–66, 68, 69, 73, 74, 76– 78, 81, 87–89, 97, 100, 107–109, 111, 183, 187 Satellite, 8, 9, 80 Scalability, 24, 29, 32, 82, 108 Secure, 11, 12, 15, 24, 35, 37, 43, 46, 61, 68, 76, 89, 106, 108 Security, 3, 4, 7, 10–12, 14, 17, 24, 25, 30, 32, 37, 38, 41–43, 45, 46, 61, 68, 73, 75, 97, 106, 108 Semiconductor, 52, 53, 57, 59, 98 Sensor, 2–4, 7–9, 12, 14, 18–20, 23, 27, 29, 30, 32, 36–38, 41, 43, 44, 46, 47, 49, 51–54, 58, 59, 63–71, 77–79, 81, 82, 84, 87–93, 97–99, 105, 109–111, 113, 114, 121, 124, 126, 181, 182 Separation, 68 Server, 12, 21, 27, 45, 47, 77 Serves, 24, 27, 33, 43, 55, 58, 108, 119 Service, 3, 7, 9, 10, 12, 32, 33, 35, 43, 46, 78, 105, 107, 110, 111, 187 Servo, 113, 114, 117, 119, 121, 130 Setting, 117, 124, 129, 130, 132, 147, 154, 173, 177 Shock, 40, 57, 59 Signals, 9, 10, 27, 36–38, 41, 53, 54, 57, 62, 67, 73, 76, 78, 92, 93, 105, 113, 114, 124, 126, 181 Silicon, 20, 58, 82

195 Similar, 21, 88, 130, 164, 172, 186 Simulation, 3, 4, 21, 49, 51–55, 84, 87, 109, 149, 150, 159, 169, 174, 176, 178 Situation, 7, 8, 10, 25, 45, 51, 69, 73, 74, 77, 88, 93, 108, 110, 150 Size, 25, 44, 58, 64, 84, 91, 97, 99, 100, 108, 177 Smart, 7, 36, 37, 43, 46, 63, 64, 66–68, 71, 73, 75, 88, 89 Software, 2–4, 9–12, 14, 18, 20, 21, 23–25, 27, 30, 32, 37, 40, 41, 43, 44, 46, 47, 49, 51–53, 55, 63, 67, 75, 81, 84, 85, 87, 88, 91–93, 107–109, 111, 130 Solution, 7, 11, 14, 33, 35, 49, 52, 53, 57, 61, 75, 76, 79, 84, 96, 100, 104, 105, 109, 147, 167, 176 Source, 7, 21, 23–25, 41, 53, 63, 75, 95, 96, 98 Speed, 9, 18, 20, 21, 27, 28, 31, 35, 37, 38, 43, 52, 63, 73, 74, 77, 78, 84, 85, 88, 89, 91, 92, 95, 97, 98, 101, 103–105, 109, 130, 132, 150–152, 164, 166, 167, 170, 177, 181, 187 Spoofing, 37 Spring, 38, 39 Stability, 4, 35, 38, 41, 59, 104, 147, 159 Stable, 39, 146, 150, 167, 176, 177 Standard, 4, 8, 11, 17, 19, 20, 32, 38, 47, 49, 53, 55, 60, 61, 66, 83, 87, 92, 105, 107–109, 114, 186 State, 2, 32, 46, 77, 99, 108, 110, 141, 142, 146 Steering, 2–4, 11, 19, 23, 35–38, 43, 54, 58, 59, 67, 77, 87–89, 104, 107 Stop, 23, 74, 76, 81, 95, 105, 111, 121, 183 Storage, 3, 20, 33, 41, 46, 47, 61, 78, 96, 99 Structure, 25, 57, 124, 149, 150 Subsystem, 21, 31, 66, 82 Support, 7, 9, 10, 14, 17, 24, 25, 27, 32, 41, 46, 57, 62, 68, 75, 77, 79, 101, 108, 109, 183 Suspension, 35, 37–40, 66, 87 Swarm, 14 Switch, 9, 28, 58, 75, 76, 99, 102, 106 Synchronization, 36, 79, 80, 126, 132 System, 1–5, 7, 9–12, 14, 15, 17–21, 23–25, 27, 28, 30–32, 35–41, 43, 45–49, 51– 53, 55, 57, 59–61, 63, 64, 66–68, 73– 79, 81–83, 85, 87–93, 95–105, 107– 109, 111, 130, 141, 145–150, 158, 159, 165, 167, 176, 181, 183, 185, 186

196 T Target, 32, 69, 71, 84, 90, 91, 103, 167, 177, 187 Task, 1, 2, 18, 19, 27, 30, 32, 33, 52, 67, 87 Taxis, 35, 96, 181, 183, 184, 187 Technique, 2–5, 18, 19, 27, 36, 44, 47, 66, 68, 74, 76, 84, 90, 111, 123, 130, 141, 146, 147, 149, 159, 167, 186 Technology, 2–5, 7, 9, 10, 14, 17, 20, 21, 25, 27, 33, 36, 37, 41, 45, 47, 55, 57, 61, 63, 67, 68, 71, 73, 75–78, 81, 87– 89, 91–93, 95, 96, 100, 101, 105, 107, 110, 112, 181, 183, 186 Telematic, 7, 8, 24, 28, 61, 75, 79, 90, 108 Telemetry, 10, 27, 75 Temperature, 28, 58, 59, 76, 95, 98–100, 103, 105, 110 Test, 3, 29, 49, 51–55, 109, 183 Testing, 3, 4, 11, 23, 28, 29, 49–53, 87, 92, 96, 107–109, 177 Thermal, 53, 57–59, 98–100, 110, 182 Thread, 21 Thrust, 149–152, 160, 164, 166, 170, 174, 177 Tilting, 143, 150, 152, 160, 167, 174, 177 Time, 1, 3, 11, 17, 21, 23–25, 27, 29, 36, 45– 47, 51, 53, 58, 64, 67, 73, 74, 76, 80, 90, 92, 95, 98, 100, 103–105, 108, 109, 111, 114, 123, 124, 126, 130, 132, 146, 147, 178, 181, 183, 187 Timing, 64, 78, 92, 114, 124, 126, 127, 130, 132 Tolerance, 2, 21, 69, 104 Torque, 36, 38, 41, 98, 101–104 Traffic, 2–4, 7–9, 14, 25, 30, 31, 45, 51, 64, 67, 68, 73, 74, 76–78, 81, 84, 87, 107, 109, 111 Train, 14, 49, 55, 59, 83, 84, 111 Transceiver, 28, 59 Transient, 101 Transistor, 21 Transmission, 8, 9, 11, 12, 27, 29, 38, 58, 77, 91, 96, 99, 106, 141 Transportation, 1, 3, 4, 62, 95, 110, 111 Travel, 20, 38, 46, 64, 76, 92, 95, 110, 111, 119, 121, 130, 147 Turbine, 187

Index U Ultrasonic, 43, 54, 64, 66, 69, 88, 93 Update, 10, 11, 32, 44, 83 USB, 14, 20 Utilize, 9, 12, 49, 51, 60, 64, 88

V Validation, 49, 107, 109 Vector, 39, 141, 146, 147 Vectoring, 36, 38, 39, 102, 104 Vehicular, 2–4, 7, 10, 11, 14, 17, 19, 23, 28– 30, 32, 35, 43, 44, 46, 47, 52, 55, 59, 68, 73, 76, 77, 82, 91, 92, 98, 108 Velocity, 23, 52, 54, 90, 151, 152, 164 Verification, 14, 21, 53, 88, 92, 107–109 Vibration, 35, 38, 40, 59, 78, 89, 110, 150 Video, 9, 19, 20, 47, 57, 66, 78, 93, 182 View, 64, 67–69, 76, 90–92, 107, 114, 119, 121 Virtual, 4, 7, 12, 17, 24, 25, 32, 41, 51, 53, 54, 75, 92, 107, 109 Virtualization, 18, 24, 25, 41 Visible, 91 Vision, 2–4, 18, 27, 32, 46, 60, 63, 64, 66–70, 81–84, 88, 90, 93, 181 Voice, 10, 14, 77, 107, 111 Voltage, 28, 35, 58–60, 67, 96–100, 103, 105, 132, 181 Vulnerabilities, 10–12, 14, 37, 46

W Warning, 7, 8, 64, 68, 73, 74, 78, 88, 89, 92, 107 Weakness, 12, 90 Weight, 20, 25, 58, 85, 97, 99, 100, 105, 108, 110, 117, 142, 143, 145, 150, 158, 164, 165, 167, 177 Wheel, 19, 35, 36, 38, 40, 55, 63, 67, 77, 88, 89, 98, 104, 107 Wireless, 3, 4, 9, 14, 29, 43, 44, 67, 73, 75–79 Wiring, 20, 23, 110

Y Yaw, 38, 141, 149, 170, 173, 177