MATLAB Automated Driving Toolbox User s Guide


331 37 41MB

English Pages [1382] Year 2021

Report DMCA / Copyright

DOWNLOAD PDF FILE

Table of contents :
Sensor Configuration and Coordinate System Transformations
Coordinate Systems in Automated Driving Toolbox
World Coordinate System
Vehicle Coordinate System
Sensor Coordinate System
Spatial Coordinate System
Pattern Coordinate System
Calibrate a Monocular Camera
Estimate Intrinsic Parameters
Place Checkerboard for Extrinsic Parameter Estimation
Estimate Extrinsic Parameters
Configure Camera Using Intrinsic and Extrinsic Parameters
Ground Truth Labeling and Verification
Get Started with the Ground Truth Labeler
Load Ground Truth Signals to Label
Load Timestamps
Open Ground Truth Labeler App
Load Signals from Data Sources
Configure Signal Display
Label Ground Truth for Multiple Signals
Create Label Definitions
Label Video Using Automation
Label Point Cloud Sequence Using Automation
Label with Sublabels and Attributes Manually
Label Scene Manually
View Label Summary
Save App Session
Export and Explore Ground Truth Labels for Multiple Signals
Sources vs. Signals in Ground Truth Labeling
Keyboard Shortcuts and Mouse Actions for Ground Truth Labeler
Label Definitions
Frame Navigation and Time Interval Settings
Labeling Window
Cuboid Resizing and Moving
Polyline Drawing
Polygon Drawing
Zooming, Panning, and Rotating
App Sessions
Control Playback of Signal Frames for Labeling
Signal Frames
Master Signal
Change Master Signal
Display All Timestamps
Specify Timestamps
Frame Display and Automation
Label Lidar Point Clouds for Object Detection
Set Up Lidar Point Cloud Labeling
Zoom, Pan, and Rotate Frame
Hide Ground
Label Cuboid
Modify Cuboid Label
Apply Cuboids to Multiple Frames
Configure Display
Create Class for Loading Custom Ground Truth Data Sources
Custom Class Folder
Class Definition
Class Properties
Method to Customize Load Panel
Methods to Get Load Panel Data and Load Data Source
Method to Read Frames
Use Predefined Data Source Classes
Tracking and Sensor Fusion
Visualize Sensor Data and Tracks in Bird's-Eye Scope
Open Model and Scope
Find Signals
Run Simulation
Organize Signal Groups (Optional)
Update Model and Rerun Simulation
Save and Close Model
Linear Kalman Filters
State Equations
Measurement Models
Linear Kalman Filter Equations
Filter Loop
Constant Velocity Model
Constant Acceleration Model
Extended Kalman Filters
State Update Model
Measurement Model
Extended Kalman Filter Loop
Predefined Extended Kalman Filter Functions
Planning, Mapping, and Control
Display Data on OpenStreetMap Basemap
Read and Visualize HERE HD Live Map Data
Enter Credentials
Configure Reader to Search Specific Catalog
Create Reader for Specific Map Tiles
Read Map Layer Data
Visualize Map Layer Data
HERE HD Live Map Layers
Road Centerline Model
HD Lane Model
HD Localization Model
Rotations, Orientations, and Quaternions for Automated Driving
Quaternion Format
Quaternion Creation
Quaternion Math
Extract Quaternions from Transformation Matrix
Control Vehicle Velocity
Velocity Profile of Straight Path
Velocity Profile of Path with Curve and Direction Change
Cuboid Driving Scenario Simulation
Create Driving Scenario Interactively and Generate Synthetic Sensor Data
Create Driving Scenario
Add a Road
Add Lanes
Add Barriers
Add Vehicles
Add a Pedestrian
Add Sensors
Generate Synthetic Sensor Data
Save Scenario
Keyboard Shortcuts and Mouse Actions for Driving Scenario Designer
Canvas Operations
Road Operations
Actor Operations
Preview Actor Times of Arrival
Barrier Placement Operations
Sensor Operations
File Operations
Prebuilt Driving Scenarios in Driving Scenario Designer
Choose a Prebuilt Scenario
Modify Scenario
Generate Synthetic Sensor Data
Save Scenario
Euro NCAP Driving Scenarios in Driving Scenario Designer
Choose a Euro NCAP Scenario
Modify Scenario
Generate Synthetic Detections
Save Scenario
Cuboid Versions of 3D Simulation Scenes in Driving Scenario Designer
Choose 3D Simulation Scenario
Modify Scenario
Save Scenario
Recreate Scenario in Simulink for 3D Environment
Create Reverse Motion Driving Scenarios Interactively
Three-Point Turn Scenario
Add Road
Add Vehicle
Add Trajectory
Run Simulation
Adjust Trajectory Using Specified Yaw Values
Generate INS Sensor Measurements from Interactive Driving Scenario
Import Road Network
Add Actor and Trajectory
Smooth the Trajectory
Add INS Sensor
Simulate Scenario
Export and Explore Sensor Data
Import OpenDRIVE Roads into Driving Scenario
Import OpenDRIVE File
Inspect Roads
Add Actors and Sensors to Scenario
Generate Synthetic Detections
Save Scenario
Export Driving Scenario to OpenDRIVE File
Load Scenario File
Export to OpenDRIVE
Inspect Exported Scenario
Limitations
Import HERE HD Live Map Roads into Driving Scenario
Set Up HERE HDLM Credentials
Specify Geographic Coordinates
Select Region Containing Roads
Select Roads to Import
Import Roads
Compare Imported Roads Against Map Data
Save Scenario
Import OpenStreetMap Data into Driving Scenario
Select OpenStreetMap File
Select Roads to Import
Import Roads
Compare Imported Roads Against Map Data
Save Scenario
Import Zenrin Japan Map API 3.0 (Itsumo NAVI API 3.0) into Driving Scenario
Set Up Zenrin Japan Map API 3.0 (Itsumo NAVI API 3.0) Credentials
Specify Geographic Coordinates
Select Region Containing Roads
Select Roads to Import
Import Roads
Compare Imported Roads Against Map Data
Save Scenario
Create Driving Scenario Variations Programmatically
Generate Sensor Blocks Using Driving Scenario Designer
Test Open-Loop ADAS Algorithm Using Driving Scenario
Test Closed-Loop ADAS Algorithm Using Driving Scenario
Automate Control of Intelligent Vehicles by Using Stateflow Charts
3D Simulation – User's Guide
Unreal Engine Simulation for Automated Driving
Unreal Engine Simulation Blocks
Algorithm Testing and Visualization
Unreal Engine Simulation Environment Requirements and Limitations
Software Requirements
Minimum Hardware Requirements
Limitations
How Unreal Engine Simulation for Automated Driving Works
Communication with 3D Simulation Environment
Block Execution Order
Coordinate Systems for Unreal Engine Simulation in Automated Driving Toolbox
World Coordinate System
Vehicle Coordinate System
Choose a Sensor for Unreal Engine Simulation
Simulate Simple Driving Scenario and Sensor in Unreal Engine Environment
Depth and Semantic Segmentation Visualization Using Unreal Engine Simulation
Visualize Sensor Data from Unreal Engine Simulation Environment
Customize Unreal Engine Scenes for Automated Driving
Install Support Package for Customizing Scenes
Verify Software and Hardware Requirements
Install Support Package
Set Up Scene Customization Using Support Package
Customize Scenes Using Simulink and Unreal Editor
Open Unreal Editor from Simulink
Reparent Actor Blueprint
Create or Modify Scenes in Unreal Editor
Run Simulation
Package Custom Scenes into Executable
Package Scene into Executable Using Unreal Editor
Simulate Scene from Executable in Simulink
Apply Semantic Segmentation Labels to Custom Scenes
Create Top-Down Map of Unreal Engine Scene
Capture Screenshot
Convert Screenshot to Map
Place Cameras on Actors in the Unreal Editor
Place Camera on Static Actor
Place Camera on Vehicle in Custom Project
Prepare Custom Vehicle Mesh for the Unreal Editor
Step 1: Setup Bone Hierarchy
Step 2: Assign Materials
Step 3: Export Mesh and Armature
Step 4: Import Mesh to Unreal Editor
Step 5: Set Block Parameters
Featured Examples
Configure Monocular Fisheye Camera
Annotate Video Using Detections in Vehicle Coordinates
Automate Ground Truth Labeling Across Multiple Signals
Automate Ground Truth Labeling of Lane Boundaries
Automate Ground Truth Labeling for Semantic Segmentation
Automate Attributes of Labeled Objects
Evaluate Lane Boundary Detections Against Ground Truth Data
Evaluate and Visualize Lane Boundary Detections Against Ground Truth
Visual Perception Using Monocular Camera
Create 360° Bird's-Eye-View Image Around a Vehicle
Train a Deep Learning Vehicle Detector
Ground Plane and Obstacle Detection Using Lidar
Code Generation for Tracking and Sensor Fusion
Forward Collision Warning Using Sensor Fusion
Adaptive Cruise Control with Sensor Fusion
Forward Collision Warning Application with CAN FD and TCP/IP
Multiple Object Tracking Tutorial
Track Multiple Vehicles Using a Camera
Track Vehicles Using Lidar: From Point Cloud to Track List
Sensor Fusion Using Synthetic Radar and Vision Data
Sensor Fusion Using Synthetic Radar and Vision Data in Simulink
Autonomous Emergency Braking with Sensor Fusion
Visualize Sensor Coverage, Detections, and Tracks
Extended Object Tracking of Highway Vehicles with Radar and Camera
Track-to-Track Fusion for Automotive Safety Applications
Track-to-Track Fusion for Automotive Safety Applications in Simulink
Visual-Inertial Odometry Using Synthetic Data
Lane Following Control with Sensor Fusion and Lane Detection
Track-Level Fusion of Radar and Lidar Data
Track-Level Fusion of Radar and Lidar Data in Simulink
Track Vehicles Using Lidar Data in Simulink
Grid-based Tracking in Urban Environments Using Multiple Lidars
Track Multiple Lane Boundaries with a Global Nearest Neighbor Tracker
Generate Code for a Track Fuser with Heterogeneous Source Tracks
Highway Vehicle Tracking with Multipath Radar Reflections
Scenario Generation from Recorded Vehicle Data
Lane Keeping Assist with Lane Detection
Model Radar Sensor Detections
Model Vision Sensor Detections
Radar Signal Simulation and Processing for Automated Driving
Simulate Radar Ghosts due to Multipath Return
Create Driving Scenario Programmatically
Create Actor and Vehicle Trajectories Programmatically
Define Road Layouts Programmatically
Automated Parking Valet
Automated Parking Valet in Simulink
Highway Trajectory Planning Using Frenet Reference Path
Motion Planning in Urban Environments Using Dynamic Occupancy Grid Map
Code Generation for Path Planning and Vehicle Control
Use HERE HD Live Map Data to Verify Lane Configurations
Localization Correction Using Traffic Sign Data from HERE HD Maps
Build a Map from Lidar Data
Build a Map from Lidar Data Using SLAM
Create Occupancy Grid Using Monocular Camera and Semantic Segmentation
Lateral Control Tutorial
Highway Lane Change
Design Lane Marker Detector Using Unreal Engine Simulation Environment
Select Waypoints for Unreal Engine Simulation
Visualize Automated Parking Valet Using Unreal Engine Simulation
Simulate Vision and Radar Sensors in Unreal Engine Environment
Highway Lane Following
Automate Testing for Highway Lane Following
Traffic Light Negotiation
Design Lidar SLAM Algorithm Using Unreal Engine Simulation Environment
Lidar Localization with Unreal Engine Simulation
Develop Visual SLAM Algorithm Using Unreal Engine Simulation
Automatic Scenario Generation
Highway Lane Following with RoadRunner Scene
Traffic Light Negotiation with Unreal Engine Visualization
Generate Code for Lane Marker Detector
Highway Lane Following with Intelligent Vehicles
Forward Vehicle Sensor Fusion
Generate Code for Vision Vehicle Detector
Automate Testing for Lane Marker Detector
Generate Code for Highway Lane Following Controller
Automate Testing for Highway Lane Following Controls and Sensor Fusion
Generate Code for Highway Lane Change Planner
Surround Vehicle Sensor Fusion
Build Occupancy Map from 3-D Lidar Data Using SLAM
Recommend Papers

MATLAB Automated Driving Toolbox User s Guide

  • Author / Uploaded
  • coll
  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
File loading please wait...
Citation preview

Automated Driving Toolbox™ User’s Guide

R2021a

How to Contact MathWorks Latest news:

www.mathworks.com

Sales and services:

www.mathworks.com/sales_and_services

User community:

www.mathworks.com/matlabcentral

Technical support:

www.mathworks.com/support/contact_us

Phone:

508-647-7000

The MathWorks, Inc. 1 Apple Hill Drive Natick, MA 01760-2098 Automated Driving Toolbox™ User’s Guide © COPYRIGHT 2017–2021 by The MathWorks, Inc. The software described in this document is furnished under a license agreement. The software may be used or copied only under the terms of the license agreement. No part of this manual may be photocopied or reproduced in any form without prior written consent from The MathWorks, Inc. FEDERAL ACQUISITION: This provision applies to all acquisitions of the Program and Documentation by, for, or through the federal government of the United States. By accepting delivery of the Program or Documentation, the government hereby agrees that this software or documentation qualifies as commercial computer software or commercial computer software documentation as such terms are used or defined in FAR 12.212, DFARS Part 227.72, and DFARS 252.227-7014. Accordingly, the terms and conditions of this Agreement and only those rights specified in this Agreement, shall pertain to and govern the use, modification, reproduction, release, performance, display, and disclosure of the Program and Documentation by the federal government (or other entity acquiring for or through the federal government) and shall supersede any conflicting contractual terms or conditions. If this License fails to meet the government's needs or is inconsistent in any respect with federal procurement law, the government agrees to return the Program and Documentation, unused, to The MathWorks, Inc.

Trademarks

MATLAB and Simulink are registered trademarks of The MathWorks, Inc. See www.mathworks.com/trademarks for a list of additional trademarks. Other product or brand names may be trademarks or registered trademarks of their respective holders. Patents

MathWorks products are protected by one or more U.S. patents. Please see www.mathworks.com/patents for more information. Revision History

March 2017 September 2017 March 2018 September 2018 March 2019 September 2019 March 2020 September 2020 March 2021

Online only Online only Online only Online only Online only Online only Online only Online only Online only

New for Version 1.0 (Release 2017a) Revised for Version 1.1 (Release 2017b) Revised for Version 1.2 (Release 2018a) Revised for Version 1.3 (Release 2018b) Revised for Version 2.0 (Release 2019a) Revised for Version 3.0 (Release 2019b) Revised for Version 3.1 (Release 2020a) Revised for Version 3.2 (Release 2020b) Revised for Version 3.3 (Release 2021a)

Contents

1

Sensor Configuration and Coordinate System Transformations Coordinate Systems in Automated Driving Toolbox . . . . . . . . . . . . . . . . . . World Coordinate System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Vehicle Coordinate System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Sensor Coordinate System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Spatial Coordinate System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Pattern Coordinate System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1-2 1-2 1-2 1-4 1-7 1-7

Calibrate a Monocular Camera . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-9 Estimate Intrinsic Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-9 Place Checkerboard for Extrinsic Parameter Estimation . . . . . . . . . . . . . . 1-9 Estimate Extrinsic Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-12 Configure Camera Using Intrinsic and Extrinsic Parameters . . . . . . . . . . 1-12

2

Ground Truth Labeling and Verification Get Started with the Ground Truth Labeler . . . . . . . . . . . . . . . . . . . . . . . .

2-2

Load Ground Truth Signals to Label . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Load Timestamps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Open Ground Truth Labeler App . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Load Signals from Data Sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Configure Signal Display . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

2-4 2-4 2-4 2-4 2-7

Label Ground Truth for Multiple Signals . . . . . . . . . . . . . . . . . . . . . . . . . . Create Label Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Label Video Using Automation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Label Point Cloud Sequence Using Automation . . . . . . . . . . . . . . . . . . . . Label with Sublabels and Attributes Manually . . . . . . . . . . . . . . . . . . . . . Label Scene Manually . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . View Label Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Save App Session . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

2-9 2-9 2-13 2-15 2-18 2-20 2-20 2-20

Export and Explore Ground Truth Labels for Multiple Signals . . . . . . . .

2-21

Sources vs. Signals in Ground Truth Labeling . . . . . . . . . . . . . . . . . . . . .

2-28

Keyboard Shortcuts and Mouse Actions for Ground Truth Labeler . . . . Label Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Frame Navigation and Time Interval Settings . . . . . . . . . . . . . . . . . . . . . Labeling Window . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

2-30 2-30 2-30 2-30

iii

3

Cuboid Resizing and Moving . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Polyline Drawing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Polygon Drawing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Zooming, Panning, and Rotating . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . App Sessions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

2-31 2-32 2-32 2-33 2-34

Control Playback of Signal Frames for Labeling . . . . . . . . . . . . . . . . . . . Signal Frames . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Master Signal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Change Master Signal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Display All Timestamps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Specify Timestamps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Frame Display and Automation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

2-35 2-35 2-35 2-36 2-37 2-38 2-38

Label Lidar Point Clouds for Object Detection . . . . . . . . . . . . . . . . . . . . . Set Up Lidar Point Cloud Labeling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Zoom, Pan, and Rotate Frame . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Hide Ground . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Label Cuboid . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Modify Cuboid Label . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Apply Cuboids to Multiple Frames . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Configure Display . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

2-39 2-39 2-40 2-40 2-41 2-43 2-44 2-44

Create Class for Loading Custom Ground Truth Data Sources . . . . . . . . Custom Class Folder . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Class Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Class Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Method to Customize Load Panel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Methods to Get Load Panel Data and Load Data Source . . . . . . . . . . . . . Method to Read Frames . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Use Predefined Data Source Classes . . . . . . . . . . . . . . . . . . . . . . . . . . . .

2-46 2-46 2-46 2-46 2-47 2-48 2-50 2-51

Tracking and Sensor Fusion Visualize Sensor Data and Tracks in Bird's-Eye Scope . . . . . . . . . . . . . . . . Open Model and Scope . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Find Signals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Run Simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Organize Signal Groups (Optional) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Update Model and Rerun Simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . Save and Close Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Linear Kalman Filters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . State Equations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Measurement Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Linear Kalman Filter Equations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Filter Loop . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Constant Velocity Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Constant Acceleration Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

iv

Contents

3-2 3-2 3-2 3-6 3-8 3-8 3-8 3-11 3-11 3-12 3-13 3-13 3-14 3-15

Extended Kalman Filters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . State Update Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Measurement Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Extended Kalman Filter Loop . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Predefined Extended Kalman Filter Functions . . . . . . . . . . . . . . . . . . . .

4

5

3-16 3-16 3-16 3-17 3-18

Planning, Mapping, and Control Display Data on OpenStreetMap Basemap . . . . . . . . . . . . . . . . . . . . . . . . .

4-2

Read and Visualize HERE HD Live Map Data . . . . . . . . . . . . . . . . . . . . . . . Enter Credentials . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Configure Reader to Search Specific Catalog . . . . . . . . . . . . . . . . . . . . . . Create Reader for Specific Map Tiles . . . . . . . . . . . . . . . . . . . . . . . . . . . . Read Map Layer Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Visualize Map Layer Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

4-7 4-7 4-8 4-8 4-11 4-12

HERE HD Live Map Layers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Road Centerline Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . HD Lane Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . HD Localization Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

4-15 4-16 4-17 4-18

Rotations, Orientations, and Quaternions for Automated Driving . . . . . Quaternion Format . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Quaternion Creation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Quaternion Math . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Extract Quaternions from Transformation Matrix . . . . . . . . . . . . . . . . . .

4-19 4-19 4-19 4-21 4-23

Control Vehicle Velocity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

4-26

Velocity Profile of Straight Path . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

4-28

Velocity Profile of Path with Curve and Direction Change . . . . . . . . . . . .

4-32

Cuboid Driving Scenario Simulation Create Driving Scenario Interactively and Generate Synthetic Sensor Data .......................................................... 5-2 Create Driving Scenario . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-2 Add a Road . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-2 Add Lanes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-3 Add Barriers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-4 Add Vehicles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-5 Add a Pedestrian . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-7 Add Sensors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-8 Generate Synthetic Sensor Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-13 Save Scenario . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-14

v

Keyboard Shortcuts and Mouse Actions for Driving Scenario Designer ......................................................... Canvas Operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Road Operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Actor Operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Preview Actor Times of Arrival . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Barrier Placement Operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Sensor Operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . File Operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

5-16 5-16 5-16 5-16 5-19 5-20 5-20 5-21

Prebuilt Driving Scenarios in Driving Scenario Designer . . . . . . . . . . . . Choose a Prebuilt Scenario . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Modify Scenario . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Generate Synthetic Sensor Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Save Scenario . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

5-22 5-22 5-41 5-41 5-42

Euro NCAP Driving Scenarios in Driving Scenario Designer . . . . . . . . . . Choose a Euro NCAP Scenario . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Modify Scenario . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Generate Synthetic Detections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Save Scenario . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

5-44 5-44 5-62 5-62 5-63

Cuboid Versions of 3D Simulation Scenes in Driving Scenario Designer ......................................................... Choose 3D Simulation Scenario . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Modify Scenario . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Save Scenario . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Recreate Scenario in Simulink for 3D Environment . . . . . . . . . . . . . . . . .

5-65 5-65 5-69 5-70 5-70

Create Reverse Motion Driving Scenarios Interactively . . . . . . . . . . . . . Three-Point Turn Scenario . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Add Road . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Add Vehicle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Add Trajectory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Run Simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Adjust Trajectory Using Specified Yaw Values . . . . . . . . . . . . . . . . . . . . .

5-72 5-72 5-72 5-74 5-75 5-78 5-79

Generate INS Sensor Measurements from Interactive Driving Scenario ......................................................... Import Road Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Add Actor and Trajectory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Smooth the Trajectory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Add INS Sensor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Simulate Scenario . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Export and Explore Sensor Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

5-82 5-82 5-84 5-86 5-87 5-88 5-89

Import OpenDRIVE Roads into Driving Scenario . . . . . . . . . . . . . . . . . . . Import OpenDRIVE File . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Inspect Roads . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Add Actors and Sensors to Scenario . . . . . . . . . . . . . . . . . . . . . . . . . . . . Generate Synthetic Detections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Save Scenario . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

5-91 5-91 5-92 5-96 5-98 5-99

Export Driving Scenario to OpenDRIVE File . . . . . . . . . . . . . . . . . . . . . . Load Scenario File . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

vi

Contents

5-101 5-101

6

Export to OpenDRIVE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Inspect Exported Scenario . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Limitations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

5-102 5-103 5-104

Import HERE HD Live Map Roads into Driving Scenario . . . . . . . . . . . Set Up HERE HDLM Credentials . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Specify Geographic Coordinates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Select Region Containing Roads . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Select Roads to Import . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Import Roads . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Compare Imported Roads Against Map Data . . . . . . . . . . . . . . . . . . . . . Save Scenario . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

5-108 5-108 5-108 5-110 5-111 5-113 5-114 5-114

Import OpenStreetMap Data into Driving Scenario . . . . . . . . . . . . . . . . Select OpenStreetMap File . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Select Roads to Import . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Import Roads . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Compare Imported Roads Against Map Data . . . . . . . . . . . . . . . . . . . . . Save Scenario . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

5-115 5-115 5-116 5-117 5-118 5-119

Import Zenrin Japan Map API 3.0 (Itsumo NAVI API 3.0) into Driving Scenario . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Set Up Zenrin Japan Map API 3.0 (Itsumo NAVI API 3.0) Credentials . . Specify Geographic Coordinates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Select Region Containing Roads . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Select Roads to Import . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Import Roads . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Compare Imported Roads Against Map Data . . . . . . . . . . . . . . . . . . . . . Save Scenario . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

5-121 5-121 5-121 5-123 5-124 5-126 5-127 5-127

Create Driving Scenario Variations Programmatically . . . . . . . . . . . . . .

5-129

Generate Sensor Blocks Using Driving Scenario Designer . . . . . . . . . .

5-135

Test Open-Loop ADAS Algorithm Using Driving Scenario . . . . . . . . . . .

5-144

Test Closed-Loop ADAS Algorithm Using Driving Scenario . . . . . . . . . .

5-151

Automate Control of Intelligent Vehicles by Using Stateflow Charts . .

5-156

3D Simulation – User's Guide Unreal Engine Simulation for Automated Driving . . . . . . . . . . . . . . . . . . . Unreal Engine Simulation Blocks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Algorithm Testing and Visualization . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

6-2 6-2 6-5

Unreal Engine Simulation Environment Requirements and Limitations .......................................................... Software Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Minimum Hardware Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Limitations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

6-7 6-7 6-7 6-7

vii

viii

Contents

How Unreal Engine Simulation for Automated Driving Works . . . . . . . . . Communication with 3D Simulation Environment . . . . . . . . . . . . . . . . . . . Block Execution Order . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

6-9 6-9 6-9

Coordinate Systems for Unreal Engine Simulation in Automated Driving Toolbox . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . World Coordinate System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Vehicle Coordinate System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

6-11 6-11 6-13

Choose a Sensor for Unreal Engine Simulation . . . . . . . . . . . . . . . . . . . .

6-17

Simulate Simple Driving Scenario and Sensor in Unreal Engine Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

6-23

Depth and Semantic Segmentation Visualization Using Unreal Engine Simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

6-32

Visualize Sensor Data from Unreal Engine Simulation Environment . . .

6-37

Customize Unreal Engine Scenes for Automated Driving . . . . . . . . . . . .

6-45

Install Support Package for Customizing Scenes . . . . . . . . . . . . . . . . . . . Verify Software and Hardware Requirements . . . . . . . . . . . . . . . . . . . . . Install Support Package . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Set Up Scene Customization Using Support Package . . . . . . . . . . . . . . .

6-46 6-46 6-46 6-46

Customize Scenes Using Simulink and Unreal Editor . . . . . . . . . . . . . . . Open Unreal Editor from Simulink . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Reparent Actor Blueprint . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Create or Modify Scenes in Unreal Editor . . . . . . . . . . . . . . . . . . . . . . . . Run Simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

6-50 6-50 6-51 6-52 6-54

Package Custom Scenes into Executable . . . . . . . . . . . . . . . . . . . . . . . . . Package Scene into Executable Using Unreal Editor . . . . . . . . . . . . . . . . Simulate Scene from Executable in Simulink . . . . . . . . . . . . . . . . . . . . .

6-56 6-56 6-57

Apply Semantic Segmentation Labels to Custom Scenes . . . . . . . . . . . . .

6-59

Create Top-Down Map of Unreal Engine Scene . . . . . . . . . . . . . . . . . . . . . Capture Screenshot . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Convert Screenshot to Map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

6-65 6-65 6-66

Place Cameras on Actors in the Unreal Editor . . . . . . . . . . . . . . . . . . . . . Place Camera on Static Actor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Place Camera on Vehicle in Custom Project . . . . . . . . . . . . . . . . . . . . . .

6-68 6-68 6-71

Prepare Custom Vehicle Mesh for the Unreal Editor . . . . . . . . . . . . . . . . Step 1: Setup Bone Hierarchy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Step 2: Assign Materials . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Step 3: Export Mesh and Armature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Step 4: Import Mesh to Unreal Editor . . . . . . . . . . . . . . . . . . . . . . . . . . . Step 5: Set Block Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

6-79 6-79 6-80 6-82 6-83 6-84

7

Featured Examples Configure Monocular Fisheye Camera . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

7-4

Annotate Video Using Detections in Vehicle Coordinates . . . . . . . . . . . .

7-10

Automate Ground Truth Labeling Across Multiple Signals . . . . . . . . . . .

7-18

Automate Ground Truth Labeling of Lane Boundaries . . . . . . . . . . . . . .

7-38

Automate Ground Truth Labeling for Semantic Segmentation . . . . . . . .

7-50

Automate Attributes of Labeled Objects . . . . . . . . . . . . . . . . . . . . . . . . . .

7-60

Evaluate Lane Boundary Detections Against Ground Truth Data . . . . . .

7-74

Evaluate and Visualize Lane Boundary Detections Against Ground Truth .........................................................

7-86

Visual Perception Using Monocular Camera . . . . . . . . . . . . . . . . . . . . . . .

7-99

Create 360° Bird's-Eye-View Image Around a Vehicle . . . . . . . . . . . . . .

7-119

Train a Deep Learning Vehicle Detector . . . . . . . . . . . . . . . . . . . . . . . . .

7-139

Ground Plane and Obstacle Detection Using Lidar . . . . . . . . . . . . . . . .

7-149

Code Generation for Tracking and Sensor Fusion . . . . . . . . . . . . . . . . .

7-158

Forward Collision Warning Using Sensor Fusion . . . . . . . . . . . . . . . . . .

7-166

Adaptive Cruise Control with Sensor Fusion . . . . . . . . . . . . . . . . . . . . . .

7-179

Forward Collision Warning Application with CAN FD and TCP/IP . . . .

7-197

Multiple Object Tracking Tutorial . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

7-203

Track Multiple Vehicles Using a Camera . . . . . . . . . . . . . . . . . . . . . . . . .

7-211

Track Vehicles Using Lidar: From Point Cloud to Track List . . . . . . . . .

7-218

Sensor Fusion Using Synthetic Radar and Vision Data . . . . . . . . . . . . .

7-237

Sensor Fusion Using Synthetic Radar and Vision Data in Simulink . . .

7-246

Autonomous Emergency Braking with Sensor Fusion . . . . . . . . . . . . . .

7-255

Visualize Sensor Coverage, Detections, and Tracks . . . . . . . . . . . . . . . .

7-267

Extended Object Tracking of Highway Vehicles with Radar and Camera ........................................................

7-275

ix

Track-to-Track Fusion for Automotive Safety Applications . . . . . . . . . .

7-295

Track-to-Track Fusion for Automotive Safety Applications in Simulink

7-308

Visual-Inertial Odometry Using Synthetic Data . . . . . . . . . . . . . . . . . . .

7-312

Lane Following Control with Sensor Fusion and Lane Detection . . . . .

7-321

Track-Level Fusion of Radar and Lidar Data . . . . . . . . . . . . . . . . . . . . . .

7-331

Track-Level Fusion of Radar and Lidar Data in Simulink . . . . . . . . . . .

7-351

Track Vehicles Using Lidar Data in Simulink . . . . . . . . . . . . . . . . . . . . .

7-361

Grid-based Tracking in Urban Environments Using Multiple Lidars . .

7-370

Track Multiple Lane Boundaries with a Global Nearest Neighbor Tracker ........................................................ 7-384 Generate Code for a Track Fuser with Heterogeneous Source Tracks .

7-393

Highway Vehicle Tracking with Multipath Radar Reflections . . . . . . . .

7-403

Scenario Generation from Recorded Vehicle Data . . . . . . . . . . . . . . . . .

7-414

Lane Keeping Assist with Lane Detection . . . . . . . . . . . . . . . . . . . . . . . .

7-427

Model Radar Sensor Detections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

7-441

Model Vision Sensor Detections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

7-457

Radar Signal Simulation and Processing for Automated Driving . . . . .

7-475

Simulate Radar Ghosts due to Multipath Return . . . . . . . . . . . . . . . . . .

7-487

Create Driving Scenario Programmatically . . . . . . . . . . . . . . . . . . . . . . .

7-499

Create Actor and Vehicle Trajectories Programmatically . . . . . . . . . . .

7-518

Define Road Layouts Programmatically . . . . . . . . . . . . . . . . . . . . . . . . . .

7-529

Automated Parking Valet . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

7-541

Automated Parking Valet in Simulink . . . . . . . . . . . . . . . . . . . . . . . . . . .

7-570

Highway Trajectory Planning Using Frenet Reference Path . . . . . . . . .

7-577

Motion Planning in Urban Environments Using Dynamic Occupancy Grid Map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-592

x

Contents

Code Generation for Path Planning and Vehicle Control . . . . . . . . . . . .

7-606

Use HERE HD Live Map Data to Verify Lane Configurations . . . . . . . . .

7-615

Localization Correction Using Traffic Sign Data from HERE HD Maps

7-629

Build a Map from Lidar Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

7-641

Build a Map from Lidar Data Using SLAM . . . . . . . . . . . . . . . . . . . . . . .

7-661

Create Occupancy Grid Using Monocular Camera and Semantic Segmentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

7-677

Lateral Control Tutorial . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

7-692

Highway Lane Change . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

7-701

Design Lane Marker Detector Using Unreal Engine Simulation Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

7-711

Select Waypoints for Unreal Engine Simulation . . . . . . . . . . . . . . . . . . .

7-720

Visualize Automated Parking Valet Using Unreal Engine Simulation .

7-729

Simulate Vision and Radar Sensors in Unreal Engine Environment . . .

7-741

Highway Lane Following . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

7-747

Automate Testing for Highway Lane Following . . . . . . . . . . . . . . . . . . . .

7-763

Traffic Light Negotiation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

7-773

Design Lidar SLAM Algorithm Using Unreal Engine Simulation Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

7-787

Lidar Localization with Unreal Engine Simulation . . . . . . . . . . . . . . . . .

7-797

Develop Visual SLAM Algorithm Using Unreal Engine Simulation . . . .

7-810

Automatic Scenario Generation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

7-823

Highway Lane Following with RoadRunner Scene . . . . . . . . . . . . . . . . .

7-837

Traffic Light Negotiation with Unreal Engine Visualization . . . . . . . . .

7-851

Generate Code for Lane Marker Detector . . . . . . . . . . . . . . . . . . . . . . . .

7-862

Highway Lane Following with Intelligent Vehicles . . . . . . . . . . . . . . . . .

7-880

Forward Vehicle Sensor Fusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

7-898

Generate Code for Vision Vehicle Detector . . . . . . . . . . . . . . . . . . . . . . .

7-907

Automate Testing for Lane Marker Detector . . . . . . . . . . . . . . . . . . . . .

7-923

Generate Code for Highway Lane Following Controller . . . . . . . . . . . . .

7-932

xi

xii

Contents

Automate Testing for Highway Lane Following Controls and Sensor Fusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

7-944

Generate Code for Highway Lane Change Planner . . . . . . . . . . . . . . . . .

7-956

Surround Vehicle Sensor Fusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

7-978

Build Occupancy Map from 3-D Lidar Data Using SLAM . . . . . . . . . . . .

7-987

1 Sensor Configuration and Coordinate System Transformations • “Coordinate Systems in Automated Driving Toolbox” on page 1-2 • “Calibrate a Monocular Camera” on page 1-9

1

Sensor Configuration and Coordinate System Transformations

Coordinate Systems in Automated Driving Toolbox Automated Driving Toolbox uses these coordinate systems: • World: A fixed universal coordinate system in which all vehicles and their sensors are placed. • Vehicle: Anchored to the ego vehicle. Typically, the vehicle coordinate system is placed on the ground right below the midpoint of the rear axle. • Sensor: Specific to a particular sensor, such as a camera or a radar. • Spatial: Specific to an image captured by a camera. Locations in spatial coordinates are expressed in units of pixels. • Pattern: A checkerboard pattern coordinate system, typically used to calibrate camera sensors. These coordinate systems apply across Automated Driving Toolbox functionality, from perception to control to driving scenario simulation. For information on specific differences and implementation details in the 3D simulation environment using the Unreal Engine® from Epic Games®, see “Coordinate Systems for Unreal Engine Simulation in Automated Driving Toolbox” on page 6-11.

World Coordinate System All vehicles, sensors, and their related coordinate systems are placed in the world coordinate system. A world coordinate system is important in global path planning, localization, mapping, and driving scenario simulation. Automated Driving Toolbox uses the right-handed Cartesian world coordinate system defined in ISO 8855, where the Z-axis points up from the ground. Units are in meters.

Vehicle Coordinate System The vehicle coordinate system (XV, YV, ZV) used by Automated Driving Toolbox is anchored to the ego vehicle. The term ego vehicle refers to the vehicle that contains the sensors that perceive the environment around the vehicle. • The XV axis points forward from the vehicle. • The YV axis points to the left, as viewed when facing forward. • The ZV axis points up from the ground to maintain the right-handed coordinate system. The vehicle coordinate system follows the ISO 8855 convention for rotation. Each axis is positive in the clockwise direction, when looking in the positive direction of that axis.

1-2

Coordinate Systems in Automated Driving Toolbox

In most Automated Driving Toolbox functionality, such as cuboid driving scenario simulations and visual perception algorithms, the origin of the vehicle coordinate system is on the ground, below the midpoint of the rear axle. In 3D driving scenario simulations, the origin is on ground, below the longitudinal and lateral center of the vehicle. For more details, see “Coordinate Systems for Unreal Engine Simulation in Automated Driving Toolbox” on page 6-11. Locations in the vehicle coordinate system are expressed in world units, typically meters. Values returned by individual sensors are transformed into the vehicle coordinate system so that they can be placed in a unified frame of reference. For global path planning, localization, mapping, and driving scenario simulation, the state of the vehicle can be described using the pose of the vehicle. The steering angle of the vehicle is positive in the counterclockwise direction.

1-3

1

Sensor Configuration and Coordinate System Transformations

Sensor Coordinate System An automated driving system can contain sensors located anywhere on or in the vehicle. The location of each sensor contains an origin of its coordinate system. A camera is one type of sensor used often in an automated driving system. Points represented in a camera coordinate system are described with the origin located at the optical center of the camera.

The yaw, pitch, and roll angles of sensors follow an ISO convention. These angles have positive clockwise directions when looking in the positive direction of the Z-, Y-, and X-axes, respectively.

1-4

Coordinate Systems in Automated Driving Toolbox

1-5

1

1-6

Sensor Configuration and Coordinate System Transformations

Coordinate Systems in Automated Driving Toolbox

Spatial Coordinate System Spatial coordinates enable you to specify a location in an image with greater granularity than pixel coordinates. In the pixel coordinate system, a pixel is treated as a discrete unit, uniquely identified by an integer row and column pair, such as (3,4). In the spatial coordinate system, locations in an image are represented in terms of partial pixels, such as (3.3,4.7).

For more information on the spatial coordinate system, see “Spatial Coordinates”.

Pattern Coordinate System To estimate the parameters of a monocular camera sensor, a common technique is to calibrate the camera using multiple images of a calibration pattern, such as a checkerboard. In the pattern coordinate system, (XP, YP), the XP-axis points to the right and the YP-axis points down. The checkerboard origin is the bottom-right corner of the top-left square of the checkerboard.

Each checkerboard corner represents another point in the coordinate system. For example, the corner to the right of the origin is (1,0) and the corner below the origin is (0,1). For more information on calibrating a camera by using a checkerboard pattern, see “Calibrate a Monocular Camera” on page 1-9.

See Also More About •

“Coordinate Systems for Unreal Engine Simulation in Automated Driving Toolbox” on page 611



“Coordinate Systems in Vehicle Dynamics Blockset” (Vehicle Dynamics Blockset)



“Coordinate Systems” 1-7

1

1-8

Sensor Configuration and Coordinate System Transformations



“Image Coordinate Systems”



“Calibrate a Monocular Camera” on page 1-9

Calibrate a Monocular Camera

Calibrate a Monocular Camera A monocular camera is a common type of vision sensor used in automated driving applications. When mounted on an ego vehicle, this camera can detect objects, detect lane boundaries, and track objects through a scene. Before you can use the camera, you must calibrate it. Camera calibration is the process of estimating the intrinsic and extrinsic parameters of a camera using images of a calibration pattern, such as a checkerboard. After you estimate the intrinsic and extrinsic parameters, you can use them to configure a model of a monocular camera.

Estimate Intrinsic Parameters The intrinsic parameters of a camera are the properties of the camera, such as its focal length and optical center. To estimate these parameters for a monocular camera, use Computer Vision Toolbox™ functions and images of a checkerboard pattern. • If the camera has a standard lens, use the estimateCameraParameters function. • If the camera has a fisheye lens, use the estimateFisheyeParameters function. Alternatively, to better visualize the results, use the Camera Calibrator app. For information on setting