Main
Multimodal Perception and Secure State Estimation for Robotic Mobility Platforms
Multimodal Perception and Secure State Estimation for Robotic Mobility Platforms
Rui Jiang, Xinghua Liu, Badong Chen, Shuzhi Sam Ge
5.0
/
5.0
0 comments
Multimodal Perception and Secure State Estimation for Robotic Mobility Platforms Enables readers to understand important new trends in multimodal perception for mobile robotics This book provides a novel perspective on secure state estimation and multimodal perception for robotic mobility platforms such as autonomous vehicles. It thoroughly evaluates filter-based secure dynamic pose estimation approaches for autonomous vehicles over multiple attack signals and shows that they outperform conventional Kalman filtered results. As a modern learning resource, it contains extensive simulative and experimental results that have been successfully implemented on various models and real platforms. To aid in reader comprehension, detailed and illustrative examples on algorithm implementation and performance evaluation are also presented. Written by four qualified authors in the field, sample topics covered in the book include: Secure state estimation that focuses [...]on system robustness under cyber-attacks Multi-sensor fusion that helps improve system performance based on the complementary characteristics of different sensors A geometric pose estimation framework to incorporate measurements and constraints into a unified fusion scheme, which has been validated using public and self-collected data How to achieve real-time road-constrained and heading-assisted pose estimation This book will appeal to graduate-level students and professionals in the fields of ground vehicle pose estimation and perception who are looking for modern and updated insight into key concepts related to the field of robotic mobility platforms.
Comments of this book
There are no comments yet.