|
Resilient Perception for Robots in Challenging Environments: day & night, in smoke, dust clouds or inside a kneeold_uid | 12000 |
---|
title | Resilient Perception for Robots in Challenging Environments: day & night, in smoke, dust clouds or inside a knee |
---|
start_date | 2016/07/08 |
---|
schedule | 14h |
---|
online | no |
---|
summary | Long-term autonomy in robotics requires perception systems that are resilient to unusual but
realistic conditions that will eventually occur during extended missions. For example, unmanned ground
vehicles (UGVs) need to be capable of operating safely in adverse and low-visibility conditions, such as at
night or in the presence of smoke, in the presence of deformable terrain or in vegetated environments. A
key to a resilient UGV perception system lies in the intelligent combination of multiple sensor modalities,
e.g. operating at different frequencies of the electromagnetic spectrum, to compensate for the limitations
of a single sensor type. For example, we show that by augmenting LIDAR-based traversability maps with
ultra-wideband (UWB) RADAR data we can enhance obstacle detection in vegetated environments, where
vegetation is often mistakenly interpreted as an obstacle by state-of-the-art obstacle detection techniques.
However, since distinct sensing modalities may react differently to certain materials or environmental conditions, they may detect different targets even though they are spatially aligned. This can lead to catastrophic
fusion, where the outcome of standard Bayesian data fusion may actually be of lower quality than the individual representations obtained using a single source of information. Therefore, we propose a new method to
reliably fuse data acquired by distinct sensing modalities, e.g. a LIDAR and a RADAR, including in situations
where they detect different targets, thereby providing ”conflicting data”. The method automatically identifies conflicting data and produces accurate continuous representations of objects in the environment, with
uncertainty, using a machine learning technique called Gaussian Process Implicit Surfaces.
If that is of interest, we may also discuss recent developments in our research on experimental learning for
traversability estimation and stochastic motion planning for a UGV. This research comprises of two main
components: 1) a near-to-far learning approach for estimating terrain traversability in the presence of occlusions and deformable terrain, 2) a method to learn stochastic mobility prediction models for planning with
control uncertainty on unstructured terrain. |
---|
responsibles | Baumard |
---|
| |
|