This is an old revision of the document!


Tracking objects with point clouds from vision and touch

Gregory Izatt, Geronimo Mirano, Edward Adelson, Russ Tedrake


Abstract

We present an object-tracking framework that fuses point cloud information from an RGB-D camera with tactile information from a GelSight contact sensor. GelSight can be treated as a source of dense local geometric information, which we incorporate directly into a conventional point-cloud-based articulated object tracker based on signed-distance functions. Our implementation runs at 12 Hz using an online depth reconstruction algorithm for GelSight and a modified second- order update for the tracking algorithm. We present data from hardware experiments demonstrating that the addition of contact-based geometric information significantly improves the pose accuracy during contact, and provides robustness to occlusions of small objects by the robot’s end effector.

Information

title:
Tracking objects with point clouds from vision and touch
author:
Gregory Izatt,
Geronimo Mirano,
Edward Adelson,
Russ Tedrake
citation:
Robotics and Automation (ICRA), 2017 IEEE International Conference on
shortcite:
ICRA 2017
year:
2017
created:
2017-05-29
keyword:
adelson
www:
http://persci.mit.edu/pub_abstracts/tracking-objects-with-point-clouds-from-vision-and-touch.html
pdf:
https://groups.csail.mit.edu/robotics-center/public_papers/Izatt16.pdf
pageid:
tracking-objects-with-point-clouds-from-vision-and-touch
type:
publication
 
pub_abstracts/tracking-objects-with-point-clouds-from-vision-and-touch.html.1505492483.txt.gz · Last modified: 2017/09/15 12:21 by elmer
Accessibility