Connecting Look and Feel: Associating the visual and tactile properties of physical materials

Wenzhen Yuan, Shaoxiong Wang, Siyuan Dong, Edward Adelson


Abstract

For machines to interact with the physical world, they must understand the physical properties of objects and ma- terials they encounter. We use fabrics as an example of a deformable material with a rich set of mechanical proper- ties. A thin flexible fabric, when draped, tends to look dif- ferent from a heavy stiff fabric. It also feels different when touched. Using a collection of 118 fabric sample, we cap- tured color and depth images of draped fabrics along with tactile data from a high-resolution touch sensor. We then sought to associate the information from vision and touch by jointly training CNNs across the three modalities. Through the CNN, each input, regardless of the modality, gener- ates an embedding vector that records the fabric’s physi- cal property. By comparing the embeddings, our system is able to look at a fabric image and predict how it will feel, and vice versa. We also show that a system jointly trained on vision and touch data can outperform a similar system trained only on visual data when tested purely with visual inputs.

Information

title:
Connecting Look and Feel: Associating the visual and tactile properties of physical materials
author:
Wenzhen Yuan,
Shaoxiong Wang,
Siyuan Dong,
Edward Adelson
citation:
IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
shortcite:
CVPR 2017
year:
2017
created:
2017-04-12
keyword:
adelson,
wenzhen,
shaoxiong,
siyuan
www:
http://persci.mit.edu/pub_abstracts/connecting-look-feel-associating-visual-tactile-properties-physical-materials.html
pdf:
https://arxiv.org/pdf/1704.03822.pdf
pageid:
connecting-look-feel-associating-visual-tactile-properties-physical-materials
type:
publication
 
pub_abstracts/connecting-look-feel-associating-visual-tactile-properties-physical-materials.html.txt · Last modified: 2017/09/15 14:18 by elmer
Accessibility