Visually Indicated Sounds

Owens, A., Isola, P., McDermott, J., Torralba, A., Adelson, E.H., Freeman, W.T.


Abstract

Objects make distinctive sounds when they are hit or scratched. These sounds reveal aspects of an object’s ma- terial properties, as well as the actions that produced them. In this paper, we propose the task of predicting what sound an object makes when struck as a way of studying physical interactions within a visual scene. We present an algorithm that synthesizes sound from silent videos of people hitting and scratching objects with a drumstick. This algorithm uses a recurrent neural network to predict sound features from videos and then produces a waveform from these fea- tures with an example-based synthesis procedure. We show that the sounds predicted by our model are realistic enough to fool participants in a “real or fake” psychophysical ex- periment, and that they convey significant information about material properties and physical interactions.

Information

title:
Visually Indicated Sounds
author:
Owens,
A.,
Isola,
P.,
McDermott,
J.,
Torralba,
Adelson,
E.H.,
Freeman,
W.T.
citation:
IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016
shortcite:
CVPR 2016
year:
2016
created:
2016-01-01
keyword:
adelson
www:
http://persci.mit.edu/pub_abstracts/visually-indicated-sounds.html
pdf:
http://persci.mit.edu/pub_pdfs/visually-indicated-sounds.pdf
pageid:
visually-indicated-sounds
type:
publication
 
pub_abstracts/visually-indicated-sounds.html.txt · Last modified: 2016/07/13 17:31 by elmer
Accessibility