Perceptual Science Group @ MIT

The Perceptual Science Group of the Department of Brain and Cognitive Sciences at MIT does research in human vision, machine vision, human-computer interaction, and touch sensing for robotics. Both the Adelson Lab and the Rosenholtz Lab are part of the Computer Science and Artificial Intelligence Lab (CSAIL), located in the Stata Center.


News…
How we acquire information from dynamic scenes: Benjamin Wolfe and Ruth Rosenholtz have a new paper setting out their Information Acquisition theory in driving. “Toward a Theory of Visual Information Acquisition in Driving.”

Why vision works as well as it does, yet we are poor at the details: Ruth Rosenholtz has a new paper re-examining limited capacity and visual attention in light of work on peripheral vision from the last decade. “Demystifying visual awareness: Peripheral encoding plus limited decision complexity resolve the paradox of rich visual experience and curious perceptual failures.”

Modeling peripheral vision: Recent experimental work appears to challenge popular “pooling” models of crowding. Ruth Rosenholtz, Dian Yu, and Shaiyan Keshvari examine the evidence. “Challenges to pooling models of crowding: Implications for visual mechanisms.”

Modelling visual crowding: Shaiyan Keshvari and Ruth Rosenholtz test a unifying account of visual crowding.

Paper accepted to IROS 2014: Rui and Wenzhen's work on adapting the Gelsight sensor for robotic touch has been accepted to IROS 2014. This work was done in collaboration with the Platt group at NEU, and it was covered by MIT News.

Taking a new look at subway map design: The Rosenholtz lab's Texture Tiling Model was used to evaluate subway maps for the MBTA Map Redesign Contest. Check out the FastCompany Design article, Visual.ly article, and the CSAIL news article. The news was also picked up by a couple other media sources: Smithsonian Magazine and The Dish. Here's an older article about our research from Science Daily.

Tactile sensing for manipulation
If robots are to perform everyday tasks in the real world, they will need sophisticated tactile sensing. The tactile data must be integrated into multi-sensory representations that support exploration, manipulation, and other tasks.

      (workshop held July 15, 2017)


Giving robots a sense of touch
GelSight technology lets robots gauge objects’ hardness and manipulate small tools.


Fingertip sensor gives robot unprecedented dexterity
Armed with the GelSight sensor, a robot can grasp a freely hanging USB cable and plug it into a USB port.


GelSight — Portable, super-high-resolution 3-D imaging
A simple new imaging system could help manufacturers inspect their products, forensics experts identify weapons and doctors identify cancers.


Artificial intelligence produces realistic sounds that fool humans
Video-trained system from MIT’s Computer Science and Artificial Intelligence Lab could help robots understand how objects interact with the world.


 
home.txt · Last modified: 2020/09/18 15:05 by bwolfe
Accessibility