Differences

This shows you the differences between two versions of the page.

home [2017/07/14 18:33]
elmer
home [2017/10/19 22:19] (current)
elmer
Line 10: Line 10:
</WRAP> </WRAP>
---- ----
-<WRAP column 25% people>+<WRAP column 20% people>
//**Faculty**// //**Faculty**//
Line 46: Line 46:
<WRAP column 60%> <WRAP column 60%>
-Founded in 1994, the Perceptual Science Group of the Department of Brain and Cognitive Sciences at MIT does research in human visual perception, machine vision, image processing, and human-computer interaction. Both the Adelson Lab and the Rosenholtz Lab are located in Building 32. +The Perceptual Science Group of the Department of Brain and Cognitive Sciences at MIT does research in human vision, machine vision, human-computer interaction, and touch sensing for robotics. Both the Adelson Lab and the Rosenholtz Lab are part of the Computer Science and Artificial Intelligence Lab (CSAIL), located in the Stata Center.\\
-\\+
\\ \\
---- ----
-** Special Event — Saturday, July 15, 2017 ** 
-[[https://sites.google.com/view/rss17ts/overview|{{:rss17-1.png}}]] +//**News... **// 
- +<WRAP box> 
-If robots are to perform everyday tasks in the real world, they will need sophisticated tactile sensing. The tactile data must be integrated into multi-sensory representations that support exploration, manipulation, and other tasks.  +<WRAP box left 50%>**[[https://sites.google.com/view/rss17ts/overview|Tactile sensing for manipulation]]**\\ 
-This workshop asks the following questions: +If robots are to perform everyday tasks in the real world, they will need sophisticated tactile sensing. The tactile data must be integrated into multi-sensory representations that support exploration, manipulation, and other tasks.\\
- +
-  * What kinds of tactile technologies are currently available, and what are needed? +
-  * What type of representations are best for capturing and exploiting tactile data? +
-  * How can tactile information be combined with other information to support specific tasks? +
-  * Can learning help to provide suitable representations from high-dimensional sensory data? +
-  +
-This workshop will bring together experts from the fields of tactile sensing, sensor design, manipulation, and machine learning. We expect that the pairing of theoretical and applied knowledge will lead to an interesting exchange of ideas and stimulate an open discussion about the goals and challenges of tactile sensing. +
-\\ +
-[[https://sites.google.com/view/rss17ts/overview|More information...]]\\ +
-\\ +
-** Presented at MIT by **\\ +
-<WRAP left box> +
-[[http://persci.mit.edu/people/adelson|{{:people:adelson.jpg?160|Ted Adelson}}]]\\ +
-[[http://persci.mit.edu/people/adelson|Ted Adelson]]\\+
</WRAP> </WRAP>
-<WRAP left box> +[[https://sites.google.com/view/rss17ts/overview|{{:rss17-1.png?250}}]] 
-[[http://people.csail.mit.edu/yuan_wz/|{{:wenzhen.png?160|Wenzhen Yuan}}]]\\ +//      (workshop held July 15, 2017)//\\
-[[http://people.csail.mit.edu/yuan_wz/|Wenzhen Yuan]]\\+
</WRAP> </WRAP>
-<WRAP clear></WRAP> 
-** Location **\\ 
-MIT Building 36 — Room 112 \\ 
-50 Vassar Street \\ 
-Cambridge, MA  02139 \\ 
- 
-[[https://www.google.com/maps/place/50+Vassar+St,+Cambridge,+MA+02139/@42.3613361,-71.0942629,17z/data=!3m1!4b1!4m5!3m4!1s0x89e370abdb5abad9:0xf77ea85672e15a0!8m2!3d42.3613361!4d-71.0920689|Directions, via Google Maps]] 
- 
-[[https://www.google.com/maps/place/50+Vassar+St,+Cambridge,+MA+02139/@42.3613361,-71.0942629,17z/data=!3m1!4b1!4m5!3m4!1s0x89e370abdb5abad9:0xf77ea85672e15a0!8m2!3d42.3613361!4d-71.0920689|{{:rss17-2.png|}}]] 
---- ----
 +<WRAP box>
 +<WRAP box right 50%>**[[http://news.mit.edu/2017/gelsight-robots-sense-touch-0605|Giving robots a sense of touch]]**\\
 +GelSight technology lets robots gauge objects’ hardness and manipulate small tools.\\
 +</WRAP>
 +{{youtube>small:BIW_jq3dOEE}}
 +</WRAP>
---- ----
- +<WRAP box> 
-//**In Other News... **//+<WRAP box left 50%>**[[http://news.mit.edu/2014/fingertip-sensor-gives-robot-dexterity-0919|Fingertip sensor gives robot unprecedented dexterity]]**\\ 
 +Armed with the GelSight sensor, a robot can grasp a freely hanging USB cable and plug it into a USB port.\\ 
 +</WRAP> 
 +{{youtube>small:w1EBdbe4Nes}} 
 +</WRAP>
---- ----
 +<WRAP box>
 +<WRAP box right 50%>**[[http://news.mit.edu/2011/tactile-imaging-gelsight-0809|GelSight — Portable, super-high-resolution 3-D imaging]]**\\
 +A simple new imaging system could help manufacturers inspect their products, forensics experts identify weapons and doctors identify cancers.\\
 +</WRAP>
 +{{youtube>small:S7gXih4XS7A}}
 +</WRAP>
 +----
 +<WRAP box>
 +<WRAP box left 50%>**[[http://news.mit.edu/2016/artificial-intelligence-produces-realistic-sounds-0613|Artificial intelligence produces realistic sounds that fool humans]]**\\
 +Video-trained system from MIT’s Computer Science and Artificial Intelligence Lab could help robots understand how objects interact with the world.\\
 +</WRAP>
 +{{youtube>small:0FW99AQmMc8}}
 +</WRAP>
 +----
 +\\
**Peripheral vision, inference, and visual awareness**: An extended abstract is now available based on Ruth Rosenholtz' invited talk at the VSS 2017 Symposium, "The Role of Ensemble Statistics in the Visual Periphery." [[https://arxiv.org/abs/1706.02764|What modern vision science reveals about the awareness puzzle: Summary-statistic encoding plus decision limits underlie the richness of visual perception and its quirky failures]] **Peripheral vision, inference, and visual awareness**: An extended abstract is now available based on Ruth Rosenholtz' invited talk at the VSS 2017 Symposium, "The Role of Ensemble Statistics in the Visual Periphery." [[https://arxiv.org/abs/1706.02764|What modern vision science reveals about the awareness puzzle: Summary-statistic encoding plus decision limits underlie the richness of visual perception and its quirky failures]]
Line 104: Line 103:
**Taking a new look at subway map design**: The Rosenholtz lab's Texture Tiling Model was used to evaluate subway maps for the MBTA Map Redesign Contest. Check out the [[http://www.fastcodesign.com/3020708/evidence/the-science-of-a-great-subway-map|FastCompany Design article]], [[http://blog.visual.ly/how-do-our-brains-process-infographics-mit-mongrel-shows-peripheral-vision-at-work/|Visual.ly article]], and the [[http://www.csail.mit.edu/node/2094|CSAIL news article]]. The news was also picked up by a couple other media sources: [[http://blogs.smithsonianmag.com/smartnews/2013/11/how-much-of-a-subway-map-can-one-persons-brain-process/|Smithsonian Magazine]] and [[http://dish.andrewsullivan.com/2013/11/07/building-a-better-subway-map/|The Dish]]. Here's an older article about our research from  [[http://www.sciencedaily.com/releases/2011/02/110202215339.htm|Science Daily]]. **Taking a new look at subway map design**: The Rosenholtz lab's Texture Tiling Model was used to evaluate subway maps for the MBTA Map Redesign Contest. Check out the [[http://www.fastcodesign.com/3020708/evidence/the-science-of-a-great-subway-map|FastCompany Design article]], [[http://blog.visual.ly/how-do-our-brains-process-infographics-mit-mongrel-shows-peripheral-vision-at-work/|Visual.ly article]], and the [[http://www.csail.mit.edu/node/2094|CSAIL news article]]. The news was also picked up by a couple other media sources: [[http://blogs.smithsonianmag.com/smartnews/2013/11/how-much-of-a-subway-map-can-one-persons-brain-process/|Smithsonian Magazine]] and [[http://dish.andrewsullivan.com/2013/11/07/building-a-better-subway-map/|The Dish]]. Here's an older article about our research from  [[http://www.sciencedaily.com/releases/2011/02/110202215339.htm|Science Daily]].
- 
- 
- 
</WRAP> </WRAP>
<WRAP clear></WRAP> <WRAP clear></WRAP>
 
home.1500071583.txt.gz · Last modified: 2017/07/14 18:33 by elmer