From: http://news.com.com/MIT+discovery+may+improve+robotic+eyes/2100-1008_3-6178065.html?tag=nefd.top
Apr 20, 2007
It may be calculating the patterns of light and dark spots, according to researchers from the Massachusetts Institute of Technology and the NTT Communications Science Labs in Japan.
It's not exactly known how the human brain represents visual information, but some believe it operates like a digital camera with a really sophisticated computer (http://news.com.com/Bringing+color+to+the+color-blind/2008-1008_3-6175622.html), said Lavanya Sharan, a member of the perceptual science group in brain and cognitive sciences at MIT.
The electrical engineering and computer science graduate student co-authored the paper "Image statistics and the perception of surface qualities," which will appear in the April 18 issue of Nature.
The paper argues that the brain takes a digital snapshot and then analyzes the bright and light spots to determine texture and, subsequently, what type of material it's looking at, in addition to taking in information on color and shape.
"Practical applications of this work would extend to domestic robots or autonomous vehicles (http://news.com.com/Divining+AI%2C+and+the+future+of+consumer+robotics/2008-11394_3-6096186.html) that could understand the world they look at. But it's also important for understanding how human perception works. How the brain understands the color or the shininess of a surface can shed light on the workings of the visual system, which is a large open question," Sharan said.
"Let's say I am looking at a shiny, black material; because it's shiny and black it will have strong highlights. That highlight will be extremely strong and be a bright region in that image. The brain then measures. If you have more highlights than normal, then it assumes that the surface is black or shiny or both," she said.
The idea that brightness or whiteness represents shine is something artists have long used to illustrate texture in paintings. The classic shiny apple is painted with a white crescent on the part that is supposed to be exhibiting shine.
In life, that part of the apple is not actually white; it's red. But the brain takes notice of this shine as brightness, and uses that information to figure out that the object it sees is shiny.
To analyze this process at a more sophisticated level useful to artificial intelligence, the MIT group plotted the process on what Sharan called a "luminosity histogram."
The x-axis measured the different intensities of light seen by the brain; the y-axis plotted the number of points sharing a common intensity value. Think of it as counting the number of white, gray or black pixels in a single black-and-white digital image, only in this case a lot more pixel types are recorded.
From the histograms, the team determined that the brain correlates the level of brightness for each point with the number of times that luminosity point shows up to determine if something is shiny, rough or wet.
"We see this work as a stepping stone or the beginning for material perception. People who work in visual perception have so far concentrated on object recognition. But we want to stress that it is not only important to recognize the table, but also what material the table is made of," Sharan said.