Optical Sensor Breakthrough Give Robots Human Like Eyes

A new study argues that a sensor that mimics how the human eye detects light could improve vision for autonomous robots and self-driving cars.

Modern electronic cameras are focused on sensors that produce an electric signal once light falls on them.

On the other side, nearly 100 million rod and cone cells in the retina, the light-sensitive membrane at the back of the eye, only relay messages to the brain in response to light shifts.

In terms of both energy and processing capacity, this allows human eyes considerably more powerful than devices, clarified senior study author John Labram, a device physicist at Oregon State University in Corvallis.

Previously, scientists developed sensors that mimic the retina. These "retinomorphic" electronics, though, require complicated circuits unfit for mass-produced sensor application.

These complex circuits have now been substituted by Labram and his colleagues' simplified solution-light-sensitive materials known as perovskites, currently being produced for use in next-generation solar power cells.

Artificial Intelligence
Pixabay

How This Innovation Works

The heart of the latest sensor consists of an electrically insulating sheet of glass covered with lead iodide of perovskite methylammonium. This perovskite switches from strongly electrically insulating to highly electrically conducting when located in light.

The researchers sandwiched these layers between electrodes and discovered that this sensor had a powerful electrical reaction to light, but it produced no additional signals before the illumination shifted.

"This is the first single-pixel sensor that replicates the behavior of biological retina as part of its fundamental design," Labram said. "From an industrial point of view, this could eventually have a huge impact on speed and power consumption."

Materials scientist Thomas Anthopoulos of the King Abdullah University of Science and Technology in Saudi Arabia, who did not take part in this study, said the sensors could be used in applications involving rapid image processing, including lidar, face recognition and autonomous vehicles.

"This is the first single-pixel sensor that replicates the behavior of biological retina as part of its fundamental design," Labram said.

Further Improvements

The researchers are now trying to build an array of these sensors to record real visual details, "starting with 10-by-10 resolution," Labram said. To better mimic the way that genetic systems process stimuli," they also want to integrate their retinomorphic receptors with artificial intelligence, he said.

All in all, Labram indicated that this study "is part of a bigger ongoing attempt to make machines more similar to humans typically," Traditional computers are programmed to execute calculations as series of steps.

Still, scientists are increasingly designing so-called neuromorphic sensors designed to replicate the human brain by simultaneously executing multiple computations, he clarified.

And as retinomorphic sensors may be stronger than traditional optical sensors, neuromorphic computers may one day be much more powerful than conventional computers.

"The human brain consumes around 20 watts of power, and a home PC [personal computer] runs at around 100 watts," Labram said. "

This doesn't sound like much but a single PC cannot do a similar task to the human brain - for example, real-time learning. This sort of task would require a data center rather than a PC to achieve."

The scientists detailed their research in a report reported in the journal Applied Physics Letters.

Check out more news and information on Technology on Science Times.

Join the Discussion

Recommended Stories

Real Time Analytics