eye 2
SHARE
eye

eye

As any photographer knows, no matter how advanced the camera, auto focus has its flaws, especially compared to the human eye. That’s because most autofocus mechanisms use contrast levels to determine how in or out of focus an image is, which takes time and battery power and relies on the assumption that best contrast equals best focus. A second autofocus system called phase detection, is more accurate, but relies on bulky and expensive hardware.

Enter Johannes Burge, a postdoctoral researcher at the University of Texas and his adviser Wilson Geisler. The pair wanted to understand how our eyes are able to focus so much more efficiently than a digital camera.

Through their research they discovered that humans (and other animals) extract key features from a blurry image and use that information to work out their distance from an object, and from there the eye focuses accordingly. Using mathematical equations, they created a computer simulation of the human visual system, and presented it with a variety of photographs– the patterns of focus remained the same for every image. They then developed a system that works by taking an inventory of the features in a scene, requires no before-and-after comparison, and could be incorporated into even point-and-shoot cameras.

The algorithm hasn’t been tested in an actual camera yet, but they are confident that it will work –The pair are applying for a patent on the technology and they’ve already had interest from a major electronic imaging company. They estimate that if their technology works, digital cameras may be able to focus accurately in as few as 10 milliseconds. Their study is published in the Proceedings of the National Academy of Sciences, and they will be presenting their work at a International Society for Optics and Photonics conference later this month.

Via. The Guardian