Regardless of whether you take photos with the best camera phone or use a real camera, it is almost always impossible to get everything from the foreground to the background in focus. However, a new breakthrough could soon make this a very real possibility.
Researchers at Carnegie Mellon University have managed to develop a new type of camera lens that offers spatially selective focusing, allowing cameras to focus on an entire scene at once. A blog post about the development details that allow this technology to capture photos “where every detail, near and far, is perfectly sharp – from the petal directly in front of you to the distant trees on the horizon.”
Unlike traditional cameras, which can only perfectly focus on a flat plane of a scene at a time, the new computational lens uses a mix of optics and algorithms to “adjust focus differently for each part of a scene.”
The resulting system uses two autofocus methods, contrast-detection autofocus (CDAF) and phase-detection autofocus (PDAF), to automatically determine “which parts of the image should be in focus,” essentially giving each part of the image its own focus button.
What this could mean for future cameras
This computational lens has the potential to pave the way for a new range of smartphone cameras that capture sharper photos with less background or foreground blur. The technology also has other potential applications, such as improving how microscopes focus on objects, helping robots and self-driving cars see details from all distances, and making AR/VR much more realistic.
The team presented their results for the first time at the International Conference on Computer Vision earlier this year, where they received the “Best Paper Honorable Mention” award. While the technology is still in the research stage, it will be interesting to see how and when it makes its way into real cameras and consumer devices.




