Thanks to improvements in sensors and neural cores present in mobile processors, such as the ones developed by Qualcomm and Apple, we have come a long way in terms of smartphone camera technology. Be it 3D scans of objects and rooms, AI-optimized images, and cinematic Dolby Vision HDR videos, the best smartphone cameras nowadays put the best pro cameras to shame. Here are a few of the improvements that have made this level of performance possible.
3D depth-scanners and Lidar
3D depth-scanning became a huge deal last year, when a few Samsung Galaxy smartphones began including the technology. For instance, a Samsung app called 3D Scanner lets users move around a real object (say, a vase) and scan it into a 3D model, before translating the moving person’s 3D depth data to animate the model itself.
This concept was soon eclipsed by Lidar sensors, starting from the ones developed by Sony for Apple’s latest iPad Pro models. Lidar technology is now present in the iPhone 12 Pro and Pro Max, and while the company itself hasn’t exactly given us a proper demo of the technology, we know from Polycam demos that a combination of Lidar and AI can easily create 3D scans of both objects and environments.
You simply press the record button, and start walking around the object as the screen turns from blue to white, indicating a successful capture. Using Lidar’s depth measurement capabilities and Apple’s rapid processing, Polycam is able to refine its results in real-time to properly reflect contours, all while the camera keeps filling in texturing gaps as you move.
Big sensors and next-gen computational photography
High-resolution camera sensors, such as those developed by Xiaomi and Samsung, are allowing smartphones to deliver results that rival those of the most expensive standalone DSLR cameras. Apple has been taking notes, clearly, as it fitted its iPhone 12 Pro Max with the physically largest camera system that has ever made its way to an iPhone device. It increased each pixel’s size to 1.7 microns, which is notably better than the 0.8 microns on Samsung’s 108MP camera sensor. This improvement in pixel size alone does wonders for color accuracy and cleanness of the image.
As for computational photography, this technology ends up benefiting phones that aren’t fitted with massive camera sensor. Apple’s Smart HDR 3, for instance, uses machine learning and image segmentation to determine the appropriate exposure for separate parts of an image. It also uses Deep Fusion to include the sharpest details from those exposures into the final clean-cut photograph.
4K video with Dolby Vision HDR
In October, Apple announced that the iPhone 12 Pro and Pro Max were the “first camera[s] ever to record in Dolby Vision,” capable of not only 10-bit HDR recording, but also live editing and simple streaming to Apple TV 4K devices and smart TVs with AirPlay 2 support. Creating Dolby Vision previously was a post-production process that required standalone computers, but the A14 Bionic’s speed enables it to happen as the video is being filmed, at up to 60 frames per second.