We have heard the famous “next iPhone will get a dSLR quality next year” for a few iterations, since before the iPhone 5 if I recall correctly. The camera did make huge jumps in sharpness, colour accuracy, noise reduction and even image stabilization. But still not at a dSLR level. Apple acquired LinX last year, a company that makes very small, mobile phones size camera systems with 2, 3 or even 4 lenses.
The space in the iPhone is very limited so you can forget about a zooming lens that would give you something to the likes of a 35-105 mm dSLR lens. So if we don’t have space to spare, what can Apple do? Add more lenses to the iPhone in order to take better pictures? Exactly.
LinX managed to build camera components and algorithms that can clean up the image by removing artifacts, reducing the noise event more and even use the 3D information from the cameras to better focus (or faster focus) the image. These are all basic stuff. But what if we take this a step further?
Two Lenses? What for?
Apple could even use the multiple lenses to do something similar to Nokia with its PureView cameras that took a 41MP photo and downscale it to a very clean, close to dSLR photo of 8MP. Imagine two 12MP sensors, for a total of 49.8MP where Apple can apply the LinX algorithms and its own already built in the iPhone 6S (and soon iPhone 7) to clean up the image and the resize it down to 12MP. The resulting image could be dSLR quality, really, this time!
Remember when Apple acquired SnappyCam, the app that had a built in JPEG algorithms that could snap 20 images at once with full resolution? Well, this became burst mode, so I’m very hopeful that LinX technology will be used in the iPhone 7S.
Why the iPhone 7S?
We usually see the greatest leap in camera technology with the S-year devices and since the rumour is mentioning that Apple is trying to source the cameras from different manufacturers, it will probably not be ready for June when the “final” version of the next iPhone is completed.
What about 3D? We now the Nintendo 3DS already has two cameras and can take 3D pictures. Will the iPhone 7S take 3D pictures? I would doubt it. The reason why the 3DS is capable is because of the distance between the lenses, the same feature is built in binocular creatures like us. The distance between our eyes is what gives us the depth awareness. The iPhone 7S could not replicate that, unless the lenses would be at opposite ends of the iPhone 7S Plus… How would you hold such a beast?
Remember the photo camera company Lythro that offered post-capture focus capabilities where you could focus the image on a different point in the image on your computer? Well, this could possibility be another feature. 3D Touch to launch the animation and focus on where you pressed. I think we are entering the realm of gimmicks now…
Let’s leave this at the “multiple cameras for better pictures” scenario, which is where Apple has to gain the most from this type of technology. I sure look forward to any improvement in camera hardware and software from Apple!