Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Tom’s Guide
Tom’s Guide
Technology
John Velasco

Apple could revolutionize the iPhone’s camera with this groundbreaking change

IPhone 15 Pro Max and iPhone 14 Pro Max next to one another.

Apple’s iPhones are frequently in contention for the best camera phones around, thanks in part to their excellent performance, ease of use, and growing features that tailor to enthusiasts. The iPhone 15 series is proof of this, particularly with the color science behind the iPhone 15 Pro’s camera and the enhanced pixel binning tech behind the iPhone 15’s 2x telephoto zoom. Things may get substantially better for future iPhones because it’s rumored that Apple could be working on its own in-house camera sensor. That’s a big deal.

This speculation comes to us courtesy of Bloomberg’s Mark Gurman, who reports that Apple is “eyeing an in-house strategy for camera sensors.” For those that have followed Apple through the years, it wouldn't be a radical proposal given the company’s history — much like how the company pivoted to its A-series chip for its iPhones and iPads, along with the M-series chips in recent memory for its MacBooks.

Nearly all of today’s best camera phones use one of Sony’s camera sensors with great results, and Apple’s no stranger to this. The company doesn’t disclose the actual sensors used in its iPhones, but the iPhone 15 Pro is believed to use a Sony IMX803. Moving away from Sony’s camera sensors to its own one could push future iPhones to greater heights around photography and videography.

Current iPhone 16 camera rumors pin Apple’s next flagship phone to package a stacked CMOS image sensor from Sony, so it would probably be a while before we could see an iPhone accompanied by an in-house developed camera sensor. Nevertheless, this move could inadvertently have a reverberating effect on Apple's long term plans.

Dramatically improved image processing

(Image credit: Future)

There’s a lot that happens in the fraction of a second when you capture a photo on an iPhone and what you end up getting on screen. From analyzing the light conditions, to the distance your subjects are from the camera, there are complex processes happening in real time and after you’ve taken the snapshot.

By having its own in-house camera sensor, we could see dramatically improved image processing with future iPhones. Apple’s already shown the incredible depth of its Photonic Engine, which uses a combination of stacking and image processing to get the best results — no matter the lighting condition or which camera you end up using.

New image processing techniques could be expedited by Apple switching to an in-house sensor, which could see development and optimization much faster than previous camera sensors it’s used in its iPhones.

One camera sensor to rule them all

(Image credit: Tom's Guide)

It’s not entirely out of the question for Apple to incorporate this in-house camera sensor in its other products. There’s cost savings with this strategy, and potentially, it could amp up production time on products — meaning fewer shipment delays in the long run.

This could also lessen Apple’s bargaining with other companies to secure contracts for manufacturing the components that make up the camera sensors in its iPhones. Additionally, there’s savings in the form of larger component orders if Apple purchases them in bulk.

But the larger benefit from an in-house sensor means that other Apple gadgets, like the iPad, could leverage these camera sensors too. And since the standard iPhone series typically gains the same cameras as the Pro models from the year before, Apple could anticipate production needs way ahead of time.

Yet, the biggest gain could lay in a new form of capturing content — spatial video.

Spatial video could be more immersive

(Image credit: Apple)

Gurman’s report details that the in-house strategy for camera sensors is “core to future developments in the mixed-reality and autonomous-driving industries.” While this doesn’t directly tie into the company’s plans around the forthcoming Apple Vision Pro headset, it does indicate that it could play a vital role in the future.

Spatial video is one of the exciting new, more immersive experiences that’s expected to be a showcase in how people use the Apple Vision Pro, and considering how the iPhone 15 Pro can now record spatial video, there’s a compelling argument for Apple to develop its own camera sensor. Having more control of the development and processing is critical to delivering even more immersive experiences with future mixed reality headsets from Apple.

All of this makes for a convincing argument, but it would be a good while before it comes to fruition. For example: there were rumors of Apple working on its own in-house 5G chip for years now that would swap out Qualcomm’s modems, with the most recent iPhone 15 series believed to pack them, but it hasn’t happened yet. It’s not a question if Apple will choose to develop its own camera sensor, but a matter of when it will happen.

More from Tom's Guide

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.