iPhone 14 Pixels: Why Isn’t The 48mp Sensor The Big Camera News This Year?

So, let’s have a conversation about pixels. More specifically, 14 pixels on the iPhone. In particular, iPhone 14 Pro resolution. As a matter of fact, despite the fact that the newest Pro models use a 48MP sensor rather than a 12MP one may grab more attention, it is hardly the most significant upgrade Apple has made to the camera this year.

In fact, to me, the 48MP sensor is the least significant of the four major upgrades this year. I’ll get to why I think the 48MP sensor is so unimportant, but please be patient because there’s a lot to unpack first.

  • The sensor size
  • Pixel binning
  • The Photonic Engine

One 48MP Sensor, Two 12MP Ones

In common usage, we refer to the iPhone’s camera as “the camera” before listing its three lenses (the “primary,” “wide,” and “telephoto”). Because it’s what we’re used to seeing, and because that’s the illusion Apple offers in the camera app for simplicity (one sensor, several (interchangeable) lenses is how DLSRs and mirrorless cameras work).

Of course, the truth is different. A total of three camera modules are included in the iPhone. There is a unique camera module and sensor for each camera. When you hit the 3x button, for example, you’re not only picking the telephoto lens; you’re also changing the sensor. When you use the slider to zoom in, the camera app will invisibly switch to the proper camera module and crop the image for you.

The primary camera module sports a 48MP sensor, while the secondary and secondary modules both still have 12MP sensors. Although Apple was upfront about this when presenting the new models, some people may have missed this critical detail:

For the first time ever, the Pro lineup features a new 48MP Main camera with a quad-pixel sensor that adapts to the photo being captured, and features second-generation sensor-shift optical image stabilization.

The 48MP Sensor Works Part-time

The primary camera has a 48-megapixel sensor. However, you’ll only be able to take 12-megapixel shots by default. And once more, Apple:

For most photos, the quad-pixel sensor combines every four pixels into one large quad pixel.

You should only use 48 megapixels when:

  • You are using the main camera (not telephoto or wide-angle)
  • You are shooting in ProRAW (which is off by default)
  • You are shooting in decent light

Here’s what you need to do if you decide to proceed. Yet in most cases, you won’t…

Apple’s Approach Makes Sense

Why provide us with a 48-megapixel camera if we’re just not going to use it?

The strategy Apple has adopted makes logical, given there are very few situations where shooting in 48MP is preferable to shooting in 12MP. It makes no sense to have this as the default, as doing so results in far larger files that devour your storage space. There are only two cases where I would recommend taking a 48-megapixel photo:

  1. You intend to print the photo in a large size
  2. You need to crop the image very heavily

That second reason is also somewhat dubious because if you need to crop that drastically, you might be better off utilizing the 3x camera.

iPhone 14 Pixels
iPhone 14 Pixels

Now Let’s Talk About Sensor Size

There are two primary distinctions between the cameras on smartphones and those on DSLRs and high-quality mirrorless cameras. The standard of the lenses is one such factor. In terms of size and expense, the lenses available for standalone cameras are a substantial improvement above those found on compact cameras. A professional or serious hobbyist photographer may spend several thousand dollars on a single lens. The cameras on smartphones simply cannot compare.

The size of the sensor is the second factor. The larger the sensor, the higher the image quality, all other factors being equal. Due to their compact design and the multitude of additional components, smartphone cameras have significantly smaller sensors than those dedicated cameras. Since capturing the shallow depth of field naturally is challenging with a smartphone-sized sensor, Apple compensates for this with Portrait mode and Cinematic video on the iPhone.

Apple’s Big Sensor + Limited Megapixel Approach

There are both clear and less obvious constraints on the size of the sensor that may be used in a smartphone, but Apple has traditionally utilized larger sensors than other smartphone companies, which is why the iPhone has been regarded as having the best camera of any smartphone for so long. But there’s another explanation. A smartphone’s pixels should be as large as they can be for the finest image quality.

This is why Samsung packs 108 megapixels onto a sensor the size of Apple’s 12-megapixel one. When taking images in low light, noise becomes especially apparent when more pixels must be packed onto a smaller sensor. After much introspection, I’ve arrived at the conclusion that the larger sensor, pixel-binning, and Photonic Engine are significantly more significant than the 48MP sensor.

  • iPhone 14 Pro/Max sensor is 65% larger

This year’s iPhone 14 Pro/Max has a primary camera sensor that’s 65% bigger than the previous year’s model. That’s obviously not even close to what you’d get with a dedicated camera, but it’s huge for a smartphone camera! Yet, as we discussed up top, the quality would suffer if Apple packed four times as many pixels into a sensor that was only 65% bigger. Indeed, this is why 12MP resolution will continue to be the standard.

  • Pixel-binning

Apple employs pixel-binning to achieve 12MP resolution while shooting with the primary camera. This means that the 48MP sensor is effectively being used as a larger 12MP one, as the data from four pixels is turned into one virtual pixel (averaging the numbers). This oversimplified example nevertheless serves to convey the main idea:

What exactly does this mean? Pixels have a micron-scale measurement (one-millionth of a meter). The pixels on high-end Android smartphones are typically between 1.1 and 1.8 micrometers in size. When set to 12 megapixels, the sensor in the iPhone 14 Pro/Max effectively produces pixels that are 2.44 microns in size. Really, that’s a huge step forward. The 48MP sensor would typically be inferior without pixel-binning.

iPhone 14 Pixels
iPhone 14 Pixels
  • Photonic Engine

Of course, smartphone cameras can’t compete with dedicated cameras in terms of optics and physics, but they can in terms of computational photography. For decades now, computational photography has been a staple of SLR cameras. Changing metering modes, for instance, tells the DLR’s internal computer to apply a new interpretation to the sensor’s raw data. Selecting a certain “photo mode” on a consumer DSLR or mirrorless camera again instructs the camera’s processor on how to manipulate sensor data to produce the desired effect.

As a result, computational photography plays a considerably more significant role in standalone cameras than is commonly believed. And when it comes to computational photography, Apple really shines.

Apple’s Deep Fusion approach to computational photography is driven by the Photonic Engine, a specific chip, and I can already tell a tremendous improvement in the dynamic range of my photographs. (Next week’s iPhone 14 Diary entry will provide examples.) Not just in terms of tonal separation but also in terms of the savvy choices made over which shadows to emphasize and which highlights to rein in. The outcome is vastly improved photographs, the quality of which is due equally to software and hardware.

  • Wrap-up

Larger-than-average sensors (as compared to smartphones) greatly improve photo quality. For most images, Apple’s pixel-binning technique effectively creates a much bigger 12MP sensor, enabling you to make use of the sensor’s increased resolution. What we mean by “Photonic Engine” is an image-processing chip designed specifically for that purpose. The practical results of this are already apparent to me. Once I’ve had a chance to put the camera through its paces over the next few days, I’ll write a more in-depth entry for the iPhone 14 Diary.