Based on materials from android-softwares.com
It was 2012. The smartphone market was already quite mature, but the quality of pictures taken with mobile devices still left much to be desired. Most smartphone makers have just started to focus on this in recent years, and mobile photography still has a long way to go. And then the Nokia PureView 808 was born.
It had optics from Carl Zeiss, the first 41-megapixel sensor on the market, and powerful software. PureView 808 was intended to be the first smartphone to redefine mobile photography. The following year, Nokia launched the legendary Lumia 1020 with 3-axis OIS, along with an enhanced and improved camera app. Retaining the same 41MP resolution, the Lumia 1020 uses an improved sensor with back-illuminated technology. And even worked on Windows Phone 8, and not on Nokia's Symbian.
It would seem that with such a combination of hardware and software features, the Lumia 1020 set the tone for the market for a long time. So why haven't we seen smartphones with the same technology since then?
Diffraction, Airy Discs and Image Quality
There can be many answers to this question. One of them is related to the concept of diffraction, so you have to go a little deeper into the technical jungle.
Usually, light waves propagate in a straight line. When they pass through gas, liquid, or are reflected from certain surfaces, they change their trajectory. Diffraction is a phenomenon where light waves encounter an obstacle, causing them to bend around it, invariably causing interference.
If you think of an obstacle as a wall with a small circular hole, the light waves passing through the hole will experience at least some degree of diffraction. The amount of diffraction will depend on the size of the hole. A larger aperture (allowing most of the light waves to pass through) will cause less diffraction, and a smaller aperture (which traps most of the waves) will cause more diffraction. Something similar happens in the camera lens. The two images below will help you imagine the phenomenon of diffraction.
As you can see, the diffracted light travels outward in a concentric pattern. In a camera lens, as light passes through the diaphragm, a similar concentric pattern forms on the sensor. There is a bright spot in the center and concentric circles around it. This bright spot in the center is called the Airy disk, and the pattern is called the Airy pattern. They were named after Sir George Biddel Airy, who first observed the phenomenon in 1835, and then gave it an explanation. In general, a narrower aperture leads to higher diffraction, and therefore larger Airy disks.
The size of Airy discs and the distance between adjacent discs play an important role in the overall detail and clarity of the final image. During shooting, light passes through the camera lens, forming many Airy disks on the sensor.
Diffraction-limited optical systems
A camera sensor is essentially a pixel array. When a shot is taken, the sensor is illuminated and the pixels convert the light information into a digital image. On smaller, higher-resolution sensors where the pixel density is high, the diameter of the Airy disks can be larger than the size of a single pixel, causing the disks to spread out over several pixels, resulting in a noticeable loss in image clarity.
In the case of a narrow aperture, this problem is compounded when multiple Airy disks begin to overlap. This is what 'diffraction-limited' means – diffraction greatly degrades the quality of the image produced by a system that has these problems. There are many interesting ways to deal with this.
Ideally, we want one Airy disk to be so small that it does not extend beyond one pixel to many others. On most recent flagships, the pixel sizes are not much smaller than the Airy drives in these systems. But in order to avoid overlapping disks, in the case of such small sizes of sensors, it is necessary to limit their resolution. If you don't do this, but raise the resolution without increasing the sensor size, this will negatively affect the image quality. Even worse, smaller pixels receive less light. This is how shooting in low light conditions suffers.
It may seem counterintuitive, but nevertheless, a sensor with a lower resolution can sometimes produce a better image simply because the described problem can be solved by using larger pixels.
However, large pixels cannot render an image with sufficient detail. In order to reproduce as accurately as possible all the information that comes from the source, the sampling frequency should be twice the highest signal frequency. In the English-language literature, this statement is called the Nyquist theorem, and in ours – the Kotelnikov theorem. Simply put, the clearest photos will be those that have a resolution twice the specified size.
But the above is true only in the case of an ideal signal, which is prevented by diffraction in smartphone cameras. So Nokia's sensor could hide some of its flaws thanks to its high resolution and sampling, but its photos weren't nearly as sharp as expected.
So, in a smartphone that invariably imposes size restrictions, the loss of image quality caused by diffraction has become a pressing problem, especially in smaller sensors with high resolution.
The evolution of cameras in smartphones
Smartphones and their cameras have come a long way in their development, but they are not able to override the laws of physics. Although Nokia preferred the combination of a large sensor and high resolution, industry leaders decided to limit sensor resolution to minimize diffraction-related problems. As you can see from the table below, the first Pixel with modest camera specs has fewer diffraction issues than the Lumia 1020, especially when you take into account the improvement in sensors over time.
Smartphone | Aperture value | Sensor size (inches, diagonal) | Airy disk size (μm) | Pixel size (µm) |
Google Pixel / Pixel XL | f / 2.0 | 1 / 2.3 | 2.7 | 1.55 |
Nokia Lumia 1020 | f / 2.2 | 1 / 1.5 | 2.95 | 1.25 |
Sensors, image processors, and AI-based software algorithms have evolved significantly over the past decade. But all that is available to them is only to compensate for the loss of image quality in diffraction-limited optical systems. The Lumia 1020 sensor was a new word in 2013, but the sensors of modern Android smartphones give the best result in absolutely all respects, while taking up 40% less space.
Conclusion
Nokia's 41MP sensor has used sampling to hide its flaws. Meanwhile, it is easier and cheaper to make a sensor with a more rational resolution than to return to the times of megapixel wars. Therefore, the world Android of smartphones began to develop in a more logical direction, and Nokia's experiment with 41 megapixels became a kind of household name – but not because of the success of such a solution.
In the foreseeable future, smartphones will continue to be equipped with 12-16 MP sensors. And better image quality will be achieved by optimizing the hardware and software ecosystem as opposed to ultra-high-resolution sensors.