Based on materials from androidauthority.com
How many cameras are optimal for a smartphone? At what point does benefit end and multi-chamber frustration begin?
2018 can rightfully be called the year of the appearance of triple cameras, since Huawei and LG released their smartphones with unique main camera modules.
But Samsung decided that three was not enough. And last month released the Galaxy A9 2018 with four cameras. Yes, putting such a camera in a non-flagship apparatus is a strong move, but can it be considered the beginning of a new trend on the market? Let's try to figure it out.
What's missing with dual cameras?
Smartphones with dual cameras come in only five types. They can have a second camera to increase the depth of the frame, a telephoto camera, a super wide-angle camera, a monochrome camera, or a night camera.
Smartphones with telephoto, super wide-angle, monochrome or low-light cameras as a second module can generally collect information for portraits or other effects, reducing the need for a dedicated sensor. We usually see a camera for depth of field in cheaper cameras when the manufacturers want to add that effect at a low cost.
Choosing a dual camera gives you more flexibility than a single camera. However, it is difficult to choose which second camera will be.
You might want to fit more into the frame with a wide angle camera. Or maybe you need a low-light camera to make your nightclub photos look presentable. Maybe you just need a good zoom. But whatever you want, when buying a smartphone with only two cameras, you are forced to choose.
What if you add another one?
Moving to three cameras gives you even more flexibility. For example, you do not need to choose between telephoto and wide-angle cameras, you can add both of them to your usual one.
Huawei The P20 Pro was the first to feature a triple camera, a 40MP main camera plus a 3x 8MP telephoto camera and a 20MP mono camera. The combination of optics and 'smart' software made it possible to get great shots in low light conditions and great zoom.
Huawei has long used a monochrome camera to improve low-light shots, but also use this camera to create better digital zoom (calling it 'hybrid'). The P20 Pro's combination of a 3x telephoto camera and a 40MP high-resolution sensor delivers the same quality images at 3x and 5x zoom.
The LG V40 uses a triple camera module that on paper is even more flexible in its capabilities. The smartphone LG has a 12 MP main camera, a 12 MP camera with a 2x zoom lens and a 16 MP camera with a wide-angle lens. This solution gives the company a radically different perspective from each camera in one phone. Here, take a look:
LG continues to exploit the triple camera mix in 2018 smartphones and will apparently keep the same mix in 2019 as it did in the Huawei Mate 20 Pro. The company Huawei ditched the monochrome camera in this model, but kept the standard and zoom cameras, adding a 20MP camera with a wide-angle lens to them. As a result, the Huawei P20 Pro retains the ability to take high-quality images with 3x and 5x zoom.
How about four main cameras?
Samsung engineers decided to add a fourth camera to the module for even better image quality, which resulted in the Galaxy A9 (2018) last month with four cameras on the back. To the usual standard telephoto wide-angle cameras, the Korean company has added another 5 MP sensor for measuring the depth of the frame.
This is a rather unusual move, as the wide-angle and zoom cameras preserve depth information anyway. In fact, in the Galaxy Note 8, Galaxy S9 Plus and Galaxy Note 9, the Live Focus feature works through the use of a telephoto camera. Adding a dedicated sensor solely for measuring the depth of the frame can be used to get an even more detailed depth map (important for portrait shots and creating bokeh effects).
A camera pairing solution similar to the Galaxy A9 (2018) is not the only option, as other brands may well replace a depth sensor with a monochrome camera, a high-sensitivity camera for low-light photography, or another telephoto lens.
But the question arises, how much does the quality of the resulting images increase from adding one more camera to the module? It is difficult to say so far, but this creates a problem.
Space issue
Unlike large DSLR cameras, there isn't much free space for smartphone cameras. In addition to them, you need to put a screen, a chipset, a battery and an abyss of other components in the case, which simply do not leave free space. Because of these considerations, mobile phone sensors are already extremely compact, so they cannot receive the same amount of light output as the much larger sensors of full-size cameras can absorb. This forces their developers to implement more sophisticated image creation algorithms (such as Image Averaging) in order to reduce the quality lag from images obtained on large matrices.
But if you add more cameras, then you have several ways to solve the problem: further reducing the size of the sensor of each camera, reducing the size of other components, or completely redesigning the phone to accommodate additional equipment. If you go to the first, then no algorithms for working with the image will be sufficient to hide the difference. Reducing the size of other components is also undesirable, since in many cases this leads to a deterioration in the parameters of the device (for example, a decrease in the size of the battery leads to less autonomy). There remains only a way to completely change the design, where an increase in the overall thickness of the device or the appearance of a protrusion in the chamber area may become a problem.
At the same time, adding another camera does not mean an automatic increase in the quality of images. If a company uses bad cameras in their models, then they will have poor picture quality in a four-camera smartphone. All that will change is that the device will have more perspectives of images with equally poor image quality.
However, the current era of dual-camera smartphones is fueled by the fact that the secondary camera brings more value (flexibility and more processing data) than the problems described.
The big question is determining the point at which negative effects begin to outweigh positive effects. We have an example from the photography industry – the Light L16 camera (pictured above).
The L16 camera was released last year and received as many as 16 (!) Cameras with lenses with different focal lengths (from 28 to 150 mm). Working together, these cameras can produce high-quality zoomed images and still images up to 52MP. So it sounds beautiful in theory.
In practice, everything turned out to be not so great. The speed at which the shot was taken, the need to use the Light Lumen app for most settings, and the wild variation in the quality of the resulting frames caused a natural disappointment in the reviews. Add in poor low-light performance and lack of video recording capability and you have multi-camera frustration.
It is clear that over time, software improvements will help Light L16 get rid of its shortcomings, but the initial result looks like a warning for the creators of Android smartphones: simply adding cameras creates more problems than it solves.
In the meantime Huawei has demonstrated the ability to get fantastic results thanks to the block of three cameras. It seems that it is the triple modules that currently optimally cover the hardware needs, so we will see them in the flagships of 2019 as well. According to rumors, the Samsung Galaxy S10 will have just such a solution.
Do you also think it's too early for four cameras? Or not? Share your opinion in the comments!