Big Myths About Phone Cameras You Need To Stop Believing

The development of smartphone cameras ranks among the most remarkable innovation stories in recent history. Compared to where we were two decades ago, today's cameras offer photos and videos that come incredibly close to real life. A quick comparison of today's midrange smartphone camera with that of the first iPhone reveals stark differences in detail, color gamut, and overall picture quality. Even better, the latest flagships keep us excited about the possibility of higher-quality media in the future.

Advertisement

That said, it's easy to get the wrong idea about how your camera works or what it can do. On the one hand, the majority of smartphone camera specifications are quantified — megapixel count, sensor size, and the number of lenses, for instance — and it may seem that simply increasing these numbers results in better picture quality. On the other, smartphone cameras are marketed with tempting features like "AI processing" and "astrophotography" that blur the lines between quality and reality. As this article explains, much of what is believed about the modus operandi of smartphone cameras is, well, inaccurate.

Of course, the quality of photos and videos is largely influenced by the smartphone user and environmental conditions. Taking photos in a properly lit setting, keeping the lenses clean, using gridlines, and capturing in high dynamic range are some of the steps that help bring out the best in your camera. But there's little that can be done to influence the camera's operation once it's out of the factory line.

Advertisement

More lenses mean better photos

Over the past few years, there's been an industry-wide movement toward fitting multiple camera lenses on smartphones. This is not without good reason, as these camera varieties increase the scope of possibilities in smartphone photography. The iPhone 14 Pro, for instance, features a triple-camera setup on the rear: a 48-megapixel (MP) main sensor, a 12 MP ultrawide, and a 12 MP telephoto lens. The ultrawide camera can capture images covering a larger width than the main camera, and the telephoto lens is purpose-built for maintaining high quality at a longer focal distance.

Advertisement

But that's about where it ends. While it is difficult to achieve all of these with one camera lens, it's also not the case that engineering several cameras on a smartphone would immediately improve the photography or videography experience. The Nokia 9 PureView perfectly illustrates this point, with its five-camera setup consisting of three monochrome sensors and two RGB sensors, all capped at 12 MP with the same f/1.8 aperture. 

According to Nokia, the cameras were designed to work together in retrieving more light than could possibly go through one lens, and produce "photos with superb dynamic range that capture detail and texture from both the highlights and shadows, plus incredible depth-of-field and vibrant, truest-to-life color." The phone also boasted RAW photography, which should have been of superior quality given its synergy of five lenses. Indeed, the Nokia 9 Pureview's disappointing photography prowess stands as proof that you don't automatically get better pictures the more lenses you incorporate.

Advertisement

More Megapixels, better photos

Ultimately, most smartphone users just want great photos and videos. Because manufacturers understand this, they will utilize every tool at their disposal to promote their phones as introducing a revolutionary moment in smartphone photography. One of these tools is the megapixel count, which has gone from less than five megapixels on a phone camera two decades ago to 200 megapixels on some of the latest flagship phones.

Advertisement

Simply put, a pixel represents a unit area of information in a digital image. A megapixel — that is, one million pixels — is the foundational indicator of how much information the camera can hold in a shot. Thus, an increase in the megapixel count of a camera implies that it can hold more detail in an image or video frame, which also increases the file size of the media. This is illustrated by the fact that images taken with a higher megapixel camera will retain their detail and quality when zoomed in at the same percentage as images taken with a lower megapixel camera.

But this does not necessarily indicate an improvement in picture quality comparatively. The number of megapixels is distinct from other features of the camera lens, such as the aperture, sensor size, and ISO — all of which play a role in the overall camera output. Just as important with today's high-megapixel cameras is a feature called pixel binning, by which multiple neighboring pixels' data is combined into one.

Advertisement

The lower the Aperture, the better the picture

It's worth mentioning at this point that smartphone users will have different conceptions and ideas of what it means for a photo to have great quality. Some consumers cherish brightness in photos, and hence they would prefer a camera with a lower aperture to let more light in. Others would go for more contrast in their pictures, implying that the aperture would not matter as much to them. Despite this subjectivity, it's safe to say that smartphone users are all for camera setups that would make them look good.

Advertisement

The aperture on your smartphone's camera determines how much light comes into the camera to produce the image. It is a hole in the camera lens, as the name implies, and its size is measured in "f-stops" — the lower the f-stop number, the wider the aperture, and the more light is let into the camera. Thus, an f/1.8 aperture camera sensor would let in more light than an f/2.8 aperture sensor. While professional cameras have variable apertures, that of smartphone cameras is fixed (most of the time).

It's obvious that lower-aperture cameras are better suited for nighttime photos, but the story is different for daytime shots. Reducing the camera aperture increases the exposure of shots, which in some cases could make photos unnecessarily bright. However, a low aperture also creates the possibility of background blur in photos, and this can produce spectacular results if done right. The challenge for manufacturers is to find the perfect aperture that delivers impressively on both sides.

Advertisement

Depth of Field requires two camera lenses

Having noted that multiple smartphone cameras do not necessarily lead to better picture quality, one of the majorly advertised use cases of a second camera is to create a background blur and accentuate the object in the foreground. This second camera is typically a telephoto lens, but a monochrome camera lens can also be employed. 

Advertisement

The smartphone achieves the depth of field effect by taking pictures with both camera sensors simultaneously, adding a blur filter on the background image, and then overlaying the main image on this blurred background. Although this might feature some inaccuracies in edge detection and softness, the result is generally good enough for the photo to be admired.

In any case, you don't always need this second lens to create such an effect in your photos. For instance, before Google's Pixel phones started featuring multi-camera setups, its single cameras achieved commendable results in depth-of-field effects. These flagship cameras were designed with computational processing algorithms to detect the main focus of a photo and blur its other parts. 

Advertisement

In addition, apps like Bokeh Lens on iOS and AfterFocus on Android approximate these effects in taking shots. And if your single-camera phone has a "Pro" mode, there's a chance that you can modify the focal length of shots and blur everything beyond a certain distance. The possibilities are not endless, but you can certainly achieve the background blur without a second camera.

AI Camera Smarts Always Make Photos and Videos Better

It's 2022, and the term "artificial intelligence" (AI) has become the ultimate marketing keyword in the tech space. AI photography can mean many different things — it's not an objective descriptor for one specific sort of feature. The presence or absence of "AI" in a smartphone's camera setup does not make or break the potential for it to capture top-notch photos and video.

Advertisement

It's true that AI technologies can optimize image quality by modifying advanced properties of the image, such as dynamic range, ISO, and white balance. They can also detect exactly what elements are in an image, and find ways to modify the photo based on these elements. In some smartphones, these technologies also set up parameters before a photo is captured. As such, they can generate photos that are considerably better than the original shots, while retaining picture quality and sharpness.

The problem is that in some cases, AI algorithms modify images based on a preprogrammed definition of what a great photo should look like — a definition that might not be accepted by everyone. Thus, while some smartphone users may absolutely enjoy taking photos with an AI-influenced camera, others would rather disable this feature. In general, AI camera features will truly come of age when they can learn from the patterns and preferences of smartphone users, and subsequently edit photos to work for them.

Advertisement

Image Stabilization Leads to 100% Stable Videos

Among the several improvements recorded in smartphone camera technology, image stabilization remains one of the most significant. We can now record videos that account for and adjust to the instability of human movement. There's no doubt that image stabilization has improved the smartphone photography experience by leaps and bounds.

Advertisement

Currently, there are two most employed techniques for this effect. Optical image stabilization (OIS) works by oscillating the camera lens to counterbalance the movements of the smartphone user. It is achieved using a gyroscopic sensor tuned to high levels of sensitivity and precision. 

Electronic image stabilization (EIS), on the other hand, works by adjusting each frame to fit a control point, which creates artificial stability in the media. Finally, hybrid image stabilization combines OIS and EIS to ensure that captured media is both stable and clear.

Unfortunately, we're not yet at the point where any of these image stabilization technologies can provide completely stable videos. A close observation would reveal that there are still imbalances in the videos, and these are more pronounced when the user's movements are particularly shaky. We can, nonetheless, appreciate what is being achieved with today's image stabilization tech — even as smartphone manufacturers continue to propose new ideas.

Advertisement

For those interested in recording smoother videos, there's always the option of a gimbal. This handheld device uses gyroscopes to keep the phone stable and is more trustworthy than smartphone image stabilization tech.

All the cameras are used all the time

If you currently have a multi-camera phone, here's a simple experiment you can try. Open the Camera app and, while focusing on an object, try covering your smartphone's camera lenses one by one. You'll most likely find that covering certain cameras would have no effect on the image capture, but covering the main camera would obstruct the image. 

Advertisement

If your smartphone has a range of camera modes, switch between these modes and cover the camera lenses one by one. Depending on the smartphone, camera algorithm, and image conditions, your camera might correspondingly alternate between lenses, meaning that your main lens does not take all the shots.

Smartphone manufacturers are consistently seeking to improve the functionality and output of their cameras using both software and hardware measures. The move towards multi-camera smartphone setups is an indication of this. Nonetheless, it's seldom the case that all camera lenses are working at the same time. Whether it's to capture normal images with great colors, obtain macro shots at high quality, or optimize photos for smoother editing, the specialization of each camera lens and type is what helps the smartphone user to achieve the desired result.

Advertisement

Even in cases where more than one camera is in operation — such as in the depth of field photo, others are most likely not in use. This further emphasizes the issue of redundancy in smartphone setups with four cameras and above. Interestingly, both midrange and flagship phones feature this multiplicity of cameras while differing greatly in picture quality.

Optical Zoom and Digital Zoom are the Same Thing

Smartphone enthusiasts might recall the Samsung Galaxy S4 Zoom, the 2013-released smartphone by Samsung that featured a retractable camera lens. We might also recall that Samsung did not move forward with similar devices — cameras that are also phones, rather than the other way around — as it was not in alignment with the industry's movement towards more portable products. The phone remains a major reminder of the difference between optical zoom and digital zoom in cameras, and indeed the superiority of optical zoom over digital zoom.

Advertisement

Just as in professional cameras, optical zoom works by adjusting the lens to increase the magnification of the object. Image quality is not lost during optical zoom within the set limits of focal length and magnification of the lens. Digital zoom, on the other hand, operates by expanding on the pictorial information being captured by a group of pixels. The fewer the pixels, the lower the quality of the image when zoomed in — until the image becomes pixelated.

By fitting more megapixels into camera setups, smartphone manufacturers continue to push the boundaries of what can be achieved using digital zoom. At the same time, the work on optical zoom lenses for smartphones has not been shelved. It would be exciting to see how smartphone companies navigate through the engineering challenge of installing optical zoom cameras on smartphones with today's form factor — and if the market will embrace this, should it be eventually achieved.

Advertisement

Smartphones Always Give You RAW, Original Photos

In the nascent days of smartphone photography, both the cameras and their resulting photos were of inferior quality. The challenge of translating images from the camera to the phone's storage was daunting enough, and there was little opportunity for high-level image processing. Today, the story is different, as this opportunity has grown with advancements in software and hardware technology for smartphone cameras. If you take a picture on your smartphone, it's more likely than not that the resulting image has passed through a series of image processing algorithms before registering it as an image file in the phone.

Advertisement

Furthermore, image processing occurs at different levels. In some smartphones, the algorithms are relatively straightforward: images are modified to reduce noise, increase brightness, and moderate saturation. In others, the algorithms are a bit more complex: image properties are edited based on the environmental conditions of the shot, to create a balance between the beauty of the object and the quality of the image. And yet other computational processes include artificial intelligence and machine learning algorithms that sift through every pixel of the image and, as in the Google Pixel 6 Pro, optimize for skin tone. Certain smartphone companies are driving the development of custom imaging chips: smartphone chipsets dedicated solely to image processing.

Advertisement

It seems to be the case that consumers prefer photos that come out looking as good as possible and require no edits. This raises the question: do smartphone users want original photos in the first place?

Cameras work better with social media apps

Speaking of original photos, the rise and acceptance of social media imagery might be an indication that raw photos are not valued as much by smartphone users — especially young people. Taking pictures and videos using Instagram or Snapchat has become a preferred option for many smartphone users, and some even use these as their primary photography apps. Moreover, given the observed differences between the quality of pictures taken with Instagram on iPhones versus Android phones, smartphone users might be inclined to believe that Instagram for iOS has been built to raise the standards of iPhone cameras.

Advertisement

Well, not really. Besides edits, filters, and stickers, these applications do nothing to improve picture quality. The distinction between Instagram photos taken with Android phones and those taken with iPhones has more to do with software-hardware integration than with cameras alone. Since iPhones are built using one company's chipsets and operating systems, it's easier to develop mobile apps that utilize this architecture completely. This is not the case for Android phones where, for instance, the Android 12 OS is installed on a Samsung phone running on the Exynos chipset. Until Android-powered smartphones become unified or standardized across the board, these apps will be limited in what they can achieve camera-wise.

Advertisement

Besides noting that social media apps do not improve camera quality, it's worth mentioning that smartphone users would be mistaken to base their expectations of future cameras on social media imagery. Whether this is the path for future camera innovation remains to be seen.

Smartphone cameras have surpassed professional cameras

As excited as we are — and should be — about smartphone cameras, and as elegantly as they are advertised by smartphone manufacturers, they haven't quite gotten to the level of professional cameras yet. There's still a long way to go and some tough decisions to make.

Advertisement

Of course, smartphone cameras are already so good. Users can capture and record almost any type of image and video, respectively, on these cameras. Some of them, like the iPhone 14 Pro, have even been promoted as capable of shooting full-length movies. In addition, smartphones are shipping with ever-improving features for photos, video, and zoom quality. The iPhone 14 Pro's "Cinematic Mode" brings to smartphones what was once thought to be only achievable on professional video cameras. Without question, it only gets better from here.

For now, though, smartphones haven't quite reached the mark. DSLR cameras are still more true to life than smartphone cameras, as they are better at capturing the patterns of color, light, and shade in photos. Also, professional cameras provide more accurate control over imaging properties such as exposure, focal length, bokeh, ISO, and white balance. Thus, even at equal resolutions, higher quality images are achieved with professional cameras than with smartphone cameras.

Advertisement

If nothing else, smartphone camera developments are still driven by a mission to bring the full capability of professional cameras to the smartphone level. This threshold might be crossed sooner than later, and this is why the future holds great things for smartphone camera technology.

Recommended

Advertisement