Will Phones Soon Finish Off the Camera Market?

Mobile photo rig for pro shooters by David Cardinal

It’s no secret the world’s most popular camera is now the phone. In parallel, standalone camera sales have nose-dived. The point-and-shoot market, in particular, has been imploding, with only a few niches like action cameras, large-sensor photo-enthusiast models, and superzooms hanging on.

So far, there remains a relatively stable market for interchangeable lens cameras, albeit one that’s moving rapidly from being DSLR-dominated to nearly all mirrorless designs. There’s good reason to believe it’s only a matter of time before that market begins to collapse the way the market for point-and-shoots has. Apple’s emphasis on the impressive new camera features in its latest iPhones has underlined the question of whether it’s just a matter of time before the phone truly owns the market for general-purpose cameras. To get the answer, let’s look at how far phones have come and what obstacles they still need to overcome.

But Wait, Everyone Told Me Phones Had Awful Lenses

Toshiba light field smartphone camera moduleThere is a pretty common line of analysis running around the internet that says phone cameras can never be much good because they have these tiny plastic lenses. That argument isn’t without merit but is a bit lazy. Plastic lenses have some unique properties so that when coupled with the processing power in a modern phone, the resulting images can be excellent. On the downside, phone lenses have a lot of distortion, but much of it is types that can be fixed with software (vignetting and barrel distortion, for example).

Will purists complain that isn’t the same as an optically-perfect image off the sensor? Sure. Will the market care about the difference? Not really. On the upside, unlike glass, it’s easy to mold plastic to any shape desired. That means that aspherical elements are easy to make. Even high-end DSLR lenses typically only have one or two aspherical elements, but phones can use them as needed. This allows phone cameras to innovate in optics, despite their tiny form factor.

Overall, this means that when there is enough light to give their small sensors a chance, modern phones can capture really excellent images (once they have been converted from their RAW format, either in the phone or later). Also, whether photo purists like it or not, 95 percent of all phone images are viewed on other phones, not in the form of large prints.

Portraits and Creating That Bokeh Magic

Larger sensors make it much easier to deliberately control the depth of focus in an image, giving the photographer creative control over the overall look. Clever hardware and software are helping erase that gap between phones and larger cameras. First, depth estimation technology has become standard for the main camera on high-end phones. Whether it’s accomplished with dual cameras (like with many models from Apple, Huawei, Samsung, and others) or with dual-pixel technology (like Google does), it allows the synthetic blurring of images to simulate a variety of apertures.

Modern smartphones use a variety of techniques for estimating depth and creating portrait-like effects

Modern phones use a variety of techniques for estimating depth and creating portrait-like effects

Simulated shallow-depth-of-field and background Bokeh first appeared in various vendors “Portrait” modes, which blur objects in the background (and to a lesser extent the foreground) in addition to other tweaks designed to make people look more appealing. Early versions suffered from serious artifacts, and also tended to be conservative about affecting the foreground. Now, Apple has stolen the spotlight with a more flexible form of controlling depth-of-focus post-shooting with its recent announcement and Aperture slider. However, it should be noted that Huawei has been doing this same trick for a while. The quality of the final output has a lot to do with the accuracy of the depth estimation and the quality of post-processing. Apple, Google, Samsung, Huawei, and others all have high-performance, specialized “AI” chips to help with this task.

Phones have even pushed past traditional camerasSEEAMAZON_ET_135 See Amazon ET commerce in this area with AI-powered “Beautification” capabilities. While aimed primarily at selfies taken with the front camera, the techniques can be applied more broadly. This is another example of how the powerful computer in phones coupled with massive RD investments can give them capabilities much sooner than traditional camera vendors can figure out how to provide them. Here, too, purists may cry foul. But most users are just fine with a certain amount of help from their cameras in making their photos look better.

Panoramas and HDR

You may be saying, sure, I understand the lens thing and the Portrait feature, but my large-sensor camera has much better dynamic range, and bracketing RAW files is awesome. Yes, I get it. I love shooting bracketed scenes with my Nikon D850 in RAW and carefully post-processing the images. No, no phone can equal that — at least not so far. But with various kinds of enhanced HDR, recent phones can come close enough for almost everyone. Early phone HDR was a simple bracket of two or three images, blended using a relatively simple tone-mapping algorithm. But now Google, Apple, and others have gone way beyond that. They use a larger number of similarly exposed images to form the basis for a final, fused image. They take advantage of the phone’s processing power and their own software expertise to cleverly isolate and track moving objects so that it’s even possible to get HDR images of scenes with substantial amounts of motion.

Smartphone-panoramas-definitely-aren't-perfect.

Phone panoramas aren’t perfect — you can see alignment glitches in this one captured with a Pixel 2 — but they do an impressive job in tough conditions and only take a few seconds to capture and process.

Similarly, panoramas used to require a specialized plate for the tripod, some clever technique, and expensive post-processing software. Now, you can create a “good enough for the web” version by simply holding your phone up and moving it around according to the on-screen instructions. This isn’t surprising to anyone except the makers of high-end cameras who seem to have completely neglected to couple their great sensors with any type of automation or user-friendly interface for creating these images. Oh yes, and as I’m constantly reminded, hardly any standalone cameras incorporate GPS, so users have to jump through hoops to take advantage of one of the most powerful ways to manage and find images. Phone shooters get that for free.

Zoom: The Final Frontier

Zoom, or more accurately telephoto, is agreed by phone camera designers to be the most challenging of the imaging problems facing them. Multiple camera designs and clever interpolation can mimic zoom between the range of focal lengths provided, but you still need to have telephoto lenses. Everyone wants to sell thin phones, but they don’t provide enough thickness for a long focal length lens married to a decent-size sensor. So even with two- and now three-camera designs, the telephoto lenses have been constrained to a relatively paltry set of focal lengths, all under 100mm (35mm equivalent).

This image I captured of a definitely-not-stuffed Impala was a finalist in the related category of Mammal Behavior. Copyright David Cardinal

When I captured this image using a Nikon DSLR, no phone could have focused and stopped the action in time. Now, they can. But the 280mm focal length used is still beyond the reach of phone cameras.

This problem is important enough that it was a primary reason startup Light.co was received with so much enthusiasm (and funding). By placing a number of telephoto lenses sideways in the body and using mirrors (yes, it’s done with mirrors), Light has been able to achieve a telephoto range of up to 140mm so far. Nothing to write home about compared with superzoom and ILC standards, but certainly well ahead of the phone market. However, the current Light model is still too thick to be sold as a flagship phone because the folded optics require placing the sensor sideways, so the phone has to be thicker than the sensor size. (Also, the current Light L16 is only a camera, so it is entirely stuffed with camera electronics, not leaving room for a phone.) The company will be coming out with some type of phone camera module, featuring fewer cameras than the L16, so it will be interesting to see how it performs and what kind of telephoto reach it will have — and at what cost in a larger phone size. There are also some upcoming hybrid phone camera designs, like the RED Hydrogen One, that will blur the boundary between phone and standalone camera photography.

In the meantime, computational imaging is playing a role in helping phones address their lack of telephoto and zoom capability, too. First, by allowing the intelligent fusion of images from different focal lengths to synthesize intermediate lengths. Huawei and others are also doing an AI-driven super-resolution that uses multiple frames shot in a burst to create a single higher-resolution image. This is yet another technique made possible by the ability of modern phones to sample and process images at 15-30fps.

It’s Only a Matter of Time

Given the massive investment being poured into phones, it is only a matter of time before they replace every segment of the camera market of which they are physically capable. They’re not the right solution for drones, robots, or even cars, for example, and in many cases, action cameras don’t benefit from a display enough to justify a phone form factor. Of course, there will still be a need and a market for larger cameras, just like there is today for film, but increasingly it will only be out of preference and not necessity.

For several years, I’ve participated in a panel at the Electronic Imaging technical conference on what it will take for the phone to be the only camera needed. My presentation was simply a set of photos I couldn’t have taken without my standalone, high-end camera. Each year there are fewer slides in the talk.

In my case, I find the ergonomics of my Nikon DSLRs to make me much more productive than shooting with a phone. Even if my phone produced the same images, it’s more work to control for an extended shooting session. Given the form factor, there is only so much phone makers can do to address that issue. Of course, my phone is always in my pocket, so I’m finding myself using it more and more as it improves each year. And for people for whom the phone was their first camera, it will be more intuitive to use than learning the controls on a traditional camera. So, yes, we’ll always have “real” cameras, the same way we have medium format cameras, and film cameras, but they will be increasingly few and far between.

Now read: Light.co Aims To Put a DSLR In Your Pocket, Mobile Photography Workflow, and Best Camera Apps of 2018.

About Skype

Check Also

, Samsung Announces ‘Gauss’ AI for Galaxy S24, #Bizwhiznetwork.com Innovation ΛI

Samsung Announces ‘Gauss’ AI for Galaxy S24

For the last several years, smartphones have shipped with processors designed to accelerate machine learning …

Leave a Reply

Your email address will not be published. Required fields are marked *

Bizwhiznetwork Consultation