A Brief Explanation Of How Deep Fusion Works On Photos
Aadhya Khatri - Jan 22, 2020
Like Smart HDR of Apple, Deep Fusion works based on scene and object recognition, and the eight pictures taken before the user capture the scene
- YouTuber Proves That Apple Has Exaggerated The Ability Of Pro Display XDR
- Apple To Reopen Its Stores In Beijing, China, Despite The Coronavirus Outbreak
- You Guys Should Stop Feeling Superior For Being A Mac Owner
Deep Fusion has stirred people’s curiosity even before it is released in iOS 13.2. Now, when users have had some time familiarized themselves with it, let’s look at some of the tech’s underlying features.
Deep Fusion Is Not So Complicated
Like Smart HDR of Apple, Deep Fusion works based on scene and object recognition, and the eight pictures taken before the user capture the scene.
Now let’s talk about the eight images. Four of them are taken with a short exposure and the others are captured with a standard exposure. When users trigger the shutter button, the ninth picture is captured with long exposure.
Since the short exposure can create the time-freezing effect and support details with high frequency, the sharpest of these images is picked for the next step.
The three pictures taken with the standard exposure usually yield the best tones, colors, and other data of low frequency. They are then fused with the images with long exposure to make one single photo with the best features.
This image and the one with the sharpest of the short exposure shots are then sent to a neural network, which picks out the best pixel and then represents one final picture to the user.
This approach can help to eliminate noise, make the colors look accurately, and sharpen the details.
The processing is done behind the scene so it will have minimal impact on your capture time, meaning you can shoot as fast as on any other phone on an iPhone 11 lineup with Deep Fusion. The only thing you will notice is that there are some more pictures in the camera roll and a photo with a lot higher quality.
According to Apple, sometimes you may see a picture being processed for about a second. Deep Fusion is not available for burst mode and only users of the iPhone 11 and 11 Pro have access to it.
It Just Works
Users do not have the choice to turn the feature on or off. Deep Fusion will be at work whenever possible.
The feature will be of no use when users use Apple’s ultrawide-angle lens. With the main lens on, the company said that the feature will kick in under indoor light, in case the phone does not offers the night mode.
With the telephoto lens, you can have the feature when taking pictures of something that is not too dark or too bright. These situations usually call for the work of the main sensor, not the telephoto camera.
So far, many users have reported that the effect of Deep Fusion has been subtle on iPhone 11 lineup. In a test of an iPhone 11 with iOS 13.2 against another one running iOS 13.1, which means it has no Deep Fusion, the difference is hard to notice.
However, when zoomed in, users may notice finer details, which is not something you will be impressed at the first look, especially when you view the photo on the phone. What we most appreciate about it is that it can only improve the image, not make it worse.
While it is only possible to see the influence of Deep Fusion when you look really hard, it is still a nice addition to have. Even if it does not work magic like the night mode.