Rather than relying on the biggest sensor or multiple lenses, Google turned to software to boost the Pixel 4’s photographic capabilities. This allows it to excel at night-time photography, and now Google has given us an insight into exactly how it works.
Google’s latest flagship smartphone, the Pixel 4, has been out for just over a month now. It’s proven a capable competitor to the iPhone and Galaxy, especially in the camera department.
When it comes to photography, the Google Pixel 4 offers something very different to Apple and Samsung’s smartphones.
On Tuesday, Google published a blog post which explains in more detail how the Pixel 4 achieves such great night-time photos. We’ll summarise it below.
Why don’t normal smartphone cameras take good low light photos?
Nowadays smartphones are much better at taking good photos in all sorts of light conditions. But, generally, low light is one of the things they struggle with.
One of the most apparent shortcomings is the grainy appearance of photos taken in the dark, especially the night sky.
To achieve accurate photos in low-light conditions you need to increase exposure or increase the aperture. Because smartphone cameras are pretty limited when it comes to aperture settings, exposure is the go-to method of taking low-light photos.
Unfortunately, the longer the exposure, the blurrier a photo will become if there is even a slight bit of movement from the subject or camera itself.
Additionally, pixels can give false readings in low-light conditions due to background interference from electrical signals running through the sensor. These are known as ‘warm pixels’, as they accidentally transmit a signal as if they were exposed to light.
This gives dark areas of photos the typical ‘grainy’ appearance which can ruin low-light photos.
How does the Pixel 4 get around these issues?
According to Google’s blog post, the Pixel 4 has a number of ways of addressing the issues which typically affect low-light photography.
Rather than relying on long exposures like most smartphone cameras, the latest version of Night Sight takes multiple consecutive short exposure shots to reduce motion blur. On their own these photos would appear very dark and potentially blurry but, when combined, averaged and treated with a software algorithm, they produce a much clearer low-light shot.
To get around the issue of warm pixels, the Google Pixel 4 has another clever software trick up its sleeve.
While warm pixels are an unavoidable consequence of the structure of the silicon substrate used to make camera sensors, they can be addressed.
Firstly, warm pixels can be identified by comparing each pixel with those around it. Once an outlier has been identified, the values of its neighbours are averaged and then used to replace the previous inaccurate value.
Using this method, the Pixel 4’s camera software can accurately reconstruct a photo as it would look in an ideal world without interference.
Another innovative feature is the ‘post-shutter viewfinder’ which, instead of displaying live footage of the camera, displays long-exposure photos once they have been captured. This allows users to see a much more accurate representation of their photos as they are being taken.
Although the term ‘software-oriented’ may turn some potential buyers away from the Google Pixel 4, the photos speak for themselves.