Google unveiled its hotly-anticipated Pixel 4 yesterday, and it’s an amateur photographer’s dream. The only problem is it doesn’t have an ultra-wide-angle lens. Does that really matter?
The fourth iteration of the Google Pixel was revealed at a special hardware release event in New York yesterday.
As expected, Google’s new flagship will come in two sizes. The standard Google Pixel 4 comes with a 5.7-inch display, while the XL gets a larger 6.3-inch display.
The Google Pixel 4 line comes with a number of intriguing new features, including radar-based gesture control and a significantly upgraded Google Assistant.
Unfortunately, it is this radar gesture control which means the Pixel 4 will not be available in India.
But the Pixel 4’s main selling point comes in the form of some pretty unique photographic capabilities.
Google has taken an interesting approach to the Pixel 4’s camera. As news.com.au reported today, some people feel Google has missed a trick with the Pixel 4 as a result.
Why are some people disappointed with the Pixel 4’s camera offering?
If there’s one philosophy which sums up the current era of smartphones, it’s “the more cameras, the better”.
In 2018 Samsung released the first smartphone with four rear lenses – the Samsung Galaxy A9 (2018).
Thankfully, the smartphone camera arms race seems to have slowed down a bit since then. Apple’s iPhone 11 Pro and Samsung’s Galaxy S10 both come with three rear camera lenses.
Meanwhile, the iPhone 11 and Google’s new Pixel 4 both come with just two. For many people, two rear camera lenses is more than enough.
But Google’s decision not to include an ultra-wide-angle lens on the Pixel 4 has drawn a lot of criticism from tech commentators and potential customers alike.
As a phone selling itself upon its photographic capabilities, many see the omission of an ultra-wide-angle lens on the Pixel 4 as a real drawback.
But Google has included features which should more than make up for the lack of an ultra-wide-angle lens.
Software over hardware
Google has taken a different approach to the Pixel’s photographic capabilities. Instead of adding more lenses, Google has prioritized software in its attempt to create a smartphone which really excels at photography.
Speaking at the New York hardware release event yesterday head of camera technology research at Google Research, Mark Levoy, said Google has opted for a “software-defined camera”.
While this may sound like marketing jargon, Google really has focused on the Pixel 4’s software to improve its photographic offering.
For example, the Pixel 4 will feature all-new machine learning-based white balancing for better photos.
It also comes with Live HDR+, which shows the user how HDR processing will be applied before taking a photo.
Another feature which has caused considerable excitement is the improved image processing included in Night Sight mode on the Pixel 4.
Specifically, there will be a new astral photography mode, which will give stunning photos of the night sky.
Instead of choosing an ultra-wide-angle second lens like Apple has with the iPhone 11, Google has opted for a 2x telephoto lens to go alongside the Pixel 4’s standard lens.
Unfortunately, however, Google appears to have done away with unlimited, full-resolution photo back ups with the Pixel 4.
All things considered, the Google Pixel 4 seems to more than make up its the lack of a super-wide-angle lens with a host of other software-based features.