How AI Powers A Lot Of Features On The Pixel 4

Aadhya Khatri - Oct 23, 2019


How AI Powers A Lot Of Features On The Pixel 4

Google’s flagship smartphones have long been powered by AI and the new Pixel 4 and 4 XL are no exception

Google’s flagship smartphones have long been powered by AI, and the new Pixel 4 and 4 XL are no exception. The Pixel 4 comes equipped with a TPU chip, Neural Core, and some features we have seen on the Pixel 3. However, this year sees some significant changes coming to Google Assistant, including speech recognition, facial recognition, and some new camera functions.

Camera

Cameras are now the trendiest selling point; this is why Google put a heavy emphasis on Pixel 4’s snappers at last week’s event. With AI, we have Night Sight for taking pictures in low light and also depth prediction for images taken by the Portrait Mode.

Pixel 4’s Portrait Mode is sharp:

Pixel-4-portrait-mode
Picture taken by Pixel 4's Portrait Mode

The phone’s depth in this mode is stronger than that of any other previous Pixel phone. If the phone is put on a tripod, Night Sight can deliver an even better result.

Pixel-4-night-sight
Night Sight on Pixel 4 can deliver incredible astronomy photos

Another addition is the ability to record 4K video. On previous Pixel phones, holding the camera button will result in a dozen of pictures, but now, doing so and you will have a video. If you want more adjustments, swipe down, and the phone will reveal extended controls.

White balance enabled by machine learning was first introduced on the Pixel 3, and now we see its comeback on the Pixel 4 for photos with more accurate color temperature.

Another highlight of the phone’s cameras is the Super Res Zoom that offers 8x zoom with the telephoto lens.

Pixel-4-Super-Res-Zoom
A picture taken by the phone's Super Res Zoom

Facial Recognition

With the Soli radar chip, Google claims that its Face Unlock has beaten Apple’s Face ID. With the new feature, users of Pixel 4 can verify a transaction with a simple scan of their faces. However, as with any other pioneering feature, some serious flaws of the system have been pointed out.

Last week, the BBC found out that the facial recognition feature of Pixel 4 also works when the users’ eyes are closed. The risk becomes clearer when you know that the phone will unlock whenever it detects its owners’ faces, even when they are sleeping, unconscious, or dead.

On Sunday, in response to the concerns over this flaw, Google said that it would fix this issue with a software update. Another more minor flaw is that Google’s system to detect faces does not work very well with people who have darker skin. Interested users of the Pixel 4 can count on an improvement in the phone’s performance.

Next-Gen Google Assistant

Google Assistant can hold a multi-turn dialogue, thanks to the Continued Conversation. So now, after you say a wake word like “Okay, Google,” the assistant will carry out what you ask and then keep listening for further instructions like “Thank You” or “Stop.”

You can ask Google something and then keep diving into that particular subject, interacting with a website or an app while Google Assistant still runs in the background.

There are disadvantages though. If you ask the virtual assistant to share something with a friend, it will just take a screenshot and then send the picture over. This method works on something like a weather forecast but not on other kinds of content like an email or a website. In these cases, sharing the link is more natural and handy.

Google Assistant makes use of the Neural Core and an on-device language model to increase its own speed, but it does not mean you will never experience lags. If your data or Wi-Fi connection is unstable, latency will also happen.

There is a weird thing though. The next-gen Google Assistant, a tool to increase efficiency, cannot work with G-Suit.

The virtual assistant does a better job with Google Photos and interacting with apps, but you will still notice the lack of contextual awareness. For example, if you say “find the nearest drug store,” the assistant will exit Maps and turn to Google Search for the result.

The appearance of the assistant will be different on the Pixel 4 too. Now all you will see a set of Google’s main colors glowing on the screen.

Speech Recognition

It is a fact that for most people, using conversational AI is faster than typing on a keyboard as it turns speech into words. While this feature can be found in many places but with Pixel 4, you will have an automatic transcription of what people converse in videos.

With Live Caption, you will have transcriptions for audio messages, video, and podcasts. The feature is not perfect, but it can be handy when you want to know the content, but for some reason, cannot listen.

If you want to move the text, all you have to do is to tap and hold them. With a simple double-tap, you will have an expanded view.

The transcribing feature is also in the Recorder app, which lets you skim through audios for specific words and even export transcripts.

The app can sometimes make mistakes, but it will provide you with the speech-to-text function in real-time. One of the most serious downsides is that the app’s ability to label speakers are not so good, so sometimes you will see words blending in with each other.

All of these new functions have been available for GBoard users for message sending or Google Doc writing for a few years now.

Comments

Sort by Newest | Popular

Next Story