When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.

Google I/O 2018: Here are the new machine learning features in Android P

At its annual Google I/O event, Google showed off more of the features that'll be included in Android P. Speaking on stage, Android VP of Engineering Dave Burke, said that the upcoming version is the first step towards making artificial intelligence the core of the operating system; he went on to say that the main themes for the release were intelligence, simplicity, and digital wellbeing.

The first feature that was mentioned, that makes Android more intelligent is Adaptive Battery. The new feature uses on-device machine learning – meaning no cloud interaction is needed – in order see a 30% reduction in CPU app wakeups, which in turn lessens the impact on battery life.

Another new feature that will be making a debut in Android P is Adaptive Brightness. Unlike auto brightness, Adaptive Brightness takes into account your personal preferences for brightness levels and your environment. The on-board machine learning learns how you like to set the brightness based on your current lighting levels and then does it automatically, in a power efficient way, as it learns. Burke said that almost half of test cases are making less brightness adjustments than on previous versions of Android, as a result of Adaptive Brightness.

The third feature which was mentioned at the event was App Actions. In the launcher users will be shown a list of apps on the top row that machine learning has determined they’re likely to use, under this row will be actions that a user can take, for example, in the demo Burke showed a button for making a call to one of his contacts, and another to start his Strava workout – these options will adapt based on what you do with your device, plugging in headphones, for example, will give you the option to resume listening to an album. App Actions can be supported by all third party apps and will surface throughout the operating system including in text selection, search, the Play Store, and in the Android Assistant.

The last major addition is Slices. As you can probably guess, this will allow developers to show some functions of their applications around the operating system. Burke said Android P lays the ground work by showing Slices in search primarily. In the demo, Burke typed “Lyft” into Google search and was shown a Slice from the from Lyft app. Clicking on the result gave him options to get a lift to his Home or to Work, clicking on these results opens the app with the ability to order the ride ready and waiting. In another example, he searched for Hawaii and was presented photos from the Photos app that were related to Hawaii.

Developers can gain early access to App Actions and Slices beginning next month, giving them time to update their applications over the summer and early autumn before Android P finally starts making its way to devices.

Report a problem with article
Next Article

Google announces new features for Lens, including broader availability

Previous Article

Middle-earth: Shadow of War's Desolation of Mordor expansion launches

Join the conversation!

Login or Sign Up to read and post a comment.

9 Comments - Add comment