It’s such an exciting time for mobile app development. And after this week’s Google I/O 2018, we’re even more excited about what’s next for mobile apps. But when following events like I/O 2018 it can be hard to cut through all the noise. We’ve heard the hype about Android P, Artificial Intelligence (AI), Machine Learning (ML), Augmented Reality (AR) and Chrome. But what do they really mean for the future of app development?

To find out, we asked our developers what they think of the news from I/O 2018. Because what they don’t know about Android isn’t worth knowing! Read on to find out exactly what all this means for developing mobile apps. First up, let’s take a look at Android P.

App Development for Android P

Popsicle? Peppermint? Pancake? We still don’t know. And from the sounds of it, we won’t until later on this year. But we do know what to expect from the update. The developer preview of Android P has been out for a few months, and the Beta is now open to the public. You can find some of the key Android P features here. But going beyond just a list of new features, what does the update mean for app development?

Enhanced Push Notifications

We know notifications are getting much, much smarter with Android P. The newly-designed previews simplify conversations and even display images. With the update, it’s clear Google is keen to bring more context to the notification panel. Which means your users will expect to do a lot more from outside your app.

These ‘rich’ push notifications increase retention by up to 56 percent. And combining your push notifications with a more conversational in-app message centre message doubles their read rate. So, with Android P, you can improve your user communications and increase engagement across the board.

But to take advantage of push notifications, you need to break up the in-app tasks your users are carrying out into quick actions, responses or confirmations. These should be easy-to-understand, straight to the point and instantly actionable. If there’s any extra context that would bring value to your users, then make full use of extra text, content or multimedia in the expanded Android notification view. Let your users do more with less!

AI-Powered UX

As part of this update, Google is continuing its push to use Artificial Intelligence (AI) to power the Android User Experience (UX). Android P takes advantage of DeepMind, Google’s neural network, to improve in-app interactions as well as streamline processes. So, for example, your phone will be able to use machine learning (ML) to inform how it manages your battery. We asked our mobile developer Athan why this matters to users:

Sonin-App-Development-Quote-Beginning“Google is now taking advantage of ML to make each and every Android device more intelligent. The OS decides which resources are most important to each user according to their past habits and actions. Based on this, the phone spends most of its battery on these apps instead of the other apps and services that run in the background.” – Athan (Mobile Developer at Sonin)

Similarly, at I/O 2018, Google gave us a glimpse at its new Android P features: ‘Actions’ and ‘Slices.’ Which offer up new ways for apps to interact with the Android OS. Actions are shortcuts found in the search bar, the Play Store and in Assistant. With them, you can quickly call for a ride home or re-order your favourite takeaway.

Android P Actions Sonin App Development

Slices, on the other hand, use AI and ML to improve your users’ search results. Which means that when they search for your app, they get a personalised result screen. One that actually includes quick links to specific in-app tasks.

From your users’ point of view, actions and slices make their lives easier and improve their UX. But for you, both offer up a huge opportunity! By taking advantage of these new Android P features effectively, you can encourage engagement, drive up app usage and create a loyal user base for your app.

Giving Users More Control

To add to this, Android P offers users much more control over their mobile devices, their personal data as well as their digital behaviour. With a whole host of new privacy and ‘Digital Wellbeing’ features. Making for a much more transparent operating system. At the centre of this is the all-new Android Dashboard. So, we asked our mobile developer Athan what kind of control this will give Android users:

Sonin-App-Development-Quote-Beginning“Google has also revealed a more sensitive side with a focus on protecting Android users from meaningless device interactions with the Dashboard. This feature gives you a detailed overview of the amount of time you’ve spent on your phone, breaking down per-app usage, the number of times you’ve unlocked your phone, the number of notifications received etc.” – Athan (Mobile Developer at Sonin)

The best apps have the least interactions. So, this may mean re-thinking how you measure the success of your app. In-app time, for example, might mean less if your app can do more with less. But for enterprise and business apps especially, the less time your users spend in-app, the more time they have to spend on higher value tasks. More empowered employees mean better business performance for you.

Easy AI Integration with ML Kit

All of the above is really exciting: smarter apps offering better UX, better engagement and more return on your mobile investment. But knowing where to start can be difficult. Especially since a lot of Android P features rely on artificial intelligence.

The good news is that this week, Google introduced ML Kit, a new software development kit (SDK) that includes existing ML models for both Android and iOS. We’re already pro’s at artificial intelligence for mobile apps. But this makes it much easier to much easier for us to integrate AI into your app. Which means much quicker working and less development time needed.

We asked our mobile developer Martha what she thinks ML Kit means for mobile app development:

Sonin-App-Development-Quote-Beginning“The most important thing from I/O 18, I believe, is the new ML Kit. It’s a software development kit which allows the developers to integrate pre-built, Google-provided machine learning models into apps. Models like text recognition, face detection, barcode scanning, image labelling and landmark recognition. This will help us to make smarter apps that bring more value to our users.” – Martha (Mobile Developer at Sonin)

Improved User Interface & Material Design

With Android P, the User Interface (UI) has been made simpler and more approachable to new users. Say goodbye to the traditional bottom-of-the-screen buttons: home, back and multitasking. There’s now a small swipe button that opens up an all-new multitasking screen. From here, users can easily swipe between their most recent tabs. Sound familiar? We asked our designer Sarah what she thinks about the new design:

Sonin-App-Development-Quote-Beginning“The UI changes in Android P are bringing Android closer to iOS. For example, the all-new gesture controls are very similar to the iPhone X. This creates an easier transition between the two platforms for users. The interface on a whole is cleaner and simpler. It may throw veteran Android users at first but the more intuitive layout should make moving through your phone easier.” – Sarah (Designer at Sonin)

 

Gesture control is a great way of maximising screen space without compromising for elements like buttons. And while they may seem like a novelty at first, the developers at Sonin on the iPhone X say they’d be hard-pressed to go back.

Android P Gestures Sonin App Development

Satisfaction cycles are so short these days. And users get used to new features and interactions very quickly. So, you need to think about how user expectations will change. Both for how they interact with your app as well as their mobile devices.

And finally, we saw a long list of changes for Material Design, Google’s card-based design language. To combat complaints of it not being flexible or diverse enough, Google released hundreds of new example layouts. While stressing that these are ‘blueprints not rules.’ As well as a bunch of new design tools and developer releases, previously used internally but now available to everyone.

ML + IoT = Smarter World

There are two big trends right now in the digital space: the rapid rise of IoT and the rapid rise of AI/ML. So, it’s no surprise that Google used I/O 18 to launch Android Things 1.0. A variant of the Android OS built specifically for smart, connected devices.

This opens up a huge opportunity for the consumer, retail and industrial sectors. Particularly because one of Google’s main focuses with Android Things is being able to run it on low-powered devices. Or, as they put it: simple sensors, smart solutions.

This means that it’s now easier than ever for businesses and brands to start reaping the benefits of AI. Whether it’s retailers using the IoT to improve their customer experience or employers empowering their employees.

Augmented Reality Enhancements

And last but not least, Google also announced a major update to ARCore. A platform that’s now compatible with over 100 million devices worldwide. One of the big focuses of the new ARCore update was shared augmented reality experiences. This is made possible by a new capability called ‘Cloud Anchors.’ Which will let us create more collaborative experiences in AR across both Android and iOS.

Google’s ARCore now also includes ‘Vertical Plain Detection’ and ‘Augmented Images.’ And with this, users can interact with their environment more. Using the new feature, images can trigger AR content and immersive experiences. So, for example, posters could turn into interactive videos and the lids of boxes could instantly display their contents.

 

Using these new features, business and brands can now create even more immersive experiences with a much more social element. Making interactions more meaningful and more impactful. Whether it’s allowing employees to collaborate remotely with mobile, providing unforgettable customer experiences or empowering staff with extra insight.

What Does Google IO 2018 mean for App Development?

So much of what we heard at Google I/O 2018 revolved around smarter apps that can do more with less. By using artificial intelligence and machine learning, Android P and Google Assistant are streamlining the mobile user experience.

But to make the most of these new features, you need an in-depth understanding of your users and what they want to achieve. No-one opens an app for the sake of it. Your users want to complete a task or solve a specific problem. Android P and Assistant want to make that easier.

So, stop thinking about your app as a destination. Start thinking about it as a set of services to help your users achieve more. Break down your customer journey or employee workflow into specific actions.

How can you use Android P push notifications to make these actions easier? Could you use AI or Assistant to automate them completely? Because looking forward, your users will come to expect one thing: convenience. Even if that means they’re interacting with your app less.

Mobile App Experience Future Sonin App Development

Are you interested in developing a consumer or business app for Android or iOS? Do you want to know how you can make the most of something you’ve seen at Google I/O? Then we’d love to hear from you.

We’ve helped many of our clients to take advantage of the latest code and tech. Engaging their employees, connecting with customers and boosting business performance. Give us a call on +44 (0)1737 45 77 88 or get in contact with us to find out more.