Today we no longer have just a single computer in the living room. We instead have several devices all connected to the Internet and we take them everywhere with us. However, sometimes these devices act just like that single computer: siloed and without any awareness to where you are or what you’re doing.
If developers want to create smarter apps, ones that can predict what the user wants to do, they have to do everything themselves. This may end up taking the form of an always running background service which may bog down your phone and drain the battery, especially when more than one is happening.
Google announced the Awareness API at Google I/O for Android. Now it’s being rolled out in the latest version of Google Play Services. There are two types of requests: snapshots, which happen inside an app, and fencing, which allows the app to be called from the background.
One example of this API is SuperPlayer, which suggests music based on what kind of activity you’re doing: driving or running or at the gym.
This is a powerful set of APIs that have implications for TV. What if an app identified when a user returned home and was able to automatically cast something to the TV or just turn on? Maybe it loads a fireplace app to give your home a classy feel. Google Home is part of this ecosystem, with the goal of allowing all your devices to talk to each other and work together instead of in silos.
TVs may be able to react based on the weather and suggest content fit for the current day. Raining? Here are some good movies to watch. Cold? Watch a Christmas movie! Tornado? Here is a video demonstration of how to stay safe. (I don’t think this last one is available.)
As the IoT progresses and our number of devices increases dramatically, Google is continuing to think about how for all these devices to work together for a good user experience.