There are many things Google and Apple do, but one of the most under-appreciated is how they blend social features into their mobile operating systems. After all, Android isn’t as varied as iOS, which makes it a lot easier to deal with your data on-the-go. That makes way easier for people to use the Google Assistant. The personal assistant, which is the Google Assistant built into Android, is almost mandatory for me to use on my phone — it’s the best answer to my annoying, rogue mom. Even my dad, who lives in a tiny town in upstate New York, checks out Google Assistant first, rather than FaceTime (I have a satellite dish, people). With good reason: the Google Assistant is the only thing I like more than hearing from my mom on my phone.
Despite the humble beginnings of the Assistant, the results are fantastic. With a simple tap, I can do everything from sending a text to finding out where to grab food in Manhattan in less than a minute. It’s arguably the coolest feature of Android — yet Google says more than 500 million people use its Android apps on the phones they already own.
And because the Assistant is a virtual helper, it feels personal. When you’re on the phone in your pocket, you don’t want to have to get out of it to use something else. Google smartly incorporates that and treats it like an iPhone — get it to respond to your phone’s wake word and it will use the AI system you have already laid out for that location. This is the perfect example of natural user interface for Android.
Because the Google Assistant is with you the whole time you’re using your phone, it can bring up stuff on your phone that you have to be in front of to really figure out what you need.
One thing I love about using the Assistant is the fluidity with which I can do things.
On my smartphone, I probably get out of it for half of the time I’m connected to the internet, and then I go to the bathroom, on the train, at my parents’ house or wherever else I’m supposed to be. That’s when the Google Assistant can say things like “Hey Google, bring me what you found on Amazon,” and “Hey Google, show me something you’re writing.” It’s natural, it’s conversational and it’s hilarious.
And because the Assistant is with you the whole time you’re using your phone, it can bring up stuff on your phone that you have to be in front of to really figure out what you need. This works especially well when you’re in transit, because the Assistant can feed you info such as flight times for a flight or your schedule for the day or the week. It can also pull up information such as reservations, so you’re never far from a restaurant. (I’m a die-hard control freak, so I keep a very detailed list of my favorite restaurants in my phone.) You don’t have to get out of your seat or to the kitchen to look at it.
In an iPhone, which has never had a Siri, or (ignorantly) a Google Assistant, you’d never be able to access almost all of that information — a side effect of the iPhone’s unpopularity with every other smartphone manufacturer. You can walk out of a store or restaurant and have Siri find you all the information you need, but it’s much harder to achieve on Android. The same is true when you’re out at a concert, trying to get a restaurant reservation and trying to figure out when to go next. It’s also why I don’t use FaceTime (I want to see how the technology really works, for god’s sake!).
Because the Assistant is with you the whole time you’re using your phone, it can bring up stuff on your phone that you have to be in front of to really figure out what you need.
One example I mentioned in my initial review of Google’s Pixel phone, which has the Pixel Visual Core chip, is that it will be able to render elements in videos at 4K resolution at 60 frames per second. This means that I don’t have to watch video while sitting down, because when you get something super-fast, your eyes can’t tell it’s really moving, and you end up watching the hell out of the whole thing. The Pixel Visual Core chip also promises to deliver the same video quality at incredibly low latency,