Hey everyone! This week, I’m going to cover how Google Now works and some of the incredible features that it has, as well as how it can help you.
From a privacy standpoint, this is also a feature that you can opt out of as much as you want, to the point where you can just turn it off altogether. For a week or so, a coworker of mine has let Google take complete hold of his life. He’s given his calendar and email over to gmail, along with his location and routine.
After about a week, Google learned his habits, such as when he goes to work, when he picks his kids up at school. and where he usually eats dinner. After that one week of learning, he started getting updates about traffic, telling him he should leave earlier or later, depending on traffic conditions. Google even told him if he should take a different way to work based on construction in the area.
This is all a part of Google’s initiative to give you useful information before you ask for it. It can pull your appointments from Google Calendar and give you directions, contact information, and anything else you might need.
Overall, it looks like an amazing overhaul to the Google Now functionality that most Android phones have now. Even if you have another phone, such as a Microsoft phone or an iPhone, there is always the Google app, which will have many of the same features as the full-fledged Now on Tap.
Google Now will also have context as to where you are and what you’re looking at if you point your phone in a general direction. One of the famous examples can be seen on a popular Android commercial where the user points her camera at the Eiffel tower and asks, “How tall is it?”
To do this, Google takes your location, and the direction you are pointing your phone, and it then determines that you are probably talking about the Eifel Tower.
Contrary to popular belief, this feature doesn’t directly understand that it is looking at the Eiffel tower. That process is something that Google is working on, but it's much much more difficult to teach a computer to understand images and context-based images. It is still nonetheless an awesome feature that I’m sure will have many practical applications in the future.
For example, say that you are in an airport, you could simply ask Google many of the questions you would normally ask an attendant, such as “Where can I pick up my bags?” “What gate do I need to go to?” and “Where can I get a cup of coffee?” With the ability to determine where you are and what you’re looking for, it can pull from maps, reviews, and any number of other online locations to get you the information that you need.
Developers will also have the option of integrating information in their pages so that Google can parse a page or an app more accurately, and there may also be an option for Google to “always ignore” information that might be sensitive, such as medical information or banking information.
Come this fall when Google releases Android M, I’m sure that there will be some new features in the operating system, as a well as other aspects of Now on Tap that have not been released. Be sure to check back for any updates!
Well, that’s it for today. Be sure to check out all my earlier episodes at techtalker.quickanddirtytips.com. And if you have further questions about this podcast or want to make a suggestion for a future episode, post them on Facebook.com/QDTtechtalker.
Until next time, I’m the Tech Talker, keeping technology simple!