Google Maps is testing an AR camera feature for easier navigation | Rickey J. White, Jr. | RJW™
22628
post-template-default,single,single-post,postid-22628,single-format-standard,ajax_fade,page_not_loaded,,qode-theme-ver-16.3,qode-theme-bridge,wpb-js-composer js-comp-ver-5.4.7,vc_responsive
 

Google Maps is testing an AR camera feature for easier navigation

Google Maps is testing an AR camera feature for easier navigation

Sometimes it’s hard to associate locations in a 2D Google Map on your phone with what you see around you in the real world. That’s why Google’s Maps team is testing a new mode that uses augmented reality to add real world directions and place labels to your view of the world through the camera of your phone.

I got a chance to use an early version of the new Maps feature—which Google teased at its I/O conference last year—in San Francisco on Monday morning. When you’re getting directions in Google Maps you can hold up your phone to the real world and see a series of arrows positioned over the route you’re supposed to take. You also see labels on streets you’re approaching. Then you can glance at the 2D map at the bottom of the screen and pretty easily associate the real-world street with the line on the map. And you can see where the little blue dot (that’s you) is moving in relation to your surroundings.

If you want to go back to the regular full-screen 2D map you just lower the phone. In fact, for safety reasons, Google wants to discourage you from putting your phone in front of your face for just a few seconds at a time: If you look through the screen of the phone too long you’ll see a message telling you to lower the phone. The challenge, a spokesperson explained to me, is putting only the most immediately helpful information in AR view–just enough to orient you and keep you moving along the route.

In order for the AR labels and directions to be helpful, they have to be positioned correctly over the real world places within the camera view. GPS isn’t always able to perfectly place the phone within an environment: the signal requires a line-of-sight view of the satellite above, which can be blocked by trees or tall buildings (in downtown San Francisco, for example). But Google has precise latitude and longitude data associated with the millions of images it’s captured for its Street View feature. It uses machine learning in the cloud to match the Street View images with real-world places seen through the phone camera. The latitude and longitude data for those places are combined with GPS and cell phone tower data to pinpoint the user’s location. With that done, the AR images can be placed correctly.

Google is now testing the AR feature with a select group of Google Maps enthusiasts who will give feedback on the usefulness of the feature and suggest ideas for improvements. If the AR feature makes it through the testing it could become a generally available feature within Google Maps. Based on my experience with it in San Francisco, I suspect it will be.


Source: Fast Company

Tags:
No Comments

Sorry, the comment form is closed at this time.