If you’re like most people, you probably use your phone’s camera to snap vacation photos, brag about your latest kitchen creations, and obsess over your pets and kids. How cute. If Google has its way, you might as well be using a $3,000 commercial gas range to boil water for mac and cheese.

At Google I/O 2021, the company’s annual developer conference, the software behemoth showed off a range of new apps and tools that treat your smartphone camera less as a humble memory maker, and more as a supercharged tool for interacting with the world around you.

Google Lens allows you to shop and make purchases merely by pointing a camera at your surroundings.

Take three quick shots of those weird red bumps that just emerged on your arm, for instance, and Google has a new tool that will search for matching dermatological conditions. It uses machine learning and a few cues from you (how old are you, how long have you had it?) to identify the likely culprit in seconds.

The steadily improving object recognition in Google Lens has allowed Google to recognize what’s in a photo for years, but advances in machine learning now allow you to do much more than simply ask, “What is this?” Taking a photo of your hiking boots and asking Google “Can I hike Mt. Fuji in these?” is now a quick way to answer a search query that doesn’t exactly lend itself to text. And naturally, Google would be happy to help you buy just about anything you can point a camera at. 

Your camera can do everything from help you shop to help you diagnose a skin condition.

Augmented reality is also moving beyond its cutesy phase of adding hats to video chats, and transforming into a must-have tool. Hold your phone up when you’re on the hunt for a restaurant, for instance, and Google Maps Live View will superimpose directions on top of your camera’s viewfinder. Google introduced this feature to Maps in 2019, but expanded it to more than 100 countries this year, turning it from a proof of concept to a legitimate navigational aid. And now it will work indoors, for when you need to find your airport gate or an urgently needed bubble tea at the mall.

Dermatology diagnosis from Google
Google

In Google’s camera-centric new world, you can even use a photo to transform the entire look of your phone’s operating system. A new feature in Android 12 will analyze the colors in your lock-screen image to generate a custom color palette for everything from backgrounds to buttons in your apps.

Not all of these features are new, but the widening scope of what is possible shows Google’s clear trajectory: A world where your camera is every bit as much an input device as the dusty mouse and keyboard next to your desktop. Just as voice assistants have made it second nature to ask Alexa aloud for calculations, cooking tips, and alarms, Google wants you to reach for your phone camera not just when you want to capture a memory, but when you have a question.

Privacy advocates may wince at giving Google yet another pipeline of precious data to poke and prod, slice and dice. And Google even admitted that some of these shopping technologies aim to “turn the world into your personal showroom.” But the next time your flight leaves in seven minutes and you can’t find your gate, you might not care as you whip open Google Maps to get there.

Editors’ Recommendations






Source link

By HS

Leave a Reply

Your email address will not be published. Required fields are marked *