Wednesday, October 18, 2017

Google's new Pixel 2 feature that identifies things you photograph isn't ready for primetime just yet

So far, Lens doesn't work quite as well as I'd hoped.

So far, Lens doesn't work quite as well as I'd hoped.

Google isn't billing Lens as 100% ready for primetime, so I have to cut it some slack for now. But my expectations were so high because it's Google. Lens has the power of Search behind it, along with Google's excellent smart assistant and an overall knack for having products that seamlessly work together.

Plus, if Lens is a feature inside Google Photos instead of the camera app, it's frustrating that other Google Photos users on other devices don't have access to it.

But there's good news for Lens: As people buy the Pixel 2 and start using the feature, it's only going to get better from here.

It can identify brands — sometimes.

It can identify brands — sometimes.

Selfishly, I was hoping Lens might work for one of my more common issues: Seeing something I like out in the world but not knowing who makes it or where I can get it.

Here, Lens worked ... fine. It couldn't identify my Vans sneakers, despite the fact that they're one of the more easily recognizable logos. But it worked great with my Daniel Wellington watch, since it's so good at reading printed words. I didn't find the results Lens served me particularly helpful, but at least it was accurate.

It's not so great at reading handwriting.

It's not so great at reading handwriting.

When it comes to handwritten numbers, Lens falls flat. The feature didn't work at all, despite the fact that I think I wrote the numbers pretty neatly. I had hoped it would recognize the phone number and offer to dial it without me having to type it in manually, but no such luck.

Lens is great at identifying addresses.

Lens is great at identifying addresses.

I quickly realized that Lens is exceptionally good at reading typed letters and numbers. Here, Lens read the address (as well as the serial number below it) in a matter of seconds and even recognized some additional information about it. Lens could tell this was the address of a corporation rather than a private home, and recognized it was in Canada, not the US.

The coolest part — and I think the best use case of Lens — is that it offered to pull up Google Maps and direct me to the address.

But Lens can accurately read labels, at least.

But Lens can accurately read labels, at least.

All Lens did here was identify that this was a jar of Jif peanut butter and pull up the company's Wikipedia page. Sure, it's accurate, but is this helpful? Not really. Lens accomplished nothing here that my actual eyes can't already do.

All Lens could gleam from this photo was that it's a still life.

All Lens could gleam from this photo was that it's a still life.

Now, Lens isn't wrong here. This is a still life photo, and Lens showed me images that were visually similar. But this wasn't exactly what I was expecting to see when I used the feature. I thought it would recognize that this was a seashell, and maybe even tell me what type of seashell I was looking at.


Source: Business Insider India