Lens, one of three visual discovery features introduced by Pinterest last month, rolled out to all of the social network’s U.S. iPhone and Android users, with some new tweaks, although the feature is still in beta.
Engineering manager, visual search and discovery engineering Steven Ramkumar said in a blog post that Lens now allows users to tap the Lens icon and swipe up in order to access new idea Lenses (pictured above) to try out.
Pinterest users can also now tag objects that they take photos of, which will enhance the data being used to populate Lens.
Users should update their applications to version 6.20 for iPhone or version 6.10 for Android in order to access Lens.
Just point Lens at a pair of shoes, then tap to see related styles or even ideas for what else to wear them with. Or try it on a table to find similar designs, and even other furniture from the same era. You can also use Lens with food. Just point it at broccoli or a pomegranate to see what recipes come up. Patterns and colors can also lead you in fun, interesting or even just plain weird new directions.
For now, Lens works best for finding home-décor ideas, things to wear and food to eat. As more and more people give it a try and we continue making improvements to our technology, results will get even better, and the range of objects Lens recognizes will get increasingly wider.
Product manager for search Eric Sung said in a blog post Friday announcing the U.S. rollout:
To get to Lens, update your app, tap the search bar and tap the red camera icon. Then just point Lens at an object—shoes, recipe ingredients, art—to see what ideas it turns up.
You can also use photos on your camera roll to do a search. We’ve even added some new things to Lens that you may not have thought of yet. Just swipe up to find new Lenses to try, from turntables to travel ideas.
Lens is still in beta, which means it isn’t perfect just yet. It is pretty good with recipe ingredients and outfit ideas. And if you Lens a throw pillow, chair or piece of art over at your friend’s house, you should turn up some great ideas.
But Lens is still learning, and doesn’t always recognize exactly what you’re looking for.
Lens will stay in beta as it gets even better at recognizing all the things. And that’s where you come in.
If you get results that feel a little meh, tap the new + button to add feedback and help Lens get better at finding ideas inspired by whatever you just Lensed. As more and more people help teach Lens about more and more objects, soon it will earn its way out of the beta zone.
And Ramkumar added in his blog post:
As part of today’s update, we’re rolling out a new visual model that is better optimized for user-generated camera images. Ultimately, Lens is constantly improving as more and more people use it.
The beta launch of Lens is really just the start. We’re continuing to improve our visual technologies to better understand images and objects, as we face challenges where the image is the only available signal and we have to understand a pinner’s intent. This is especially difficult in the case of real-world camera images as people take photos in a variety of lighting conditions with inconsistent image quality and various orientations.
We’re excited by the possibilities that objects and visual search together can bring and are continuing to explore new ways of using our massive scale of objects and images to build new discovery products for Pinners around the world.