IndustryWill ASOS’ visual search tool revolutionize the retail industry?

Will ASOS' visual search tool revolutionize the retail industry?

The visual search ecosystem is growing rapidly, with innovations from Google, Amazon, eBay, AliExpress and Wayfair. And 1 year on from the launch of ASOS' Style Match tool, Andrew Charlton reviews how to prepare for an visual-search future.

In 2017, ASOS introduced visual search on iOS, as a way of making product discovery easier and more interesting for their users. A year on, the Style Match tool, which trawls through an 85,000+ product inventory before matching products to the look in the image, has been rolled out worldwide across both iOS and Android.

ASOS has placed a big bet on visual search, and for good reason. Pinterest has reported a 100% year-on-year increase in Pinterest Lens users, with 600 million visual searches made every month.

The visual search ecosystem is also growing rapidly, with innovations from Google, Amazon, eBay, AliExpress and Wayfair in just the last year.

With this proliferation of visual search adoption, could this cause a snowball effect in the retail industry as others look to jump on the trend? More interestingly, for us as search engine marketers, how will Google’s own visual search engine evolve to compete with retailers like ASOS, who could quickly become the default visual search engine for mobile shoppers?

A brief history of visual search

                                         Source: Distilled.net

To understand how Google’s visual search engine could develop in the future, we need to step back in time to understand the progress they’ve made so far. In the last five years, Google has come a long way from being able to serve images that are mildly related to a search query. Now the algorithm is able to understand the context behind a query, so they can better serve image results that satisfy user intent. This has completely changed the retail landscape where image search was once only a source of inspiration. Google has enabled rich product results that attract those looking to buy through images too.

Made possible by Schema Markup, Google has taken the product page and bundled it inside an image. Product images now feature attributes such as pricing, availability, reviews and item descriptions, as well as including badges to encourage more users to click through to products from results.

Making the leap to lens

Following Pinterest’s launch of their visual discovery tool Lens, Google introduced their own ‘Lens’ equivalent at the I/O developer conference in 2017. It was later rolled out onto all Pixel phones as part of the Google Assistant, and more recently made available to all Android and iOS devices through the Google Photos app.

Right now, Google Lens isn’t geared towards retail. The main features are described as ‘saving information from business cards’, ‘recognizing landmarks’ and ‘looking up products by barcodes’. While this all sounds rather uninspiring for retailers, exciting times could soon be around the corner.

The launch of ASOS’s Style Match tool could be pivotal to this. ASOS is the first big UK retailer to invest significantly in this technology. As other retailers look to invest in a similar vein, Google will undoubtedly commercialize their own visual search offering to capitalize on a growing number of mobile users, inspired by image.

Preparing for an image-first future

Visual search is in its infancy, but we should prepare for a future where a search engine users’ first thought is image. Research by Moz in 2017 found that image search now makes up nearly 30% of all searches on Google, with image blocks in an estimated 11% of search listings.

Searching by text will remain the easiest way to access information for most. For retail, however, it doesn’t seem unrealistic to imagine a time when searching by image is more convenient than text. Visual searches will be favored by those that want to discover similar products, tailored to their own preferences.

If you’re inspired by an outfit on Instagram, the Google of the not-too-distant future might accurately display individual similar items from the outfit, with image results personalized by brand preference, availability, and budget.

Personalization is a challenge the ASOS visual tool intrinsically solves. Their users are already invested in ASOS – they have downloaded the app, they have purchased their products. This means that whenever a user searches by image, they’re going to see ASOS products, and probably like them. This is one of the biggest challenges facing Google, and ultimately why retailers like ASOS will influence the way personalized visual search evolves.

How to prepare?

To prepare for an image-first future, retailers need to adapt their websites to changing search behaviors. Here are four things retailers can do right now to get started:

  • Go mobile

Visual search is inherently mobile – we don’t take pictures on a desktop, so mobile should be at the forefront of a search strategy. Optimizing for image search in its current form should also prepare retailers for what’s to come.

Mobile is big news in 2018. Google looks set to roll out the mobile-first-index imminently, so retailers could face some challenges if they are not mobile-responsive. In the context of visual search, retailers need to be mobile because that’s where it happens. Images hosted on pages that are not mobile-friendly are less likely to show up in mobile image results.

  • Use structured data

Structured data is essential for rich product results to appear on images. This will still likely be the case as visual search expands on Google, so optimize now. There are four attributes that are required for a ‘product’ badge to appear in image results. These are:

  • Currency of the item (in three-letter ISO 4217 format).
  • Price of the item (as a number)
  • URL of the product
  • The name of the product
  • Think page speed

The rankings of the host URL of the image tend to correlate with how the images rank themselves. With mobile page speed becoming a ranking factor as of July, reducing image size is the easiest way to improve load time and satisfy that facet of the algorithm.

Rankings aside, improving page speed is also incredibly important for user experience. Google found that as page load time goes from one second to 10 seconds, the probability of a mobile user bouncing increases by 123%!

  • Make product images functional

Give Google the best chance possible to understand the contents of the product image. Images should be beautiful but functional. This means light backgrounds with the product center stage, as well as including only one product per image.

Image resolution and dimensions are important too. Google tends to exclude extremely large images, or images with unusual dimensions. If images are too wide and not tall enough, or too tall and not wide enough, there’ll likely be issues.

Conclusion

As the above suggests, it is never too early to start planning your digital strategy around visual search – especially in retail. Make the right start by making your site mobile-responsive, to give the best experience for your existing users. Then it’s time to review, refine and optimize your images, to unlock the vast potential the visual search market could have.

 

Andrew Charlton is Search Marketing Consultant at Silverbean

Resources

The 2023 B2B Superpowers Index

whitepaper | Analytics The 2023 B2B Superpowers Index

8m
Data Analytics in Marketing

whitepaper | Analytics Data Analytics in Marketing

10m
The Third-Party Data Deprecation Playbook

whitepaper | Digital Marketing The Third-Party Data Deprecation Playbook

1y
Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study

whitepaper | Digital Marketing Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study

1y