Published on
June 5, 2019
Visual search has been one of the biggest digital trends in the last few years.
Its beginnings go back in 2001 when Google launched its image search feature with over 250 million images in its database.
The reason this feature was created was no one else but Jennifer Lopez and her green Versace dress she wore at the 2000’s Grammy Awards ceremony.
The demand for pictures of this legendary dress was so huge, that the tech giant had to add a new feature to handle it.
In 2007, Google added more information to the images in its database, like resolution and URL, while since 2009, users can use the feature to find similar images.
However, the real revolution began in 2011, when Google added the Search by Image feature.
This allows users to submit an image instead of typing in keywords, resulting in a list of similar images, pages that contain the image, and different sizes of the image.
Since then, brands have started adopting visual search and using images as a major traffic driver to their e-commerce stores and online catalogs.
Visual search is a feature that allows customers to search by images instead of by keywords.
They can use images from the Internet, photos they’ve taken or screenshots to find their desired items.
The visual search technology of today uses AI (Artificial Intelligence) to find the results that are the closest to the context of the image.
Thanks to visual search, online retailers can provide more quality results for their website visitors than they couldn’t provide by just using keyword search.
Customers also find it simpler. In fact, according to Visenze, 62% of millennials prefer visual search over any other new technology.
Marketers are listening to what their customers have to say, so 35% of them are planning to optimize for visual search by the end of 2020.
This has led to the growth of the image recognition market, which is expected to reach $25.65 billion by the end of 2019.
Suggested read: The Essential Guide to Visual Search in Fashion Ecommerce
Pinterest, Google, Amazon, and Bing have developed their own visual search engines that are leading the revolution.
Many retailers like ASOS, IKEA, Argos, eBay, Neimann Marcus, Walmart, etc., have built in their own visual search engines.
For example, ASOS launched its Style Match tool in August 2017, allowing its shoppers to take a photo through the ASOS app, adjust the focus on the desired item, and search ASOS’s products on the picture.
So, if you liked the dress your favorite influencer was wearing on their latest Instagram photo, you can simply upload it on the ASOS app and ASOS will find a similar product for you.
eBay’s visual search feature allows users to search eBay for some item when they come across a picture of it on the Internet.
When they see the desired item on another website, they can simply click the “Find it on eBay” button, and eBay’s engine will search for relevant items in its online store.
The company uses a convolutional neural network, a deep learning model that goes through eBay’s items and looks for similarities based on visual connections.
Marks & Spencer’s AI-powered visual search tool allows its mobile app users to upload an image of any outfit and find similar items in less than ten seconds.
The Style Finder was designed to help the retailer become digital-first, as over three-quarters of their online traffic now comes from their mobile app.
Their goal is to increase the number of online sales to one-third of all purchases by 2022.
Suggested read: How to Prepare for Visual Search: A Clean, Structured and up-to-Date Image Database
It all comes down to the characteristics of the human brain.
The human brain is made to remember visuals - it can identify an image within 13 milliseconds.
Moreover, 90% of the information received by the human brand is visual (Source: MIT).
When they look at a picture, people remember the shapes and the patterns of the objects shown on it.
However, AI doesn’t work like the human brain - it focuses more on points and dotted lines.
Visual search teaches AI to work like the human brain and focus on shapes and patterns instead.
When an image is submitted to a visual search engine, it detects the objects on the image and looks for other images that contain similar objects.
For example, a photo of a woman in a dress identifies the dress and shows other images with a dress similar to it.
The machine learning-based neural networks that visual search leverages allow the engine to keep learning and improving its accuracy.
This is especially important for online stores with large image bases that contain many similar objects.
Brands can get a lot of benefits from using visual search:
We live in a world that’s dominated by visuals, and no wonder visual search is changing the way we search.
By using this technology, we’ll allow our brains to do what’s only natural for them - follow the visual.
A brand that uses visual search is a brand that’s looking ahead, in a future that’s here to stay.
How to develop solutions that help shoppers find what they are looking for.
How to develop solutions that help shoppers find what they are looking for.
How to develop solutions that help shoppers find what they are looking for.
How to develop solutions that help shoppers find what they are looking for.