Contributed Blog : By Mark Elfenbein, CEO, Slyce Inc.
The very first search engine was launched exactly 25 years ago. Archie (Archive, without the v) debuted in 1990 as an FTP site with an index of downloadable directory listings. To find what you were looking for, exact wording was required- search engines didn’t recognize partial phrases or understand context like today. It was the dawning of the internet, the foundations on which was built our current ability to instantly know the population of China, or the ‘world’s greatest’ pancake recipe with just a few taps of a keyboard.
Flash forward two and a half decades and search titan Google can perform sophisticated crawls through what has become a vast ocean of information in fractions of a second and return prioritized and relevant articles, videos, maps, and images. Google’s search algorithm is constantly evolving, but so is the way we search: this year for the first time in internet history more web searches were performed on mobile than desktop.
Smartphone use in 2015 is ubiquitous. 64% of Americans now own one, a figure up 35% since 2011. Users are spending more time in-app than ever, and searching ranks as the top in-app activity. Thanks to smartphones, we are evolving past the use of a single search engine. We are bombarded with so much data and information (we uploaded over 800 billion photos in 2014 alone) that sorting through it all to find what is relevant to us has become an activity we must engage in, in every place we spend our online hours.
But while search engines are becoming more sophisticated, a user’s ability to articulate their precise, desired search using text-based language alone remains a barrier to efficiently navigating the overwhelming amount of data. Although we have moved beyond a time when we had to be exact in our phraseology, textual language still stands in our way as certain elements we’re keen to find are overly complex or too subtle to sum up in a few words. We can struggle to describe certain shades of colour, particular shapes, or the abstract item that is familiar but seemingly ‘nameless’. Do you know what “quatrefoil” looks like? Do a quick search, and you’ll find a very familiar décor pattern.
Visual search removes this hurdle. We experience the world visually, not through text, and because of that the brain processes visual images 60,000 times faster than text. And we aren’t simply short on language, we are also short on time. Snapping a photo in-app, or making an audio query through an intelligent personal assistant like Siri, can be immeasurably quicker, easier, and more fun than typing in a text query. With the voice recognition and image recognition markets forecasted to be worth multiple billions in the next 5-10 years, and wearable technology becoming part of the norm (Diane von Furstenberg is developing a handbag that cordlessly charges your smartphone) early adopters are already preparing for the next wave of search evolution. The preeminent luxury retailer Neiman Marcus, for example, has already implemented both 3D and 2D visual search within its top-ranking app.
Online search has come a long way in the last 25 years. It has grown exponentially as technology has boomed, and now stands at another pivotal moment. In much the same way that we find it hard to imagine a time when people flicked through an encyclopaedia to source their answers, within the next decade it is likely that children will similarly roll their eyes patronizingly upon hearing their parents keyed in search queries, rather than just speaking into their mobile device or snapping a quick photo.