Think about the last time you were inspired to try a new look – from a cute tie-dye top you saw on Instagram, to a celebrity sporting a couture dress in the latest issue of “Vogue.” Later, when trying to come up with the best words to describe the look, you discover that you are not a poet. You struggle to find the right words to explain the shape of a neckline, or the spacing of a polka dot pattern, and when you attempt your text-based search, the results are far from the trend you were after.
At the re: MARS 2019 conference today, Amazon announced a feature that solves this very problem.
StyleSnap, an AI-powered feature, helps you shop – all you need to do is take a photograph or screenshot of a look that you like. Announcing the feature at the re: MARS keynote, Consumer Worldwide CEO Jeff Wilke said, “The simplicity of the customer experience belies the complexity of the technology behind it.”
To get started, all you have to do is click the camera icon in the upper right hand corner of the Amazon App, and select the “StyleSnap” option; then simply upload a photograph or screenshot of a fashion look that you like. StyleSnap will present you with recommendations for similar items on Amazon that match the look in the photo. When providing recommendations, StyleSnap considers a variety of factors such as brand, price range, and customer reviews.
While StyleSnap presents a seamless experience for customers, building this feature was no easy feat. Lifestyle images and influencer posts are unpredictable, with poses as varied as the locations – from an influencer enjoying a croissant in an indoor café, to a celebrity enjoying a mojito on a sunny beach under the shade of an umbrella.
StyleSnap uses computer vision and deep learning to identify apparel items in a photo, regardless of setting. Deep learning technology also helps classify the apparel items in the image into categories like “fit-and-flair dresses” or “flannel shirts.”
Deep learning powering fashion discovery
Deep learning refers to a class of machine learning techniques based on artificial neural networks, which are inspired by the working of the human brain. Neural networks are made up of millions of artificial neurons connected to each other, and can be “trained” to detect images of outfits by feeding it a series of images. For example, if we feed a network thousands of images of maxi and accordion skirts, it will eventually be able to tell the difference between the two styles. If we, however, present it with one Scottish kilt, it may be confused and predict an incorrect class until enough examples are provided to train it otherwise.
To have neural networks identify a greater number of classes, we can stack a greater number of layers on top of each other. The first few layers typically learn concepts such as edges and colors, while the middle layers identify patterns such as “floral” or “denim”. After having passed through all of the layers, the algorithm can accurately identify concepts like fit and outfit style in an image.
We must go one step further, however – feed-forward neural networks will stall and eventually degrade after a certain number of layers have been added. This is known as the vanishing gradient problem, where the signal from the training data is so spread out between layers that it is lost entirely.
Amazon uses residual networks to overcome this problem, as they use shortcuts to allow the training signal to skip over some of the layers in the network. This helps the network learn basic features like “edges” and “patterns” first, and then focus on complex concepts. A unique method developed by Amazon researchers allows the network to learn new concepts while also remembering things it has learned in the past – this is critical for enabling StyleSnap to work through large volumes of data effectively.
While StyleSnap allows customers to discover inspiring fashion finds by simply taking screenshots of the looks they like, it also helps fashion influencers expand their communities. In addition, fashion influencers who participate in the Amazon Influencer Program are also eligible to receive commissions for purchases they inspire.
Similar to shopping in the Amazon online store, at Amazon Go, or Whole Foods, StyleSnap is the latest example of how Amazon leverages artificial intelligence to make a real-world difference in the lives of customers.
”We are highly innovative and customer-obsessed, and we will continue to create new experiences for customers to discover the products they want and love. We are incredibly excited about StyleSnap and how it enables our customers to shop visually for Fashion on Amazon,” said Jeff Wilke.
**StyleSnap currently only works for dresses, tops and bottoms.