Introducing AI-Powered Virtual Try-Ons for Realistic Online Clothes Shopping Experience

·

·

Shopping in The Age of AI: Google’s Virtual Try-Ons

Let’s talk about the future of online shopping, shall we? Google has decided to make our lives a tad easier by introducing virtual try-on capabilities for women’s tops in the U.S., starting today. But hold your horses, it’s not just another digital avatar situation like other brands have done. No, Google is taking it up a notch with generative AI, producing highly detailed portrayals of clothing on real models with different body shapes and sizes.

Real Models, Real Sizes: Embracing Diversity

Google’s new generative AI model is all about inclusivity. It takes just one clothing image and accurately reflects how it would drape, fold, cling, stretch, and form wrinkles and shadows on a diverse set of real models in various poses. We’re talking sizes XXS-4XL, different skin tones, body shapes, ethnicities, and hair types, as stated by Lilian Rincon, senior director of product management at Google.

Out with the Old, In with the AI

Now, most virtual try-on tools out there rely on geometric warping to dress up avatars, which, let’s be honest, isn’t always perfect. Unnecessary folds, anyone? To address this, Google has developed a new diffusion-based AI model. The process involves training a model by adding extra pixels to an image until it’s unrecognizable, then reversing (or denoising) it until the original image is reconstructed in perfect quality. The model learns from this and starts generating new, high-quality images from random, noised images.

Training the AI with Shopping Graph

To create a model that caters to different body shapes and sizes, Google tapped into its Shopping Graph, which is essentially a comprehensive dataset of products and sellers. They trained the model on millions of image pairs, each showing a different person wearing an outfit in two different poses. By using this data and the diffusion technique, the model learned to render outfits on different people standing in different poses.

How it Works

When a user exploring an outfit on Google Search hits the try-on button, they can select a model with a similar body shape and size to see how the outfit would fit them. The chosen garment and model image act as the input data. Each image is then sent to its own neural network and shares information with other networks in a process called “cross-attention”to generate the output: a photorealistic image of the person wearing the garment, as explained by Ira Kemelmacher-Shlizerman, senior staff research scientist at Google.

A Work in Progress

Currently, the try-on feature is only available for women’s tops from brands across Google. But as the training data grows and the model expands, it will cover more brands and items. And for the gentlemen out there, don’t worry – Google plans to launch virtual try-on for men later this year.

So, what are you waiting for? It’s time to embrace the future of online shopping with Google’s virtual try-ons, and finally get rid of those pesky fitting errors that have haunted us for years.

Source: venturebeat.com