Google AI improves shopping experience
- June 9, 2025
- Steve Rogerson

Google is using AI to improve the shopping experience including a virtual-try-on feature so shoppers know what they look like wearing the clothes they are about to buy.
The internet giant is using its recently developed AI Mode technology to provide shoppers with visuals, smart guidance and reliable product data. It is even giving shoppers a virtual dressing room and an agentic checkout experience so they can act quickly to make a purchase when the price is right.
“Our new AI Mode experience is built for every part of shopping, from finding inspiration to buying at the right moment,” said Lilian Rincon, Google vice president. “Plus, our virtual try-on tool now works with your own photos.”
The AI Mode (blog.google/products/search/ai-mode-search) shopping experience brings together Gemini capabilities with Google’s Shopping Graph to help users browse for inspiration, think through considerations and narrow down products. The Shopping Graph has more than fifty billion product listings, from global retailers to local mom-and-pop shops, each with details such as reviews, prices, colour options and availability. Every hour, more than two billion of those product listings are refreshed on Google.
“Say you tell AI Mode you’re looking for a cute travel bag,” said Rincon. “It understands that you’re looking for visual inspiration and so it will show you a beautiful, browsable panel of images and product listings personalised to your tastes. If you want to narrow your options down to bags suitable for a trip to Portland, Oregon, in May, AI Mode will start a query fan-out, which means it runs several simultaneous searches to figure out what makes a bag good for rainy weather and long journeys, and then use those criteria to suggest waterproof options with easy access to pockets.”
The new righthand panel dynamically updates with relevant products and images as the shopper progresses, helping them pinpoint exactly what they are looking for and discover new brands. These shopping features are coming to AI Mode in the USA in the coming months.
Once a shopper has decided what to buy, the agentic checkout helps them buy at a price that fits their budget. If they tap “track price” on any product listing and set the right size, colour or other options, and the amount they want to spend, it will provide price drop notifications and, if they are ready to buy, they can confirm the purchase details and tap “buy for me”.
Behind the scenes, Google will add the item to their cart on the merchant’s site and securely complete the checkout on their behalf with Google Pay. This agentic checkout feature will be rolling out in the coming months to product listings in the USA.
It can be hard to envision how new styles and trends when shopping online will look when actually worn, especially if the shopper wants to step outside their comfort zone. Virtual try-on technology (blog.google/products/shopping/ai-virtual-try-on-google-shopping) helps shoppers imagine how clothes look on different body types. Now, users can virtually try billions of apparel listings on themselves, just by uploading a photo.
This technology is powered by a custom image generation model for fashion, which understands the human body and nuances of clothing, such as how different materials fold, stretch and drape on different bodies. It preserves these subtleties when applied to poses in the shopper’s photos.
The try-on experiment is rolling out in Search Labs (labs.google.com/search) in the USA.
“When you’re shopping for shirts, pants, skirts and dresses on Google, simply tap the try-it-on icon on product listings,” said Rincon. “From there you can upload a full-length photo of yourself, and within moments you’ll see how that wedding-season maxi dress or playful shirt for your next vacation looks on you. Not quite ready to commit and need a second opinion? You can easily save the looks or share with friends.”