Google makes it easy to try on virtual clothes with AI
Google seems determined to exploit the full potential of generative AI. Following the announcement and release of the Search Generative Experience beta, the company has launched a new feature. The aim? To make the virtual trying on of clothes more effective and realistic than ever. This response to a problem that is well known to Internet users: knowing what a garment will look like on them before they buy it.
The system is based on new artificial intelligence developed by Google. By taking an image of a garment, it can then show what it would look like worn by a variety of different people. The feature is designed to be as realistic as possible: potential buyers can see the folds, shading or cuts of items in several sizes and in several poses. Sizes range from XXS to quadruple XL, and Google has included representativeness in its approach: Internet users will be able to choose the skin colour, origin and morphology that best suits them.
Here’s a sneak preview of the feature:
For the time being, it will only be launched in the United States for retailers such as Anthropologie, Everlane, H&M and LOFT. Others will follow, before a potential roll-out in other regions.
Continue reading ...
Subscribe to our newsletter
Receive our best articles and practical guides directly in your inbox every month