AI FOR MORE SUSTAINABLE FASHION

autotagging images

Turning visual content into a powerful tool

As the world is in a situation where becoming aware of the environment is the only option, the second-hand model is the model that will become mainstream.

If you have a browse, you’ll find many platforms to offer different ways to to shop sustainably: rent, buy new, or buy good as new. At Wide Eyes we want to be part of making these ways of shopping as smart and flexible as this generation is. 

Nevertheless, for secondhand stores and marketplaces creating an experience that is customized to the needs of their target consumers is not always easy: Their product catalogues have thousands of different brands, and a global customer base, meaning an intelligent taxonomy is important for creating “credibility at scale”.

In fashion, not everyone will discribe the same images using the exact same tags. Customers might use many different words to describe the same piece, so comprehensive, specific attributes help recommendation engines make associations between related garments. Using Auto-tagging provides several benefits over manual tagging: saving you a lot of time and effort!

Wide-Eyes’ auto-tagging technology organizes and tags over 750 datapoints per image in less than 1ms from the product catalog, using sophisticated AI algorithms.


AI can power up second-hand e-commerce stores to be as competitive as their traditional industry opponents. Adding innovative tools, dynamic aspects of analytics, and new data sources makes them more flexible and competitive, allowing them to respond to trends and events in real-time.

Our Auto-tagging builds an efficient product tagging system even if the database consists solely of visual information about the products. Wide-Eyes’ automated solution will set homogenous tags around the whole catalog. For these reasons, smart organizations are using AI-powered visual recognition technology to automatically extract product attributes from fashion images.

Leave a Reply

Your email address will not be published. Required fields are marked *