Fashion is a multi-billion-dollar industry with social and economic implications, worldwide. And this year’s Melbourne Spring Fashion Week (MSFW) was an opportunity for IBM Research to showcase how cognitive technology can transform how fashion designers engage and connect with their customers.
This is the third year IBM has partnered with City of Melbourne for MSFW as the official Innovation and Technology Partner, but the first year IBM’s research teams in Australia and India produced a series of firsts for the international event:
- IBM designed the MSFW iPhone App which showcased IBM Interactive Experience’s (iX) design-led approach to harness social, mobile, and analytics expertise while focusing on delivering an enriched user experience – giving users exclusive content during MSFW, and photos and news about other events and designers.
- During the event, IBM also featured its Marketing Cloud, a cloud based digital marketing platform to help MSFW tailor individual, relevant and timely campaigns to improve audience engagement (as well as any local user’s engagement).
- IBM also designed Social Media Analytics for MSFW, providing social media analysis in real time to identify trends, sentiment, key messages and infer user location.
And in a fashion technology first, IBM worked with renowned Australian couture designer, Jason Grech, to help him design the Jason Grech + IBM Watson: Cognitive Collection. Grech used Watson Cognitive APIs and fashion data in a fusion of technology and creativity to inspire the creation of 12 couture dresses.
Vikas Raykar, the lead researcher on the project, based out of IBM Research-India’s lab in Bangalore, talks about how IBM Watson was able to capture a decade of fashion runway images from magazines and social media, as well as architectural design, to determine future trends and styles for Grech.
What data did these tools examine, and how did they help produce usable results?
Vikas Raykar: Fashion is highly visual, which is why from the beginning we decided to focus on fashion images. We looked at images from two sources:
- About 500,000 historical fashion runway images from 2006-2016 across fashion archives, and nearly 600 designers.
- And about 100,000 fashion-related images from social media platforms which Jason follows and derives inspiration from.
The first one is representative of high-end couture and the second one is more indicative of the current zeitgeist as to what people are wearing right now and posting on social media.
As a part of the broader project on Cognitive Fashion at our lab, we are developing capabilities that can understand a fashion image, and using these tools to analyze current fashion trends from fashion images. We were able to analyze the collection of fashion images to predict popular and trending colors for every season for a decade, and then predict the trending colors for the next season.
We presented this analysis to Jason as an interactive web application where he could explore the color trends for the next season, and derive inspiration for his new collection.
How did you decide that ‘color’ was how Watson could help Grech?
VR: The story of fashion is the story of color. Initially, we decided to break it down into three phases: color, print, and silhouette. Since the color palette plays such a crucial role in the designers’ mood board we decided to first focus on colors.
How did your team work with Grech? How did it help the cognitive system?
VR: We worked in close collaboration with Jason, incorporating his feedback and analyzing images from hashtags he follows as his source of inspiration.
Architectural images were another source of inspiration for Jason, and for which we built a visual discovery tool for him. The tool provided images of architectural shapes and structures he was inspired by, and we used deep image understanding to match architectural structural details to patterns and silhouettes in historic fashion runway images, providing the inspiration for new designs.
We understand that cognitive systems understand natural language, and now even have visual recognition — what was understandable (to the systems) about fashion? What made fashion unique (and challenging)?
VR: The big challenge has been adapting these cognitive computing technologies to the domain of fashion. Fashion is highly visual and has its own nuances, for example when understanding natural images, we found that humans use only around 10-20 color terms. However, once we moved to the domain of fashion, there were thousands of colors. The standard yellow color (the “Y” in “CMYK” most-readily recognized as the color of a lemon peel) is very different visually from Naples yellow’s earthy tone (also known as antimony yellow). A shift dress is very different from an A-line dress, and the visual recognition systems have to be trained to understand and discriminate these nuances.
What is your and your team’s experience in fashion? What did you and the team have to learn in order to help the system learn?
VR: Our team came from a deep learning and image understanding background. To start with, we had very minimal fashion vocabulary, however now we have become highly detailed on how we talk about fashion.
The team had to pore over fashion magazines to build a fashion catalog, understand what’s happening on the current trending fashion sites, and even learn to pronounce designer names.
How will cognitive couture help the average person in the future? What will that experience look like?
VR: Typically, fashion designers start conceptualizing and designing their collection 12 months ahead of the launch date. However, the emergence of so called fast-fashion retailers has dramatically shrunk lead-times.
Understanding customer insights is more crucial than ever before. A designer’s ability to understand previous and current trends is now key to shaping and mapping success in the fashion industry.
What we are seeing now is trends arising from a multitude of sources. For example, if animal prints are the trend, it could mean that high-end fashion designers have started showing animal print designs on the runway. This starts a chain reaction as apparel retailers introduce the trend in their online catalogs and stores; celebrities are spotted wearing animal prints; and then fashion magazines, websites, blogs and social media sites start recognizing this trend, resulting in fashion-forward consumers wearing animal prints on the street. Fashion trends can actually come from anywhere and with our tools these can be widely adopted.
Fashion is not just about runway shows or fashion magazines, it is also about what people are wearing. We also analyzed public images from the street and social networks during the Melbourne Spring Fashion Week 2016 to understand the dominant color trends. This data was then turned into Twitter posts to provide a zeitgeist of the fashion trends on the city’s streets.
What are other aspects of IBM’s research or the research process?
VR: Our team in India is working on a project called Cognitive Fashion where we are building a suite of modern cognitive computing technologies that use artificial intelligence, machine learning, dialog, computer vision and natural language understanding to empower online fashion retailers. We believe that the next generation fashion portals will be one-stop sites for all fashion needs (from buying, browsing, search, style advice, to fashion recommendations and following trends).
IBM Cognitive Fashion technologies can increase the stickiness of the portal and ensure that the customer can find the right product of choice, without spending hours searching portals, and leading to more conversion. This can be of transformative value to online retailers especially focusing on fashion apparel, jewelry, or even furniture.