The Business of Fashion
Agenda-setting intelligence, analysis and advice for the global fashion community.
Agenda-setting intelligence, analysis and advice for the global fashion community.
In March 2021, before mainstream excitement around generative artificial intelligence exploded, a pair of researchers published a paper on the way biases can turn up in images created by AI.
With one AI tool, they made five male-appearing and five female-appearing faces. Next, they fed them into another AI tool to complete with bodies. For the female faces, 52.5 percent of the images the AI returned featured a “bikini or low-cut top,” they wrote. For the male faces, 42.5 percent were completed with “suits or other career-specific attire.”
Bias in AI — or rather in the data these models are trained on — is a well-known problem. There’s even a mantra: garbage in, garbage out. The idea is that if you input flawed data, the output will reflect those flaws. Because the generative-AI tools available have typically been trained on giant volumes of data scraped off the internet, they’re likely to reflect the internet’s biases, which can include all the conscious and unconscious biases of society. The researchers guessed their output resulted from “the sexualised portrayal of people, especially women, in internet images.”
Fashion should pay close attention. As it begins using generative AI for everything from producing campaign imagery to powering online shopping assistants, it risks repeating the discrimination based on race, age, body type and disability that it has spent the past several years loudly claiming it wants to move past.
ADVERTISEMENT
For example, when I entered the prompt “model in a black sweater” in DreamStudio, a commercial interface for the AI image generator Stable Diffusion, the results depicted thin, white models. That was the case for most, if not all, of the models every time I tried it. In the hive mind of the internet, this is still what a model looks like.
Ravieshwar Singh, a digital fashion designer who has been trying to raise awareness of the issue, even staging a minor protest at the recent AI Fashion Week, said the current moment is especially important for combating these problems.
“What we’re seeing now is the construction of these norms in real-time with AI,” he said.
Except now brands won’t be able to fall back on the justifications they’ve used in the past for not casting certain types of models or failing to represent different groups. Where they might have previously claimed they couldn’t find the right curvy model, now they’re able to generate whatever look they want, Singh pointed out. While they might have claimed in the past that producing a range of samples to fit a range of bodies was prohibitively complicated or expensive, now there’s no major added cost or complexity. (It does raise the related issue of whether brands should be using AI instead of hiring human models, but the reality is ignoring the technology won’t make it disappear.)
“So then the question to me becomes, ‘Why are we making these choices in the first place?’” Singh said.
There are factors beyond the technology at play. Brands are often trying to present an aspirational image that parrots what society more broadly deems desirable. On the other hand, fashion is also more influential than most other industries in defining what “desirable” looks like.
For the industry to deviate from, and ultimately shift, its paradigms would require extra thought and effort. It will be up to the individuals brands and creatives to introduce more diversity, and there’s no guarantee that will happen. Fashion has tended to resist even small changes in the past, and if it means more work, there could be those who won’t invest the effort, meaning fashion would go on reinforcing the same patterns.
The tech industry is still struggling with its own issues around bias in AI. There are numerous well-documented examples of AI treating white men as the default, with consequences like voice recognition not working well for women or image recognition mislabeling Black men. Generative AI adds its own risks, like perpetuating negative stereotypes or erasing different groups just by not including them. One issue with some image generators is that they can default to a white man for just about any prompt, positive or negative.
ADVERTISEMENT
Tech experts and researchers believe one possible way to deal with the problem is reinforcement learning from human feedback, a technique that, true to its name, involves a human providing an AI model feedback to guide its learning in a desired direction — without the human having to specify the desired outcome.
“I’m optimistic that we will get to a world where these models can be a force to reduce bias in society, not reinforce it,” Sam Altman, chief executive of OpenAI, the company behind ChatGPT and the DALL-E image generator, told Rest of World, a global tech news site, in a recent interview.
Singh believes AI could have a positive influence on fashion, too. If someone creates an AI campaign with a South Asian model, or includes someone with a body type that hasn’t been fashion’s standard in the past, a casting director might see it and get the idea to do the same in a physical casting.
First, though, fashion companies using generative AI need to think beyond the default decisions history and the technology are making for them.
Brands are using them for design tasks, in their marketing, on their e-commerce sites and in augmented-reality experiences such as virtual try-on, with more applications still emerging.
Brands including LVMH’s Fred, TAG Heuer and Prada, whose lab-grown diamond supplier Snow speaks for the first time, have all unveiled products with man-made stones as they look to technology for new creative possibilities.
Social networks are being blamed for the worrying decline in young people’s mental health. Brands may not think about the matter much, but they’re part of the content stream that keeps them hooked.
After the bag initially proved popular with Gen-Z consumers, the brand used a mix of hard numbers and qualitative data – including “shopalongs” with young customers – to make the most of its accessory’s viral moment.