Ben Zhao, a Neubauer professor of computer science and director of graduate studies in computer science at the University, and a team of computer science Ph.D. students have created a system to protect human artists from style mimicry by AI models. The system, named Glaze, works by making subtle changes to artwork that are imperceptible to human eyes but dramatically alter its appearance to AI models. This prevents AI from using the artwork in its training data.
Free to use and operate, Glaze serves as a new dimension of art protection that is difficult to reverse engineer. Additionally, a free web version named WebGlaze makes Glaze more accessible to artists without access to powerful computers and is invite-only so as to restrict it to artists who do not use generative AI in their art. Alongside Glaze, Zhao’s team also developed Nightshade, a tool designed to counterbalance the power asymmetry in the realm of generative AI models and their trainers.
“While Glaze serves as a defense mechanism against style mimicry, Nightshade functions as an offensive deterrent against models that scrape images without consent,” Zhao said. “Together, they offer a comprehensive arsenal for artists to safeguard their creations and combat the exploitation of their intellectual property.”
Since its release, Glaze has been awarded the Special Mention award in TIME’s Best Inventions of 2023, the Chicago Innovation Award 2023, and the 2023 USENIX Internet Defense Prize, among others.
Zhao met with the Maroon to share Glaze and Nightshade’s origins, objectives, and potential implications.
“Our overarching goal is to furnish artists with a robust defense mechanism against AI mimicry, thereby ensuring the enduring vitality of artistic expression,” Zhao said. “Through fostering collaboration and dialogue, we aspire to harness the potential of AI for constructive purposes while mitigating its potential hazards.”
Zhao said the idea for Glaze came when his team attended an open forum where artists raised concerns regarding AI and intellectual property infringement. He also explained the importance of being aware of the needs of artists when creating and refining this project.
“I think [what] really colored how we approached this project was how heavily we’ve been involved with real artists from the get-go. A lot of people, including ourselves in the past, would have basically picked up this technical problem and said, ‘Let’s address it with a technical solution; and that’s the end of that,’” Zhao said. “So getting involved with over 1,000 projects [with professional artists] for the first paper that involved Glaze… has been absolutely enlightening, because it’s things that you just cannot imagine if you don’t have real people telling you about what their experiences are.”
“We realized the imperative to confront the ethical implications of AI in art, particularly the insidious threat of style mimicry, which poses a grave risk to artists’ livelihoods,” said Shawn Shan, lead Ph.D. student for the Glaze project. “By disrupting the phenomenon of AI mimicry, Glaze empowers artists, fortifying the preservation of their unique identities and creative integrity. It embodies a critical stride towards safeguarding artistic expression in an increasingly digitized realm.”
The landscape of technology and art stands poised for a transformative paradigm shift guided by principles of innovation, collaboration, and ethical practice. In navigating this shift, Shan clarified how Glaze separates itself in this era of AI uncertainty.
“One argument could be to pass a perfect law to protect human rights globally. But that’s still not enough, because there’s so many people outside of those jurisdictions that still get impacted by generative AI. They are the ones who really need some tools to protect them. So that’s how we see Glaze,” said Shan.
However, the journey of Glaze has not been without its challenges.
“Maintaining algorithm security and privacy while building trust with artists has been paramount,” Zhao acknowledged. “We’ve faced resistance from AI enthusiasts who spread misinformation about Glaze, presenting challenges in gaining widespread acceptance.”
Reflecting on the broader implications of Glaze, Zhao offered a nuanced perspective on its significance. “We designed these tools for individual artists to help individual creatives…. But it’s becoming [clearer] that it’s not just individual creatives who have things at stake here,” Zhao said. “Now, increasingly, whether they be movie studios, companies in the music industry, or gaming developers, all are realizing that this is a risk to them.”
Zhao said the artistic community is particularly vulnerable to the economic impacts of AI mimicking their work.
“There is sort of a misguided perception that some artists are elitist, [and] sit all day in galleries [while] you know, playing with their jewelry. But the reality is quite different. Most artists are barely making ends meet, because they’re doing it for the passion, not for money. But having a different perspective and dimension from [larger] companies that are entering this process and being concerned, I think, will expand the conversation, I think it will change how people feel about this overall struggle,” said Zhao.
As Glaze continues to evolve and adapt, the researchers anticipate a profound impact on the future of technology and art.
“Glaze serves as a beacon of hope for artists grappling with the specter of AI-generated mimicry,” Shan concludes. “Despite the challenges that may lie ahead, the prospect of safeguarding the artistic community’s integrity renders the journey unequivocally worthwhile.”