A tool that cloaks visual artworks, so that their style cannot be mimicked by generative AI models

(Glazed image: "Musa" by Klara Ortiz)

This tool helps artists introduce minor perturbations into their artwork before posting the images online. Using existing style transfer algorithms, normally used to recreate existing images in particular modes (like cubism, watercolor, etc), to identify style-specific features of the image, the algorithm developed by SAND labs adapts the images slightly and exclusively in these areas. These changes are, depending on the work, more or less invisible to the human eye but make forgery attempts that aim to create new images in the style of the original artworks much less successful. Artists can use the tool, which was released for free in 2023, independently and locally, with control over the amount of cloaking.

SAND Labs, a research lab of Chicago University, previously released a similar cloaking algorithm for personal photographs, making them unusable in training sets for facial recognition models in 2020. When image generation using GANs (generative adversarial networks) started becoming popular, multiple visual artists reached out to the lab if a similar algorithm could be developed to protect their artworks.

The rise of image generation algorithms in 2022 led to a wide outcry amongst artists, with a group of illustrators subsequently filing a class action lawsuit against Midjourney Inc, DeviantArt Inc, and Stability AI Ltd (Stable Diffusion). The plaintiffs claim that the tools violate the rights of millions of artists, as their artworks were used in the training sets without consent and compensation, violating copyright with a substantial negative impact on the work of the plaintiffs. Not only are AI-generated images used as cheaper alternatives for commissions but the style of specific artists was also mimicked, enabling users to create new images, forgeries in the style of the artists.

Similar projects to Glaze

Back to all projects.