What is Glaze? How to use it to protect my art from AI scraping?

Friendly Neighborhood Artist
10 Jul 202310:43

TLDRThe video discusses 'Glaze,' a tool designed to protect artists' work from being utilized in AI models, specifically in stable diffusion for creating fine-tuned styles. The artist explains that while Glaze cannot protect existing art that has already been incorporated into AI systems, it can prevent the use of new pieces. The process involves applying a cloak to the artwork that is invisible to the human eye but detectable by AI, thus preventing it from being trained on. The artist also shares their experience with Glaze, noting that it can take time to find the right settings to avoid visibly distorting the artwork. They mention that while Glaze is not a perfect solution, it offers a level of protection against AI scraping and suggest that artists might need to experiment with different settings for each piece of art.


  • 🎨 Glaze is a tool designed to protect new artwork from being used in AI models like Stable Diffusion, specifically preventing styles from being fine-tuned without consent.
  • 🔍 The technology behind Glaze embeds invisible alterations to the image that are detectable only by AI, not affecting human visual perception.
  • ⚖️ Glaze cannot retroactively protect art that has already been included in AI models, but it can safeguard new creations.
  • 🚫 Despite efforts, developers have not yet succeeded in circumventing Glaze's protective measures to train AI on glazed images.
  • 🖼️ Visually, glazed and unglazed versions of an artwork are indistinguishable to the naked eye, ensuring the artistic integrity remains intact.
  • 🛠️ Adjustments in Glaze's settings can lead to more visible changes in the artwork, with higher settings providing stronger protection but potentially altering the artwork more noticeably.
  • ⏲️ Applying Glaze can be time-consuming, especially on higher settings, impacting the user's workflow and system performance.
  • 🔄 Repeated glazing with different settings can result in varied effects on the artwork, such as unwanted color distortions or visible artifacts.
  • 📊 The effectiveness of Glaze in different scenarios, like image-to-image AI tasks, varies, and it may still be possible for AI to manipulate glazed images to a degree.
  • 🔗 Glaze is part of an evolving field of tools aimed at protecting artists' rights against unauthorized use of their work in AI applications.

Q & A

  • What is Glaze and what does it do?

    -Glaze is a tool designed to protect artwork from being used in AI models, specifically in stable diffusion models that create fine-tuned styles. It prevents new pieces of art from being utilized in this way, but it cannot protect art that has already been included in the AI's training data.

  • How does Glaze protect art from AI?

    -Glaze works by adding a cloak to the artwork that is invisible to the human eye but recognizable by AI models. This cloak prevents the AI from using the artwork for training purposes, thus protecting the artist's work.

  • Is Glaze a permanent solution to protect art from AI?

    -No, Glaze is not a permanent solution. It is the current best method available to protect new artwork from being used by AI, but it cannot retroactively protect art that has already been exposed to AI systems.

  • What is the process of applying Glaze to an artwork?

    -The process involves using the Glaze tool to apply a cloak to the artwork. The user can adjust settings such as the magnitude of changes, which can affect the visibility of the changes and the strength of the protection.

  • Does applying Glaze affect the visual quality of the artwork?

    -Applying Glaze can introduce artifacts and distortions to the artwork, which may affect its visual quality. The degree of distortion depends on the settings used, with higher settings providing stronger protection but also more visible changes.

  • How long does it take to apply Glaze to an artwork?

    -The time it takes to apply Glaze can vary depending on the settings chosen. Higher quality settings may take longer, with one example in the transcript mentioning a wait time of 33 minutes.

  • Can Glaze be reversed or removed from an artwork?

    -The transcript does not explicitly mention whether Glaze can be reversed or removed once applied. However, it suggests that the changes made by Glaze are not easily undone, as they are cloaked within the image in a way that only AI can detect.

  • Is there a way to counteract the effects of Glaze?

    -As of the information provided in the transcript, there are countermeasures being developed, but so far, they have not been successful in overcoming the protection provided by Glaze.

  • Who created the Glaze tool?

    -Glaze is an open-source tool created by students from the U.S. and China.

  • How does Glaze compare to other methods of protecting art from AI?

    -Glaze is described as being five times stronger than other methods. It is specifically designed to disrupt image-to-image attacks, which is when an AI model alters an image based on a given prompt.

  • What are the potential drawbacks of using Glaze?

    -While Glaze offers protection, it can also distort the artwork, potentially reducing its aesthetic quality. Artists may need to find a balance between the level of protection and the visual integrity of their work.

  • How can artists keep their audience informed about the use of Glaze?

    -Artists should consider informing their audience that they are using Glaze to protect their artwork. This can help manage expectations about the potential visual changes to their art due to the application of Glaze.



🎨 Introduction to Glades for Art Protection

The first paragraph introduces the topic of Glades, a tool designed to safeguard an artist's work from being used in 'Laura,' which is a method for creating fine-tuned styles. The artist discusses their research into stable diffusion and their plans to create a video about it. They explain that Glades helps protect new pieces of art from being used in Laura without affecting art already incorporated into the system. The paragraph also touches on the limitations of Glades, its effectiveness against countermeasures, and the subtle visual differences between original and glazed artworks, which are only detectable by AI models.


🛠️ Glazing Process and Its Impact on Artwork

In the second paragraph, the artist describes the process of applying Glades to their artwork. They discuss the different settings available in the tool, such as the magnitude of changes, which can affect the visibility of alterations and the strength of protection. The artist shares their experience with various settings, noting that higher settings can visibly alter the artwork and may not be desirable. They mention that the tool can be five times stronger than other methods and that it may distort the art, which could be a concern for some artists. The artist concludes by suggesting that artists should fine-tune the settings based on each piece of artwork and that the tool may be useful for those particularly concerned about their art being used for training AI models.


📢 Conclusion and Community Sharing

The final paragraph wraps up the video script by thanking the viewers for their time and announcing that the artist will post the glazed versions of their artwork on the community tab for closer inspection. The artist acknowledges the complexity of finding a satisfactory setting for the glazing process and suggests that it may require multiple attempts. They also express hope for other solutions and the potential for running the glazing process through different settings to achieve the desired level of protection without compromising the artwork's aesthetic quality.




Glaze is a digital tool designed to protect visual art from being scraped and reused by AI algorithms without permission. In the script, it is explained as a method to prevent art from being incorporated into AI models like 'Laura' for creating specific artistic styles. The artist explores different settings of Glaze to determine the balance between effective protection and visual integrity of the artwork.


Laura is mentioned in the script as a specific style that can be applied in AI models like stable diffusion, using the style of an artist named 'Ross draws'. It represents a way in which AI can replicate an artist's style based on prior training with their artwork. Glaze aims to prevent such unauthorized use.

💡Stable diffusion

Stable diffusion is a type of AI model used for generating or altering images based on textual descriptions. In the script, it is used as a platform where artists' styles are applied to create new artworks. The script discusses how Glaze prevents the art from being misused in such models.

💡AI scraping

AI scraping refers to the process where AI technologies harvest large amounts of data, such as images from artists, to train machine learning models. The script addresses concerns over AI scraping and how Glaze helps protect against this by masking the art in ways that disrupt AI training processes.

💡Image cloaks

Image cloaks are modifications made to images that are imperceptible to the human eye but can prevent AI models from using the images effectively in their training. The script explains that Glaze applies such cloaks to artworks to safeguard them from being scraped by AI.

💡Control Net

Mentioned in the script as a tool or method that some developers are exploring to bypass protections like Glaze. It illustrates the ongoing battle between developing AI capabilities and efforts to control or restrict their impact on copyrighted or personal content.

💡Adverse noise

In the context of the script, adverse noise refers to unwanted alterations or distortions introduced into an image when attempts are made to bypass protections like Glaze. It is a factor in considering the effectiveness of filters or other protective measures.

💡Anisotropic filter

An anisotropic filter is a processing technique used to enhance image quality by reducing distortion and preserving detail along multiple orientations. The script discusses its potential use as a way to handle images protected by Glaze, suggesting ongoing research into counteracting Glaze's effects.

💡Settings of Glaze

The settings of Glaze refer to the adjustable parameters that determine how much the artwork is altered to protect it from AI use. The script details the artist's experimentation with different settings to find a suitable balance that minimizes visual changes while maximizing protection.

💡Art training

Art training in the script refers to the process by which AI models learn to recognize and replicate specific artistic styles by analyzing a large number of images. The discussion around Glaze focuses on preventing new artwork from being included in such AI training datasets.


Introduction to Glaze and its purpose in art protection.

Explanation of how Glaze prevents artwork from being used in machine learning models like stable diffusion.

Importance of Glaze in maintaining the uniqueness of artists' styles against AI training.

Discussion on limitations of Glaze in protecting previously existing artworks.

Mention of the inability of AI developers to bypass Glaze's protection so far.

Comparison of original and glazed artworks showing no visible difference to the human eye.

Description of the technical aspects of Glaze that make it invisible to humans but detectable by AI.

Examination of alternative methods for adding protection to artwork and their inefficacy.

Introduction to other tools like MISS that claim to disrupt image-to-image AI attacks.

Discussion on the necessity of repeated application of protective measures for ongoing effectiveness.

Personal experience with setting adjustments in Glaze and their effects on art quality.

Issues with image quality degradation at different settings of Glaze.

Explanation of choosing the right settings in Glaze based on the type of artwork.

Reflection on the time-consuming nature of applying Glaze and its impact on computing resources.

Summary of the commitment needed to achieve satisfactory protection with Glaze.