Can artists protect their work from AI? – BBC News
TLDRThe BBC News report explores the escalating conflict between artists and AI-driven art creation. Highlighting a case where artworks were used without consent for AI training, it introduces 'Glaze,' a new tech aimed at protecting artistic integrity by making minor visual modifications that confuse AI models but are imperceptible to humans. Amidst legal battles and ethical debates over AI's 'inspiration' versus 'theft,' the piece underscores the urgent need for regulatory measures and artist involvement in AI development.
Takeaways
- 🖼️ AI art has gained traction recently, with some pieces selling for hundreds of thousands of dollars, but the data used to generate these works often comes from artists who did not give their consent.
- 🤖 AI image generators, such as DALL·E and Stable Diffusion, create art from vast datasets compiled from billions of images scraped from the internet, without artists' permission.
- ⚖️ Carla Ortiz, a concept artist from San Francisco, discovered that her artwork had been scraped into an AI image dataset without her consent, prompting her to join a class action lawsuit against AI companies.
- 🚫 Artists like Carla Ortiz have taken steps to remove their work from the internet to prevent further unauthorized use in AI image datasets.
- 🔒 A potential solution to unauthorized AI use of art is 'Glaze,' a software developed by researchers at the University of Chicago. It applies subtle changes to an image that are imperceptible to humans but cause AI models to misinterpret the style.
- 🕵️♂️ Glaze alters visual elements, such as brush strokes, to mislead AI models into generating incorrect styles when attempting to mimic a specific artist.
- 🧐 Critics argue that AI image generators use other artworks as inspiration, similar to how humans do, while companies being sued claim their models do not create exact copies.
- 👩⚖️ Stability AI, one of the companies facing a lawsuit, says their future models will include an 'opt-out' option to address artists' concerns, but many artists believe the process should be 'opt-in' rather than 'opt-out.'
- 🛡️ While Glaze offers some protection for artists, it's not foolproof, as people are already attempting to bypass its security measures.
- ⚡ The ongoing debate highlights the need for regulation and public awareness to ensure AI art develops alongside the interests of human artists who create original works.
Q & A
What is the significance of AI art in the context of the article?
-AI art has gained significant attention due to its ability to mimic styles of specific artists through a training process involving millions of images. This has raised concerns about artists' rights and the use of their work without consent in AI image generators.
How does an AI image generator work?
-AI image generators work by ingesting millions or even billions of images sourced from the web, combined with text descriptions. They use this data set to create images from simple text prompts, effectively mimicking styles of various artists.
Why did Carla Ortiz and other artists file a class action lawsuit against AI image generators?
-Carla Ortiz and other artists filed a lawsuit because their artwork was used without their consent in AI image generators' data sets. They felt this was a form of art theft and an invasion of their rights as creators.
What is the 'glaze' technology developed by Professor Ben Zhao's lab?
-Glaze is a solution that leverages the differences in how humans and machine learning models perceive visual images. It makes subtle changes to artwork that are almost imperceptible to humans but significantly alter how a machine interprets the image, preventing AI from accurately mimicking the artist's style.
How does the 'glaze' technology help artists protect their work online?
-By applying glaze to their artwork, artists can publish it online without the fear of it being scraped and used by AI models to generate new art. Glaze alters the machine's perception of the art, causing any AI attempting to mimic the style to fail.
What is the controversy surrounding AI art generators and the use of artists' work?
-The controversy stems from the fact that AI art generators use artists' work without their consent. Critics argue that this is a form of theft, while others claim that AI is simply taking inspiration, similar to how humans learn from studying other pieces.
What is the stance of Stability AI and Adobe regarding the use of artists' work in their AI generators?
-Stability AI has stated that their new generators will be opt-out, meaning artists can choose not to have their work included. Adobe, on the other hand, has trained its new image generator, Firefly, only on images from its stock library, implying a more controlled approach to using artists' work.
Why did Carla Ortiz decide to remove her work from the internet?
-Carla Ortiz removed her work from the internet as a preventative measure to avoid her art being scraped into an AI image data set without her consent.
What are the potential legal implications of using artists' work without their consent in AI image generators?
-The use of artists' work without consent could lead to legal actions such as class action lawsuits, as seen with Carla Ortiz and other artists against AI image generators. This raises questions about intellectual property rights and the ethical use of AI technology.
How do some artists feel about the opt-in process for AI image generators?
-Some artists are open to the idea of their work being used with AI image generators, but they advocate for an opt-in process, giving them control over whether their art is included in the training data for AI models.
What role do regulators and public opinion play in the future of AI art?
-Regulators and public opinion are crucial in shaping the ethical and legal standards for AI art. They can influence how AI tools are developed and used, ensuring that the rights and consent of artists are respected.
What is the current status of efforts to 'break' the glaze technology?
-The article mentions that people are already attempting to bypass glaze, indicating that there is an ongoing challenge between the development of protective technologies like glaze and the efforts to circumvent them.
Outlines
🖼️ AI Art Creation and Legal Challenges
AI-generated art has surged in popularity, with tools like Dolly and Stable Diffusion enabling users to create artwork quickly by mimicking various styles, including those of specific artists, through training models using vast datasets. However, this raises significant legal and ethical issues, as many artists, like Carla Ortiz, have not consented to their work being used in such datasets. Ortiz, a concept artist whose work includes designs for 'Magic the Gathering' and 'Doctor Strange,' discovered her artwork had been used without her permission, leading her to participate in a class action lawsuit against AI image generators. Meanwhile, innovations like 'Glaze' by Professor Ben Zhao offer a technical solution by altering how artwork appears to machines without affecting human perception, hoping to protect artist styles from unauthorized use.
🛡️ Developing Solutions and Future Challenges in AI Art
The introduction of 'Glaze' represents a stopgap measure to protect artists' works from unauthorized AI training use by subtly altering images to mislead AI without impacting human visual perception. This tool is part of a broader effort to manage the challenges posed by AI in the arts, focusing on giving artists time to adapt and for regulations to evolve. The future of AI in art seems inevitable, emphasizing the importance of regulatory pressure, artist input, and public awareness to ensure these technologies develop in a manner that respects creators' rights and contributions.
Mindmap
Keywords
💡AI art
💡Image generators
💡Training
💡Art theft
💡Carla Ortiz
💡Class action lawsuit
💡Glaze
💡Opt-in and Opt-out
💡Regulation
💡Public awareness
💡Machine learning models
Highlights
AI art sold for over four hundred thousand dollars at a Christie's auction in 2018.
AI models use a process called training to mimic styles, even those of specific artists, by ingesting millions of images.
Many artists never consented to their artwork being used in AI image generators.
Carla Ortiz, a concept artist, discovered her art was scraped into an AI image dataset without her permission.
Carla Ortiz and other artists filed a class action lawsuit against AI image generator companies.
Professor Ben Zhao's lab at the University of Chicago developed 'Glaze,' a method to protect art from AI scraping.
Glaze alters images in subtle ways that mislead AI models but are nearly imperceptible to humans.
AI art generators argue they are merely taking inspiration from existing art, not copying.
Artists advocate for an opt-in approach to use their works in AI, opposing the current opt-out methods.
Adobe’s Firefly generator uses images from its own stock library, addressing consent issues.
Internet users are already finding ways to circumvent Glaze's protections.
There is a growing need for regulation and public awareness to ensure ethical use of AI in art.
Glaze aims to buy time for artists until more permanent solutions and regulations are established.
The lawsuit and technological solutions like Glaze highlight the complex ethics of AI-generated art.
The future of AI in art depends on collaboration between regulators, artists, and technologists.