Artificial intelligenceMachine learningTechnology

Artists can now opt out of the next version of Stable Diffusion


Artists will have the chance to opt out of the next version of one of the world’s most popular text-to-image AI generators, Stable Diffusion, the company behind it has announced

Stability.AI will work with Spawning, an organization founded by artist couple Mat Dryhurst and Holly Hendon who have built a website called HaveIBeenTrained, that allows artists to search for their works in the data set that was used to train Stable Diffusion. Artists will be able to select which works they want to exclude from the training data.

The decision follows a heated public debate between artists and tech companies over how text-to-image AI models should be trained. Stable Diffusion is based on the open source LAION-5B data set, which is built by scraping the internet of images, including copyrighted works of artists. Some artists’ names and styles became popular prompts for wannabe AI-artists. 

Dryhurst told MIT Technology Review artists have “around a couple of weeks” to opt out before Stability.AI starts training its next model, Stable Diffusion 3. 

The hope is, Dryhurst says, that until there are clear industry standards or regulation around AI art and intellectual property, Spawning’s opt-out service will augment or compensate for the absence of legislation. In the future, Dryhurst says, artists will also be able to opt in to having their works included in data sets.

A spokesperson for Stability.AI did not respond to a request for comment. In a tweet, Stability.AI’s founder Emad Mostaque said the company is not doing this for “ethical or legal reasons.”

“We are doing this as no reason to particularly not do it and be more inclusive,” Mostaque said on Twitter. “We think different model datasets will be interesting,” he added in another tweet

But Karla Ortiz, an artist and a board member of the Concept Art Association, an advocacy organization for artists working in entertainment, says she doesn’t think Stability.AI is going far enough.

The fact that artists have to opt out means “that every single artist in the world is automatically opted in and our choice is taken away,” she says.

“The only thing that Stability.AI can do is algorithmic disgorgement, where they completely destroy their database and they completely destroy all models that have all of our data in it,” she says. 

The Concept Art Association is fundraising $270,000 in order to hire a full-time lobbyist in Washington D.C. to influence lawmakers to make changes to US copyright and data privacy laws, as well as labor laws to ensure artists’ intellectual property and jobs are protected. The group wants to update intellectual property and data privacy laws to address new AI technologies, require AI companies to adhere to a strict code of ethics and to work with creative labor unions and creative industry groups. 

“It just truly does feel like we artists are the canary in the coal mine right now,” says ORtiz. 

Ortiz says the group is sounding the alarm to all creative industries that AI tools are coming for creative professions “really fast, and the way that it’s being done is extremely exploitative.” 





Source link