Subscribe to get a recap of the days posts & never miss the latest breaking news or exclusive content.
Summary:
University of Chicago computer scientists have developed Nightshade, a software tool that can subtly alter digitized artwork to “poison” generative AI models using the images for training. The Nightshade tool uses the open-source machine learning framework Pytorch to modify images in a way that changes how the AI model perceives them without visibly altering the image. This tool aims to help artists prevent their artwork from being used without consent in training synthetic image engines and encourages AI developers to seek licensing agreements with artists instead.
Introduction:
University of Chicago computer scientists have released Nightshade, a software tool that can subtly alter digitized artwork to “poison” generative AI models using the images for training. The Nightshade tool employs the Pytorch framework to modify images in a way that drastically changes how an AI model perceives them without visibly altering the image. This tool addresses the ongoing debate over data scraping and the unauthorized use of artists’ works in AI model training.
Main Points:
– Nightshade uses Pytorch to subtly alter images, making them appear largely unchanged to human eyes but drastically different to AI models.
– The software is resilient to common image transformations and ensures the integrity of its protective measures even when images are cropped, compressed, or otherwise altered.
– The tool aims to prevent unauthorized use of artists’ works in AI model training and encourages AI developers to seek licensing agreements with artists.
– Nightshade is part of ongoing efforts to address concerns about data scraping and the infringement on artists’ livelihoods.
Conclusion:
Nightshade is a software tool developed by University of Chicago computer scientists that can subtly alter digitized artwork to “poison” generative AI models. The tool uses the Pytorch framework to modify images in a way that drastically changes how AI models perceive them without visibly altering the image. Nightshade aims to prevent unauthorized use of artists’ works in AI model training and encourages AI developers to seek licensing agreements with artists. This tool is part of ongoing efforts to address concerns about data scraping and the infringement on artists’ livelihoods.