Artists can “poison” AI to distort results

by time news

2023-10-27 04:45:00

All about Artificial Intelligence

The relationship between artificial intelligence (AI) and copyright is still an unregulated area and causes clashes between developers and artists. Now, a new tool called Nightshade allows users of AI systems that convert text to images to “poison” the software so that it stops working and generating satisfactory results.

Poison

Nighshade works simply: users of this AI software simply attach it to their request and it will corrupt the program. One example of this is how it can change the display of pixels in the final image or make the chatbot read a object as something completely different, delivering unexpected results. The Vergethis can disrupt software settings such as DALL-E, Stable Diffusion and Midjourney and, in the long term, limit the ability to create artificial images.

Read more:

Imagem: Login (Shutterstock)

Artists vs. AI

A professor at the University of Chicago, Ben Zhao, is one of the creators of Nighshade and told MIT Technology Review that the intention of the “poison” is to tip the scales away from AI companies that used copyrighted works to train language models; Research Article Nightshade shows how text-to-image conversion software is vulnerable to attacks, which can destabilize programs and take away their generation capacity; To intensify the situation, Nightshade is integrated with Glaze, a tool from the same creators that masks the art style of artists, so that AI cannot reproduce them; These creators proposed in the article that these “hacker” programs could ultimately be used as artists’ defense against AI companies that take advantage of their work.
#Artists #poison #distort #results

You may also like

Leave a Comment