University of Chicago researchers seek to “poison” AI art generators with Nightshade
On Friday, a team of researchers at the University of Chicago released a research paper outlining "Nightshade," a data poisoning technique aimed at disrupting the training process for AI models, reports MIT Technology Review and VentureBeat. The goal... [3776 chars]
Source: Ars Technica