Microsoft accused of selling AI tool that spews violent, sexual images to kids

March 06, 2024
40 views

Microsoft's AI text-to-image generator Copilot Designer appears to be heavily filtering outputs after a Microsoft engineer, Shane Jones, warned that Microsoft has ignored warnings that the tool randomly creates violent and sexual imagery, CNBC report... [3388 chars]

Source: Ars Technica