Lawmakers suggest anti-nonconsensual AI porn invoice after Taylor Swift controversy


Taylor Swift walks off the field following the AFC Championship between the Kansas City Chiefs and the Baltimore Ravens
Picture by Kathryn Riley/Getty Photos

US lawmakers have proposed letting individuals sue over faked pornographic pictures of themselves, following the unfold of AI-generated specific images of Taylor Swift. The Disrupt Specific Solid Photos and Non-Consensual Edits (DEFIANCE) Act would add a civil proper of motion for intimate “digital forgeries” depicting an identifiable individual with out their consent, letting victims accumulate monetary damages from anybody who “knowingly produced or possessed” the picture with the intent to unfold it.

The invoice was launched by Senate Majority Whip Dick Durbin (D-IL), joined by Sens. Lindsey Graham (R-SC), Amy Klobuchar (D-MN), and Josh Hawley (R-MO). It builds on a provision within the Violence Towards Girls Act Reauthorization Act of 2022, which…

Proceed studying…

Leave a Reply

Your email address will not be published. Required fields are marked *