spot_img
20.4 C
Munich
spot_img
Tuesday, May 20, 2025

Trump signs federal law banning non-consensual AI-generated, intimate images

Must read

President Donald Trump has officially enacted a new federal law prohibiting the distribution of so-called “revenge porn,” including sexually explicit images created using artificial intelligence technology.

The legislation, known as the Take It Down Act, was signed into law on Monday. It criminalises the publication of intimate photos or videos without the subject’s consent and mandates that social media platforms take down such content within 48 hours once a victim makes a request.

This landmark bill, which earlier received overwhelming bipartisan support in the U.S. Congress, extends to both authentic images and hyper-realistic AI-generated visuals, commonly referred to as “deepfakes.”

“With the rise of AI image generation, countless women have been harassed with deepfakes and other explicit images distributed against their will.

“This is … wrong … just so horribly wrong,” President Trump said during a signing ceremony held at the White House in Washington, D.C. “It’s a very abusive situation … And today we’re making it totally illegal.”

First Lady Melania Trump, a key advocate of the bill following her husband’s return to office, hailed the new law as a “powerful step forward in our efforts to ensure that every American, especially young people, can feel better protected from their image or identity being abused.”

Despite receiving strong support across party lines and endorsements from anti-sexual abuse organisations, the legislation has drawn criticism from some digital rights advocates concerned about its implications for online freedom.

The Electronic Frontier Foundation (EFF), a prominent privacy advocacy group, raised concerns that the law’s enforcement mechanisms could infringe on civil liberties.

“Lawful content – including satire, journalism, and political speech – could be wrongly censored,” the group said in a statement issued in February.

They also cautioned that the tight 48-hour removal window could force platforms, especially smaller ones, to rely on automated content filters that often misidentify legal content.

“Online service providers will have to comply so quickly to avoid legal risk that they won’t be able to verify claims,” EFF noted.

“Instead, automated filters will be used to catch duplicates, but these systems are infamous for flagging legal content, from fair-use commentary to news reporting.”

- Advertisement -spot_img

Latest article