Trump signs bill cracking down on explicit deepfakes - The bipartisan Take It Down Act, which passed both chambers of Congress overwhelmingly, is one of the few pieces of legislation Trump has signed into law in his second term.

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
WASHINGTON — President Donald Trump signed legislation Monday that bans the nonconsensual online publication of sexually explicit images and videos that are both authentic and computer-generated.

The Take It Down Act makes publishing such content illegal, subjecting violators to mandatory restitution and criminal penalties such as prison, fines or both. The bill also establishes criminal penalties for people who make threats to publish the intimate visual depictions, some of which are created using artificial intelligence.


The measure requires websites, through enforcement by the Federal Trade Commission, to remove such imagery after they receive requests from victims within 48 hours and to make efforts to take down copies, as well.


"With the rise of AI image generation, countless women have been harassed with deepfakes and other explicit images distributed against their will. This is ... wrong, and it’s just so horribly wrong," Trump said at an afternoon signing ceremony in the White House Rose Garden. "It’s a very abusive situation like, in some cases, people have never seen before. And today we’re making it totally illegal."

First lady Melania Trump, who championed the legislation, attended the event.

"This legislation is a powerful step forward in our efforts to ensure that every American, especially young people, can feel better protected from their image or identity being abused through nonconsensual, intimate imagery," she said at the ceremony. "Artificial Intelligence and social media are the digital candy for the next generation — sweet, addictive and engineered to have an impact on the cognitive development of our children, but unlike sugar, these new technologies can be weaponized, shape beliefs and, sadly, affect emotions and even be deadly."

It is only the sixth bill Trump has signed into law in his second term. By his 100th day in office, he had signed only five bills — fewer than any other president in the first 100 days of an administration since at least Dwight D. Eisenhower in the 1950s, according to an NBC News analysis of data in the Congressional Record.

The Senate approved the measure by unanimous consent and the House overwhelmingly passed it in a 409-2 vote last month. Sens. Ted Cruz, R-Texas, and Amy Klobuchar, D-Minn., sponsored the bill in the Senate, while Rep. Maria Elvira Salazar, R-Fla., introduced its companion in the House along with several other members, including Democrats.


According to the bill's sponsors, while many states have laws explicitly banning sexual deepfakes, they vary in terms of classification of crime and penalties.

Trump highlighted the bill in early March, joking that it would apply to him. “I’m going to use that bill for myself, because nobody gets treated worse than I do online," he said.

The first lady also held an event on Capitol Hill that month touting the proposal. "It’s heartbreaking to witness young teens, especially girls, grappling with the overwhelming challenges posed by malicious online content like deepfakes," she said.

"This toxic environment can be severely damaging," Melania Trump continued. "We must prioritize their well-being by equipping them with support and tools necessary to navigate this hostile digital landscape. Every young person deserves a safe online space to express themselves free without the looming threat of exploitation or harm."


 
I read through this bill and it's a doozy for the farms.
- it's not just deep fakes it's any intimate imagery.
- the platform must take down the content in question without question. there is no process to avoid this.. no proof of ownership. nothing. the kicker is that the service has 48 hours to respond or remove the content, or else they lose their protections.

it doesn't apply to:
original, non AI posted publicly
original, non AI posted commercially (only fans)
anything posted with consent.

this means that those videos of Chris Chan are likely illegal to post now, as he sent them directly to who he thought was his online gf. go figure
It's DCMA 2.0. Religiously I'm on-board. Porn bad and revenge porn/deepfake porn especially so. But law wise? This is bad law. Everyone is well aware of how shit the DCMA is. It is a overbearing gorilla with a Tommy Gun. This doesn't even have defenses like fair use that you can possibly fight against it. This is a Gorilla with a M2 Browning spraying 50 Cal. The internet is fucked unless courts intervene.
 
Strong avatar comment synergy.

I support the intent of this I think. But I've little idea how you can make this workable in a way that doesn't overreach massively.
The massive overreach is the real intention of the bill. They're hiding it behind the "think of the revenge porn victims, children and public figures."

Not just women, but weak-minded individuals. People that have targeted the kiwifarms in the past will absolutely try to use this and further chip away at free speech.
Everyone here on the farms who support these kind of bills ultimately don't give a fuck Kiwifarms and the 1st Amendment is taken down if it means they can fuck over everyone they perceive as gooners
 
Last edited:
The point of this is to:

A: Protect politicians. So that they can't be mocked with deepfake porn.
B: Protect the people who have blackmail rackets to gatekeep positions of power. The capacity for someone to go rogue and say 'it was just deepfaked, you can't prove I really did x with y at z place!' threatens to blow the entire point of operations ran by folks like Sean Combs and Jeffrey Epstein.

So yeah if you think this is good it's because you're myopic and self-centered.
 
  • Like
Reactions: 08903248752
This will certainly die in the Supreme Court. It's asking too much of social media companies in an unreasonable time frame, not to mention the glaring free speech issue. Its super vague too for a federal law. Even state laws have more specifics than this. How the hell are we supposed to determine whats 'nonconsensual' in 48 hours? Cracking down on discord and roblox groomer dens would have a greater effect than this. It virtually kills the porn industry on X.

Their best avenue is to restrict the AI from generating these types of things if they want tax breaks or whatnot. Teach kids and parents not to share their entire lives online if they want to reduce the risk, stop whoring yourself out on social media.

I like Trump but supporting and enacting this was retarded. Ditto Melania. To quote the left: "Trump and Melania are on the wrong side of history."

Edit: This is directly from the FBI's website on sextortion:
Never send compromising images of yourself to anyone, no matter who they are—or who they say they are.
Do not open attachments from people you do not know.
Turn off your electronic devices and web cameras when you are not using them.
Link | Archive
 
Last edited:
This will certainly die in the Supreme Court. It's asking too much of social media companies in an unreasonable time frame, not to mention the glaring free speech issue. Its super vague too for a federal law. Even state laws have more specifics than this. How the hell are we supposed to determine whats 'nonconsensual' in 48 hours?
Only SC judge confirmed to have a constitutional spine and not have a vagina is Thomas.

Their best avenue is to restrict the AI from generating these types of things if they want tax breaks or whatnot. Teach kids and parents not to share their entire lives online if they want to reduce the risk, stop whoring yourself out on social media.
Have to change how public schools teach and indoctrinate children into telling the world everything about themselves to the world to not doing so.
 
Back