In a significant legal move underscoring the growing concern over digital privacy and safety, U.S. President Donald Trump has officially signed the TAKE IT DOWN Act into law. This landmark legislation, ratified on May 19, 2025, targets the insidious rise of nonconsensual AI-generated deepfake pornography, marking a pivotal moment in the digital rights discourse. The new law mandates platforms to remove offending images within 48 hours, thereby enforcing a strict time frame for compliance.
A New Era of Accountability
The TAKE IT DOWN Act, an acronym for Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks, isn’t just another piece of bureaucracy—it’s a direct assault on the misuse of technology in the realm of personal privacy. Endorsed by First Lady Melania Trump, who has been a vocal advocate, the bill criminalizes the publication or threat of nonconsensual intimate images, including deepfakes, with punitive measures ranging from fines to imprisonment. “Artificial Intelligence and social media are the digital candy of the next generation—sweet, addictive, and engineered to have an impact on the cognitive development of our children,” Melania Trump noted in a statement, highlighting the broader implications of technological misuse.
The law’s passage has been a collaborative effort, steered by Senators Ted Cruz and Amy Klobuchar, who introduced it in June 2024. Their bipartisan effort culminated in a resounding approval in April 2025, reflecting a shared concern over digital exploitation. As Melania Trump proclaimed, this legislative victory represents a national triumph in safeguarding the public from the darker corners of AI innovation.
Global Context and Implications
The United States joins a growing list of nations, including the UK, in criminalizing explicit deepfakes. The UK’s Online Safety Act of 2023 has already set a precedent, indicating a shift towards more stringent global measures against digital manipulation. According to a 2023 report from Security Hero, deepfakes are predominantly pornographic, with an overwhelming 99% of victims being women—a sobering statistic that underscores the gendered dimension of this issue.
The implications for the cryptocurrency world are intriguing. As digital assets become increasingly woven into the fabric of online interactions, the question of security becomes paramount. Heightened scrutiny and regulatory action in digital spaces could potentially ripple through crypto markets, where privacy and decentralization are core tenets. However, the push for accountability might also foster more robust security frameworks, benefiting the community at large. This is reminiscent of the developments in AI-Powered Court System Is Coming to Crypto With GenLayer, where AI is being integrated to enhance security and accountability in crypto transactions.
The Road Ahead
In the wake of this legislative milestone, platforms face the formidable task of aligning with the new requirements. The 48-hour window for content removal is a clear signal from lawmakers that the era of lax digital oversight is coming to an end. Yet, the practicalities of implementing these measures remain to be seen, especially for smaller platforms with limited resources.
The specter of AI misuse looms large, raising questions about the balance between technological advancement and ethical governance. As the digital landscape continues to evolve at a breakneck pace, the need for comprehensive, adaptive policies becomes evident. This new law sets a precedent, but it’s only the beginning. Future discussions will likely pivot towards refining these regulations and exploring the complex interplay between innovation and security. This aligns with the ongoing exploration of AI’s role in finance, as detailed in AI Crypto Agents Are Ushering in a New Era of ‘DeFAI’, where AI agents are reshaping decentralized finance.
As we navigate the uncharted waters of digital transformation, the TAKE IT DOWN Act serves as a beacon of progress. However, the journey towards a safer, more equitable digital world is far from over. The challenge now lies in ensuring that this momentum is not only sustained but expanded to address the multifaceted threats posed by emerging technologies.
Source
This article is based on: Trump signs bill criminalizing nonconsensual AI deepfake porn
Further Reading
Deepen your understanding with these related articles:
- Multi-wallet usage up 16%, but AI may address crypto fragmentation gap
- Coinbase Leaps Into Supreme Court Case in Defense of User Data Going to IRS
- Sam Altman’s World Crypto Project Launches in US With Eye-Scanning Orbs in 6 Cities

Steve Gregory is a lawyer in the United States who specializes in licensing for cryptocurrency companies and products. Steve began his career as an attorney in 2015 but made the switch to working in cryptocurrency full time shortly after joining the original team at Gemini Trust Company, an early cryptocurrency exchange based in New York City. Steve then joined CEX.io and was able to launch their regulated US-based cryptocurrency. Steve then went on to become the CEO at currency.com when he ran for four years and was able to lead currency.com to being fully acquired in 2025.