FTC Warns Tech Platforms of May 19 Deadline for Take It Down Act

WASHINGTON—The Federal Trade Commission (FTC) has issued warning letters to popular online platforms to be prepared for compliance with the Take It Down Act, which is due to enter force on May 19. 

According to a press release from the FTC, the warning letters were sent to companies such as Amazon, Alphabet, Apple, Automattic, Bumble, Discord, Match Group, Meta, Microsoft, Pinterest, Reddit, SmugMug, Snapchat, TikTok and X

The act, passed and signed into law nearly one year ago, creates a federal criminal prohibition on the publication of non-consensual intimate imagery (NCII), including non-consensual deepfakes generated by artificial intelligence models. 

The sweeping law establishes a strict notice-and-removal process that dictates platforms must take down the offending content within 48 hours of initial notice. This starts May 19 for all platforms subject to the purview of U.S. laws and regulations. 

“We stand ready to monitor compliance, investigate violations, and enforce the Take It Down Act,” said Andrew Ferguson, chair of the FTC. "Protecting the vulnerable—especially children—from this harmful abuse is a top priority for this agency and this administration."

"Under the law, 'covered platforms' include various websites, apps and online services, such as social media, messaging, image or video sharing and gaming platforms," the FTC states. Unlawful sharing of content could result in criminal prosecution under federal law. Notice-and-removal violations of the act are additionally viewed as violations of FTC rules. Such violations will receive civil penalties of $53,088 per violation. 

Adult industry companies are urged to comply and be prepared by consulting counsel, attorney Corey Silverstein of Silverstein Legal advises. He wrote in a recent blog post, "For platforms that host user-generated content, creator content, private messaging, image or video uploads, live chat, AI-generated media, or adult content, this is not just a policy issue.

"It is an operational issue," he continued. "A compliant policy is not enough if the platform cannot receive, review, track, remove, and prevent reposting of covered content within the required timeframe."

Adult industry trade group the Free Speech Coalition (FSC) registered similar concerns late in April. 

Liability for violations of the Take It Down Act applies to "any person who knowingly publishes [non-consensual] content using an interactive computer service," a statement from FSC reads. "This targets the individual uploader/publisher, not the platform. ... Platforms must post a clear, conspicuous, plain-language notice of their removal process and how to submit a request. ... Failure to comply with the notice-and-removal obligations is treated as an unfair or deceptive act or practice under the FTC Act, enforced by the Federal Trade Commission."