
Across the internet, we’re seeing a concerning growth of so-called ‘nudify’ apps, which use AI to create fake non-consensual nude or sexually explicit images. Meta has longstanding rules against non-consensual intimate imagery, and over a year ago we updated these policies to make it even clearer that we don’t allow the promotion of nudify apps or similar services. We remove ads, Facebook Pages and Instagram accounts promoting these services when we become aware of them, block links to websites hosting them so they can’t be accessed from Meta platforms, and restrict search terms like ‘nudify’, ‘undress’ and ‘delete clothing’ on Facebook and Instagram so they don’t show results.
Today, we’re sharing updates in our approach to tackling these apps, including new steps to remove them from our platforms, help other companies do the same and hold the people behind them accountable.
Taking Legal Action
We’re suing Joy Timeline HK Limited, the entity behind CrushAI apps, which allow people to create AI-generated nude or sexually explicit images of individuals without their consent. We’ve filed a lawsuit in Hong Kong, where Joy Timeline HK Limited is based, to prevent them from advertising CrushAI apps on Meta platforms.
This follows multiple attempts by Joy Timeline HK Limited to circumvent Meta’s ad review process and continue placing these ads, after they were repeatedly removed for breaking our rules.
This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it. We’ll continue to take the necessary steps – which could include legal action – against those who abuse our platforms like this.
Fighting Nudify Apps Across the Internet
With nudify apps being advertised across the internet – and available in App Stores themselves – removing them from one platform alone isn’t enough. Now, when we remove ads, accounts or content promoting these services, we’ll share information – starting with URLs to violating apps and websites – with other tech companies through the Tech Coalition’s Lantern program, so they can investigate and take action too. Since we started sharing this information at the end of March, we’ve provided more than 3,800 unique URLs to participating tech companies. We already share signals about violating child safety activity, including sextortion, with other companies, and this is an important continuation of that work.
Strengthening Our Enforcement Against Adversarial Advertisers
Like other types of online harm, this is an adversarial space in which the people behind it – who are primarily financially motivated – continue to evolve their tactics to avoid detection. For example, some use benign imagery in their ads to avoid being caught by our nudity detection technology, while others quickly create new domain names to replace the websites we block.
That’s why we’re also evolving our enforcement methods. For example, we’ve developed new technology specifically designed to identify these types of ads – even when the ads themselves don’t include nudity – and use matching technology to help us find and remove copycat ads more quickly. We’ve worked with external experts and our own specialist teams to expand the list of safety-related terms, phrases and emojis that our systems are trained to detect within these ads.
We’ve also applied the tactics we use to disrupt networks of coordinated inauthentic activity to find and remove networks of accounts operating these ads. Since the start of the year, our expert teams have run in-depth investigations to expose and disrupt four separate networks of accounts that were attempting to run ads promoting these services.
Supporting Effective Legislation
We welcome legislation that helps fight intimate image abuse across the internet, whether it’s real or AI-generated, and that complements our longstanding efforts to help prevent this content from spreading online through tools like StopNCII.org and NCMEC’s Take It Down. That’s why we championed and are working to implement the new U.S. TAKE IT DOWN Act, an important bipartisan step forward in fighting this kind of abuse across the internet and supporting those affected.
We also continue to support legislation that empowers parents to oversee and approve their teens’ app downloads. As well as sparing parents the burden of repeated approvals and age verification across the countless apps their teens use, this would also allow parents to see if their teen is attempting to download a nudify app from an App Store, and prevent them from doing so.
The post Taking Action Against ‘Nudify’ Apps appeared first on Meta Newsroom.