Help by reporting now!
Help by reporting now!
Help by reporting now! Help by reporting now!

Let’s Stop Nudification Apps Together!

Digital violence has many faces and AI feeds the beast. Using AI to create sexualized deepfakes of people, against their consent, is one horrifying part of this. AlgorithmWatch is trying to locate such non-consensual sexualization tools (NSTs, as we call them) − with the help of partners and supporters like you − and contain their spread on social media and app stores.

To do this, we are using the EU’s Digital Services Act (DSA). This law requires large platforms to proactively mitigate systemic risks like non-consensual sexual deepfakes and to give researchers like us data to help mitigate such risks. While some important platforms give us access to this data, X is currently refusing and blocking research.

You can help us to search for NSTs! If you see websites or apps which create non-consensual sexualized deepfakes of real people, or posts promoting such websites or apps, tell us about them in our form.

You don't have anything to report but want to support research, advocacy, and campaigns that lead to change? Donate to help fight NSTs and promote the responsible use of AI—making the digital space safer for everyone.

What are non-consensual sexualization tools?

Non-consensual sexualisation tools (NSTs) are apps and websites which use AI to create sexualized images of real people (“deep fakes”), irrespective of whether they have given consent.

NSTs are often called “nudify apps.” But while some tools allow full nudity, they can also include sexualization via (for example) images in underwear or swimsuits − as seen recently on the X chatbot Grok. We are also looking for “undressing” tools.

Some of these tools have had substantial usage and impact, with targets ranging from celebrities to even schoolchildren. AlgorithmWatch has previously reported on cases in Spanish schools, and Spiegel reports: “At more than 500 schools and universities in South Korea, naked images of young women and girls were created using Al in 2024. The deepfakes were shared in large chat groups on Telegram and used in some cases to blackmail the victim.”

These tools are hosted and circulated through a range of networks, sites, and online services (the Guardian, Bellingcat), including Telegram, Discord channels (404 Media, Wired), and very large platforms and app stores (Graphika, Context).

What are we doing to make the platforms act?

  • We are investigating five platforms which can be used to share NSTs amongst very large audiences: X, Facebook, Instagram, the Google Play Store, and the Apple App Store.
  • We are collecting reports of NSTs (for example on social media) and building a tool to detect content and accounts which are helping to advertise and spread NSTs. You can help us! Fill in our form.
  • By detecting NSTs, we can report them to platforms using their DSA reporting forms to get the NSTs taken down, and also build an evidence base of how they spread online.

How are we using the DSA?

  • The DSA is an EU regulation which places requirements on online services, with particular requirements falling on services with over 45 million monthly users in the EU, so called Very Large Online Platforms and Search Engines (VLOPs and VLOSEs). These include the five platforms we are investigating: X, Facebook, Instagram, the Google Play Store, and the Apple App Store.
  • The DSA clearly states (Articles 34 and 35) that large online platforms and search engines must mitigate systemic risks in the EU. One of the listed systemic risks is “actual or foreseeable negative effects in relation to gender-based violence.”

    This means that VLOP/VLOSEs must take proactive and effective measures to mitigate the circulation of NSTs, encouraged if need be by the threat of DSA enforcement by the EU Commission.

  • The DSA also states (Article 40.12), that these platforms and search engines must provide access to public data to research organizations that are researching systemic risks − such as AlgorithmWatch.

We try to research − but X refuses to provide data

  • In June we requested publicly accessible data under Article 40.12 of the DSA. We now have access to data from Apple and Google. We are still discussing with Meta about data for Facebook and Instagram. But X has refused to provide data.
  • X’s refusal just says that “your application fails to demonstrate that your proposed use of X data is related to the specified systemic risks in the EU as described by Art. 34 of the Digital Services Act.” They have used this exact text to refuse many other requests (as found by the DSA Data Access Collaboratory) − so it seems to be their default excuse.
  • We complained to X about this via their online form on 23 July 2025, but have received no answer.
  • We are contacting the European Commission about this issue − proper enforcement of the DSA against X, which we have been awaiting for months, needs to happen!

Part of a bigger problem of digital violence

We are aware that NSTs are part of a bigger problem. Digital violence is very complex and has many facets. We want to use this research as a starting point, and will continue to learn, research, create awareness, and engage for change!

What are we doing now?

🗯 We are going to keep building our detection system, and use it to track down and report NSTs. You can help us! If you see websites or apps which create non-consensual sexualized deepfakes of real people, tell us about them in our form.

🗯 We aim to ensure that platforms comply with the DSA − and in particular that X gives us data. If they don’t, we will pressure the European Commission to properly enforce the DSA against them.

🗯 We will also use our collected evidence to support us and partners to advocate for other measures against digital violence − such as the law against digital violence promised in the German Coalition Agreement (p.91).

If you have more questions about this project or want to talk about it please contact:

Oliver Marsh
Head of Tech Research

You don't have anything to report but would like to support research, advocacy, and campaigns like this one that lead to change? Then support these causes with your donation:

Your support makes a difference! It helps set boundaries to greedy tech companies that circumvent regulations and spread harms.