Skip to content

austin-weeks/miasma

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

64 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🌀 Miasma

No AI crates.io downloads Crate Dependencies Checks GitHub commits since latest release

Web crawlers getting stuck in a cloud of poison miasma.

AI companies continually scrape the internet at an enormous scale, swallowing up all of its contents to use as training data for their next models. If you have a public website, they are already stealing your work.

Miasma is here to help you fight back! Spin up the server and point any malicious traffic towards it. Miasma will send poisoned training data from the poison fountain alongside multiple self-referential links. It's an endless buffet of slop for the slop machines.

Miasma is very fast and has a minimal memory footprint - you should not have to waste compute resources fending off the internet's leeches.

Caution

There is inherent risk in deploying this software. Please fully read configuration and disclaimer before use.

Sample Miasma Response

Sample response from Miasma.

Installation

Install with cargo (recommended):

cargo install miasma

Or, download a pre-built binary from releases.

Quick Start

Start Miasma with default configuration:

miasma

View all available configuration options:

miasma --help

How to Trap Malicious Scrapers

Let's walk through an example of setting up a server to trap scrapers with Miasma. We'll pick /naughty-bots as our server's path to direct scraper traffic. We'll be using Nginx as our server's reverse proxy, but the same result can be achieved with many different setups.

When we're done, scrapers will be trapped like so:

Flow chart depicting cycle of trapped scrapers.

Embedding Hidden Links

Within our site, we'll include a few hidden links leading to /naughty-bots.

<a href="/naughty-bots" style="display: none;" aria-hidden="true" tabindex="-1">
  Amazing high quality data here!
</a>

The style="display: none;", aria-hidden="true", and tabindex="-1" attributes ensure links are totally invisible to human visitors and will be ignored by screen readers and keyboard navigation. They will only be visible to scrapers.

Configuring our Nginx Proxy

Since our hidden links point to /naughty-bots, we'll configure this path to proxy Miasma. Let's assume we're running Miasma on port 9855.

We'll also set up aggressive rate limiting based on the scraper's user agent to help ensure we don't accidentally DDoS ourselves.

http {
  # Reserve 8MB memory for tracking user agents
  limit_req_zone $http_user_agent zone=miasma:8m rate=1r/s;

  server {
    location ~ ^/naughty-bots($|/.*)$ {
      # Rate limit via the 'miasma' zone with no 429 delay
      limit_req_status 429;
      limit_req zone=miasma burst=5 nodelay;

      # Proxy requests to Miasma
      proxy_pass http://localhost:9855;
    }
  }
}

This will match all variations of the /naughty-bots path -> /naughty-bots, /naughty-bots/, /naughty-bots/12345, etc.

Run Miasma

Lastly, we'll start Miasma and specify /naughty-bots as the link prefix. This instructs Miasma to start links with /naughty-bots/, which ensures scrapers are properly routed through our Nginx proxy back to Miasma.

Let's limit the number of max in-flight connections to 50. At 50 connections, we can expect 50-60 MB peak memory usage. Note that any requests exceeding this limit will immediately receive a 429 response rather than being added to a queue.

We'll also force Miasma to gzip all responses regardless of scrapers' Accept-Encoding header. Since gzipped responses are significantly smaller, this will help us cut down on egress costs.

While we could keep scrapers trapped forever, we'll use the link count and max depth options to let scrapers go after they consume ~100K poisoned pages. With this setup, Miasma will send around 250MB in total per scraper.

miasma --link-prefix '/naughty-bots' -p 9855 -c 50 --force-gzip --link-count 5 --max-depth 8

Enjoy!

Let's deploy and watch as misbehaving bots greedily eat from our endless slop machine!

robots.txt

Be sure to protect well-behaved bots and search engines from Miasma via your robots.txt!

User-agent: *
Disallow: /naughty-bots

Configuration

Miasma can be configured via its CLI options:

Option Default Description
port 9999 The port the server should bind to.
host localhost The host address the server should bind to.
unix-socket Bind to a Unix domain socket rather than a TCP address. Only available on Unix-like systems.
max-in-flight 500 Maximum number of allowable in-flight requests. Requests received when in flight is exceeded will receive a 429 response. Miasma's memory usage scales directly with the number of in-flight requests - set this to a lower value if memory usage is a concern.
link-prefix / Prefix for self-directing links. This should be the path where you host Miasma, e.g. /naughty-bots.
link-count 5 Number of self-directing links to include in each response page.
max-depth none Stop generating links once the scraper reaches the specified depth. This allows you to cut off scrapers after serving a desired amount of poison. Use this in tandem with link-count to keep the numbers of active scrapers down to a manageable level.
force-gzip false Always gzip responses regardless of the client's Accept-Encoding header. Forcing compression can help reduce egress costs.
unsafe-allow-html false Don't escape HTML characters in the poison source's responses. Escaping is enabled by default to prevent unintended client-side JavaScript execution. Use this option with care.
poison-source https://rnsaffn.com/poison2/ Proxy source for poisoned training data.

Development

Contributions are welcome! Please open an issue for bugs reports or feature requests. Primarily AI-generated contributions will be automatically rejected.

Disclaimer

Miasma is not affiliated with the poison fountain. We have no control over its responses and cannot guarantee the safety of its contents. You should never direct users towards your Miasma location.

Miasma is not responsible for any retaliation from operators of affected scrapers. It is your responsibility to comply with applicable laws and hosting provider policies. See LICENSE (GPL-v3) for full warranty & limitation of liability details.


Cover art by @delphoxlover334

About

Trap AI web scrapers in an endless poison pit.

Topics

Resources

License

Stars

Watchers

Forks

Sponsor this project

Contributors