Virtual Newsletter about Software Freedom November 4, 2025 No, just rate limit! -------------------- There's a lot of crawlers on the "internet". Most of these crawlers are for LLMs. People don't like LLM crawlers, so people made software against them. All of these softwares are hard to use. They all require many reverse proxies. Examples are Anubis. But I always wonder, why don't people use a simple rate limit? It's built in to most HTTP servers. It works well. It is also very simple to use. There are some software like `iocaine`[1], which instead of doing a PoW like Anubis, it detects crawlers and sends them to a "digital maze", poisoning the LLM. This is different from a simple anti-bot solution, because this is actively making the LLM less useful. This is good! There's no Apache configuration that's available, and if there is, I don't want a difficult setup. That's all. - NexusSfan [1] https://iocaine.madhouse-project.org --- Copyright (c) 2025 NexusSfan Except where otherwise noted, this work is licensed under the Creative Commons Attribution-NoDerivatives 4.0 International License A copy of the license is included at https://creativecommons.org/licenses/by-nd/4.0 ---