Dark Visitors

With this Kirby plugin, you can prevent AI crawlers from visiting your own site.

Dark Visitors are all the AI crawlers that roam our websites, scraping text, graphics, and more to train language models and similar technologies. Not everyone likes that. That's why darkvisitors.com was created, a site that lists common user agents used by these crawlers.

Using these user agents, one can create a robots.txt file, a text file that crawlers refer to and which specifies whether and which resources of the websites they are allowed to access – at least theoretically. Whether they all adhere to it is questionable, but it's at least an attempt to keep crawlers in check.

Now, one can either compile such a robots.txt oneself, or use my new Kirby plugin for it. It accesses the Dark Visitors API once a day to retrieve the latest rules from there. This way, you stay up to date and don't have to worry about it yourself. Additionally, you can also add your own entries to supplement your own sitemap and rules.

The plugin is free and can be installed via the usual methods:

What you could do now

If you (don't) like this post, you can comment, write about it elsewhere, or share it. If you want to read more posts like this, you can follow me via RSS or ActivityPub, or you can view similar posts.