Dark Visitors

With this Kirby plugin, you can prevent AI crawlers from visiting your own site.

Dark Visitors are all the AI crawlers that roam our websites, scraping text, graphics, and more to train language models and similar technologies. Not everyone likes that. That's why darkvisitors.com was created, a site that lists common user agents used by these crawlers.

Using these user agents, one can create a robots.txt file, a text file that crawlers refer to and which specifies whether and which resources of the websites they are allowed to access – at least theoretically. Whether they all adhere to it is questionable, but it's at least an attempt to keep crawlers in check.

Now, one can either compile such a robots.txt oneself, or use my new Kirby plugin for it. It accesses the Dark Visitors API once a day to retrieve the latest rules from there. This way, you stay up to date and don't have to worry about it yourself. Additionally, you can also add your own entries to supplement your own sitemap and rules.

The plugin is free and can be installed via the usual methods:

https://maurice-renck.de/projects/dark-visitors

You can comment on this post from your blog.

This post reacts to webmentions. You can link it from your website and send a webmention. Your contribution will then appear in the comments here. Does your site not support webmentions? Enter the link to your post here:

Write a comment
By submitting your data, you agree that all entered data may be saved and displayed as a comment.
Like, Share, Reply

I'd love to hear from you! Did you enjoy this post? Leave a comment, link your blog post, or react on Mastodon and Bluesky.