This Kirby plugin utilizes the Dark Visitors API to create a robots.txt file that (theoretically) prevents common AI crawlers from scraping your website. Additionally, sitemaps and custom rules can be added. Installation via the usual methods, the plugin is free.
Dark Visitors
A Kirby plugin to block AI crawler
Like, Boost, Reply
This post is not linked to any Mastodon or Bluesky posts, so you can only reply directly.
You can comment on this post from your blog.
This post reacts to webmentions. You can link it from your website and send a webmention. Your contribution will then appear in the comments here. Does your site not support webmentions? Enter the link to your post here:
Write a comment