Hexo Generator Robotstxt

screenshot of Hexo Generator Robotstxt
hexo

Basic robots.txt generator plugin for Hexo

Overview

The hexo-generator-robotstxt plugin is a straightforward and efficient tool designed for users of Hexo 3, allowing them to automatically generate a robots.txt file. This file plays a crucial role in guiding web crawlers on how to interact with your website, making it essential for SEO optimization. With this plugin, you can customize the crawling permissions for various user agents, ensuring that your content is indexed appropriately.

Whether you are a seasoned developer or just starting with Hexo, this plugin simplifies the process of creating and managing your robots.txt file, offering a hassle-free way to enhance your website's spider-friendly settings.

Features

  • User-Agent Control: Easily set the User-Agent to define which crawlers your settings apply to, with a default that covers all agents (*).
  • Disallow Settings: Specify particular files or folders that you want to block from being crawled by designated User-Agents, keeping sensitive content private.
  • Allow Directives: Allow certain files or folders for specific User-Agents, ensuring that important content is visible to crawlers.
  • Sitemap Specification: Set the path to your sitemap within the robots.txt file, providing crawlers with a clear approach to navigating your site's structure.
  • Easy Configuration: Simple installation and configuration by adding the plugin to your _config.yml, making it accessible for quick updates.
  • Open Source: Licensed under MIT, giving users the flexibility and freedom to utilize and modify the plugin as needed.
hexo
Hexo

Hexo is a static site generator built with Node.js that enables developers to create fast and efficient websites using Markdown, EJS, and Stylus. It offers features such as server-side rendering, plugin support, and easy deployment to hosting services like GitHub Pages and Netlify.