Gatsby Plugin Robots Txt

screenshot of Gatsby Plugin Robots Txt
gatsby

Gatsby plugin that automatically creates robots.txt for your site

Overview:

The gatsby-plugin-robots-txt is a plugin that allows users to create a robots.txt file for their Gatsby sites during the build process. By using this plugin, users can easily control the behavior of crawlers on their websites.

Features:

  • Create robots.txt File: Easily generate a robots.txt file for a Gatsby site during the build.
  • Custom Host and Sitemap: Options to set the host of the site, sitemap paths, and policy rules in the robots.txt file.
  • External Configurations: Ability to provide an external configuration file for more advanced settings.
  • Environment Handling: Support for different environments and configuration based on environment variables.
  • Netlify Integration: Easily disable crawlers for deploy-previews on Netlify deployments.
  • Query Option: Use a GraphQL query to provide a different site URL for the robots.txt file.
gatsby
Gatsby

GatsbyJS is a free and open-source static site generator based on React. It uses a modern development stack including Webpack, GraphQL, and modern JavaScript and CSS frameworks. It also provides a rich set of plugins, starters, and themes.

eslint
Eslint

ESLint is a linter for JavaScript that analyzes code to detect and report on potential problems and errors, as well as enforce consistent code style and best practices, helping developers to write cleaner, more maintainable code.