generate-robotstxt

Awesome generator robots.txt


Keywords
robotstxt, robots.txt, generate, robots-txt, user-agent, police, allow, disallow, crawl-delay, robot, robots, cli, generator-robots, robots-generator
License
MIT
Install
npm install generate-robotstxt@8.0.3

Documentation

generate-robotstxt

NPM version Travis Build Status dependencies Status devDependencies Status

Awesome generator robots.txt.

Installation

npm install --save-dev generate-robotstxt

Usage

import robotstxt from "generate-robotstxt";

robotstxt({
  policy: [
    {
      userAgent: "Googlebot",
      allow: "/",
      disallow: "/search",
      crawlDelay: 2,
    },
    {
      userAgent: "OtherBot",
      allow: ["/allow-for-all-bots", "/allow-only-for-other-bot"],
      disallow: ["/admin", "/login"],
      crawlDelay: 2,
    },
    {
      userAgent: "*",
      allow: "/",
      disallow: "/search",
      crawlDelay: 10,
      cleanParam: "ref /articles/",
    },
  ],
  sitemap: "http://example.com/sitemap.xml",
  host: "http://example.com",
})
  .then((content) => {
    console.log(content);

    return content;
  })
  .catch((error) => {
    throw error;
  });

File based configuration

robots-txt.config.js

module.exports = {
  policy: [
    {
      userAgent: "Googlebot",
      allow: "/",
      disallow: ["/search"],
      crawlDelay: 2,
    },
    {
      userAgent: "OtherBot",
      allow: ["/allow-for-all-bots", "/allow-only-for-other-bot"],
      disallow: ["/admin", "/login"],
      crawlDelay: 2,
    },
    {
      userAgent: "*",
      allow: "/",
      disallow: "/search",
      crawlDelay: 10,
      cleanParam: "ref /articles/",
    },
  ],
  sitemap: "http://example.com/sitemap.xml",
  host: "http://example.com",
};

CLI

Awesome generator robots.txt

  Usage generate-robotstxt [options] <dest>

  Options:
     --config  Path to a specific configuration file.

Contribution

Feel free to push your code if you agree with publishing under the MIT license.

Changelog

License