Skip to content

Latest commit

 

History

History
91 lines (70 loc) · 3.36 KB

Readme.md

File metadata and controls

91 lines (70 loc) · 3.36 KB

metalsmith-robots

npm version build status dependency status devdependency status downloads

A metalsmith plugin for generating a robots.txt file

stack overflow slack chat

This plugin allows you to generate a robots.txt file. It accepts global options, and can be triggered from a file's frontmatter with the public and private keywords. Works well with metalsmith-mapsite, as that also accepts setting a page to private from the frontmatter.

For support questions please use stack overflow or the metalsmith slack channel.

Installation

$ npm install metalsmith-robots

Example

Configuration in metalsmith.json:

{
  "plugins": {
    "metalsmith-robots": {
      "useragent": "googlebot",
      "allow": ["index.html", "about.html"],
      "disallow": ["404.html"],
      "sitemap": "https://www.site.com/sitemap.xml"
    }
  }
}

Which will generate the following robots.txt:

User-agent: googlebot
Allow: index.html
Allow: about.html
Disallow: 404.html
Sitemap: https://www.site.com/sitemap.xml

Options

You can pass options to metalsmith-robots with the Javascript API or CLI. The options are:

  • useragent: the useragent - String, default: *
  • allow: an array of the url(s) to allow - Array of Strings
  • disallow: an array of the url(s) to disallow - Array of Strings
  • sitemap: the sitemap url - String
  • urlMangle: mangle paths in allow and disallow - Function

Besides these options, settings public: true or private: true in a file's frontmatter will add that page to the allow or disallow option respectively. metalsmith-robots expects at least one of the last three options, without them it will not generate a robots.txt.

urlMangle

To make sure paths start with a / you can mangle urls that are provided via allow and disallow.

.use(robots({
  urlMangle: (filepath) => {
    return (filepath.slice(0, 1) !== '/') ? `/${filepath}` : filepath;
  }
}))

License

MIT