This component for GoHugo adds a customizable robots.txt to your website. This module overrides the internal robots.txt generation of Hugo and lets you configure what robots.txt in your public folder will contain. It also offers a meta-robots tag for your head section.
This component can be used as drop-in without much configuration. However, robots.txt generation must be enabled in your main configuration, for instance
You can add configuration parameters per content page in its frontmatter:
This will add a
Disallow line for the current URL. Note, that with clean URLs this will disallow bot-access for all sub-folders and sub-urls of the current item.
Adding global (dis)allows
You can add global additions to your robots.txt via
Configure meta-robots tags
Configure the robots tag with the following individual configuration parameters in your frontmatter:
Add or edit global defaults in the
[params] section of
config.toml or in
The default without any configuration is
true for both parameters.
If you are using davidsneighbour/hugo-head then the
robots meta tag is automatically added to your head section. If not, you need to add a call to the meta tag:
You can cache this partial, but based on a per-page level:
Are you into ASCII-art?
If you like to do your robots.txt proud — if you catch my drift — then you can use the following configuration parameters in
config/_default/params.toml to add some flourish to your robots.txt:
Be careful to properly comment out these parts.
Remove dnb-org notices
The plugin adds some branding notes to your robots.txt. It’s a free plugin. If you need to remove these mentions feel free to set the
disableBranding option in your
config/_default/params.toml to true: