robots.txt template
To generate a robots.txt file from a template, change the site configuration:
hugo.
enableRobotsTXT: true
enableRobotsTXT = true
{
"enableRobotsTXT": true
}
By default, Hugo generates robots.txt using an embedded template.
User-agent: *
Search engines that honor the Robots Exclusion Protocol will interpret this as permission to crawl everything on the site.
robots.txt template lookup order
You may overwrite the internal template with a custom template. Hugo selects the template using this lookup order:
/layouts/robots.txt
/themes/<THEME>/layouts/robots.txt
robots.txt template example
layouts/robots.txt
User-agent: *
{{ range .Pages }}
Disallow: {{ .RelPermalink }}
{{ end }}
This template creates a robots.txt file with a Disallow
directive for each page on the site. Search engines that honor the Robots Exclusion Protocol will not crawl any page on the site.