SEO and analytics configuration
-
Add analytics scripts, ensuring they only run on the production sites (review apps should never run analytics; local sites should not run them by default, but it needs to be possible to run them) -
Bonus points if we can add a test for this. A simple test could be to grep
thepublic
directory of the compiled site for certain strings that would only be present in the analytics snippets (e.g "googletagmanager" "bizible" "marketo" etc). (This could totally just be a follow-up though) -
It would be nice if we could disable analytics using an environment variable ( ANALYTICS_ENABLED
). Allows for turning them off entirely without a code change in the event of a problem with them.
-
-
Ensure sitemap.xml is generated correctly. -
Run a Lighthouse check to look for SEO issues. -
Update analytics documentation as needed (doc/analytics.md) -
Set up robots.txt to allow crawlers on production only. -
Adjust robots metatags to allow crawlers on production only. -
Add a tag to specify each page's canonical URL. The canonical URL should always be the "latest" doc (e.g https://archives.docs.gitlab.com/16.11/charts/development/clickhouse.html has this tag: <link rel="canonical" href="https://docs.gitlab.com/charts/development/clickhouse.html">
)
References:
- https://gohugo.io/getting-started/configuration/#configuration-directory
- https://gohugo.io/getting-started/configuration/#enablerobotstxt
Let's see if we can close this: gitlab-org/gitlab-docs#1132 (maybe we can change this config option via an env var at build time?)