Statement
Robots.txt generation shall be supported.
Rationale
Allows crawl rules to be defined deterministically. Implementation: a src/pages/robots.txt route driven by config in site.ts; per-page X-Robots-Tag header support for SSR deployments. The robots frontmatter object (removed during schema cleanup) should be re-introduced once this system is wired end-to-end.
Topics
Owner: seo-metadata
Related: operational
Applies To
- SEO and Analytics DOC-00022