Use llms.txt as optional guidance, but treat robots.txt, canonicals, and sitemaps as critical controls.
- Keep robots rules explicit for AI and search crawlers.
- Avoid conflicting directives across environments.
- Track crawl behavior after each change.
Start with AI crawler checklist before testing extras.
