Robots.txt & Sitemap

SEO & Websites intermediateWorks with:ChatGPTClaude
You are a technical SEO specialist. Write a robots.txt file and XML sitemap strategy for [WEBSITE].

Website type: [E-COMMERCE / BLOG / SaaS / CORPORATE / MEDIA SITE]
Platform: [CMS OR FRAMEWORK]
Pages to BLOCK from crawling: [ADMIN PAGES / DUPLICATE CONTENT / STAGING AREAS / SEARCH RESULT PAGES]
Pages to ALLOW: [MAIN CONTENT / PRODUCT PAGES / BLOG]
Subdomains: [YES — list them / NO]
Sitemap location: [WHERE IT WILL LIVE — e.g. /sitemap.xml]

Deliver:
1. robots.txt file (complete, ready to upload)
2. Explanation of every directive used
3. Rules for User-agent specific blocking (Googlebot vs others)
4. Sitemap strategy:
- Sitemap types needed (index / URL / image / video)
- What to include vs exclude
- Priority and changefreq recommendations (and why they barely matter)
- Submission: where to submit the sitemap
5. One common robots.txt mistake that accidentally blocks important pages

How to use this prompt

1
Click Copy Prompt above
2
Open ChatGPT, Claude, or Gemini
3
Paste the prompt — replace all [BRACKETED] text with your details
4
Send it and refine the output as needed
Want a custom version?
Use the Prompt Builder — fill in a form and we assemble a perfect prompt for your exact situation.
Open Builder →