When to use robots.txt for SEO
Robots.txt is best for blocking infinite spaces, admin back‑ends, or low‑value duplicate pages so crawlers can focus on important content instead of wasting crawl budget.[web:767][web:764]
Common mistakes to avoid
- Blocking the entire site with
Disallow: /when you meant to restrict only a folder.[web:762][web:770] - Trying to deindex pages only with robots.txt instead of using
noindextags or removing them from sitemaps.[web:758][web:765] - Forgetting to update robots.txt after site migrations or major URL structure changes.[web:761][web:770]