# https://www.codeant.ai/robots.txt # Clean, search-friendly rules # Major search bots — full access User-agent: Googlebot User-agent: Googlebot-Image User-agent: Mediapartners-Google User-agent: bingbot User-agent: BingPreview User-agent: Slurp User-agent: DuckDuckBot User-agent: Applebot User-agent: YandexBot User-agent: Baiduspider User-agent: PetalBot Allow: / # Default rule for everything else User-agent: * Allow: / # (Optional) keep query-string dupes out of the crawl # Remove these two lines if you rely on query-string pages for indexable content. Disallow: /*? Disallow: /*?* # (Optional) block non-marketing areas IF they exist on this host # If these paths don’t exist, leaving them here does no harm. Disallow: /admin/ Disallow: /dashboard/ Disallow: /login Disallow: /api/private/ Disallow: /cgi-bin/ Disallow: /lp/ # Sitemaps Sitemap: https://www.codeant.ai/sitemap.xml