Skip to content

Commit 35b952b

Browse files
committed
docs: update robots.txt to match phcode.dev conventions
Add core.ai legal header, per-bot crawl directives, and references to all three sitemaps (docs, phcode.dev, sitemaps.phcode.io).
1 parent 36e304b commit 35b952b

1 file changed

Lines changed: 32 additions & 1 deletion

File tree

static/robots.txt

Lines changed: 32 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,36 @@
1+
# The use of robots or other automated means to access the sites managed by core.ai
2+
# without the express permission of core.ai is strictly prohibited.
3+
# Notwithstanding the foregoing, core.ai may permit automated access to
4+
# access certain pages but solely for the limited purpose of
5+
# including content in publicly available search engines. Any other
6+
# use of robots or failure to obey the robots exclusion standards set
7+
# forth at http://www.robotstxt.org/ is strictly prohibited.
8+
9+
# Details about Googlebot available at: http://www.google.com/bot.html
10+
# The Google search engine can see everything
11+
User-agent: gsa-crawler-www
12+
Disallow: /assets/
13+
14+
# The Omniture search engine can see everything
15+
User-agent: Atomz/1.0
16+
Disallow: /assets/
17+
118
User-agent: *
2-
Allow: /
319
Disallow: /assets/
420

21+
User-agent: AdsBot-Google
22+
Disallow: /assets/
23+
24+
User-agent: AdsBot-Google-Mobile
25+
Disallow: /assets/
26+
27+
User-agent: SearchmetricsBot
28+
Disallow: /assets/
29+
30+
User-agent: Googlebot
31+
Disallow: /assets/
32+
33+
# XML sitemaps
534
Sitemap: https://docs.phcode.dev/sitemap.xml
35+
Sitemap: https://phcode.dev/sitemap-phcode.xml
36+
Sitemap: https://sitemaps.phcode.io/sitemap-phcode.xml

0 commit comments

Comments
 (0)