Example of a simple robots.txt file, indicating that a user-agent called "Mallorybot" is not allowed to crawl any of the website's pages, and that other user-agents cannot crawl more than one page every 20 seconds, and are not allowed to crawl the "secret" folder. robots.txt is the filename used for implementing the Robots Exclusion Protocol, a standard used by websites to indicate to visiting web
![robots.txt - Wikipedia](https://cdn-ak-scissors.b.st-hatena.com/image/square/1943d8cfa1180b20801b9c15583613e757e6d30a/height=288;version=1;width=512/https%3A%2F%2Fupload.wikimedia.org%2Fwikipedia%2Fcommons%2Fthumb%2F1%2F16%2FRobots_txt.svg%2F1200px-Robots_txt.svg.png)