FastOS 工具
FastOS 工具

Robots.txt 生成器

為您的网站生成 robots.txt 文件。配置搜索引擎爬虫访问规则。

Robots.txt Rules

Rule 1

Sitemap

Generated robots.txt

功能介紹

The Robots.txt Generator tool helps you create a correct robots.txt file to control how search engine crawlers access your site. It lets you quickly configure which sections to allow, disallow, or point to your sitemap.

什麼時候用這個工具?

Use this tool whenever you set up a new site, migrate to a new domain, or need to fine-tune crawler access: blocking admin paths, staging areas, or duplicate content from being indexed.

使用步驟

1

Specify allowed and disallowed paths

List which URL paths should be accessible to crawlers and which should be blocked, such as /admin or /private.

2

Configure user-agents

Optionally set rules for specific crawlers (like Googlebot or Bingbot) or use a wildcard to define global rules.

3

Add sitemap references

Include the full URL of your sitemap.xml so search engines can more easily discover your important pages.

4

Generate and deploy robots.txt

Generate the robots.txt content, then upload it to the root of your domain so it is accessible at /robots.txt.

使用範例

Protecting a staging environment

Block all crawlers from indexing a staging site by disallowing all paths, preventing duplicate content from appearing in search results.

Blocking internal admin URLs

Disallow /admin, /backend, or other private tools so they are not indexed or shown in search results.

常見問題

相關工具推薦