diff --git a/docs/content/administration/search-engines-indexation.en-us.md b/docs/content/administration/search-engines-indexation.en-us.md
index 7898e8146..664940697 100644
--- a/docs/content/administration/search-engines-indexation.en-us.md
+++ b/docs/content/administration/search-engines-indexation.en-us.md
@@ -23,7 +23,7 @@ If you don't want your repository to be visible for search engines read further.
 ## Block search engines indexation using robots.txt
 
 To make Gitea serve a custom `robots.txt` (default: empty 404) for top level installations,
-create a file called `robots.txt` in the [`custom` folder or `CustomPath`](administration/customizing-gitea.md)
+create a file with path `public/robots.txt` in the [`custom` folder or `CustomPath`](administration/customizing-gitea.md)
 
 Examples on how to configure the `robots.txt` can be found at [https://moz.com/learn/seo/robotstxt](https://moz.com/learn/seo/robotstxt).
 
diff --git a/docs/content/administration/search-engines-indexation.zh-cn.md b/docs/content/administration/search-engines-indexation.zh-cn.md
index 77ad5eca2..904e6de11 100644
--- a/docs/content/administration/search-engines-indexation.zh-cn.md
+++ b/docs/content/administration/search-engines-indexation.zh-cn.md
@@ -22,7 +22,7 @@ menu:
 
 ## 使用 robots.txt 阻止搜索引擎索引
 
-为了使 Gitea 为顶级安装提供自定义的`robots.txt`(默认为空的 404),请在[`custom`文件夹或`CustomPath`](administration/customizing-gitea.md)中创建一个名为 `robots.txt` 的文件。
+为了使 Gitea 为顶级安装提供自定义的`robots.txt`(默认为空的 404),请在 [`custom`文件夹或`CustomPath`](administration/customizing-gitea.md)中创建一个名为 `public/robots.txt` 的文件。
 
 有关如何配置 `robots.txt` 的示例,请参考 [https://moz.com/learn/seo/robotstxt](https://moz.com/learn/seo/robotstxt)。