Search Engines Indexation

Search engines indexation of your Forgejo installation

By default your Forgejo installation will be indexed by search engines. If you don’t want your repository to be visible for search engines read further.

Block search engines indexation using robots.txt

To make Forgejo serve a custom robots.txt (default: empty 404) for top level installations, create a file called robots.txt at the root of the Custom File Root Path as displayed in the /admin/config page.

Examples on how to configure the robots.txt can be found at https://moz.com/learn/seo/robotstxt.

User-agent: *
Disallow: /

If you installed Forgejo in a subdirectory, you will need to create or edit the robots.txt in the top level directory.

User-agent: *
Disallow: /forgejo/

Disallow crawling archives to save disk space

If the archive files are crawled, they will be generated dynamically and kept around which can amount to a lot of disk. To prevent that from happening, add the following to the robots.txt file:

User-agent: *
Disallow: /*/*/archive/

See also a more complete example at Codeberg.