SEO-related files should reside in the root folder of your site and are used by search engine spiders during the crawling of your site.
1. 'sitemap.xml' file. This file represents the page architecture of your site and speeds up the search engine indexing of all pages throughout your site. We'd recommend you to re-generate and re-submit a new sitemap to search engines whenever you apply major changes to your site.
2. 'robots.txt' file. This file tells crawlers which directories can or cannot be crawled. Uploading such a file can also serve as an invitation for search spiders to index every page in your site.
How to insert a robot.txt file in your site:
2.1. Open a text editor, such as Notepad and write these lines:
- If all pages should be crawled:
- If you don't want some of your pages to be crawled, exclude them by adding their URLs:
2.2. Save the file as 'robots.txt'
2.3. Upload it as a static file in Control Panel > Site Properties > Special Features > Show Options > Site Static Files.
Note: The Static File feature is only available for certain plans. Please check your Site Properties to ensure that your plan includes the static file upload feature.