What is robots.txt?
A robots.txt file helps indicate to search engine crawlers which parts of your site you don’t want them accessing. The file uses a small set of commands that can be used to indicate access to your site by section and by specific kinds of web crawlers (such as mobile crawlers vs desktop crawlers). In practice, robots.txt files indicate whether these crawlers can or cannot crawl parts of your marketplace. These crawl instructions are specified by “disallowing” or “allowing” the behavior of certain (or all) crawlers. To learn more, Google has an in-depth guide on robots.txt files
Do I need to configure my robots.txt file?
By default, your Arcadier marketplace pages & items listing will be fully accessible for search engine crawlers to crawl. This is often the case for most websites as they would want as many pages as possible to be picked up by search engines, increasing their SEO efforts. However, there are some reasons where admins would want to utilize this, for example:
- You run a private marketplace and want to block all content from Search Engines
- You are still developing your marketplace and you do not want search engines to index it until you go Live!
- You are using paid links or advertisements that need special instructions for robots
- They help you follow some Google guidelines in certain situations
How do I configure my robots.txt file?
Simply click on Settings > Analytics/SEO > Robots, then key in the relevant rules into the text area below and save it.
The inputs saved will be updated to your robots.txt file specific to your marketplace domain.
For more information on robots.txt and how to use them, refer to this guide by Google.
Comments
0 comments
Please sign in to leave a comment.