You can now easily allow or disallow user agents in Global Settings
Robots.txt can be updated from our Layout & Design tool. You can now easily allow or disallow user agents within Layout & Design tool's Global Settings as well.
What Is a Robots.txt File?
A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site. This is mainly used to avoid overloading your site with requests. It's not a mechanism for keeping a web page off of Google. To keep a web page off of Google, you should use noindex directives or protect your page with a password.
What Is Robots.txt Used For?
Robots.txt is primarily used to manage crawler traffic to your site, and usually to keep a page off of Google, depending on the file type.
If you have any questions about this feature, email support@rebelmouse.com .