Smart Robots.txt Tools
Smart Robots.txt Tools allows you to improve and customize ‘robots.txt’ file used by search engines and other types of bots when indexing your website. Through this file you can disallow access to some areas of the website for all or only some bots.
This plugin allows you to use improved basic rules in robots.txt file with few extra predefined rules, or you can create your own custom rules that can include any number of bots user agents and rules to allow or disallow access.
When plugin is installed, it will adopt WordPress search engine visibility settings, and if your website is set hidden from search engines (WordPress uses robots file for this), plugin will be set to Deny All. You can change robots content to anything through plugin settings, but as a convenience, WordPress settings are used on installation only.