In order to instruct search engines to crawl your site and where to crawl and index, having a Robot.txt virtual file is essential. Basically, these are the instructions given to search engine robots of where to go and what to index or not index.
To enable Robot.txt, make sure you have enabled the SEO service from the 10Web dashboard. Once this is enabled, it will appear in your WP dashboard. The Robot.txt tab will become activated once you have authenticated the Google Search Analytics. Once this has been done, in your WordPress dashboard, go to SEO by 10Web, click on Settings then on the robots.txt tab. To enable, check the Enable robots.txt virtual file, checkbox. In the box field, you have the option to block certain robots from crawling and indexing your website, or allow them to crawl and index your website. If needed, you also have a Flush Permalinks button, allowing you to clear previous permalink structure as needed. When done, click Save.