Now, Create 'robots.txt' report at your root directory. copy above text and paste into the textdocument.
Robots.txt Generator generates a document this is very a great deal contrary of the sitemap which suggests the pages to be blanketed, consequently, robots.txt syntax is of top notch significance for any website. every time a search engine crawls any website, it constantly first looks for the robots.txt file that is located on the area root level. while recognized, crawler will read the file, after whichidentify the files and directories that can be blocked.