Robots.txt Generator For WordPress | Best robots.txt generator

Search Engine Optimization

Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

Concerned about your online content which don’t want to be index on the search engine, being index on the search engines? Well, robots.txt generator is a handy tool and it is quite impressive if the search engines visits and index your website. But, in some cases some wrong data is indexed on the search engine which you don’t want people to see.

Robots.txt Generator

Consider you have made a special data for people who have subscribed to your site, but due to some errors that data is accessible to regular people as well. And sometimes your confidential data which you don’t want anyone to see is made visible to many people. In order to overcome this issue you have to inform the websites about some specific files and folders to be kept safe using the metatag. But most of the search engines do not read all of the metatags, so to be double sure you have to use the robots.txt file.

Robots.Txt Generator

Robots.txt is a text file which tells the search robots which pages should be kept confidential and not to be viewed by other people. It is a text file so don’t compare it with an html one. Robots.txt is sometimes misunderstood as a firewall or any other password protection function. Robots.txt ensures that the necessary data which the web owner wants to keep confidential is kept away. One of the frequently asked questions regarding robots.txt file is, how to create a robots.txt file for SEO? In this article we will be answering you this question.

Robots.txt file example or the basic format:

        The Robots.txt has a proper format which should be kept in mind. If any mistake in the format is made, the search robots won’t perform any task. Below is a format for a robots.txt file:

User-agent: [user-agent name]

Disallow: [URL string not to be crawled]

           Just keep in mind that the file should be made in text format.

Robots.txt generator what is it and how to use?

Custom robots.txt generator for blogger is a tool which helps in the webmasters to protect their websites’ confidential data to be indexed in the search engines. In other words it helps in generating the robots.txt file. It has made the lives of websites owner quite easy as they don’t have to create the whole robots.txt file by their own self. They can easily make the file by using the below steps:

  • Firstly choose if you want to disallow all the robots or some robots to have access to your files.
  • Secondly, choose how much delay you require in the crawls. You can decide from 5 seconds up to 120 seconds.
  • Paste your sitemap on the generator if you have one.
  • Select which bot you want to crawl and which bot you don’t want to crawl on your site.
  • Lastly, restrict the directories. The path should contain a slash “/”.

    By using these easy steps you can easily create a robots.txt file for your website.

How to optimize your Robots.txt file for better SEO?

If you already have a robots.txt file then in order to maintain proper security of your files you have to create a proper optimized robots.txt file with no errors. The Robots.txt file should be properly examined. For a robots.txt file to be optimized for search engines you have to clearly decide what should come with the allow tag and what should come with a disallow tag. Image folder, Content folder, etc. should come with the Allow tag if you want to your data to be accessed by search engines and other people. And for the Disallow tag should come with folders like, Duplicate webpages, Duplicate content, duplicate folders, archive folders, etc. 

How to use robots.txt file generator for WordPress?

Although it is not required to create a Robots.txt file in WordPress. But in order to achieve higher SEO you are required to create a robots.txt file so that the standards are maintained. You can easily create a WordPress robots.txt file to disallow search engines to access some of your data by following the steps below:

  1. Firstly login to your hosting Dashboard like Cloudways. Cloudways is robots.txt generator WordPress.  
  2. After logging in to dashboard, Select the “Servers” tab which is located on the top right of the screen.
  3. After that open “FileZilla”, which is an FTP server application used to access WordPress document. After that connect FileZilla to a server by using “Master Credentials”.
  4. After connecting to the server go to the “Applications” tab.
  5. Return to Cloudways and from the top left go to the “Applications” tab.
  6. Select the WordPress from the applications.
  7. After logging in to the WordPress panel, select “File manager” from the left tab.

How to create robots.txt file on cpanel

  1. After that return to FileZilla and search “/applications/[Your Folder Name]/public_html.”  

How to create robots.txt for wordpress website using Cpanel?

  1. Create a new text file and name it “Robots.txt”.

Creating manual robots.txt file for WordPress on cpanel public_html

  1. After that open that file on any typing tool like, Notepad, Notepad++, etc. Since Notepad is built in you can use it.
  2. Following is a sample for creating a robots.txt file for Cloudways:

User-agent: *

Disallow: /admin/

Disallow: /admin/*?*

Disallow: /admin/*?

Disallow: /blog/*?*

Disallow: /blog/*?

      If you have a sitemap, add its URL as:

“sitemap: http://www.yoursite.com/sitemap.xml”

How to enable Robots.txt on the Blogger dashboard?

Since blogger has a robots.txt file in its system, therefore, you don’t have to hassle so much about it. But, some of its functions are not enough. For this you can easily alter the robots.txt file in blogger according to your needs by following the below steps:

  1. Firstly visit your blogger blog.
  2. After that go to the settings and click “search preferences”.
  3. From the Search preferences tab click on the “crawlers and indexing”.

Enabling the robots.txt file on blogger

         4. From this go to the “Custom robots.txt” tab and click on edit and then “Yes”.

How to Enable the robots.txt file on the blogger

     5. After that paste your Robots.txt file there to add more restrictions to the blog. You can also use a custom robots.txt blogger generator.

    6. Then save the setting and you are done.

Robots.txt file example for blogger:

     Following are some robots.txt templates:

  1. Allow everything:

User-agent: *

Disallow:                

OR

            User-agent: *

             Allow: /

  1.   Disallow everything:

   User-agent: *

    Disallow: /

  1. Disallow a specific folder:

    User-agent: *

     Disallow: /folder/

How to create the robots.txt file using SEO Magnifier?

There is no rocket science behind using SEO magnifier robots file creation tool. Simply Follow these steps for generating your file.

  • Select the "allow all" or "block all" robots from the option.

how to create robots.txt file step 1

  • Choose the "crawl delay time".

how to create robots.txt file step 2

  • Enter your website "sitemap" address like https://yoursite.com/sitemap.xml

how to create robots.txt file step 3

  • Choose your favorite search engine robots that you want "allow" or "block" separately.

how to create robots.txt file step 4

  • Add any directory that you want to restrict e.g /admin, /uploads etc.

how to create robots.txt file step 5

  • After adding all the info simply click the "create robots.txt" or "create and save as robots.txt" and you have done and them simply upload this file to your "website root directory".
  • how to create robots.txt file step 6