Add Blogger custom robots.txt for better seo

Robots.txt file is an important issue for the better seo for any website. This file allow the search engine to crawl or not to your site.How to add Blogger custom robots.txt in blogger So it has a terrific importance for all blogger. After creating your blog, you should set it first. In our previous post about Blogger dashboard setting , we described this Robots.txt file shortly. Now see this issue details. Don’t avoid this if you want to get a successful blogging career.

What is Robots.txt ?

Robots.txt is a .txt file in which some codes are added and it is very important for the search performance of a site. This Robots.txt file instruct search engine robot how to search or crawl an individual site. By adding codes in this file you can control search performance of your site. You can instruct search engines which page or content it can crawl and which is not. In Blogger this file is known as ‘Custom Robots.text’. For crawling and indexing by the search properly, you should add Robots.txt in blogger in perfect way. It is a very sensitive issue for the search engine performance. If you add your Robots.txt file wrong way, you may face a penalty from search engine. Even it can ruin your blogger career.

A SEO friendly ideal Blogger custom Robots.txt

User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: http://www.bloggermaking.com/sitemap.xml

Explanations :
1) User-agent: Mediapartners-Google

It is a Google adsense robots code. Who is using Google Adsense, they can use this code for getting better ads impression in your site. It will help you by earing highest revenue from Adsense. If you have no Adsense, then leave it.

2) User-agent : *

The above asterisk (*) mark indicates for all robots those will crawl your site. Generally in default blogger setting, Blogger labels are not crawl or index by the all robots and it is a good default setting for the blogger for performing better seo. Add below code to restrict labels search by robots in your blogger blog.

Disallow: /search

The above code indicates that robots will not index those pages which link having ‘serach’ just after the blog domain name. Generally we see that our blogger labels link is likely ‘ http://blogermaking.com/search/label. As the word ‘search’ is placed after the domain name, this link will be not index by robots. So, if you want to disallow any link to index by the robots, You can do that by adding this code. Just add code like bellow ( Example : About us page which link url http:// bloggermaking.com/about-us) To disallow this link, add bellow code in blogger custom robots.txt file.

Disallow: /about-us

3) Allow: /

It instruct the robots to index or crawl. You should allow your blog sitemap to index by robots. It is very important for seo. Jut allow your blog sitemap by this code.

Allow: /
Sitemap: http://www.bloggermaking.com/sitemap.xml

Add Blogger custom robots.txt :

1) Login your Blogger Dashboard. Now, go to the Setting options and enter the ‘Search performances’ . Here, you can see ‘Crawler and Indexing’ section where you have to add Blogger custom robots.txt. Click on ‘Edit’ option now.

Adding Blogger custom robots.txt
2) After clicking ‘Edit’, select ‘Yes’ to add Robots.txt in blogger.

How to add Blogger custom robots.txt

3) Now paste the above Robots.txt code in box (Replace http://www.bloggermaking.com to your site URL) and Lastly save it and you are done.

How to add Blogger custom robots.txt 2

How to check Blogger custom robots.txt setting of any site?

It is very simple to find robots.txt setting of any website. Just add ” /robots.txt ” after the home page URL or domain name in address bar and go to the link.

Example : http://www.bloggermaking.com/robots.txt

Share

2 Comments

  1. Amaechi July 27, 2015
    • Mahadi Hassan July 28, 2015