custom robot .txt is the most important thing in blogger. header tags for blogger are the latest and vital feature of Google’s BlogSpot. It is the best and easiest way to control the robot’s behaviour for your blog. By launching this feature, google makes blogger more SEO friendly and suitable for blogging Let's see How to do it
Step 1:
First of all, login to your Blogger account and go to the dashboard. If you have more than one blog in your blogger account, select the one you need to enable custom robots header tags.
Step 2:
From the left menu, Click on the “Settings > Search Preferences”. Now take a look on the right side. You will see the “Custom robots header tags” option right after the Custom robots.txt. Here you have to click on the “edit” link as shown below.
Step 3:
At this point, you have to enable this feature by clicking on the first radio button (Yes). Take a look on the image below to become clearer.
Optimum Settings for Blogger Custom Robots Header Tags
Step 4:
This is the most important part of the entire process. Actually, here we will give the instruction to robots to control their behavior for our blog by using some check boxes. Take a look on the image below.
Step 5:
I will recommend you to follow the above image to have optimum settings for Blogger custom robots header tags. It includes three sections (Homepage, Archive and Search pages and Default for Posts and Pages). Have you checked all the boxes as shown above? If yes, click on the “Save changes” button and you are done.
Let's See Meaning of that three sections
The Homepage Settings Explanation:
The homepage is the most important page of your blog. For this reason, we should not block anything on the homepage from search engine crawlers. However, some perception may be found to use “noimageindex” for the home page. In my opinion, it is an unnecessary step because Google understands the structure of a blog and the behavior of homepage.
The Archive and Search page Settings Explanation:
Actually, archive and search pages are auto generated pages which represent some information of your blog posts. In fact, these pages are very important to let your readers navigate your blog more efficiently.
if you allow search engines to index these pages, same contents will be shown under different URLs. Which may cause the duplicate content issue for our blog. Also, the overall index status will be messy. For this reason, we should not index these pages of our blog.
The Default for Posts and Pages Settings Explanation:
We can control the robot’s activity for the most important part of our blog from this section. Yes, I am talking about the post and pages. As a matter of fact, the reason behind blogging or creating great contents is letting people discover our hard work through search engines. Therefore, we should not prevent search engines from indexing our posts and pages to represent them in front of the world.
List of Custom Robots Header Tags of Blogger
there are 10 custom robots header tags in blogger. Let’s see the functionality and how they work.
1. all:
This robot header tag feature will allow search engine crawler visit and discover every single element of your blog. Therefore, if you check and enable this feature, you will give the complete freedom to the crawler to crawl and index everything.
2. noindex:
This feature is given for private blogs. If someone thinks not to share his blog publicly, this option is for them. Usually, allowing this option will prevent search engine crawler from crawling and indexing the blog. So nobody will be able to discover that b via search engines.
3. nofollow:
You may have heard the term nofollow and dofollow before. This is a critical, complex, and confusing SEO factor because improper use of these tags may increase or decrease search engine ranks of your blog. Blogger provides this feature to let you make all of the outbound links of your blog nofollow if you wish.
4. none:
If you wish to apply both noindex and nofollow tags together, you can enable this feature. It will prevent search engine crawlers from indexing and be considering outbound links dofollow.
5. noarchive:
This feature is given to control the search engine cache permission. Usually, search engines collect a cache version of your web page and show it on the search engine result page (SERP). Actually, the cache is a frequently updated copy of your web pages is used to serve your website on the downtime by search engines.
6. nosnippet:
Almost every search result includes a small snippet on the SERPs. It helps people to get an idea about the content of the web page. However, enabling this feature will prevent search engines to show that text snippet.
7. noodp:
ODP stands for “Open Directory Project” like Dmoz. Actually, you can prevent adding your website information there by enabling this feature.
8. notranslate:
This option will allow you to disable translation of your blog in different languages. Although machine translation is not hundred percent readable, we should let our readers translate the web page if they want.
9. noimageindex:
Allowing this option will prevent search engines to index images of your blog. In my opinion, it is wise to use this feature with caution. Because images are a major part of blogging and can increase the organic traffic of your blog.
10. unavailable_after:
If you wish to deindex your blog after a specific amount of time, you may check this option.
that's all let's check ,how it does work !!
Step 1:
Select "Posts Section" And Click "New post".
Step 2:
Now take a look on the right side “Post settings” > “Custom Robots Tags” and click on it.
you have successfully enabled the custom robots header tags for blogger. !!!!!!
If you Like this article Don't Forget To share