0
It is important to do the proper SEO settings for Blogspot blogs. In this post, we are providing BlogSpot Advanced SEO: Custom Robots Header File Settings. Because it is very helpful to follow this tutorial if you are using the Blogger platform.
blogger advanced SEO

Compared to Wordpress Blogger also have some features to do the SEO settings. But we need to enable the Advanced SEO settings in blogger. One is Custom Robots.txt in Blogger. So learn How to enable custom robots header tags in blogger.

How To Enable Advanced SEO Settings In Blogger


It is a very simple process to enable these settings in blogger. First Login to your Blogger Dashboard, then click on the settings like in below image. Click on the Search Preferences in order to enable SEO settings.

Blogger >> Settings >> Search Preferences
custom robots settings

Then check the Custom robots.txt section. Here you need to enable that option, immediately you can see one box appeared in that particular section. See below image for better understanding.

my blog robots file

Before beginning this tutorial, we need to know the details of Robots.txt file. Because this is a crucial step to customize this file according to our choice.

What is Robots.txt ?


Robots.txt is a text file which contains some simple piece of code. Everyone knows that Web crawlers will crawl the websites. Usually, this file saved on the websites/blogs which gives instructions to web crawlers to crawl and index our blog on search results. Which means we are able to restrict any page like demo pages or labels pages on our blog from web crawlers.

Every blog hosted on blogger has default robots.txt file like below.

User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: https://www.yourdomain.com/atom.xml?redirect=false&start-index=1&max-results=500

Explanation of Above Code

Basically, we can divide the robots.txt code into three parts. We have provided the whole explanation in a simple way.

User-agent: Mediapartners-Google


This command belongs to the Google Adsense to serve the better adds to any website/blog. If you are not using Adsense the simply remove this code.

User-agent *


This will call the all robots marked with (*). In blogger default settings, our blog labels are restricted to indexed by search engines. This will happen because of the below code.

Disallow: /search

If you remove the above code, then web crawlers will crawl our entire blog and contents.

If you want to restrict the crawlers to index the particular post from indexing you can use below code.

Disallow: /yyyy/mm/post-url.html

Sitemap: https://www.yourdomain.com/atom.xml?redirect=false&start-index=1&max-results=500


This is referred to the sitemap of your blog. Means it is the file which has our blog post links and content. Web crawlers will easily find our posts by using this code in the robots.txt file.

How To Check Your Robots.txt File


Simply use the below link format check your blog robots file. You can also get it by adding /robots.txt at the end of your blog address.

https://www.yourdomain.com/robots.txt

The next part is setting the robots header tags. Once check the above image to enable the robots header tags in blogger dashboard. In our next post, we will give the Custom Robots Header Tags settings for blogger. 

Post a Comment

 
Top