Thursday, March 21, 2019

How to Add Custom Robots.txt File in Blogger

How to Add Custom Robots.txt file in blogger?  Welcome to in this article I will be showing you How you can put your custom robots.txt file on your blogger hosted website.

add custom robots.txt file

Custom Robots.txt files help every website to get indexed. The main importance of custom robots is to set which pages should be crawled and indexed from your website. When spiders and bots get to your sitemap of your blogger, they will first search for this Robots.txt file. This will let the spiders know which pages should be crawled and indexed.

Also Read: 
How to put ads in between posts

This will let the custom pages indexed from your site. This crawling and indexing happens only when you submit your sitemap in Google Webmaster Tools or Bing Webmaster Tools or other Webmaster Tools.

So, let's get started on How to create custom Robots.txt file.

Step 1:

Copy below code

User-agent: Mediapartners-Google
User-agent: *
Disallow: /search
Allow: /

Here change the my-blog-name to your blog name.

Now, go to Search Preferences in the Settings Section

In crawlers and indexing section, enable CustomRobots.Txt File and paste the above code

add custom robots.txt file

#User-agent: Mediapartners-Google :- this for google to Provide better ads on blogger blog.
#User-agent: * :- this means our blogger blog or website can crawl all search engines. It means visible to all search engines.
#Disallow: /search :- this line tells all search engines to not crawl this post who related /search post and pages.
#Allow: / :- this line tells all search engines to crawl and Allow  homepage.
#Sitemap :- we add our Blogger blog sitemap.

No comments:

Post a Comment