Custom robots.txt Setting | Blogger Setting

Custom robots.txt Setting | Blogger Setting

Hii friends today I am going to tell you about How to set Custom robots.txt Setting | Blogger Setting.
This is one of the best settings of Blogger because in this setting most of the setting work in the bases of Custom robots.txt Setting.


What is Robots.txt?


​Robots.txt is a content record which contains few lines of basic code.

It is saved money on the site or blog's server which teach the web crawlers on the most proficient method to record and slither your blog in the indexed lists.

That implies you can confine any website page on your blog from web crawlers so it can't get filed in web search tools like your blog names page, your demo page or whatever other pages that are not as critical to get filed.

Continuously recall that search crawlers examine the robots.txt record before slithering any page.
Clarification

This code is separated into three segments. Allows first investigation every one of them after that we will figure out how to include custom robots.txt record in Blogspot online journals.

Client operator: Media partners-Google


This code is for Google Adsense robots which help them to serve better promotions on your blog. Possibly you are utilizing Google Adsense on your blog or not just leave it all things considered.

Client specialist: *


This is for all robots set apart with reference mark (*). In default settings our blog's marks connections are confined to ordered via search crawlers that imply the web crawlers won't record our names page joins on account of beneath code.

Forbid:/search


That implies the connections having catchphrase search soon after the area name will be disregarded. See beneath model which is a connection of mark page named SEO.

http://www.spymock.blogspot.com/search/mark/SEO

Furthermore, on the off chance that we evacuate Disallow:/search from the above code then crawlers will get to our whole blog to record and slither the majority of its substance and pages.

Here Allow:/alludes to the Homepage that implies web crawlers can slither and record our blog's landing page.

Refuse Particular Post


Presently assume on the off chance that we need to avoid a specific post from ordering, at that point we can include beneath lines in the code.

Deny:/yyyy/mm/post-url.html

Here yyyy and mm allude to the distributing year and month of the post separately. For instance on the off chance that we have distributed a post in the year 2013 in the month of March then we need to use underneath organization.

Prohibit:/2013/03/post-url.html

To make this assignment simple, you can just duplicate the post URL and expel the blog name from the earliest starting point.

Prohibit Particular Page


In the event that we have to prohibit a specific page, at that point we can utilize a similar strategy as above. Just duplicate the page URL and expel blog address from it which will something resemble this:

Prohibit:/p/page-url.html

Sitemap: http://example.blogspot.com/encourages/posts/default?orderby=UPDATED

This code alludes to the sitemap of our blog. By including sitemap interface here we are basically advancing our blog's creeping rate.

Means at whatever point the web crawlers check our robots.txt document they will discover a way to our sitemap where every one of the connections of our distributed posts present.

Web crawlers will think that its simple to creep the majority of our posts.

Subsequently, there are better shots than web crawlers slither the majority of our blog entries without overlooking a solitary one.

Note: This sitemap will just enlighten the web crawlers regarding the ongoing 25 posts. On the off chance that you need to build the quantity of connection in your sitemap, at that point supplant default sitemap with beneath one. It will work for the initial 500 ongoing posts.

Sitemap: http://example.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500

In the event that you have in excess of 500 distributed posts in your blog, at that point you can utilize two sitemaps like underneath:

Sitemap: http://example.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500

Sitemap: http://example.blogspot.com/atom.xml?redirect=false&start-index=500&max-results=1000



How to add Custom robots.txt Setting | Blogger Setting


Step 01:- Go to your Blogger dashboard.
Step 02:- Click on the setting.
Step 03:- Click on the search preferences.


Custom robots.txt Setting | Blogger Setting

Step 04:- Click on the Custom robots.txt and click on yes button.
Step 05:- Paste this code given below.


User-agent: *
Disallow: /search
Allow: /
Sitemap: https://spymock.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500
Important replace red colour with your website url.

Step 06:- Click on the Save button. 

Now you are successfully done your custom robots.txt setting.

This was the present total instructional exercise on the most proficient method to include custom robots.txt record in blogger.
I attempted my best to make this instructional exercise as straightforward and enlightening as could reasonably be expected.

 Yet at the same time in the event that you have any uncertainty or question, at that point don't hesitate to ask me in the remark area beneath.

Make a point not to put any code in your custom robots.txt settings without thinking about it. Basically, request that I settle your questions.

 I'll let you know everything in detail.
Many appreciated folks for perusing this instructional exercise. On the off chance that you preferred it, at that point please bolster me to spread my words by sharing this post on your web-based life profiles.
 Cheerful Blogging!


If you like this content then please share with your best friends on social media.

I hope you like this...

Thank you for reading this article don't be cheap let's subscribe your email address in the subscription box.

Thank you so much...

0/Post a Comment/Comments