Crawlers & Indexing Settings of Blogger | Enter Robots.txt in Blogger
Crawlers & Indexing setting of Blogger | Enter Robots.txt in Blogger | By-Tech Z |
This blog post will teach you how to set up Crawlers & Indexing in Blogger.
We will divide this section into 3 different parts and set up them.
first, go to the blogger setting and then find these buttons.
- Custom Robots.txt
- Custom Robots Header Tags
- Google Search Console
Crawlers and indexing
01. Custom Robots.txt
The vital part of crawlers & index settings of bloggers is custom robots.txt. The Custom Robots.txt is a simple text file that indicates search engine crawlers & robots which pages, sections, folders, and URLs of a website are allowed to be crawled/traversed and which are not.
Before a web robot or crawler wants to visit a URL it first visits Robot.txt and checks whether it is permitted to crawl or not.
It is not a good SEO practice to allow crawlers and web robots to unnecessarily crawl admin-side folders, URLs, and public URLs with search labels. It can affect the performance of your Blogger blog.
This is the most important and perfect Robots.txt code for a Blogger blog. Copy the code and paste it into the Custom Robots.txt, according to the next steps.
In Sitemap replace highlight “example” with your blogger domain name like “https://techz24web.blogspot.com”.
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: https://example.blogspot.com/sitemap.xml
- The first two lines of robot.txt instruct that User-agent: Mediaparnters-Google & Disallow instructs web robots that if a blog is monetized with Google AdSense, ads can appear on all pages of the blog.
- The other lines of tobot.txt instruct that User-agent: * & Disallow: /search guides web-robots to ignore all those Blogger URLs having the keyword “/search/” like “https://www.example.blogspot.com/search/label/Blogger”.
- The Allows: / instructs web robots that they can visit the homepage and all other pages of your blog which are not disallowed in Robots.txt.
- The above code which is known as Sitemap is an XML file that informs search engines like Google, Bing, etc, about pages on a site that are available for crawling.
Follow the steps to add the above code in the Robots.txt file.
- First, turn on the custom robots.txt.
- Click the Custom robots.txt.
- Paste the code of Custom robot.txt in the text area. The code is given above which is shaded by color.
- Click SAVE.
- In order to check your Custom robots.txt, open your blogpost and add /robots.txt at the end of your blogpost URL and press enter like https://example.blogspot.com/robots.txt.And then You will see the same code of your Custom robots.txt on your browser page.
02. Custom Robots Header Tags
In this part of blogpost, you will learn how to set up Custom Robots Header Tags.
There are 3 types of Custom Robots Header Tags in Blogger:
- Home page tags
- Archive and search page tags
- Post and page tags
01. Home page tags
click on home page tags and follow the screenshot to complete your blogger setting.
First Enable your custom robots header tags.
- Enable All and Noodp buttons.
- Click SAVE
02. Archive and search page tags
- Enable noindex and noodp button.
- Click SAVE
03. Post and page tags
- Enable all and noodp.
- Click SAVE.
1 Comments