How to Solve Couldn’t Fetch Problem of Sitemaps in Google Search Console? A Correct way to use Robotx.txt in your Blogger!



I had this problem for several weeks for my sites, and then I solved it with a change in robots.txt files for my blog/site. This problem mainly arises due to the robots.txt file, so modify the robots.txt file as I will show in this post. 

Then your problem will be definitely solved, as I guarantee you, but wait for 2-3 days after you modify the robots.txt file in your blog/site to submit your sitemaps in Google Search Console. 

This article will help you solve the Couldn't Fetch problem in your blog and provide you with the correct robots.txt file to add to your site/blog. Use this robots.txt file so that you will allow Google to crawl everything inside your blog.

Photo: Sample Photo of Couldn't fetch problem in Google Search Console.



Step by Step Guide to Solve COULDN'T FETCH Problem of Google Search Console:-


#1. First of all, REMOVE any custom robots.txt file if already present in your blog/site. Then we will add the new one now.





#2. Then go to this website ( https://www.labnol.org/blogger/sitemap/ ) to create a sitemap for your blog/site. Then enter your site/blog link in the sitemap generator and then copy the code in notepad.

Copy only this part as shown below from the sitemap generator. The code seems like this:- 

User-agent: *
Disallow: /search
Allow: /
Sitemap: https://www.your-site-name/atom.xml?redirect=false&start-index=1&max-results=500





#3. Then from the above-copied code, REMOVE the line: ( Disallow: /search ), and the robots.txt file is modified now a shown below:-

User-agent: *
Allow: /
Sitemap: https://www.your-site-name/atom.xml?redirect=false&start-index=1&max-results=500
Sitemap: https://www.your-site-name/atom.xml?redirect=false&start-index=501&max-results=1000


This is the correct robots.txt file that allows Google Search Console to crawl everything in your site/blog.


#4. Then copy the modified robots.txt code for your blog/site, and this has to be pasted in your blog custom robots.txt. 

The Couldn’t fetch problem mainly arises due to the line of the robotx.txt : ( Disallow: /search )


#5. Now paste the modified robots.txt in your blog and move to the Google search console for adding the sitemaps. 

Remove the sitemaps previously added, which showed the Couldn’t fetch problem. 
Add new sitemaps after 2-3 days only after modifying the robots.txt.




#6. After 2-3 days, it will show success in the Google search console for your sitemaps. Try to do it immediately, too; if not solved immediately, try after 2-3 days. It will definitely solve your Couldn’t fetch problem of Sitemaps in Google Search Console.

Remember, you have to remove the previously added Sitemap link before adding the new one.




If still not solved after 2-3 days too, mail me at  helpforag@gmail.com  and help further in the mail.


Also, Watch this Step by Step Guide Video for details:-



Previous Post Next Post