One of the first things you should do after you create your blog using Google’s Blogger platform, is to add a XML sitemap file on it. This file helps search engines like Google or Bing to find content pages on your blog, which may have been missed by the automatic bots crawling websites on a scheduled basis.
In general, the XML sitemap file is a list, or directory, which contains all the data regarding your pages of the blog like the URL, date when it was update or posted. If we’re talking about blogs hosted and created using the Blogger platform, it will keep only 26 of your most recent blog posts.
If you add only two or three posts a week, everything should be fine as the Google bot will crawl your website and index all the pages. Things get complicated if you add 10 or more posts per day on your blog because it is very likely that some of them will not be indexed, due to reaching the 26 posts limit of the default XML sitemap.
Here is how you can add a complete XML sitemap to your blog so that search engines can always find and index your pages with content. Don’t worry if you have a paid domain as this trick works for both 3rd party and blogspot.com hosted blogs, as long as you are still using the Blogger platform for the paid one.
Go to the following Sitemap Generator and paste in the address of your blog and then click on the Create Sitemap button. This very neat tool created by Amit Agarwal, a popular blogger from India, will read your blog and then generate the required information to create a complete sitemap.
Next you should log on to your blog and open the Search Preferences option found in the Settings menu. Here you should enable the Custom robots.txt found in the Crawling and Indexing section. Once done, go back to the Sitemap Generator and copy the information generated and paste it in the custom robots.txt.
The last step is to save the custom robots.txt so that Google can discover your complete XML sitemap and index all the pages on your blog.