Easily Access Your Robots.Txt And Sitemap.Xml Through Amazon S3

We are pleased to announce that your store’s Robots.txt and sitemap.xml file has been shifted to the cloud and is available on Amazon S3. This way, you can easily access these files. Earlier, it used to save on KartRocket server. This action will benefit you in the following ways:

1) Less Load On Your Server. Since, these data will automatically shift to S3, there will be lesser load on Kartrocket server.

2) Faster Loading Time. Amazon S3 is an advance object storage which will help you retrieve any amount of data from anywhere easily.

3) Send Files Anywhere Easily. Want to send your robots.txt file or xml sitemap to your SEO analyst for consultation? All you need is to copy the URL and send it across. It’s just quick and easy.

How To Access These Files On Cloud?

Follow these steps to access your sitemap file:

1) Login to your KartRocket store. Go to SEO and click on Manage Sitemap.xml.

robots 1

2) The following screen will appear. You can either click on “Existing Sitemap” or choose from the uploaded file and click on down arrow action button.

robots 2

3) Your robots.txt file/xml sitemap file will open in new tab. You can also copy the link to send it to anyone easily.

Why I Need These Files On My Website?

If you are wondering what these files do and why your online store needs them, then here is the brief out of the same.


Whenever Google or any other search engines wants to rank your website, they crawl through every page of your website available online, index then and then rank your site accordingly. Robots.txt file tell these search engines which pages to index and which one to leave. This way that particular page will not be shown on the search engine pages and users will not be able to find it from there. This file is essential both for your website’s privacy and SEO.

Why would anyone want to keep their pages out of their user’s eyes. Well, there are reasons for that.

• If you have a page has duplicate content, you don’t want search engines to index it as it will hurt your SEO.

• There are pages like Thank You page, which you don’t want your user to access unless a specific action is taken.

• There are many private files which you would like to protect.


If you want to rank easily on popular search engines, you will need an xml sitemap. It is a document which helps Google and other search engines to understand your website. It has all the URLs which you want Google to index. It is essential so that the search engines does not miss any important information while indexing, which might hurt your SEO. Other than the URLs, it also provides information like date of the last website update, frequency of changes in the website and other information.

The difference between robots.txt and sitemap.xml is that the former tells search engines what not do index and other tells what to index.


Leave a comment

Your email address will not be published. Required fields are marked *

Thank You for contacting us. Our executives will call
you shortly.