Easily Access Your Robots.Txt And Sitemap.Xml Through Amazon S3

We are pleased to announce that your store’s Robots.txt and sitemap.xml file has been shifted to the cloud and are available on Amazon S3. This way, you can easily access these files. Earlier, it used to save on the KartRocket server. This action will benefit you in the following ways:

1) Less Load On Your Server. Since these data will automatically shift to S3, there will be a lesser load on the Kartrocket server.

2) Faster Loading Time. Amazon S3 is advanced object storage that will help you retrieve any amount of data from anywhere easily.

3) Send Files Anywhere Easily. Want to send your robots.txt file or XML sitemap to your SEO analyst for a consultation? All you need is to copy the URL and send it across. It’s just quick and easy.

How To Access These Files On Cloud?

Follow these steps to access your sitemap file:

1) Log in to your KartRocket store. Go to SEO and click on Manage Sitemap.xml.

robots 1

2) The following screen will appear. You can either click on “Existing Sitemap” or choose from the uploaded file and click on down arrow action button.

robots 2

3) Your robots.txt file/xml sitemap file will open in new tab. You can also copy the link to send it to anyone easily.

Why I Need These Files On My Website?

If you are wondering what these files do and why your online store needs them, then here is the brief out of the same.


Whenever Google or any other search engine wants to rank your website, they crawl through every page of your website available online, index then, and then rank your site accordingly. Robots.txt file tells these search engines which pages to index and which one to leave. This way that particular page will not be shown on the search engine pages and users will not be able to find it from there. This file is essential both for your website’s privacy and SEO.

Why would anyone want to keep their pages out of their user’s eyes? Well, there are reasons for that.

• If you have a page that has duplicate content, you don’t want search engines to index it as it will hurt your SEO.

• There are pages like Thank You page, which you don’t want your user to access unless a specific action is taken.

• There are many private files that you would like to protect.


If you want to rank easily on popular search engines, you will need an XML sitemap. It is a document that helps Google and other search engines to understand your website. It has all the URLs which you want Google to index. It is essential so that the search engines do not miss any important information while indexing, which might hurt your SEO. Other than the URLs, it also provides information like the date of the last website update, frequency of changes in the website, and other information.

The difference between robots.txt and sitemap.xml is that the former tells search engines what not to index and the other tells what to index.


Saahil Goel

Founder & CEO at Shiprocket

Saahil Goel is Co-Founder and CEO of Shiprocket, a data-driven logistics aggregation platform that drives efficiency in India’s eCommerce logistics by connecting online retailers with logistics prov ... Read more


Leave a comment

Your email address will not be published.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Thank You for contacting us. Our executives will call
you shortly.