Multiple Sitemaps
-
Hello everyone!
I am in the process of updating the sitemap of an ecommerce website and I was thinking to upload three different sitemaps for different part (general/categories and subcategories/productgroups and products) of the site in order to keep them easy to update in the future.
Am I allowed to do so? would that be a good idea?
Open to suggestion
-
Right! I think now I have the complete picture and I can crack on working on it!
Thank you very much indeed!
Best Regards
Oscar
-
If you are talking about the sitemap for the visitors on your website, if you think the newly added pages are going to be helpful to them, you can update your visitors sitemap accordingly. But the Sitemap.xml file is a supplemental indexing tool meant for the search engines to find the pages on your website easily and needs to be updated and resubmitted to search engines using webmaster tools accounts whenever new pages are added to your website.
Hope that helps.
Best,
Devanur Rafi
-
Thanks a lot guys!
I really appreciated your help, although all this information made me realize I have tons of work to do to update the sitemaps and I have to start creating new ones.
Just another question, after I create the new sitemaps I will also have to update the sitemap on the website, is that right?
-
it should be added to the end of your robots.txt and be proceeded by 'Sitemap', like:
Sitemap: http://www.exmaple.com/sitemap1.xml
Sitemap: http://www.exmaple.com/sitemap2.xml -
No problem my friend. You are most welcome. Yes, you just need to give the location of your sitemap.xml file as given below:
Sitemap: http://example.com/sitemap_location.xml Here you go for more: https://support.google.com/webmasters/answer/183669?hl=en
-
Oh I see, thank you very much for your help, I haven't got much experience dealing with sitemaps.
So in order to put them in the robots.txt I will just have to put the link in it without anything else, is that right?
-
Hi there, robots.txt file is one of the initial things that search engine spiders look at when they visit your website and a reference to the Sitemap.xml file in there will aid the search engine spider to quickly access to important URLs on your website then and there.
Best,
Devanur Rafi
-
Why should I put the sitemaps in the robots.txt?
I ve been looking around and some sites do and some don't, what's the reason for it?
-
Thanks for the response my friend. The problem without an index sitemap file is, when you have to resubmit multiple sitemap.xml files in webmaster tools account, you will have to resubmit each of them at a time. With an index sitemap file, you just need to submit the index file and it would take care of the job.
Here you go for more: https://support.google.com/webmasters/answer/71453?hl=en
Best,
Devanur Rafi
-
You don't actually need to use a sitemap index file to use multiple sitemaps. You can list and submit them separately in robots.txt file and Google Web Master Tools.
-
Yes this is fine, from Google:
Whether you list all URLs in a single Sitemap or in multiple Sitemaps (in the same directory of different directories) is simply based on what's easiest for you to maintain. We treat the URLs equally for each of these methods of organization. More info can be found here multiple sitemaps in same directory
-
Hi there, though a single Sitemap.xml file can accommodate upto 50K URLs, it is not uncommon to go for multiple Sitemap.xml files for many purposes even with few hundreds on each.
You need to come up with a total of 4 Sitemap files and one among these would be an index sitemap that lists the other 3 Sitemap.xml files with URLs.
Here you go for more: http://www.sitemaps.org/protocol.html
Best,
Devanur Rafi
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console Showing 404 errors for product pages not in sitemap?
We have some products with url changes over the past several months. Google is showing these as having 404 errors even though they are not in sitemap (sitemap shows the correct NEW url). Is this expected? Will these errors eventually go away/stop being monitored by Google?
Technical SEO | | woshea0 -
Sitemap
I have a question for the links in a sitemap. Wordpress works with a sitemap that first link to the different kind of pages: pagesitemap.xml categorysitemap.xml productsitemap.xml etc. etc. These links on the first page are clickable. We have a website that also links to the different pages but it's not clickable, just a flat link. Is this an issue?
Technical SEO | | Happy-SEO0 -
Do I submit a sitemap for a highly dynamic site or not? If so, what's the best way to go about doing it?
I do SEO for online boutique marketplace. I've been here for about 4 weeks and no one's done there SEO (they've been around for about 5 years), so there's lots to do. A big concern is whether or not to submit a sitemap, and if I do submit one, what's the best way to go about doing one.
Technical SEO | | Jane.com0 -
Recovering from Sitemap Issues with Bing
Hi all, I recently took over SEO efforts for a large e-commerce site (I would prefer not to disclose). About a month ago, I began to notice a significant drop in traffic from Bing and uncovered in Bing Webmaster Tools that three different versions of the sitemap were submitted and Bing was crawling all three. I removed the two out of date sitemaps and re-submitted the up to date version. Since then, I have yet to see Bing traffic rebound and the amount of pages indexed by Bing is still dropping daily. During this time there has been no issue with traffic from Google. Currently I have 1.3 million pages indexed by Google while Bing has dropped to 715K (it was at 755K last week and was on par with Google several months ago). I know that no major changes have been made to the site in the past year so I can't point to anything other than the sitemap issue to explain this. If this is indeed the only issue, how long should I expect to wait for Bing to re-index the pages? In the interim I have been manually submitting important pages that aren't currently in the index. Any insights or suggestions would be very much appreciated!
Technical SEO | | tdawson090 -
Best Practices for adding Dynamic URL's to XML Sitemap
Hi Guys, I'm working on an ecommerce website with all the product pages using dynamic URL's (we also have a few static pages but there is no issue with them). The products are updated on the site every couple of hours (because we sell out or the special offer expires) and as a result I keep seeing heaps of 404 errors in Google Webmaster tools and am trying to avoid this (if possible). I have already created an XML sitemap for the static pages and am now looking at incorporating the dynamic product pages but am not sure what is the best approach. The URL structure for the products are as follows: http://www.xyz.com/products/product1-is-really-cool
Technical SEO | | seekjobs
http://www.xyz.com/products/product2-is-even-cooler
http://www.xyz.com/products/product3-is-the-coolest Here are 2 approaches I was considering: 1. To just include the dynamic product URLS within the same sitemap as the static URLs using just the following http://www.xyz.com/products/ - This is so spiders have access to the folder the products are in and I don't have to create an automated sitemap for all product OR 2. Create a separate automated sitemap that updates when ever a product is updated and include the change frequency to be hourly - This is so spiders always have as close to be up to date sitemap when they crawl the sitemap I look forward to hearing your thoughts, opinions, suggestions and/or previous experiences with this. Thanks heaps, LW0 -
Best practice for XML sitemap depth
We run an eCommerce for education products with 20 or so subject based catalogues (Maths, Literacy etc) and each catalogue having numerous ranges (Counting, Maths Games etc) then products within those. We carry approximately 15,000 products. My question is around the sitemap we submit - nightly - and it's depth. It is currently set to cover off home, catalogues and ranges plus all static content (about us etc). Should we be submitting sitemaps to include product pages as well? Does it matter or would it not make much difference in terms of search. Thanks in advance.
Technical SEO | | TTS_Group0 -
Local SEO best practices for multiple locations
When dealing with local search for a business with multiple locations, I've always created an individual page for each location. Aside from the address and business name being in there, I also like to make sure the title tag and other important markup features the state/city/suburb, or, in the case of hyper-local, hyper-competitive markets, information more specific than that. It's worked very well so far. But, the one thing you can always count on with Local is that the game keeps changing. So I'd like to hear what you think... How do you deal with multiple locations these days? Has Google (and others, of course) advanced far enough to not mess things up if you put multiple locations on the same page? (Do I hear snickers? Be nice now) How does Schema.org fit in to your tactics in this area, if at all? Cheers (Edit: dear SEOmoz, stop eating my line breaks)
Technical SEO | | BedeFahey0 -
301 redirects inside sitemaps
I am in the process of trying to get google to follow a large number of old links on site A to site B. Currently I have 301 redirects as well a cross domain canonical tags in place. My issue is that Google is not following the links from site A to site B since the links no longer exist in site A. I went ahead and added the old links from site A into site A's sitemap. Unfortunately Google is returning this message inside webmaster tools: When we tested a sample of URLs from your Sitemap, we found that some URLs redirect to other locations. We recommend that your Sitemap contain URLs that point to the final destination (the redirect target) instead of redirecting to another URL. However I do not understand how adding the redirected links from site B to the sitemap in site A will remove the old links. Obviously Google can see the 301 redirect and the canonical tag but this isn't defined in the sitemap as a direct correlation between site A and B. Am I missing something here?
Technical SEO | | jmsobe0