Sitemap.xml - autogenerated by CMS is full of crud
-
Hi all,
hope you can help.
the Magento ecommerce system I'm working with autogenerates sitemap.xml - it's well formed with priority and frequency parameters.
However, it has generated lots of URLs that are pointing to broken pages returning fatal erros, duplicate URLs (not canonicals), 404s etc
I'm thinking of hand creating sitemap.xml - the site has around 50 main pages including products and categories, and I can get the main page URLs listed by screaming frog or xenu.
Then I'll have to get into the hand editing the crud pages with noindex, and useful duplicates with canonicals.
Is this the way to go or is there another solution
thanks in advance for any advice
-
If the cron is working then I would personally turn to the other forum to see if anyone knows a way to rope those messy URLs in and get them under control. I try to avoid manually generating and updating sitemaps whenever I can, because it's a hassle on a small site, not to mention the trouble on an ecommerce site.
If your site is going to stay that small, then a manual sitemap might be less of a headache for you than customizing Magento.
I would worry about keeping a clean sitemap. If the search engines learn that you keep a messy sitemap, they will rely on it less and less. 404 & 500 codes especially, but also redirects and perhaps duplicate content.
For Further Reading:
Google Sitemaps Ask For Clean URLs - http://www.johnfdoherty.com/google-sitemaps-ask-for-clean-urls/
-
Hi Kane,
the sitemap is new - it's just that Magento create lots of duplicate files on the fly & it's not putting the canonical URLs in the sitemap etc.
I just wondered whether its worth hand creating a sitemap.xml containing the content pages (60 or 70 of them) for this relatively small site, or not worry too much about the sitemap, the site is pretty well indexed by google already
I'll head over to the Magento forums again to see if I can find more info
many thanks for you help
-
If it's returning 404 pages, that sounds like a dated sitemap. Have you activated the cron service?
See the "Refreshing Sitemaps at Regular Intervals" section of this page if not:
Magento can be set up to automatically refresh Google Sitemaps at regular intervals. This function is configured in Admin > System > Configuration > Google Sitemap.
To use Magento’s automatic generation of Google Sitemaps, you must activate the Magento Cron service.
If you do have that setup, and you're certain it's working correctly, then I would turn to the forums at MagentoCommerce.com - you're going to get a lot faster answer there since everyone is familiar with that exact platform.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can you keep you old HTTP xml sitemape when moving to HTTPS site wide?
Hi Mozers, I want to keep the HTTP xml sitemape live on my http site to keep track of indexation during the HTTPS migration. I'm not sure if this is doable since once our tech. team forces the redirects every http page will become https. Any ideas? Thanks
Technical SEO | | znotes0 -
Handling XML Sitemaps for Ad Classified Sites
Let's put on a scenario for a Job Classified site, So far the way we are handling xml sitemaps is in a consecutive number containing only ads historically: http://site.com/sitemap_ads_1.xml http://site.com/sitemap_ads_2.xml http://site.com/sitemap_ads_99.xml Those sitemaps are constantly updating as each ad is published, keeping expired ads but I'm sure there is a better way to handle them. For instance we have other source of content besides ads pages, like those related to search results (Careers, Location, Salary, level, type of contract, etc) and blog content, but we are not adding them yet So what I'm suggesting is to reduce the amount of xml sitemaps ads to just one, including just the ones that are active (not expired), add another xml sitemap based on search results, another one on blog content, another one on images and finally one for static content such as home, faq, contact, etc. Do you guys think this is the right way to go?
Technical SEO | | JoaoCJ0 -
Sitemap submission for site migration?
Hi mozzers, We're about to migrate 4 domains into 1. Is there a particular way I should generate and submit the sitemap or should I just follow the same protocol as for one domain? Should I even worry submitting a sitemap when the site has this drupal module? I have access to the webmaster tools of all domains, should I do something specific on the accounts that are migrating besides submitting a sitemap? Thanks for letting me know!
Technical SEO | | Ideas-Money-Art0 -
Delete or re-submit sitemaps for new products? How often?
When I add new products (approx. 10 a month), I usually delete the old sitemap and submit a new one. Is this ok to do, or should I just re-submit it with the new info included? Also, is once a month too much?
Technical SEO | | tiffany11030 -
Sitemap & noindex inconstancy?
Hey Moz Community! On a the CMS in question the sitemap and robots file is locked down. Can't be edited or modified what so ever. If I noindex a page in the But it is still on the xml sitemap... Will it get indexed? Thoughts, comments and experience greatly appreciate and welcome.
Technical SEO | | paul-bold0 -
Why did I drop ranking after setting up perm redirect, sitemap, and Google places??
I have a site that was ranking in the top two for my search terms. We had a funky url (it contained hyphens) and was advised to change it for SEO, so I setup a perm redirect through my web host (before it was a temporary one I think) At the same time I installed a sitemap plugin for Wordpress and also registered for a Google Places account. I can't remember the exact order I did this -- does it matter? Anyway, within a couple days of doing the above, my ranking dropped to the bottom of the second page. I would like to fix this, but I'm not sure. I need help please!
Technical SEO | | fsvatousek0 -
Robots.txt versus sitemap
Hi everyone, Lets say we have a robots.txt that disallows specific folders on our website, but a sitemap submitted in Google Webmaster Tools that lists content in those folders. Who wins? Will the sitemap content get indexed even if it's blocked by robots.txt? I know content that is blocked by robot.txt can still get indexed and display a URL if Google discovers it via a link so I'm wondering if that would happen in this scenario too. Thanks!
Technical SEO | | anthematic0 -
WordPress blog and XML sitemap
I have a friend that just spent 15K on a new site and believe it or not the developer did not incorporate a CMS into the site. If a WP blog is built and the URL is added to the site's XML sitemap, for all intensive purposes, would Google view this URL as part of the site in terms of overall number of links, referring domains etc.? The developer is saying that even if the WP URL is added to the XML sitemap, Google will not view this URL as part of the site domain. I cannot think of another way of adding unique content to the site unless the developer is paid to build new pages every month. If the WP blog is not part of the overall domain, then we're left with the URL simply pointing back to the domain with anchor text and such and not adding to the total number of links and RD... ANY THOUGHTS ON THIS WOULD BE GREATLY APPRECIATED! Thanks Mozzers!
Technical SEO | | hawkvt10