Old deleted sitemap still shown in webmaster tools
-
Hello
I have redisgned a website inl new url structure in cms.
Old sitemap was not set to 404 but changed with new sitemap files,also new sitemap was named different to old one.All redirections done properly
Still 3 month after google still shows me duplicate titile and metas by comparing old and new urls
I am lost in what to do now to eliminate the shown error. How can google show urls that are not shown in sitemap any more?
Looking forward to any help
Michelles
-
Hi Michelle,
So you're 404'ing the old sitemap URL yet you've placed the new sitemap at the same location...? If you want to private message me your domain, I'd be happy to take a look for you.
There should be no need to 404 anything, just replace the old sitemap and Google will do the rest. Alternatively, just recreate the new sitemap index at a new location such as domain.com/sitemaps/sitemap.xml.
Thanks
-
Is the problem that you have old URL's still indexed in Google or that Google Webmaster Tools is just displaying / accessing your old sitemap?
- Delete your old sitemap from the server.
- Delete your old sitemap from Webmaster Tools.
- Submit your new sitemap to Webmaster Tools.
- Ping your sitemap to Google here.
- Check your web analytics to see what old URL's are still being accessed.
- If the old URL's still won't leave Google's index you can either block them with robots.txt, request a index removal request within Webmaster Tools.
- You can also add your new sitemap to your robots.txt so search engines know where they should be looking.
Let me know if none of the above answers your question.
-
that's a known issue - Google Webmaster Tools is very slow in de-indexing old sitemaps, even if they are removed from GWT. I have the same issue with some sites and it's pretty annoying because it makes it harder to discover the real 404s.
also refer to this helpful article: http://www.seomoz.org/blog/how-to-fix-crawl-errors-in-google-webmaster-tools
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
XML Sitemap Question!
Hi All, I know that the sitemaps.xml URL must be findable but what about the sitemaps/pageinstructions.xml URL? Can we safely noindex the sitemaps/pageinstructions.xml URL? Thanks! Yael
Intermediate & Advanced SEO | | yaelslater0 -
Old URLs that have 301s to 404s not being de-indexed.
We have a scenario on a domain that recently moved to enforcing SSL. If a page is requested over non-ssl (http) requests, the server automatically redirects to the SSL (https) URL using a good old fashioned 301. This is great except for any page that no longer exists, in which case you get a 301 going to a 404. Here's what I mean. Case 1 - Good page: http://domain.com/goodpage -> 301 -> https://domain.com/goodpage -> 200 Case 2 - Bad page that no longer exists: http://domain.com/badpage -> 301 -> https://domain.com/badpage -> 404 Google is correctly re-indexing all the "good" pages and just displaying search results going directly to the https version. Google is stubbornly hanging on to all the "bad" pages and serving up the original URL (http://domain.com/badpage) unless we submit a removal request. But there are hundreds of these pages and this is starting to suck. Note: the load balancer does the SSL enforcement, not the CMS. So we can't detect a 404 and serve it up first. The CMS does the 404'ing. Any ideas on the best way to approach this problem? Or any idea why Google is holding on to all the old "bad" pages that no longer exist, given that we've clearly indicated with 301s that no one is home at the old address?
Intermediate & Advanced SEO | | boxclever0 -
2015/2016 Sitemaps Exclusions
Hello fellow mozrs!
Intermediate & Advanced SEO | | artdivision
Been working on a few Property (Real Estate for our American friends) websites recently and and two questions that constantly come up as we spec the site are: 1. What schema (schema.org) should the website use (throughout all pages as well as individual pages). Did anyone found that schema actually helped with their ranking/CTR?
2. Whilst setting up the sitemaps (usually Yaost is our preferred plugin for the job), what page would you EXCLUDE from the site map? Looking forward to some interesting comments.
Dan.0 -
TD*IDF analysis Tools
Hi guys, I was wondering if anyone knew of free TD*IDF analysis tools on the market? I know about onpage.org and Text-tools.net both paid. I was wondering if anyone knows of other tools? Cheers, Chris
Intermediate & Advanced SEO | | jayoliverwright1 -
Is the Tool Forcing Sites to Link Out?
Hi I have a tool that I wish to give to sites, it allows the user to get an accurate idea of their credit score with out giving away any personal data and with out having a credit search done on their file. Due to the way the tool works and to make the implementation on other peoples sites as simple as possible the tool remains hosted by me and a one line piece of Javascript code just needs to be added to the code of the site wishing to use the tool. This code includes a link to my site to call the information from my server to allow the tool to show and work on the other site. My questions are: Could this cause a problem with Google as far as their link quality goes? - Are we forcing people to give us a backlink to use the tool? (in the eyes of Google) or will Google not be able to read the Javascript / will ignore the link for SEO purposes? Should I make the link in the code Nofollow? If I should make the link a Nofollow any tips on how to make the most of the opportunity from a link building or SEO point of view? Thanks for your help
Intermediate & Advanced SEO | | MotoringSEO0 -
Canonical URLs and Sitemaps
We are using canonical link tags for product pages in a scenario where the URLs on the site contain category names, and the canonical URL points to a URL which does not contain the category names. So, the product page on the site is like www.example.com/clothes/skirts/skater-skirt-12345, and also like www.example.com/sale/clearance/skater-skirt-12345 in another category. And on both of these pages, the canonical link tag references a 3rd URL like www.example.com/skater-skirt-12345. This 3rd URL, used in the canonical link tag is a valid page, and displays the same content as the other two versions, but there are no actual links to this generic version anywhere on the site (nor external). Questions: 1. Does the generic URL referenced in the canonical link also need to be included as on-page links somewhere in the crawled navigation of the site, or is it okay to be just a valid URL not linked anywhere except for the canonical tags? 2. In our sitemap, is it okay to reference the non-canonical URLs, or does the sitemap have to reference only the canonical URL? In our case, the sitemap points to yet a 3rd variation of the URL, like www.example.com/product.jsp?productID=12345. This page retrieves the same content as the others, and includes a canonical link tag back to www.example.com/skater-skirt-12345. Is this a valid approach, or should we revise the sitemap to point to either the category-specific links or the canonical links?
Intermediate & Advanced SEO | | 379seo0 -
Sitemap not indexing pages
My website has about 5000 pages submitted in the sitemap but only 900 being indexed. When I checked Google Webmaster Tools about a week ago 4500 pages were being indexed. Any suggestions about what happened or how to fix it? Thanks!
Intermediate & Advanced SEO | | theLotter0 -
Why do old URL format are still being crawled by Rogerbot?
Hi, In the early days of my blog, I used permalinks with the following format: http://www.mysitesamp.com/2009/02/04/heidi-cortez-photo-shoot/ I then decided to change this format using .htaccess to this format: http://www.mysitesamp.com//heidi-cortez-photo-shoot/ My question is, why do rogerbot still crawls my old URL format since these urls' no longer exists in my website or blog.
Intermediate & Advanced SEO | | Trigun0