Why extreme drop in number of pages indexed via GWMT sitemaps?
-
Any tips on why our GWMT Sitemaps indexed pages dropped to 27% of total submitted entries (2290 pages submitted, 622 indexed)? Already checked the obvious Test Sitemap, valid URLs etc. We had typically been at 95% of submitted getting indexed.
-
Thanks, that coves it!
-
Yes, this is the norm. You will generally have a variety of update frequencies in your xml sitemap. If you look at your sitemap you will usually see a value from 0.1 to 1.0. Those request the frequency in which the page is updated. If Googlebot will generally adhere to your guidelines and only crawl those pages when you tell them they are updated. If all of your pages are set to the same frequency, which they shouldn't be, Google will generally only crawl a certain amount of data on your site on a given crawl. So, a slow increase in indexed pages is the norm.
-
Yes, looking back at change logs was helpful. Canonical tags was it! We found a bug, the canonical page tags were being truncated at 8 characters. The number of pages indexed has started to increase rather than decrease, so it appears the issue is resolved. But I would have thought the entire sitemap would get indexed once the issue was resolved, rather than small increases each day. Does that seem correct to have a slow increase back to normal, rather than getting back to nearly 100% indexed overnight?
-
Do you have the date of the change? Try to see if you can see the when the change happened because we might be able to figure it out that way too.
WMT > sitemaps > webpages tab
Once you find the date you may be able to go through your notes and see if you've done anything around that date or if Google had any sort of update (PageRank just updated).
I have had sites that had pages unindexed and then a few crawls later it got reindexed. I just looked at 20 sites in our WMT and all of our domains look good as far as percentage of submitted vs indexed.
Only other things I can think of is to check for duplicate content, canonical tags, noindex tags, pages with little or no value (thin content) and (I've done this before) keep your current sitemap structure but add an additional sitemap with all of your pages and posts to it. Don't break it down, just add it all to one sitemap. I've had that work before for a similar issue but that was back in 2010. Multiple sitemaps for that site never seemed to work out. Having it all on one did the trick. The site was only about 4,000 pages at the time but I thought I would mention it. I haven't been able to duplicate the error and no other site has had that problem but that did do the trick.
Definitely keep an eye on it over the next few crawls. Please let us know what the results are and what you've tried so we can help troubleshoot.
-
We use multiple site maps.
Thanks, I had not thought about page load speed. But it turned up okay. Had already considered your other suggestions. Will keep digging. Appreciate your feedback. -
Not sure why the drop but are you using just one sitemap or do you have multiple ones?
Check the sizes of your pages and the crawl rate that Google is crawling your site. If they have an issue with the time it takes them to crawl your sitemap, it will start to reduce the number of indexed pages it serves up. You can check your crawl stats by navigating to WMT, crawl > crawl stats. Check to see if you've notice any delays in the numbers.
Also, make sure that your robots.txt isn't blocking anything.
Have you checked your site with a site: search?
These are pretty basic stuff but let us know what you've looked into so we can help you more. Thanks.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Getting Google to index our sitemap
Hi, We have a sitemap on AWS that is retrievable via a url that looks like ours http://sitemap.shipindex.org/sitemap.xml. We have notified Google it exists and it found our 700k urls (we are a database of ship citations with unique urls). However, it will not index them. It has been weeks and nothing. The weird part is that it did do some of them before, it said so, about 26k. Then it said 0. Now that I have redone the sitemap, I can't get google to look at it and I have no idea why. This is really important to us, as we want not just general keywords to find our front page, but we also want specific ship names to show links to us in results. Does anyone have any clues as to how to get Google's attention and index our sitemap? Or even just crawl more of our site? It has done 35k pages crawling, but stopped.
Intermediate & Advanced SEO | | shipindex0 -
Removing massive number of no index follow page that are not crawled
Hi, We have stackable filters on some of our pages (ie: ?filter1=a&filter2=b&etc.). Those stacked filters pages are "noindex, follow". They were created in order to facilitate the indexation of the item listed in them. After analysing the logs we know that the search engines do not crawl those stacked filter pages. Does blocking those pages (by loading their link in AJAX for example) would help our crawl rate or not? In order words does removing links that are already not crawled help the crawl rate of the rest of our pages? My assumption here is that SE see those links but discard them because those pages are too deep in our architecture and by removing them we would help SE focus on the rest of our page. We don't want to waste our efforts removing those links if there will be no impact. Thanks
Intermediate & Advanced SEO | | Digitics0 -
Is this a good sitemap hierarchy for a big eCommerce site (50k+ pages).
Hi guys, hope you're all good. I am currently in the process of designing a new sitemap hierarchy to ensure that every page on the site gets indexed and is accessible via Google. It's important that our sitemap file is well structured, divided and organised into relevant sub-categories to improve indexing. I just wanted to make sure that it's all good before forwarding onto the development team for them to consider. At the moment the site has everything thrown into /sitemap.xml/ and it exceeds the 50k limit. Here is what I have came up with: A primary sitemap.xml referencing other sitemap files, each of the following areas will have their own sitemap of which is referenced by /sitemap.xml/. As an example, sitemap.xml will contain 6 links, all of which link to other sitemaps. Product pages; Blog posts; Categories and sub categories; Forum posts, pages etc; TV specific pages (we have a TV show); Other pages. Is this format correct? Once it has been implemented I can then go ahead and submit all 6 separate sitemaps to webmaster tools + add a sitemap link to the footer of the site. All comments are greatly appreciated - if you know of a site which has a good sitemap architecture, please send the link my way! Brett
Intermediate & Advanced SEO | | Brett-S0 -
Cache and index page of Mobile site
Hi, I want to check cache and index page of mobile site. I am checking it on mobile phone but it is showing the cache version of desktop. So anybody can tell me the way(tool, online tool etc.) to check mobile site index and cache page.
Intermediate & Advanced SEO | | vivekrathore0 -
Pages are being dropped from index after a few days - AngularJS site serving "_escaped_fragment_"
My URL is: https://plentific.com/ Hi guys, About us: We are running an AngularJS SPA for property search.
Intermediate & Advanced SEO | | emre.kazan
Being an SPA and an entirely JavaScript application has proven to be an SEO nightmare, as you can imagine.
We are currently implementing the approach and serving an "escaped_fragment" version using PhantomJS.
Unfortunately, pre-rendering of the pages takes some time and even worse, on separate occasions the pre-rendering fails and the page appears to be empty. The problem: When I manually submit pages to Google, using the Fetch as Google tool, they get indexed and actually rank quite well for a few days and after that they just get dropped from the index.
Not getting lower in the rankings but totally dropped.
Even the Google cache returns a 404. The question: 1.) Could this be because of the whole serving an "escaped_fragment" version to the bots? (have in mind it is identical to the user visible one)? or 2.) Could this be because we are using an API to get our results leads to be considered "duplicate content" and that's why? And shouldn't this just result in lowering the SERP position instead of a drop? and 3.) Could this be a technical problem with us serving the content, or just Google does not trust sites served this way? Thank you very much! Pavel Velinov
SEO at Plentific.com1 -
Should pages with rel="canonical" be put in a sitemap?
I am working on an ecommerce site and I am going to add different views to the category pages. The views will all have different urls so I would like to add the rel="canonical" tag to them. Should I still add these pages to the sitemap?
Intermediate & Advanced SEO | | EcommerceSite0 -
Google is indexing the wrong page
Hello, I have a site I am optimizing and I cant seem to get a particular listing onto the first page due to the fact google is indexing the wrong page. I have the following scenario. I have a client with multiple locations. To target the locations I set them up with URLs like this /<cityname>-wedding-planner.</cityname> The home page / is optimized for their port saint lucie location. the page /palm-city-wedding-planner is optimized for the palm city location. the page /stuart-wedding-planner is optimized for the stuart location. Google picks up the first two and indexes them properly, BUT the stuart location page doesnt get picked up at all, instead google lists / which is not optimized at all for stuart. How do I "let google know" to index the stuart landing page for the "stuart wedding planner" term? MOZ also shows the / page as being indexed for the stuart wedding planner term as well but I assume this is just a result of what its finding when it performs its searches.
Intermediate & Advanced SEO | | mediagiant0 -
Sitemap contains Meta NOINDEX pages - Good or bad?
Hi, Our sitemap is created by our e-commerce software - Magento - We are probably going to make a lot of products Meta No Index for the moment, until all the content has been corrected on them - but by default, as they are enabled, they will appear in Sitemap. So, the question is: "Should pages that are Meta NOINDEX be listed in a sitemap"? Does it matter? thanks!
Intermediate & Advanced SEO | | bjs20100