Discrepency between # of pages and # of pages indexed
-
Here is some background:
- The site in question has approximately 10,000 pages and Google Webmaster shows that 10,000 urls(pages were submitted)
2) Only 5,500 pages appear in the Google index
3) Webmaster shows that approximately 200 pages could not be crawled for various reasons
4) SEOMOZ shows about 1,000 pages that have long URL's or Page Titles (which we are correcting)
5) No other errors are being reported in either Webmaster or SEO MOZ
6) This is a new site launched six weeks ago. Within two weeks of launching, Google had indexed all 10,000 pages and showed 9,800 in the index but over the last few weeks, the number of pages in the index kept dropping until it reached 5,500 where it has been stable for two weeks.
Any ideas of what the issue might be? Also, is there a way to download all of the pages that are being included in that index as this might help troubleshoot?
-
It's not exactly 3 clicks... if you're a PR 10 website it will take you quite a few clicks in before it gets "tired". Deep links are always a great idea.
-
I have also heard 3 clicks from a page with link juice. So if you have deep links to a page it can help carry pages deeper in. Do you agree?
-
Thank you to all for your advice. Good suggestions.
-
We do have different types of pages but Google is indexing all category pages but not all individual content pages. Based on the replies I have received, I suspect the issue can be helped by flattening the site architecture and links.
As an FYI, the site is a health care content site so no products are sold on the site. Revenue is from ads.
-
Great tip. I have seen this happen too (e.g. forum, blog, archive and content part of the website not indexed equally).
-
Do you have areas of your site that are distinctively different in type, such as category pages and individual item pages, or individual item pages and user submitted content?
What I'm getting at is trying to find if there's a certain type of page that Google isn't indexing. If you have distinct types of pages, you can create separate site maps (one for each type of content) and see if one type of content is being indexed better than another. It's more of a diagnostics tool that a solution, but I've found it helpful for sites of that size and larger in the past.
As other people have said, it's also a new site, so the lack of links could be hindering things as well.
-
Agreed!
-
Oh yes, Google is very big on balancing and allocation of resources. I don't think 10,000 will present a problem though as this number may be too common on ecommerce and content websites.
-
Very good advice in the replies. Everyone seems to have forgotten PageRank though. In Google's random surfer model it is assumed user will at some point abandon the website (after PageRank has been exhausted). This means if your site lacks raw link juice it may not have enough to go around through the whole site structure and it leaves some pages dry and unindexed. What can help is: Already mentioned flatter site architecture and unique content, but also direct links to pages not in index (including via social media) and more and stronger links towards home page which should ideally cascade down to the rest.
-
If you don't have many links to your site yet, I think that could reduce the number of pages that Google keeps in its main index. Google may allocate less resources to crawling your site if you have very little link juice, especially if deep pages on your site have no link juice coming in to them.
Another possibility is if some of the 10,000 pages are not unique content or duplicate content. Google could send a lot of your pages to its supplemental index if this is the case.
-
If you flatten out your site architecture a bit to where all pages are no more then 3 clicks deep, and provide a better HTML sitemap you will definitely see more pages indexed. It wont be all 10k, but it will be an improvement.
-
I appreciate the reply. The HTML site map does not show all 10,000 pages and some pages are likely more than 3 deep. I will try this and see what happens.
-
Google will not index your entire 10k page site just because you submitted the links in a site map. They will crawl your site and index many pages, but most likely you will never have your entire site indexed.
Cleaning up your crawl errors will help in getting your content indexed. A few other things you can do are:
-
provide a HTML sitemap on your website
-
ensure your site navigation is solid ( i.e. all pages are reachable, no island pages, the navigation can be seen in HMTL, etc)
-
ensure you do not have deep content. Google will often only go about 3 clicks deep. If you have buried content, it won't be indexed unless it is well linked.
-
if there are any particular pages you want to get indexed, you can link to them from your home page, or ask others to link to those pages from external sites.
-
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Trying to find all internal links to a specific page (without index)
Hi guys -- Still waiting on Moz to index a page of mine. We launched a new site over two months ago. In the meantime, I really just need a list of internal links to a specific page because I want to change its URL. Does anybody know how to find that list (of internal links to 1 of my pages) without the Moz index? I appreciate the help!
Technical SEO | | marchexmarketingmcc1 -
Site Not Being Indexed
Hey Everyone - I have a site that is being treated strangely by google (at least strange to me) The site has 24 pages in the sitemap - submitted to WMT'S over 30 days ago I've manually triggered google to crawl the homepage and all connecting links as well and submitted a couple individually. Google has been parked the indexing at 14 of the 24 pages. None of the unindexed URL's have Noindex or follow tags on them - they are clearly and easily linked to from other places on the site. The site is a brand new domain, has no manual penalty history and in my research has no reason to be considered spammy. 100% unique handwritten content I cannot figure out why google isn't indexing these pages. Has anyone encountered this before? Know any solutions? Thanks in advance.
Technical SEO | | CRO_first0 -
Blog page won't get indexed
Hi Guys, I'm currently asked to work on a website. I noticed that the blog posts won't get indexed in Google. www.domain.com/blog does get indexed but the blogposts itself won't. They have been online for over 2 months now. I found this in the robots.txt file: Allow: / Disallow: /kitchenhandle/ Disallow: /blog/comments/ Disallow: /blog/author/ Disallow: /blog/homepage/feed/ I'm guessing that the last line causes this issue. Does anyone have an idea if this is the case and why they would include this in the robots.txt? Cheers!
Technical SEO | | Happy-SEO2 -
When i type site:jamalon.com to discover number of pages indexed it gives me different result from google web master tools
when i type site:jamalon.com to discover number of pages indexed it gives me different result from google web master tools
Technical SEO | | Jamalon0 -
Removing a staging area/dev area thats been indexed via GWT (since wasnt hidden) from the index
Hi, If you set up a brand new GWT account for a subdomain, where the dev area is located (separate from the main GWT account for the main live site) and remove all pages via the remove tool (by leaving the page field blank) will this definately not risk hurting/removing the main site (since the new subdomain specific gwt account doesn't apply to the main site in any way) ?? I have a new client who's dev area has been indexed, dev team has now prevented crawling of this subdomain but the 'the stable door was shut after the horse had already bolted' and the subdomains pages are on G's index so we need to remove the entire subdomain development area asap. So we are going to do this via the remove tool in a subdomain specific new gwt account, but I just want to triple check this wont accidentally get main site removed too ?? Cheers Dan
Technical SEO | | Dan-Lawrence0 -
Should I index my search result pages?
I have a job site and I am planning to introduce a search feature. The question I have is, is it a good idea to index search results even if the query parameters are not there? Example: A user searches for "marketing jobs in New York that pay more than 50000$". A random page will be generated like example.com/job-result/marketing-jobs-in-new-york-that-pay-more-than-50000/ For any search that gets executed, the same procedure would be followed. This would result in a large number of search result pages automatically set up for long tail keywords. Do you think this is a good idea? Or is it a bad idea based on all the recent Google algorithm updates?
Technical SEO | | jombay0 -
Web page is showing up on Google but doesn't show when it was cached, so is it indexed?
Hey everyone So I created a new page on a WordPress website, it was live for a few hours till I changed my mind & switched it back to a draft. Just out of curiosity I did the Site:www.example.com/Example search on Google to see if it had been indexed & apparently it had but when I click on cached to see what time it got indexed at exactly it's showing me an error. So does this mean it is indexed or not?
Technical SEO | | conversiontactics0 -
Campaign shows 5,000 warnings from shared database feed, made pages no-follow and no-index, are we OK now?
One of our campaigns shows 5,000 warnings for dup content, meta descriptions, and urls. This is from a xml database feed that is shared throughout the industry. We made the pages no-follow and no-index, but on Moz crawl still get the warnings. No warnings on Webmaster tools. Should we ignore these warnings and are we OK now, or is there more work to do?
Technical SEO | | geilt0