Huge google index with un-relevant pages
-
Hi,
i run a site about sport matches, every match has a page and the pages are generated automatically from the DB. pages are not duplicated, but over time some look a little bit similar. after a match finishes it has no internal links or sitemap entry, but it's reachable by direct URL and continues to be on google index. so over time we have more than 100,000 indexed pages.
since past matches have no significance and they're not linked and a match can repeat and it may look like duplicate content....what you suggest us to do:
when a match is finished - not linked, but appears on the index and SERP
-
301 redirect the match Page to the match Category which is a higher hierarchy and is always relevant?
-
use rel=canonical to the match Category
-
do nothing....
*301 redirect will shrink my index status, some say a high index status is good...
*is it safe to 301 redirect 100,000 pages at once - wouldn't it look strange to google?
*would canonical remove the past matches pages from the index?
what do you think?
Thanks,
Assaf.
-
-
In terms of what you've written, blocking a page via robots.txt doesn't remove it from the index. It simply prevents the crawlers from reaching the page. So if you block a page via robots.txt, the page remains in the index, Google just can't go back to the page and see if anything has changed. So if you were to block the page via robots.txt, and add a noindex tag to the page, Google won't be able to see the page with the noindex tag to remove it from the index because it's blocked via robots.txt.
If you moved all of your old content to a different folder, and block that folder via robots.txt, Google won't remove those pages from the index. In order to remove them from the index, you would have to go in to Webmaster Tools and use the URL removal tool to remove that new folder from the index - if they see it's blocked via robots.txt, then and only then they'll remove the content from the index - it has to be blocked via robots.txt first in order to remove the whole folder with the URL removal tool.
I'm not sure though if this would work for the future - if you removed a folder from the index, and then added more content that was indexed previously afterwards, I'm not sure what would happen to that new content moved to that folder. Either way, Google will have to come back and recrawl the page to see that it has moved to the new folder, and then remove it from the index. So either way, the content will only be removed once Google recrawls the old content.
So I still think a better way to remove the content from the index is to add the noindex tag to the old pages. To facilitate the search engines reaching these old pages, I'd make sure there is a way the engines can get to them - make sure there is a path they can take to reach them.
Another good idea I saw on a forum post here a while ago would be to create a sitemap containing all of these old pages you have indexed and want removed. Add the noindex tag to the sitemap - using the Webmaster tools sitemap interface, you'll then be able to monitor the progress of deindexation over time - by checking how many pages on the sitemap/s of the old content are originally indexed as reported by webmaster tools, and then you can see later on how many of those pages are still indexed, this will be a good indicator for you of the progress of the deindexation.
-
Dear Mark,
*i've sent you a private message.
i'm starting to understand i've a much bigger problem.
*my index status contain 120k pages while only 2000 are currently relevant.
your suggestion is - after a match finishes pragmatically add to the page and google will remove it from it's index. it could work for relatively new pages but since very old pages don't have links OR sitemap entry it could take a very long time to clear the index cause they're rarely crawled - if at all.
- more aggressive approach would be to change this site architecture and restrict by robot.txt the folder that holds all the past irrelevant pages.
so if today a match URL is like this: www.domain.com/sport/match/T1vT2
restrict www.domain.com/sport/match/ on robots.txt
and from now on create all new matches on different folder like: www.domain.com/sport/new-match-dir/T1vT2
-
is this a good solution?
-
wouldn't google penalize me for removing a directory with 100k pages?
-
if it's a good approach, how much time it will take for google to clear all those pages from it's index?
I know it's a long one and i'll really appreciate your response.
Thanks a lot,
Assaf.
-
there are a bunch of articles out there, but each case is different - here are a few:
http://www.searchenginejournal.com/the-holy-grail-of-panda-recovery-a-1-year-case-study/45683/
You can contact me via private message here on the forum and I can try to take a more in depth look at your site if you can give me some more detailed info.
-
yes. when the 1st Panda update was rolled out i've lost 50% of the traffic from google and haven't really recovered since.
-
Are you sure you got hit by Panda before we talk about a Panda hit?
-
Thanks Mark!
any good article about how to recover from Panda?
-
Exactly - I'd build a strategy more around promoting pages that will have long lasting value.
If you use the tag noindex, follow, it will continue to spread link juice throughout the site, it's just the individual page with the tag will not be included in the search results and will not be part of the index. In order for the tag to work, they first have to crawl the page and see the tag - so it doesn't happen instantaneously - if they crawl these deeper pages once every few weeks, once a month, or even longer, it may take a while for these pages to be removed from the index.
-
Hi Mark
-
these pages are very important when they are relevant (before the match finished) - they are the source of most of our traffic which come from long tail searches.
-
some of these pages have inbound link and it would be a shame to lose all this juice.
-
would noindex remove the pages from the google index? how much time it would take? wouldn't a huge noindex also look suspicious?
-
by "evergreen pages" - you mean pages that are always relevant like League page / Sport page etc...?
Thanks,
Assaf.
-
-
Hi Assaf,
(I'm not stalking you, I just think you've raised another interesting question)
In terms of index status/size, you don't want to create a massive index of empty/low value pages - this is food for Google's Panda algorithm, and will not be good for your site in the long run. It'll get a Panda smack if it hasn't already.
To remove these pages from the index, instead of doing hundreds of thousands of 301 redirects, which your server won't like either, I'd recommend adding the noindex meta tag to the pages.
I'd put a rule in your cms that after a certain point in time, you noindex those pages. Make sure you also have evergreen pages on your site that can serve as landing pages for the search engines and which won't need to be removed after a short period of time. These are the pages you'll want to focus your outreach and link building efforts on.
Mark
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Indexed Pages Different when I perform a "site:Google.com" site search - why?
My client has an ecommerce website with approx. 300,000 URLs (a lot of these are parameters blocked by the spiders thru meta robots tag). There are 9,000 "true" URLs being submitted to Google Search Console, Google says they are indexing 8,000 of them. Here's the weird part - When I do a "site:website" function search in Google, it says Google is indexing 2.2 million pages on the URL, but I am unable to view past page 14 of the SERPs. It just stops showing results and I don't even get a "the next results are duplicate results" message." What is happening? Why does Google say they are indexing 2.2 million URLs, but then won't show me more than 140 pages they are indexing? Thank you so much for your help, I tried looking for the answer and I know this is the best place to ask!
Intermediate & Advanced SEO | | accpar0 -
Why isn't Google indexing this site?
Hello, Moz Community My client's site hasn't been indexed by Google, although it was launched a couple of months ago. I've ran down the check points in this article https://mza.seotoolninja.com/ugc/8-reasons-why-your-site-might-not-get-indexed without finding a reason why. Any sharp SEO-eyes out there who can spot this quickly? The url is: http://www.oldermann.no/ Thank you
Intermediate & Advanced SEO | | Inevo
INEVO, digital agency0 -
How to speed indexing of web pages after website overhaul.
We have recently overhauled our website and that has meant new urls as we moved from asp to php. we also moved from http to https. The website (https://) has 694 urls submitted through site map with 679 indexed in sitemap of google search console. As we look through the google search console analytics we notice that google index section / index status it says: https://www.xyz.com version - index status 2
Intermediate & Advanced SEO | | Direct_Ram
www.xyz.com version - index status 37
xyz.com version - index status 8 how can we get more pages to be indexed or found by google sooner rather than later as we have lost major traffic. thanks for your help in advance0 -
When does Google index a fetched page?
I have seen where it will index on of my pages within 5 minutes of fetching, but have also read that it can take a day. I'm on day #2 and it appears that it has still not re-indexed 15 pages that I fetched. I changed the meta-description in all of them, and added content to nearly all of them, but none of those changes are showing when I do a site:www.site/page I'm trying to test changes in this manner, so it is important for me to know WHEN a fetched page has been indexed, or at least IF it has. How can I tell what is going on?
Intermediate & Advanced SEO | | friendoffood0 -
Why is this site not indexed by Google?
Hi all and thanks for your help in advance. I've been asked to take a look at a site, http://www.yourdairygold.ie as it currently does not appear for its brand name, Your Dairygold on Google Ireland even though it's been live for a few months now. I've checked all the usual issues such as robots.txt (doesn't have one) and the robots meta tag (doesn't have them). The even stranger thing is that the site does rank on Yahoo! and Bing. Google Webmaster Tools shows that Googlebot is crawling around 150 pages a day but the total number of pages indexed is zero. It does appear if you carry out a site: search on Google however. The site is very poorly optimised in terms of title tags, unnecessary redirects etc which I'm working on now but I wondered if you guys had any further insights. Thanks again for your help.
Intermediate & Advanced SEO | | iProspect-Ireland0 -
Why the archive sub pages are still indexed by Google?
Why the archive sub pages are still indexed by Google? I am using the WordPress SEO by Yoast, and selected the needed option to get these pages no-index in order to avoid the duplicate content.
Intermediate & Advanced SEO | | MichaelNewman1 -
What Sources to use to compile an as comprehensive list of pages indexed in Google?
As part of a Panda recovery initiative we are trying to get an as comprehensive list of currently URLs indexed by Google as possible. Using the site:domain.com operator Google displays that approximately 21k pages are indexed. Scraping the results however ends after the listing of 240 links. Are there any other sources we could be using to make the list more comprehensive? To be clear, we are not looking for external crawlers like the SEOmoz crawl tool but sources that would be confidently allow us to determine a list of URLs currently hold in the Google index. Thank you /Thomas
Intermediate & Advanced SEO | | sp800 -
Best possible linking on site with 100K indexed pages
Hello All, First of all I would like to thank everybody here for sharing such great knowledge with such amazing and heartfelt passion.It really is good to see. Thank you. My story / question: I recently sold a site with more than 100k pages indexed in Google. I was allowed to keep links on the site.These links being actual anchor text links on both the home page as well on the 100k news articles. On top of that, my site syndicates its rss feed (Just links and titles, no content) to this page. However, the new owner made a mess, and now the site could possibly be seen as bad linking to my site. Google tells me within webmasters that this particular site gives me more than 400K backlinks. I have NEVER received one single notice from Google that I have bad links. That first. But, I was worried that this page could have been the reason why MY site tanked as bad as it did. It's the only source linking so massive to me. Just a few days ago, I got in contact with the new site owner. And he has taken my offer to help him 'better' his site. Although getting the site up to date for him is my main purpose, since I am there, I will also put effort in to optimizing the links back to my site. My question: What would be the best to do for my 'most SEO gain' out of this? The site is a news paper type of site, catering for news within the exact niche my site is trying to rank. Difference being, his is a news site, mine is not. It is commercial. Once I fix his site, there will be regular news updates all within the niche we both are in. Regularly as in several times per day. It's news. In the niche. Should I leave my rss feed in the side bars of all the content? Should I leave an achor text link on the sidebar (on all news etc.) If so: there can be just one keyword... 407K pages linking with just 1 kw?? Should I keep it to just one link on the home page? I would love to hear what you guys think. (My domain is from 2001. Like a quality wine. However, still tanked like a submarine.) ALL SEO reports I got here are now Grade A. The site is finally fully optimized. Truly nice to have that confirmation. Now I hope someone will be able to tell me what is best to do, in order to get the most SEO gain out of this for my site. Thank you.
Intermediate & Advanced SEO | | richardo24hr0