Locating Duplicate Pages
-
Hi,
Our website consists of approximately 15,000 pages however according to our Google Webmaster Tools account Google has around 26,000 pages for us in their index.
I have run through half a dozen sitemap generators and they all only discover the 15,000 pages that we know about. I have also thoroughly gone through the site to attempt to find any sections where we might be inadvertently generating duplicate pages without success.
It has been over six months since we did any structural changes (at which point we did 301's to the new locations) and so I'd like to think that the majority of these old pages have been removed from the Google Index. Additionally, the number of pages in the index doesn't appear to be going down by any discernable factor week on week.
I'm certain it's nothing to worry about however for my own peace of mind I'd like to just confirm that the additional 11,000 pages are just old results that will eventually disappear from the index and that we're not generating any duplicate content.
Unfortunately there doesn't appear to be a way to download a list of the 26,000 pages that Google has indexed so that I can compare it against our sitemap. Obviously I know about site:domain.com however this only returned the first 1,000 results which all checkout fine.
I was wondering if anybody knew of any methods or tools that we could use to attempt to identify these 11,000 extra pages in the Google index so we can confirm that they're just old pages which haven’t fallen out of the index yet and that they’re not going to be causing us a problem?
Thanks guys!
-
It's cool. Sorry, the point I was making is that irrespective of what you search for the page that is returned is http://www.refreshcartridges.co.uk/advanced_search_result.php (with nothing after the .php) and as such the search results page couldn't spurn multiple pages which could be indexed by Google.
-
Hmm, I'm not too knowledgeable about php pages. Sorry!
-
Sorry, I'm not sure what happened to that bit.ly address - The actual address of the website is www.refreshcartridges.co.uk.
Ah, I see what you mean about the search results now however this hopefully shouldn't be an issue as for security (our web guy said something about injections) the URL that is returned irrespective of what is searched for is http://www.refreshcartridges.co.uk/advanced_search_result.php
Thanks again!
-
I can't get that link to work.
What I said before still applies with physical input (this is what I assumed when I said it).
For example, user inputs the words "snakes and dogs" and clicks search. The new URL is "www.yoursite.com/search?q=snakes and dogs" All these weird URL pages need noindex meta tags or Google will flag them as duplicate content because, for example, this page and the result for "dogs and snakes" generate almost the same page.
Does that make sense?
It is in Google's Webmaster Guidelines that you should noindex these pages. -
Many thanks for your input on this. I have actually looked at this through the HTML improvements section of GWMT however I am showing only a few dozen duplicated titles / descriptions and this is simply due to the product categories being almost identical (for example HP Deskjet 500 and HP Deskjet 500+)
-
Many thanks for your response. Our site is an eCommerce site that doesn't employ tags as such and our categories are all accounted for in the 15,000 page figure.
-
We did have this at the beginning of the year when we used a ?dispmode=grid and ?dispmode=list to change the way our results were displayed. This has been rectified however by us completely removing the option and any instances of dispmode present in the URL force a 301 to the correct master page. There are still a few hundred instances of this dispmode being present in the Google index but 99% of them have fallen out now.
I have checked and double checked and we don't seem to have any issues like this at present.
-
I'm not certain if this is the case as our search engine requires physical input in order to yield a result. I don't know if it helps but the URL is http://bit.ly/4Cogchww if you fancy taking a look
-
Thanks for your reply. Indeed our website does force www. if someone were to attempt to navigate to us without prefixing www.
-
Hi Chris,
Google Webmaster has a tool that helps identify duplicate HTMLs and maybe you can use that to see if the 11,000 pages are duplicate. IF they are, I am assuming they should have the duplicate Title Tag and etc. which the tool may discover.
-
Have you checked for instances where a page parameter is being seen as another version of the same page? One of the sites I work for had an issue a few months back where every instance of a product page was being flagged as duplicate content because of an oversight. We had one of our coders write a clause into the page where every time a page loaded with a parameter such as ?color=72 it would canonicalize it to the page minus the parameter. This decreased our duplicate content warnings quickly and effectively.
-
it could be that your tags and categories are considered individual pages and therefore creating their own permalink: ex: http:www.example.com/keyword, and http://www.example.com/tag/keyword and http://www.example.com/category/keyword. Another way would be to check the sitemaps you have in webmaster tools and compare those to each other. Just a suggestion.
-
Does your website force 'www.'?
Both yourdomain.com and www.yourdomain.com are separate sites and can have different pages spidered.
-
Be sure to try different combinations of 'site:www.domain.com' and 'site:domain.com'. They will all yield different results.
Sounds to me like you probably have an internal search engine that is generating search results pages based off the search term, and each different results page is a piece of duplicate content.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I reduce Too Many On-Page Links? I am looking for best method through which I can reduce by on page link.
Hello, As I have the Pro Account in SEOMOZ . I have created the campaign for my website and I have seen the warring for on page analysis for Too Many On-Page Links. As per my knowledge in past it's matter that you can put maximum 100 links per page but now is it still matter or harm if pages has Too Many On-Page Links? And if yest then please let me know the best method to reduce my On-Page Links with out doing any major changes in website
On-Page Optimization | | jemindesai0 -
Sold Products appear as duplicate pages 'Page Not Found' ???
Hi there, I'm down to just 6 duplicate page warnings but I'm not sure how to deal with this one: Information Page Not Found! http://www.vintageheirloom.com/index.php?route=information/information&information_id=6 My Ecommerce shopping site products are unique, 1 of a kind. So once one product has sold and been delivered we take the product off our website, hence the Information Page Not Found! As I understand when search engines re-index these warnings will drop off but new sold products would replace them. So redirecting seems like hard work and never ending. Is it ok to ignore these warnings? Thanks Mozzers..
On-Page Optimization | | well-its-1-louder0 -
Summarize your question.Images being seen as duplicate content/pages
My images suddenly are appearing in my crawl reports as duplicate content, without meta tags, this happened over night and cant figure out why.
On-Page Optimization | | RBYoung0 -
How should we handle ecommerce section pages (flagged with duplication) containing the same products?
We've removed a ton of errors, duplication and other stuff since signing up to SEOmoz Pro, but we're getting to the point where what we have left isn't that easy to fix. On one of our (ecommerce) sites we have several sections where people buy products that are applicable to the area of the home. In one or two instances, a particular list of products is the same for two or more different areas - for instance the "Bedroom and Landings" and "Hallway and Stairs" sections may list the same 10 products. This is obviously flagging up as duplication in our reports. What is the best way to handle this situation? Make the one with the highest authority canonical? Point both to another canonical page? Or, try and convince the product department that we should have a more generically name section that both link to? Thanks for any advice!
On-Page Optimization | | Safelincs0 -
View all Page for Product Overview Pages
Hi everybody! We have an ecommerce site with product overview pages, where sometimes there are hundreds of products listed. Usually, we just display 30 and have a button where users can click to see 30 more - or all products listed at once. This is the overview page (as indexed in google): http://www.geschenkidee.ch/aussergewoehnliches.html
On-Page Optimization | | zeepartner
And this is the view-all page: http://www.geschenkidee.ch/aussergewoehnliches.html#all What should I do here? The product overview page will hardly generate more traffic by listing all products (because the overview page will rank for generic keywords, while the product keyword searches will be referred to the specific product pages themselves). I was originally thinking of using rel=canonical pointing to the view-all page. But this would just lead to longer load time. Should we just leave those overview pages or is there a best practice for how to deal with such pages? Thanks for your thoughts on this!0 -
Which redirect to use when redirecting to https page from http page
I have one form under https which is redirected from the regular http page. this site was not made by me and I am trying to understand if the way it was redirected using 302 redirect is a problem Thanks
On-Page Optimization | | ciznerguy0 -
Duplicate Page Content Issues
How can I fix Duplicate Page Content Issues on my site : www.ifocalmedia.com. This is a WP site and the diagnostics shows I have 115 errors? I know this is damaging to my SEO campaign how do I clear these? Any help is very welcome.
On-Page Optimization | | shami0 -
Old pages
I have a site where I have 5,000 new products each year, I never waned to deleted the old pages due to links pointing to them and keywords. But I now have 20,000 plus pages, does having that many pages spread out my link juice or does it effect me in any other ways over having a site with 5,000 pages or should I keep not deleting old pages so I dont loose any links? Along with that I currently do not link to my old pages from my site so Im guessing google does not get to them very often if at all, if you agree to still keep them should I link to them somewhere? Because the products are not that simiiar and they do bring added value I dont think canonical would work here
On-Page Optimization | | Dirty0