Filtered Navigation, Duplicate content issue on an Ecommerce Website
-
I have navigation that allows for multiple levels of filtering. What is the best way to prevent the search engine from seeing this duplicate content? Is it a big deal nowadays? I've read many articles and I'm not entirely clear on the solution.
For example.
You have a page that lists 12 products out of 100:
companyname.com/productcategory/page1.htm
And then you filter these products:
companyname.com/productcategory/filters/page1.htm
The filtered page may or may not contain items from the original page, but does contain items that are in the unfiltered navigation pages. How do you help the search engine determine where it should crawl and index the page that contains these products?
I can't use rel=canonical, because the exact set of products on the filtered page may not be on any other unfiltered pages. What about robots.txt to block all the filtered pages? Will that also stop pagerank from flowing? What about the meta noindex tag on the filitered pages?
I have also considered removing filters entirely, but I'm not sure if sacrificing usability is worth it in order to remove duplicate content. I've read a bunch of blogs and articles, seen the whiteboard special on faceted navigation, but I'm still not clear on how to deal with this issue.
-
Hi Dstrunin,
I would still use the rel canonical tag even with or without the filter in place. So if you have a list of products displayed unfilter at companyname.com/productcategory/page1.htm, I would add a rel canonical with it pointing at companyname.com/productcategory/page1.htm. For the filtered results,companyname.com/productcategory/filters/page1.htm , the canoncial tag would still point to companyname.com/productcategory/page1.htm.
It doesn't hurt to have a canonical tag point to the same page it's on.
If you can't do that I would meta noindex those filtered pages and remove the robots.txt stuff. Robots.txt doesn't tell Google they can't index it it only says they can't crawl it. So they could still index old stuff they crawled before you did the robots.txt stuff or index the title tags.
Casey
-
I have been doing that, but robots.txt only does so much. I've implemented the meta noindex tag as well and it doesn't seem to be taking all the pages out of the index.
-
My unprofessional opinion would be to use robot.txt on some areas. I'll also be interested to see what the pros here say.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Potential duplicate content issue?
We have a category on our website for PVC rolls to buy as standard 50m rolls (this includes 15 products in the category). We're also releasing PVC rolls to buy per metre (10m roll/25m roll etc...), again with 15 products, which we are adding as a separate category as it makes more sense for our customers and removes the risk of having too many options. Would using the same description be bad practice for SEO? The product is exactly the same just available in different roll sizes, but we definitely do not want to combine categories as it doesn't work for our customers. Any help or suggestions would be appreciated, thanks.
On-Page Optimization | | RayflexGroup0 -
Would you consider this to be thin content
I always struggle with these pages I have on my site going back and forth debating what I want to do with them. On one side Google was content, yet at the same time its all about user experience. http://www.freescrabbledictionary.com/word-lists/words-that-start-with/letter/h/ I used to have all my words listed on one page which could have been well over 10,000. Now I pagination them as you can see. I debate writing a header of content for these pages, but honestly users just want the words. Get in, get what you need and get out. What is the recommendation on these pages. Should I write content? Should I not?
On-Page Optimization | | cbielich0 -
Does hreflang restrain my site from being penalized for duplicated content?
I am curently setting up a travel agency website. This site is going to be targeting both american and mexican costumers. I will be working with an /es subdirectory. Would hreflang, besides showing the matching language version in the SERP´s, restrain my site translated content (wich is pretty much the same) from being penalized fro duplicated content? Do I have to implement relcannonical? Thank ypu in advanced for any help you can provide.
On-Page Optimization | | kpi3600 -
Wordpress blog duplicate issue
So after looking at the set up of the blog ive found this. http://www.trespass.co.uk/blog/ http://www.trespass.co.uk/blog/category/news/ http://www.trespass.co.uk/blog/category/general/ http://www.trespass.co.uk/blog/category/snow/ Content shown on http://www.trespass.co.uk/blog/ can also be found on the other 3 urls. The permalink structure is set up as /%category%/%postname%/ which I want to change to just %postname% Obviously i want to make things as seo friendly as possible so any suggestions to do this right without losing any indexed pages etc. I have limited access to make changes to plugins etc aswell as these need to be done through the development company who manage our site. Cheers Robert
On-Page Optimization | | Trespass0 -
How to check duplicate content with other website?
Hello, I guest that my website may be duplicate contents with other websites. Is this a important factor on SEO? and how to check and fix them? Thanks,
On-Page Optimization | | JohnHuynh1 -
Posting content from our books to our website
Hello, I am the newly appointed in-house seo person for a small business. The founders of our company have written several books, which we sell. But book sales are a small part of our business. We are considering posting to our website some or all of the content of the books. This content is directly relevant to the existing content of our website and would be available for free to all visitors. 1. Is it likely that the traffic and links to the new book pages would improve the search engine rankings of our existing pages? 2. We already have pdf versions of each book we could post, which are formatted nicely. Should we convert these to html to make them more friendly to search engines? 3. Of course, we would have to split each book into multiple web pages, perhaps one chapter per page. How much content could each new page optimally accommodate? 4. Would it be more valuable from an SEO perspective to post pieces of the books over time in a blog format? Thank you very much for your thoughts!
On-Page Optimization | | nyc-seo0 -
What's the best practice for handling duplicate content of product descriptions with a drop-shipper?
We write our own product descriptions for merchandise we sell on our website. However, we also work with drop-shippers, and some of them simply take our content and post it on their site (same photos, exact ad copy, etc...). I'm concerned that we'll loose the value of our content because Google will consider it duplicated. We don't want the value of our content undermined... What's the best practice for avoiding any problems with Google? Thanks, Adam
On-Page Optimization | | Adam-Perlman0 -
Does Google still see masked domains as duplicate content?
Older reads state the domain forwarding or masking will create duplicate content but Google has evolved quite a bit and I'm wondering if that is still the case? Not suggesting that a 301 is not the proper way to redirect something but my question is: Does Google still see masked domains as duplicate content? Is there any viable use for domain masking other than for affiliates?
On-Page Optimization | | TracyWeb0