Duplicate Content Reports
-
Hi
Dupe content reports for a new client are sjhowing very high numbers (8000+) main of them seem to be for sign in, register, & login type pages, is this a scenario where best course of action to resolve is likely to be via the parameter handling tool in GWT ?
Cheers
Dan
-
Cool - Many thanks Kurt !
All Best
Dan
-
You don't absolutely have to do both, but by doing the parameter handling you are sending another signal to Google of what you want them to do (keep in mind that both canonical and parameters are only considered suggestions by Google). It's pretty simple to setup the parameter handling, so if you are really concerned about the duplicate content issues, why not do both?
Also, technically, the canonical tag tells Google which URL they've crawled to give prominence to when they are duplicate content, whereas my understanding is that parameter handling (when Google follows your suggestions) actually prevents Google from even crawling URLs with those parameters. In other words, canonical tags tell Google what to do with URLs they've crawled and parameter handling tells Google what URLs not to even crawl.
-
Thanks Kurt
and what about the parameter handling tool ? if canonical tag method you mention will deal with this then is there any need to do anything with parameter handling tool ?
cheers
dan
-
I would answer the same as Kurt for the install. You put the noindex tag in the header of the core page and so when all the other pages are generated with the parameters it will be added to those pages automatically. Once you get the pages out of the index, then I would nofollow links or use robots.txt to those pages to keep the bots out to start with.
-
Hi Dan,
I mean both. The canonical tag will help with duplicate content issues and the parameter handling will help with indexing.
Setting up the canonical tag shouldn't be an issue. If the same page content is being displayed and the only difference is that the URL has some parameters in it, then the canonical tag should naturally be included with the rest of the page's code. Since the canonical tag doesn't change, it should work perfectly.
For example, if you have a page, login.php, and that page always has a parameter, ?visitor=### (where ### is a random number), then you simply put the canonical tag in the head of the login.php page (). That canonical tag will always be in the login.php page no matter whether the URL is login.php?visitor=123 or login.php?visitor=56, etc. It will always tell the search engines that the original page login.php.
-
Thanks Clever PHD
So is there a way of setting a general rule to apply noindex to all of these duplicates or do you mean to the main actual sign in/login pages which will hence apply to all new, sessions specific, duplicate versions of the main sign-in/log-in pages etc when generated ?
Cheers
Dan
-
HI Kurt
Do you mean both or one or the other ?
Isn't setting up canonical tags on all the possible dynamically generated login, sign up and registration type pages impossible e can you set up some sort of rule that applies to those unpredictable (since we dont know what they are until they are generated by a user session etc) pages ?
Cheers
Dan
-
You can also noindex those pages to simply take them out of the index and then later nofollow links to them.
-
You can use the parameter handling and setup canonical tags on the pages.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content with same URL?
SEOmoz is saying that I have duplicate content on: http://www.XXXX.com/content.asp?ID=ID http://www.XXXX.com/CONTENT.ASP?ID=ID The only difference I see in the URL is that the "content.asp" is capitalized in the second URL. Should I be worried about this or is this an issue with the SEOmoz crawl? Thanks for any help. Mike
Technical SEO | | Mike.Goracke0 -
Are aggregate sites penalised for duplicate page content?
Hi all,We're running a used car search engine (http://autouncle.dk/en/) in Denmark, Sweden and soon Germany. The site works in a conventional search engine way with a search form and pages of search results (car adverts).The nature of car searching entails that the same advert exists on a large number of different urls (because of the many different search criteria and pagination). From my understanding this is problematic because Google will penalize the site for having duplicated content. Since the order of search results is mixed, I assume SEOmoz cannot always identify almost identical pages so the problem is perhaps bigger than what SEOmoz can tell us. In your opinion, what is the best strategy to solve this? We currently use a very simple canonical solution.For the record, besides collecting car adverts AutoUncle provide a lot of value to our large user base (including valuations on all cars) . We're not just another leech adword site. In fact, we don't have a single banner.Thanks in advance!
Technical SEO | | JonasNielsen0 -
How to fix duplicate page content error?
SEOmoz's Crawl Diagnostics is complaining about a duplicate page error. The example of links that has duplicate page content error are http://www.equipnet.com/misc-spare-motors-and-pumps_listid_348855 http://www.equipnet.com/misc-spare-motors-and-pumps_listid_348852 These are not duplicate pages. There are some values that are different on both pages like listing # , equipnet tag # , price. I am not sure how do highlight the different things the two page has like the "Equipment Tag # and listing #". Do they resolve if i use some style attribute to highlight such values on page? Please help me with this as i am not really sure why seo is thinking that both pages have same content. Thanks !!!
Technical SEO | | RGEQUIPNET0 -
Ways of Helping Reducing Duplicate Content.
Hi I am looking to no of anyway there is at helping to reduce duplicate content on a website with out breaking link and affecting Google rankings.
Technical SEO | | Feily0 -
Duplicate content issue
Hi everyone, I have an issue determining what type of duplicate content I have. www.example.com/index.php?mact=Calendar,m57663,default,1&m57663return_id=116&m57663detailpage=&m57663year=2011&m57663month=6&m57663day=19&m57663display=list&m57663return_link=1&m57663detail=1&m57663lang=en_GB&m57663returnid=116&page=116 Since I am not an coding expert, to me it looks like it is a URL parameter duplicate content. Is it? At the same time "return_id" would makes me think it is a session id duplicate content. I am confused about how to determine different types of duplicate content, even by reading articles on Seomoz about it: http://www.seomoz.org/learn-seo/duplicate-content. Could someone help me on how to recognize different types of duplicate content? Thank you!
Technical SEO | | Ideas-Money-Art0 -
Duplicate Content
Many of the pages on my site are similar in structure/content but not exactly the same. What amount of content should be unique for Google to not consider it duplicate? If it is something like 50% unique would it be preferable to choose one page as the canonical instead of keeping them both as separate pages?
Technical SEO | | theLotter0 -
Duplicate homepage content
Hi, I recently did a site crawl using seomoz crawl test My homepage seems to have 3 cases of duplicate content.. These are the urls www.example.ie/ www.example..ie/%5B%7E19%7E%5D www.example..ie/index.htm Does anyone have any advise on this? What impact does this have on my seo?
Technical SEO | | Socialdude0 -
Duplicate content connundrum
Hey Mozzers- I have a tricky situation with one of my clients. They're a reputable organization and have been mentioned in several major news articles. They want to create a Press page on their site with links to each article, but they want viewers to remain within the site and not be redirected to the press sites themselves. The other issue is some of the articles have been removed from the original press sites where they were first posted. I want to avoid duplicate content issues, but I don't see how to repost the articles within the client's site. I figure I have 3 options: 1. create PDFs (w/SEO-friendly URLs) with the articles embedded in them that open in a new window. 2. Post an image with screenshot of article on a unique URL w/brief content. 3. Copy and paste the article to a unique URL. If anyone has experience with this issue or any suggestions, I would greatly appreciate it. Jaime Brown
Technical SEO | | JamesBSEO0