301s vs. rel=canonical for duplicate content across domains
-
Howdy mozzers,
I just took on a telecommunications client who has spent the last few years acquiring smaller communications companies. When they took over these companies, they simply duplicated their site at all the old domains, resulting in a bunch of sites across the web with the exact same content. Obviously I'd like them all 301'd to their main site, but I'm getting push back.
Am I OK to simply plug in rel=canonical tags across the duplicate sites? All the content is literally exactly the same.
Thanks as always
-
Awesome - thanks Matthew.
-
Thanks Ade. This is really helpful. Much appreciated!
-
Hey,
You can do cross domain canonicals. Google does support that (http://www.youtube.com/watch?v=zI6L2N4A0hA) and the one time I had to use that, it did seem to help. That being said, I'm not sure if Bing supports that or not.
Hope that helps.
-
Hi James,
Using a rel=canonical will work but I am guessing that at some point your client will want to updated the content of their website. If you do go for rel=canonical then this would mean that you would have to also update that same content to all of the other domains.
Depending on your domain/website hosting set-up you may be able to get around this by adding all of the duplicate domains as Parked Domains on the website hosting server but this can get very messy.
Unless there is a really good reason not to then I would use 301's.
If any of the domains has been hit with the Penguin update then I wouldn't use a 301 for those domains.
Cheers.
Ade.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
ViewState and Duplicate Content
Our site keeps getting duplicated content flagged as an issue... however, the pages being grouped together have very little in common on-page. One area which does seem to recur across them is the ViewState. There's a minimum of 150 lines across the ones we've investigated. Could this be causing the reports?
Technical SEO | | RobLev0 -
Rel=Canonical For Landing Pages
We have PPC landing pages that are also ranking in organic search. We've decided to create new landing pages that have been improved to rank better in natural search. The PPC team however wants to use their original landing pages so we are unable to 301 these pages to the new pages being created. We need to block the old PPC pages from search. Any idea if we can use rel=canonical? The difference between old PPC page and new landing page is much more content to support keyword targeting and provide value to users. Google says it's OK to use rel=canonical if pages are similar but not sure if this applies to us. The old PPC pages have 1 paragraph of content followed by featured products for sale. The new pages have 4-5 paragraphs of content and many more products for sale. The other option would be to add meta noindex to the old PPC landing pages. Curious as to what you guys think. Thanks.
Technical SEO | | SoulSurfer80 -
Duplicate content: using the robots meta tag in conjunction with the canonical tag?
We have a WordPress instance on an Apache subdomain (let's say it's blog.website.com) alongside our main website, which is built in Angular. The tech team is using Akamai to do URL rewrites so that the blog posts appear under the main domain (website.com/more-keywords/here). However, due to the way they configured the WordPress install, they can't do a wildcard redirect under htaccess to force all the subdomain URLs to appear as subdirectories, so as you might have guessed, we're dealing with duplicate content issues. They could in theory do manual 301s for each blog post, but that's laborious and a real hassle given our IT structure (we're a financial services firm, so lots of bureaucracy and regulation). In addition, due to internal limitations (they seem mostly political in nature), a robots.txt file is out of the question. I'm thinking the next best alternative is the combined use of the robots meta tag (no index, follow) alongside the canonical tag to try to point the bot to the subdirectory URLs. I don't think this would be unethical use of either feature, but I'm trying to figure out if the two would conflict in some way? Or maybe there's a better approach with which we're unfamiliar or that we haven't considered?
Technical SEO | | prasadpathapati0 -
Duplicate content in product listing
We have "duplicate content" warning in our moz report which mostly revolve around our product listing (eCommerce site) where various filters return 0 results (and hence show the same content on the page). Do you think those need to be addressed, and if so how would you prevent product listing filters that appearing as duplicate content pages? should we use rel=canonical or actually change the content on the page?
Technical SEO | | erangalp0 -
Will rel=canonical work here?
Dear SEOMOZ groupies, I manage several real estate sites for SEO which we have just taken over. After running the crawl on each I am find 1000's of errors relating to just a few points and wanted to find out either suggestion to fix or if the rel=canonical will resolve it as it is in bulk. Here are the problems...Every property has the following so the more adverts the more errors. each page has a contact agent url. all of these create dup title and content each advert has the same with printer friendly each advert has same with as a favorites page several other but I think you get the idea. Help!!! .... suggestions overly welcome Steve
Technical SEO | | AkilarOffice0 -
How to find original URLS after Hosting Company added canonical URLs, URL rewrites and duplicate content.
We recently changed hosting companies for our ecommerce website. The hosting company added some functionality such that duplicate content and/or mirrored pages appear in the search engines. To fix this problem, the hosting company created both canonical URLs and URL rewrites. Now, we have page A (which is the original page with all the link juice) and page B (which is the new page with no link juice or SEO value). Both pages have the same content, with different URLs. I understand that a canonical URL is the way to tell the search engines which page is the preferred page in cases of duplicate content and mirrored pages. I also understand that canonical URLs tell the search engine that page B is a copy of page A, but page A is the preferred page to index. The problem we now face is that the hosting company made page A a copy of page B, rather than the other way around. But page A is the original page with the seo value and link juice, while page B is the new page with no value. As a result, the search engines are now prioritizing the newly created page over the original one. I believe the solution is to reverse this and make it so that page B (the new page) is a copy of page A (the original page). Now, I would simply need to put the original URL as the canonical URL for the duplicate pages. The problem is, with all the rewrites and changes in functionality, I no longer know which URLs have the backlinks that are creating this SEO value. I figure if I can find the back links to the original page, then I can find out the original web address of the original pages. My question is, how can I search for back links on the web in such a way that I can figure out the URL that all of these back links are pointing to in order to make that URL the canonical URL for all the new, duplicate pages.
Technical SEO | | CABLES0 -
Duplicate Content Issue with
Hello fellow Moz'rs! I'll get straight to the point here - The issue, which is shown in the attached image, is that for every URL ending in /blog/category/name, it has a duplicate page of /blog/category/name/?p=contactus. Also, its worth nothing that the ?p=contact us are not in the SERPs but were crawled by SEOMoz and they are live and duplicate. We are using Pinnacle cart. Is there a way to just stop the crawlers from ?p=contactus or? Thank you all and happy rankings, James
Technical SEO | | JamesPiper0 -
Duplicate content and tags
Hi, I have a blog on posterous that I'm trying to rank. SEOMoz tells me that I have duplicate content pretty much everywhere (4 articles written, 6 errors at the last crawl). The problem is that I tag my posts, and apparently SEOMoz thinks that it's duplicate content only because I don't have so many posts, so pages end up being very very similar. What can I do in these situations ?
Technical SEO | | ngw0