Rel canonical and duplicate subdomains
-
Hi,
I'm working with a site that has multiple sub domains of entirely duplicate content. So, the production level site that visitors see is (for made-up illustrative example):
Then, there are sub domains which are used by different developers to work on their own changes to the production site, before those changes are pushed to production:
Google ends up indexing these duplicate sub domains, which is of course not good.
If we add a canonical tag to the head section of the production page (and therefor all of the duplicate sub domains) will that cause some kind of problem... having a canonical tag on a page pointing to itself? Is it okay to have a canonical tag on a page pointing to that same page?
To complete the example...
In this example, where our production page is 123abc456.edu, our canonical tag on all pages (this page and therefor the duplicate subdomains) would be:
Is that going to be okay and fix this without causing some new problem of a canonical tag pointing to the page it's on?
Thanks!
-
Hi Bob,
That excellent question I'll have to look in to and confirm. More later. Thanks!
-
Is the subdomain data stored on the server as directories?
So for example, is the Moe.123abc456.edu data stored in a folder like 123abc456.edu/Moe
If so, you can simply have one robots.txt on your root domain, blocking those directories
Disallow: /Moe/
-
Well, Bob, it looks like you're right! I guess it will for sure see all the pages in
as the ones to remove and not
Also, how does that robots text not get pushed to production as the developer working on that branch completes his work and pushes it to production.
I must confess, it still feels a little like bomb disposal.
-
This should be exactly what you need: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1663427
-
Hi Bob,
Thanks for the suggestion/question. I'm thinking about that, but wouldn't putting some robots do not crawl text on pages already indexed be a little like closing the barn door after the horses left? Do you think it would un-index the already crawled sub-domain? Thanks!
-
Assuming that you do not need the development environments indexed in Google, why not simply block all crawlers on those subdomains?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate without user-selected canonical excluded
We have pdf files uploaded in the media of wordpress and used in our website. As these pdfs are duplicate content of the original publishers, we have marked links to these pdf urls as nofollow. These pages are also disallowed in robots.txt Now, Google Search Console has shown these pages Excluded as "Duplicate without user-selected canonical" As it comes out we cannot use canonical tag with pdf pages so as to point to the original pdf source If we embed a pdf viewer in our website and fetch the pdfs by passing the urls of the original publisher, would the pdfs be still read as text by google and again create duplicate content issue? Another thing, when the pdf expires and is removed, it would lead to 404 error. If we direct our users to the third party website, then it would add up to our bounce rate. What should be the appropriate way to handle duplicate pdfs? Thanks
Intermediate & Advanced SEO | | dailynaukri1 -
Is This Considered Duplicate Content?
My site has entered SEO hell and I am not sure how to fix it. Up until 18 months ago I had tremendous success on Google and Bing and now my website appears below my Facebook page for the term "Direct Mail Raleigh." What makes it even more frustrating is my competitors have done no SEO and they are dominating this keyword. I thought that the issue was due to harmful inbound links and two months ago I disavowed ones that were clearly spam. Somehow my site has actually gone down! I have a blog that I have updated infrequently and I do not know if it I am getting punished for duplicate content. On Google Webmaster Tools it says I have 279 crawled and indexed pages. Yesterday when I ran the MOZ crawl check I was amazed to find 1150 different webpages on my site. Despite the fact that it does not appear on the webmaster tools I have three different webpages due to the format that the Wordpress blog was created: "http://www.marketplace-solutions.com/report/part2leadershi/", "http://www.marketplace-solutions.com/report/page/91/" and "http://www.marketplace-solutions.com/report/category/competent-leadership/page/3/" What does not make sense to me is why Google only indexed 279 webpages AND why MOZ did not identify these three webpages as duplicate content with the Crawl Test Tool. Does anyone have any ideas? Would it be as easy as creating a massive robot.txt file and just putting 2 of the 3 URLs in that file? Thank you for your help.
Intermediate & Advanced SEO | | DR700950 -
Subdomain SEO question (php script on domain + wordpress on subdomain)
Hi Moz fellows, I am doing my first website which is entirely .php scripted. But I would like to have a wordpress blog to create content and blog posts, while the .php side of the website is more for sales pages and user generated listings.The only way to do this is to install wordpress on a subdomain "blog.website.com" QUESTION: If all my keywords targeted content is on the subdomain's Wordpress blog, but all my guest blogging efforts link to my main website, which one will rank? The subdomain or the domain? I need the domain to rank well as it is a Fiverr-like script, so if tons of people land on my "blog.website.com" subdomain, they will not convert into users... Let me know if you have experience with such a scenario, and thank you all in advance for your help! -Marc
Intermediate & Advanced SEO | | marcandre0 -
Opinion on Duplicate Content Scenario
So there are 2 pest control companies owned by the same person - Sovereign and Southern. (The two companies serve different markets) They have two different website URLs, but the website code is actually all the same....the code is hosted in one place....it just uses an if/else structure with dynamic php which determines whether the user sees the Sovereign site or the Southern site....know what I am saying? Here are the two sites: www.sovereignpestcontrol.com and www.southernpestcontrol.com. This is a duplicate content SEO nightmare, right?
Intermediate & Advanced SEO | | MeridianGroup0 -
301 redirect or rel=canonical
On my site, which I created with Joomla, there seems to be a lot of duplicated pages. I was wondering which would be better, 301 redirect or rel=canonical. On SeoMoz Pro "help" they suggest only the rel=canonical and dont mention 301 redirect. However, ive read many other say that 301 redirect should be the number one option. Also, does 301 redirect help solve the crawling errors, in other words, does it get rid of the errors of "duplicate page content?" Ive read that re-=canonical does not right? Thanks!
Intermediate & Advanced SEO | | waltergah0 -
Should I redirect all my subdomains to a single unique subdomain to eliminate duplicate content?
Hi there! I've been working on http://duproprio.com for a couple of years now. In the early stages of the website, we've put into place a subdomain wildcard, that allowed us to create urls like this on the fly : http://{some-city}.duproprio.com This brought us instantly a lot of success in terms of traffic due to the cities being great search keywords. But now, business has grown, and as we all know, duplicate content is the devil so I've been playing with the idea of killing (redirecting) all those urls to their equivalent on the root domain. http://some-city.duproprio.com/some-listing-1234 would redirect to equivalent page at : http://duproprio.com/some-listing-1234 Even if my redirections are 301 permanent, there will be some juice lost for each link redirected that are actually pointing to my old subdomains This would also imply to redirect http://www.duproprio.com to http://duproprio.com. Which is probably the part I'm most anxious about since the incoming links are almost 50/50 between those 2 subdomains... Bringing everything back into a single subdomain is the thing to do in order to get all my seo juice together, this part is obvious... But what can I do to make sure that I don't end up actually losing traffic instead of gaining authority? Can you help me get the confidence I need to make this "move" without risking to lose tons of traffic? Thanks a big lot!
Intermediate & Advanced SEO | | DuProprio.com0 -
Penalized for duplication?
Hi there, In February 2012 one my web pages (.co.uk) dropped from page 1 to page 5 for the keyword 'Menopause' and was replaced with a .PDF Late January 2012 I launched a duplicate version of this webpage however targeting .ie due to difference currency and legalities, I had made sure in webmaster tools that both websites were both Geographically correct, I am also using hreflang tags on both webpages. One thing that is strange is if I copy the first few paragraphs of the webpage in question into Google.co.uk, it's the .ie webpage that appears. Any help would be appreciated in why this has happened. Kind Regards
Intermediate & Advanced SEO | | Paul780 -
Subdomain or subdirectory
We're a big social networking site with over 1 million indexed pages and over 4 million visits a month. Our PR is 7. We're about to acquire and rebrand the content of a large reviews website, current PR 3. The new content will be treated as a 'site within a site' with different navigation and interface. With these factors in mind I think we need to create a new subdomain for the reviews site but I need to factor in the SEO implications, bearing in mind that new advertisers are going to be looking closely at our stats. Migrating the content to a new subdomain I understand will be easier than siting it in a new folder. Any advice appreciated
Intermediate & Advanced SEO | | CecilyP0