Rel canonical and duplicate subdomains
-
Hi,
I'm working with a site that has multiple sub domains of entirely duplicate content. So, the production level site that visitors see is (for made-up illustrative example):
Then, there are sub domains which are used by different developers to work on their own changes to the production site, before those changes are pushed to production:
Google ends up indexing these duplicate sub domains, which is of course not good.
If we add a canonical tag to the head section of the production page (and therefor all of the duplicate sub domains) will that cause some kind of problem... having a canonical tag on a page pointing to itself? Is it okay to have a canonical tag on a page pointing to that same page?
To complete the example...
In this example, where our production page is 123abc456.edu, our canonical tag on all pages (this page and therefor the duplicate subdomains) would be:
Is that going to be okay and fix this without causing some new problem of a canonical tag pointing to the page it's on?
Thanks!
-
Hi Bob,
That excellent question I'll have to look in to and confirm. More later. Thanks!
-
Is the subdomain data stored on the server as directories?
So for example, is the Moe.123abc456.edu data stored in a folder like 123abc456.edu/Moe
If so, you can simply have one robots.txt on your root domain, blocking those directories
Disallow: /Moe/
-
Well, Bob, it looks like you're right! I guess it will for sure see all the pages in
as the ones to remove and not
Also, how does that robots text not get pushed to production as the developer working on that branch completes his work and pushes it to production.
I must confess, it still feels a little like bomb disposal.
-
This should be exactly what you need: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1663427
-
Hi Bob,
Thanks for the suggestion/question. I'm thinking about that, but wouldn't putting some robots do not crawl text on pages already indexed be a little like closing the barn door after the horses left? Do you think it would un-index the already crawled sub-domain? Thanks!
-
Assuming that you do not need the development environments indexed in Google, why not simply block all crawlers on those subdomains?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Too much duplicate text?
Last December we started losing traffic to our website https://www.spec.lt (This is in Lithuanian).
Intermediate & Advanced SEO | | anonimas
The thing we did was to every single company page we added QR Code. For example: https://www.spec.lt/imone/onninen-uab (at the bottom of this page). We added some text that goes with it. As you can see here http://imgur.com/a/beaYm
The only difference between those texts is the company name. Can this be the reason why google reduced our positions ? (We didn't lose any of traffic in categories/search/articles - only in company pages). A lot of companies that are new or bancrupt have little to no text at all. Except for this text about QR code, like here for example - https://www.spec.lt/imone/mazoji-bendrija-transportas-2017 Can this be the reason? Or any other on page errors that you see.
Thank you0 -
302 to a page and rel=canonical back to the original (to preserve url juice)?
Bit of a weird case, but let me explain. We use unbounce.com to create our landing pages, which are on a separate sub-domain (get.domain.com).
Intermediate & Advanced SEO | | dragonlawhq
Some of these landing pages have a substantial amount of useful information and are part of our content building strategy (our content marketers are able to deploy them without going through the dev team cycle). We'd like to make sure the seo page-juice is counting towards our primary domain and not the subdomain.
(It would also help if we one day stop using unbounce and just migrate our landing page content to our primary website). Would it be an SEO faux-pas to do the following:
domain.com/awesome-page ---[302]---> get.domain.com/awesome-page
get.domain.com/awesome-page ---[rel=canonical]---> domain.com/awesome-page My understanding is that our primary domain would hold all the "page juice" whilst sending users to the unbounce landing page - and the day we stop using unbounce, we just kill the redirect and host the content on our primary domain.0 -
Rel=canonical
My website is built around a template, the hosting site say I can only add code into the body of the webpage not the header, will this be ok for rel=canonical If it is my next question is redundant but as there is only one place to put it which urls do I need to place in the code http://domain.com, www.domain.com or http://www.domain.com the /default.asp option for my website does not seem to exist, so I guess is not relevant thanks
Intermediate & Advanced SEO | | singingtelegramsuk0 -
Subdomains + SEO
Hi everyone, So a little background - my company launched a new website (http://www.everyaction.com). The homepage is currently hosted on an amazon s3 bucket while the blog and landing pages are hosted within Hubspot. My question is - is that going to end up hurting our SEO in the long run? I've seen a much slower uptick in search engine traffic than I'm used to seeing when launching new sites and I'm wondering if that's because people are sharing the blog.everyaction.com url on social (which then wouldn't benefit just everyaction.com?) Anyways, a little help on what I should be considering when it comes to subdomains would be very helpful. Thanks, Devon
Intermediate & Advanced SEO | | EveryActionHQ0 -
Category Content Duplication
Does indexing category archive page for a blog cause duplications? http://www.seomoz.org/blog/setup-wordpress-for-seo-success After reading this article I am unsure.
Intermediate & Advanced SEO | | SEODinosaur0 -
Rel Canonical = WHAT
can someone please explain this "NOTICE" i am getting from my campaign...Is this a problem that needs attention?
Intermediate & Advanced SEO | | SEObleu.com0 -
Robots.txt disallow subdomain
Hi all, I have a development subdomain, which gets copied to the live domain. Because I don't want this dev domain to get crawled, I'd like to implement a robots.txt for this domain only. The problem is that I don't want this robots.txt to disallow the live domain. Is there a way to create a robots.txt for this development subdomain only? Thanks in advance!
Intermediate & Advanced SEO | | Partouter0 -
Canonical category pages
A couple of years ago I used to receive a lot of traffic via my category pages but now I don't receive as much, in the past year I've modified the category pages to canonical. I have 15 genres for the category pages, other than the most recent sorting there is no sorting available for the users on the cat pages, a recent image link added can over time drop off to page 2 of the category page, for example mysite.com/cat-page1.html = 100 image links per page with numbered page navigation, number of cat pages 1-23. New image link can drop off to page 2. mysite.com/dog-page1.html = 100 image links per page with numbered page navigation, number of cat pages 1-53. New image link can drop off to page 2. mysite.com/turtle-page1.html = 100 image links per page with numbered page navigation, number of cat pages 1-2. New image link can drop off to page 2. Now on the first page (eg mysite.com/cat-page1.html) I've set this up to rel= canonical = mysite.com/cat-page1.html One thing that I have noticed is the unique popup short description tooltips that I have on the image links only appears in google for the first pages of each category page, it seems to ignore the other pages. In view of this am I right in applying canonical ref or just treating it as normal pages.? thanks
Intermediate & Advanced SEO | | Flapjack0