Rel canonical and duplicate subdomains
-
Hi,
I'm working with a site that has multiple sub domains of entirely duplicate content. So, the production level site that visitors see is (for made-up illustrative example):
Then, there are sub domains which are used by different developers to work on their own changes to the production site, before those changes are pushed to production:
Google ends up indexing these duplicate sub domains, which is of course not good.
If we add a canonical tag to the head section of the production page (and therefor all of the duplicate sub domains) will that cause some kind of problem... having a canonical tag on a page pointing to itself? Is it okay to have a canonical tag on a page pointing to that same page?
To complete the example...
In this example, where our production page is 123abc456.edu, our canonical tag on all pages (this page and therefor the duplicate subdomains) would be:
Is that going to be okay and fix this without causing some new problem of a canonical tag pointing to the page it's on?
Thanks!
-
Hi Bob,
That excellent question I'll have to look in to and confirm. More later. Thanks!
-
Is the subdomain data stored on the server as directories?
So for example, is the Moe.123abc456.edu data stored in a folder like 123abc456.edu/Moe
If so, you can simply have one robots.txt on your root domain, blocking those directories
Disallow: /Moe/
-
Well, Bob, it looks like you're right! I guess it will for sure see all the pages in
as the ones to remove and not
Also, how does that robots text not get pushed to production as the developer working on that branch completes his work and pushes it to production.
I must confess, it still feels a little like bomb disposal.
-
This should be exactly what you need: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1663427
-
Hi Bob,
Thanks for the suggestion/question. I'm thinking about that, but wouldn't putting some robots do not crawl text on pages already indexed be a little like closing the barn door after the horses left? Do you think it would un-index the already crawled sub-domain? Thanks!
-
Assuming that you do not need the development environments indexed in Google, why not simply block all crawlers on those subdomains?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can cross domain rel canonical point back and forth
My company was recently acquired by a much larger one with much stronger domain authority. Can we both use cross domain rel canonical for different keywords and on different pages than each other to help each other rank for non-competing keywords?
Intermediate & Advanced SEO | | Cassie_Ransom0 -
Canonicals question ref canonicals pointing to redundant urls
Hi, SCENARIO: A site has say 3 examples of the same product page but with different urls because that product fits into 3 different categories e.g. /tools/hammer /handtools/hammer /specialoffers/hammer and lets say the first 2 of those have the canonical pointing to /specialoffers/hammer YET that page is now redundant e.g. the webmaster decided to do away with the /specialoffers/ folder. ASSUMPTIONS: That is going to seriously hamper the chances of the 2 remaining versions of the hammer page being able to rank as they have canonicals pointing to a url that no longer exists. The canonical tags should be changed to point to 1 of the remaining url versions. As an added complication - lets say /specialoffers/hammer still exists, the url works, but just isn't navigable from the site. Thoughts/feedback welcome!
Intermediate & Advanced SEO | | AndyMacLean0 -
Should I add rel=nofollow ?
Say I have an article that includes a list of many websites with ressources for the articles topic. From a SEO perspective, should I add nofollow to them? some of them? all of them?
Intermediate & Advanced SEO | | Superberto0 -
High level rel=canonical conceptual question
Hi community. Your advice and perspective is greatly appreciated. We are doing a site replatform and I fear that serious SEO fundamentals were overlooked and I am not getting straight answers to a simple question: How are we communicating to search engines the single URL we want indexed? Backstory: Current site has major duplicate content issues. Rel-canonical is not used. There are currently 2 versions of every category and product detail page. Both are indexed in certain instances. A 60 page audit has recommends rel=canonical at least 10 times for the similar situations an ecommerce site has with dupe urls/content. New site: We are rolling out 2 URLS AGAIN!!! URL A is an internal URL generated by the systerm. We have developed this fancy dynamic sitemap generator which looks/maps to URL A and creates a SEO optimized URL that I call URL B. URL B is then inserted into the site map and the sitemap is communicated externally to google. URL B does an internal 301 redirect back to URL A...so in an essence, the URL a customer sees is not the same as what we want google to see. I still think there is potential for duplicate indexing. What do you think? Is rel=canonical the answer? In my research on this site, past projects and google I think the correct solution is this on each customer facing category and pdp: The head section (With the optimized Meta Title and Meta Description) needs to have the rel-canonical pointing to URL B
Intermediate & Advanced SEO | | mm916157
example of the meta area of URL A: What do you think? I am open to all ideas and I can provide more details if needed.0 -
Wordpress and duplicate content
Hi, I have recently installed wordpress and started a blog but now loads of duplicate pages are cropping up for tags and authors and dates etc. How do I do the canonical thing in wordpress? Thanks Ian
Intermediate & Advanced SEO | | jwdl0 -
Issue with duplicate content in blog
I have blog where all the pages r get indexed, with rich content in it. But In blogs tag and category url are also get indexed. i have just added my blog in seomoz pro, and i have checked my Crawl Diagnostics Summary in that its showing me that some of your blog content are same. For Example: www.abcdef.com/watches/cool-watches-of-2012/ these url is already get indexed, but i have asigned some tag and catgeory fo these url also which have also get indexed with the same content. so how shall i stop search engines to do not crawl these tag and categories pages. if i have more no - follow tags in my blog does it gives negative impact to search engines, any alternate way to tell search engines to stop crawling these category and tag pages.
Intermediate & Advanced SEO | | sumit600 -
Cross Domain Rel Canonical for Affiliates?
Hi We use the Cross Domain Rel Canonical for duplicate content between our own websites, but what about affiliates sites who want our XML feed, (descriptions of our products). We don´t mind being credited but would this present a danger for us? Who is controlling the use of that cross domain rel canonical, us in our feed or them? Is there another way around it?
Intermediate & Advanced SEO | | xoffie0 -
What constitutes duplicate content?
I have a website that lists various events. There is one particular event at a local swimming pool that occurs every few months -- for example, once in December 2011 and again in March 2012. It will probably happen again sometime in the future too. Each event has its own 'event' page, which includes a description of the event and other details. In the example above the only thing that changes is the date of the event, which is in an H2 tag. I'm getting this as an error in SEO Moz Pro as duplicate content. I could combine these pages, since the vast majority of the content is duplicate, but this will be a lot of work. Any suggestions on a strategy for handling this problem?
Intermediate & Advanced SEO | | ChatterBlock0