Subdomain replaced domain in Google SERP
-
Good morning,
This is my first post. I found many Q&As here that mostly answer my question, but just to be sure we do this right I'm hoping the community can take a peak at my thinking below:
Problem: We are relevant rank #1 for "custom poker chips" for example. We have this development website on a subdomain (http://dev.chiplab.com). On Saturday our live 'chiplab.com' main domain was replaced by 'dev.chiplab.com' in the SERP.
Expected Cause: We did not add NOFOLLOW to the header tag. We also did not DISALLOW the subdomain in the robots.txt. We could have also put the 'dev.chiplab.com' subdomain behind a password wall.
Solution: Add NOFOLLOW header, update robots.txt on subdomain and disallow crawl/index.
Question: If we remove the subdomain from Google using WMT, will this drop us completely from the SERP? In other words, we would ideally like our root chiplab.com domain to replace the subdomain to get us back to where we were before Saturday. If the removal tool in WMT just removes the link completely, then is the only solution to wait until the site is recrawled and reindexed and hope the root chiplab.com domain ranks in place of the subdomain again?
Thank you for your time,
Chase
-
Hi Chase,
Removing dev via web master tools should do the trick for now. Then since google won't get to dev anymore you should be safe.
Adding both noindex and password protection is not needed. Since it's password protected Google won't get to see the noindex on the pages. So you should only do one of the two. No need to change now. The password protection is safe.
As expected 'dev.chiplab.com' was removed from the SERP. Now, I'm a bit worried that the link equity was transferred for good to the subdomain from 'www.chiplab.com'. That's not possible, right?
*** Yes, that's not possible so you are good.
Only 301 redirections are "mandatory" for Google to pass equity - so all good.
-
No worries, that's what this community is here for!
Google views subdomains as different entities. They have different authority metrics and therefore different ranking power. Removing a URL on a subdomain won't have any affect on it's brother over on a different subdomain (for example: dev. and www.).
Good call to keep the disallow: / on the dev.chiplab.com/robots.txt file - I forgot to mention that you should leave it there, for anti-crawling purpose.
This is the query you'll want to keep an eye on. The info: operator is new and can be used to show you what Google has indexed as your 'canonical' homepage.
-
Hi Logan,
Last follow-up. I swear.
Since I'm pretty new to this I got scared and cancelled the 'dev.chiplab.com' link removal request. I did this because I didn't want to go up 14 days without any traffic (this is the estimated time I found that the Google SERP can take to be updated even though we "fetched as GoogleBot in GWT). May be wrong on the SERP update time?
So what I did was add a 301 permanent redirect from 'dev.chiplab.com' to 'www.chiplab.com'. I've kept the NOFOLLOW/NOINDEX header on all 'dev' subdomains of course. I've kept the DISALLOW in robots.txt for the dev.chiplab.com site specifically. So now I just plan on doing work in the 'dev' site (because I can't test anything with the redirects happening). And then hopefull in 14 days or so the domain name will change gracefully in the Google SERP from dev.chiplab.com to www.chiplab.com. I did all of this because of how many sales we would lose if it took 14 days to start ranking again for this term. Good?
Best,
Chase
-
You should be all set# I wouldn't worry about link equity, but it certainly wouldn't hurt to keep an eye on your domain authority over the next few days.
-
Hi Logan,
Thanks for fast reply!
We did the following:
- Added NOINDEX on the entire subdomain
- Temporarily removed 'dev.chiplab.com' using Google Webmaster Tools
- Password protected 'dev.chiplab.com'
As expected 'dev.chiplab.com' was removed from the SERP. Now, I'm a bit worried that the link equity was transferred for good to the subdomain from 'www.chiplab.com'. That's not possible, right? Do we now just wait until GoogleBot crawls 'www.chiplab.com' and hope that it is restored to #1?
Thank you for your time (+Shawn, +Matt, +eyqpaq),
Chase
-
noindex would be the easiest way.
Seen some people having the same issue fixing it by adding rel canonical to dev pointing to the new site and so the main site got back step by step with no interruptions...
Cheers.
-
Just like Chase said, noindex your dev site to let the search engines know that it should not show in search. I do this on my dev sites everytime.
-
The most ideal method would be to make the dev page password protected. What I would do is to 301 redirect the dev page to the subsequent correct site pages and then when the SERP refreshes, I'd make the dev site a password protected site.
-
Hi Chase,
Removing the subdomain within Search Console (WMT) will not remove the rest of your WWW URLs. Since you have different properties in Search Console for each, they are treated separately. That removal is only temporary though.
The most sure-fire way to ensure you don't get dev. URLs indexed is to put a NOINDEX tag on that entire subdomain. NOFOLLOW simply means that links on whatever page that tag is on won't be followed by bots.
Remember, crawling and indexing are different things. For example, if on your live www. site you had an absolute link somewhere in the mix that had dev.chiplab.com in it, since you presumably haven't nofollowed your live site, a bot will still access that page. The same situation goes for a robots.txt disallow. That only prevents crawling, not indexing. In theory, a bot can get to a disallowed URL and still index it. See this query for an example.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why isn't our complete meta title showing up in the Google SERPS? (cut off half way)
We carry a product line, cutless bearings (for use on boats). For instance, we have one, called the Able, that has the following meta title (and searched by View Page Source to confirm): BOOT 1-3/8" x 2-3/8" x 5-1/2" Johnson Cutless Bearing | BOOT Cutlass However, if I search for it on on Google by part number or name (boot cutless bearing, boot cutlass bearing), the meta title comes back with whole first part chopped off, only showing this : "x 5-1/2" Johnson Cutless Bearing | BOOT Cutlass - Citimarine ..." Any idea why? Here's the url if it will hopefully help: https://citimarinestore.com/en/metallic-inches/156-boot-johnson-cutless-bearing-870352103.html All the products in the category are doing the same. Thanks!
Intermediate & Advanced SEO | | Citimarine0 -
Using a Sub Domain as a Main Domain?
Hi, I'm working on a site at the moment and the sub domain is acting as the main domain. This occurred when the site was redesigned and built on a sub domain for testing but it was never moved to the main domain when it went live (a couple of years ago). So little or no pages are live on domain.com but all on sub.domain.com. It's a large company but they have very poor rankings. Would you recommend that they move the sub domain back into the root folder? Does this involve renaming/re-pointing URLs? Thanks Louise
Intermediate & Advanced SEO | | MVIreland1 -
Putting my content under domain.com/content, or under related categories: domain.com/bikes/content ?
Hello This questions plays on what Joe Hall talked about during this years' MozCon: Rethinking Information Architecture for SEO and Content Marketing. My Case:
Intermediate & Advanced SEO | | Inevo
So.. we're working out guidelines and templates for a costumer (sporting goods store) on how to publish content (articles, videos, guides) on their category pages, product pages, and other pages. At this moment I have 2 choices:
1. Use a url-structure/information architecture where all the content is placed in one subfolder, for example domain.com/content. Although it's placed here, there's gonna be extensive internal linking from /content to the related category pages, so the content about bikes (even if it's placed under domain.com/bikes) will be just as visible on the pages related to bikes. 2. Place the content about bikes on a subdirectory under the bike category, **for example domain.com/bikes/content. ** The UX/interface for these two scenarios will be identical, but the directories/folder-hierarchy/url structure will be different. According to Joe Hall, the latter scenario will build up more topical authority and relevance towards the category/topic, and should be the overall most ideal setup. Any thoughts on which of the two solutions is the most ideal? PS: There is one critical caveat her: my costumer uses many url-slugs subdirectories for their categories, for example domain.com/activity/summer/bikes/, which means the content in the first scenario will be 4 steps away from the home page. Is this gonna be a problem? Looking forward to your thoughts 🙂 Sigurd, INEVO0 -
Subdomains + SEO
Hi everyone, So a little background - my company launched a new website (http://www.everyaction.com). The homepage is currently hosted on an amazon s3 bucket while the blog and landing pages are hosted within Hubspot. My question is - is that going to end up hurting our SEO in the long run? I've seen a much slower uptick in search engine traffic than I'm used to seeing when launching new sites and I'm wondering if that's because people are sharing the blog.everyaction.com url on social (which then wouldn't benefit just everyaction.com?) Anyways, a little help on what I should be considering when it comes to subdomains would be very helpful. Thanks, Devon
Intermediate & Advanced SEO | | EveryActionHQ0 -
Merging two different domains - subdomain or subfolder?
My company has two sites on different domains. We are considering merging the sites into one and keeping only the dominant domain. The dominate site is already a sub-domain of a larger organization so the new sub-domain would be two levels deep. I realize this is a little abstract so below is an example Dominant company site: company.root-domain.com Secondary company site: other-root-domain.com When they merge, everything will be on company.root-domain.com. Should it be other.company.root-domain.com or company.root-domain.com/other Note: The other site has several hundred pages. Both sites have strong authority and link profiles. I want to maintain as much of the value on the other site as possible with the merge.
Intermediate & Advanced SEO | | SEI0 -
Google and private networks?
I have one or two competitors (in the UK) in my field who buy expired 1 - 8 year old domains on random subjects (SEO, travel, health you name it) and they are in the printing business and they stick 1 - 2 articles (unrelated to what was on there before) on these and that's it. I think they stick with PA and DA above 30 and most have 10 – 100 links so well used expired domains, hosted in the USA and most have different Ip’s although they now have that many (over 70% of their backlink profile) that some have the same ip. On further investigation none of the blogs have any contact details but it does look like they have been a little smart here and added content to the about us (similar to I use to run xxx but now do xxx) also they have one or two tabs with content on (article length) that is on the same subject they use to do and the titles are all the same content. So basically they are finding expired 1 – 10 year old domains that have only been expired (from what I can see) 6 months max and putting 1 – 2 articles on the home page in relation with print (maybe adding a third on the subject the blog use to cover), add 1 – 3 articles via tabs at the top on subjects the sites use to cover, registering the details via [email protected] and that’s it. They have been ranking via this method for the last couple of years (through all the Google updates) and still do extremely well. Does Google not have any way to combat link networks other than the stupid stuff such as public link networks, it just seems that if you know what you are doing you get away, if your big enough you get away with it but the middle of the ground (mum and pop sites) get F*** over with spam pointing to there site that no spammer would dream of doing anyway?
Intermediate & Advanced SEO | | BobAnderson0 -
Odd Results Moving Subdomain Content onto Main Domain
Hi forum! On Thursday night (12/6/12) we moved a page (and all the linking product pages) from our subdomain, mailing-list.consumerbase.com, to our main domain, www.consumerbase.com/mailing-lists.html Shockingly, today I search for "mailing lists" (our #1 target keyword) and we're on the first page! This page never has not ranked well for this keyword in the past. The problem is, the link displaying on Google is our old mailing-list.consumerbase.com subdomain URL. Did moving this content from the new subdomain to our old, well-established domain cause it to appear better in search? Or, since the URL is on the subdomain, did Google just finally get around to indexing that page? Thanks!
Intermediate & Advanced SEO | | Travis-W0 -
Subdomain or subdirectory
We're a big social networking site with over 1 million indexed pages and over 4 million visits a month. Our PR is 7. We're about to acquire and rebrand the content of a large reviews website, current PR 3. The new content will be treated as a 'site within a site' with different navigation and interface. With these factors in mind I think we need to create a new subdomain for the reviews site but I need to factor in the SEO implications, bearing in mind that new advertisers are going to be looking closely at our stats. Migrating the content to a new subdomain I understand will be easier than siting it in a new folder. Any advice appreciated
Intermediate & Advanced SEO | | CecilyP0