Getting subdomains unindexed
-
If i turn an application off displaying a 503 error will that get my site unindexed from search engines?
-
Subdomains can be verified as their own site in GWT. Verify the subdomain in GWT, then put a robots.txt on that subdomain excluding the entire subdomain, then request removal in GWT of that entire subdomain. I've had to remove staging and dev sites a couple of times myself.
A couple of things I've found useful in this situation is to make the robots.txt files for both the dev and live sites read only, so you don't accidentally overwrite one with the other when pushing a site live. You can also sign up for a free tool like Pole Position's Code Monitor that will look at the code of a page (including your robots.txt url) once a day and email you if there are any changes so you can fix the file then go hunt down whoever changed the file.
-
GWT was the first placed i checked unfortunately you can only remove directories or pages. I need entire subdomained sites to be removed (in fact they shouldn't of been indexed in the first place).
We use subdomains for our development testing environment when creating client sites and once the site is approved we push it live replacing the old site. Somehow these testing sites are getting indexed and it may pose a threat to duplicate content on different domains. So i am trying to find a solution to get the subdomains (100's of them) unindexed.
I understand a 301 redirect is best but that isn't really applicable since these test sites still need to be reached by clients.
-
With a robots.txt blocking it, you can then go into Google Webmaster Tools and request removal of that particular page or folder from Google's index.
-
No index tag on it works, and putting up a robots.txt that disallows everyone should work as well.
-
Thanks for the quick reply, i will have to try that. Essentially i am trying to get the site un-indexed but i wasn't sure if a 503 would do the trick.
-
Eventually, but that's the code Google recommends to return when your site is having downtime, so I would expect them to be more lenient towards not removing things right away. I wouldn't expect it to be as efficient as returning a 404 or a 410.
The best way to get content de-indexed is to return a page with a meta noindex tag on it, if you're really keen on getting it removed immediately.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to sunset language subdomains that we don't want to support anymore?
We have a primary domain www.postermywall.com. We have used subdomains for offering the same site in different languages, like es.postermywall.com, fr.postermywall.com etc. There are certain language subdomains that have low traffic and are expensive to get translated. We have decided to sunset 3 subdomains that match that criteria. What is the best way of going about removing those subdomains? Should we just redirect from those subdomains to www.postermywall.com? Would that have any negative impact on our primary domain in Google's eye etc.? Anything other than a redirect that we should be considering?
Technical SEO | | 250mils0 -
Google Crawling Issues! How Can I Get Google to Crawl My Website Regularly?
Hi Everyone! My website is not being crawled regularly by Google - there are weeks when it's regular but for the past month or so it does not get crawled for seven to eight days. There are some specific pages, that I want to get ranked but they of late are not being crawled AT ALL unless I use the 'Fetch As Google' tool! That's not normal, right? I have checked and re-checked the on-page metrics for these pages (and the website as a whole, backlinking is a regular and ongoing process as well! Sitemap is in place too! Resubmitted it once too! This issue is detrimental to website traffic and rankings! Would really appreciate insights from you guys! Thanks a lot!
Technical SEO | | farhanm1 -
How is Google finding our preview subdomains?
I've noticed that Google is able to find, crawl and index preview subdomains we set up for new client sites (e.g. clientpreview.example.com). I know now to use "meta name="robots" and robots.txt) to block the search engines from crawling these subdomains. My question though, is how is Google finding these subdomains? We don't link to these preview domains from anywhere else, so I can't figure out how Google is even getting there. Does anybody have any insight on this?
Technical SEO | | ZeeCreative0 -
Can someone help me get this site ranked? www.2sponsors.com
Hi, I am have been trying for months to get a site ranked for one of my customers and I am not doing very well. I have been doing SEO for years and have gotten lots of sites ranked but this one has been the most difficult. Does anyone have time to look at it for me? Thanks The sites PR=4. I am trying to get it ranked in www.google.com.ar Thanks Carla skype: carla.dawson78
Technical SEO | | Carla_Dawson0 -
Subdomain for a blog
My client has a site hosted with a company that allows very little customization including I am unable to add a blog to the site. As he has a fair amount of time & money invested in the site, he is reluctant to start over. So my question is this. His blog is currently hosted off site, would it benefit him if I had them add a cname or a record to show his blog at blog.mydomain.com? Or does Google recognize that it is still a separate site and treat it as such? Finally does it matter how they set it up cname, a record or redirect? This is definitely not my area of expertise (if that is not already obvious from the question!). Thanks for your help! Matthew
Technical SEO | | farlandlee0 -
Where to get expert SEO help?
I joined SEOmoz knowing very little about SEO (it turns out even less than I thought!) I signed up because my business website that had be ranking very well for years made a fast and furious fall to the purgatory of page 2, 3, whatever. We'll I've definitely learned a lot and made a several changes that have helped. Specifically link building (directory submissions) and eliminating duplicate content. But we're still far below where we used to be and I've done everything I can do without making a career change to SEO. I've hired a few offshore SEOs to help but they have all failed to live up to their promises. So, I would love to find a GOOD SEO that can 1. Fix the remaining on-page technical issues in our CMS website (Business Catalyst), and 2. help us develop an SEO strategy for the next year. (I prefer not to post the name of the website for competitive reasons) Our keywords are really not very competitive at all due to the uniqueness of the business. Where should I look for help? Thanks
Technical SEO | | Placeboo0 -
How to Submit XML Site Map with more than 300 Subdomains?
Hi,
Technical SEO | | vaibhav45
I am creating sitemaps for site which has more than 500 Sub domains. Page varies from 20 to 500 in all subdomains & it will keep on adding in coming months. I have seen sites that create separate sitemap.xml for each subdomain which they mention in separate robots.txt file http://windows7.iyogi.com/robots.txt XML site map eg for subdomain: http://windows7.iyogi.com/sitemap.xml.gz , Currently in my website we have only 1 robots.txt file for main domain & sub domains. Please tell me shall i create separate robots.txt & XML site map file for each subdomain or 1 file. Creating separate xml for each sub-domain is not feasible as we have to verify in GWT separately. Is there any automatic way & do i have to ping separately if i add new pages in subdomain. Please advise me.0 -
Switching subdomains
A few years ago our company decided to merge all its websites (magazine brands) as sub-domains under one new root domain. So the current situation is like this: brand1.rootdomain.com
Technical SEO | | WDN
brand2.rootdomain.com
brand3.rootdomain.com
... For the moment the rootdomain has a domain authority of 66. In a few weeks we would like to switch that rootdomain to the strongest (highest trust, pagerank,...) brand. So we get this: www.brand1.com
brand2.brand1.com
brand3.brand1.com Before we make the switch i'll have to make a pro and con list. So I hope I can get some advice for you guys if this is a good idea or not.0