Getting subdomains unindexed
-
If i turn an application off displaying a 503 error will that get my site unindexed from search engines?
-
Subdomains can be verified as their own site in GWT. Verify the subdomain in GWT, then put a robots.txt on that subdomain excluding the entire subdomain, then request removal in GWT of that entire subdomain. I've had to remove staging and dev sites a couple of times myself.
A couple of things I've found useful in this situation is to make the robots.txt files for both the dev and live sites read only, so you don't accidentally overwrite one with the other when pushing a site live. You can also sign up for a free tool like Pole Position's Code Monitor that will look at the code of a page (including your robots.txt url) once a day and email you if there are any changes so you can fix the file then go hunt down whoever changed the file.
-
GWT was the first placed i checked unfortunately you can only remove directories or pages. I need entire subdomained sites to be removed (in fact they shouldn't of been indexed in the first place).
We use subdomains for our development testing environment when creating client sites and once the site is approved we push it live replacing the old site. Somehow these testing sites are getting indexed and it may pose a threat to duplicate content on different domains. So i am trying to find a solution to get the subdomains (100's of them) unindexed.
I understand a 301 redirect is best but that isn't really applicable since these test sites still need to be reached by clients.
-
With a robots.txt blocking it, you can then go into Google Webmaster Tools and request removal of that particular page or folder from Google's index.
-
No index tag on it works, and putting up a robots.txt that disallows everyone should work as well.
-
Thanks for the quick reply, i will have to try that. Essentially i am trying to get the site un-indexed but i wasn't sure if a 503 would do the trick.
-
Eventually, but that's the code Google recommends to return when your site is having downtime, so I would expect them to be more lenient towards not removing things right away. I wouldn't expect it to be as efficient as returning a 404 or a 410.
The best way to get content de-indexed is to return a page with a meta noindex tag on it, if you're really keen on getting it removed immediately.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website homepage temporarily getting removed from google index
hi, website: www.snackmagic.com The home page goes out of google index for some hours and then comes back. We are not sure why our home page is getting de-indexed temporarily. This doesn't happen with other pages on our website. This has been happening intermittently in the gap of 2-3 days. Any inputs will be very useful for us to debug this issue Thanks
Technical SEO | | manikbystadium0 -
Why images are not getting indexed and showing in Google webmaster
Hi, I would like to ask why our website images not indexing in Google. I have shared the following screenshot of the search console. https://www.screencast.com/t/yKoCBT6Q8Upw Last week (Friday 14 Sept 2018) it was showing 23.5K out 31K were submitted and indexed by Google. But now, it is showing only 1K 😞 Can you please let me know why might this happen, why images are not getting indexed and showing in Google webmaster.
Technical SEO | | 21centuryweb0 -
How can I get a photo album indexed by Google?
We have a lot of photos on our website. Unfortunately most of them don't seem to be indexed by Google. We run a party website. One of the things we do, is take pictures at events and put them on the site. An event page with a photo album, can have anywhere between 100 and 750 photo's. For each foto's there is a thumbnail on the page. The thumbnails are lazy loaded by showing a placeholder and loading the picture right before it comes onscreen. There is no pagination of infinite scrolling. Thumbnails don't have an alt text. Each thumbnail links to a picture page. This page only shows the base HTML structure (menu, etc), the image and a close button. The image has a src attribute with full size image, a srcset with several sizes for responsive design and an alt text. There is no real textual content on an image page. (Note that when a user clicks on the thumbnail, the large image is loaded using JavaScript and we mimic the page change. I think it doesn't matter, but am unsure.) I'd like that full size images should be indexed by Google and found with Google image search. Thumbnails should not be indexed (or ignored). Unfortunately most pictures aren't found or their thumbnail is shown. Moz is giving telling me that all the picture pages are duplicate content (19,521 issues), as they are all the same with the exception of the image. The page title isn't the same but similar for all images of an album. Example: On the "A day at the park" event page, we have 136 pictures. A site search on "a day at the park" foto, only reveals two photo's of the albums. 3QolbbI.png QTQVxqY.jpg mwEG90S.jpg
Technical SEO | | jasny0 -
Blogs are best when hosted on domain, subdomain, or...?
I’ve heard the it is a best practice to host your blog within your site. I’ve also heard it’s best to put it on a subdomain. What do you believe is the best home for your blog and why?
Technical SEO | | vernonmack0 -
Should I Do On Site Optimization For A Website That Will Get A New Design
Would it be wise for me to start implementing onsite optimization changes on a website, such as the changing urls, adding in keywords in meta tags, meta descriptions, etc if the website is about to get a totally new design. For example if I wanted to change the url structure and onsite optimization features would the changes still be on the new website.
Technical SEO | | TSpike10 -
Understanding page and subdomain metrics in OSE
When using OSE to look at the **Total External Links **of a websites homepage, I dont understand why the page and subdomain metrics are so different. For example, privacy.net has 1,992 external links at the page level and 55,371 at the subdomain level. What subdomain? www redirects to privacy.net. And they have 56,982 at the root domain level - does that mean they have around 55k deep links or what?
Technical SEO | | meterdei0 -
Competitive Domain Analysis & Subdomain Metrics
I have a web site that shows with all zeroes in the Competitive Domain Analysis and subdomain Metrics screen. I don't think this is possible because I have a ton of links out there to this web site from a forum that I visit. Can someone help me understand how this might be. I am hoping it's not some dreaded www vs non www issue because I think I solved that issue for this site. The site is www.nationalcurrencyvalues.com
Technical SEO | | Banknotes0 -
Basically duplicate sites that act like they're two different businesses. How do they not get dinged?
I bought supplies recently at barcodesinc.com. While searching I noticed it is clearly the same site as barcodediscount.com. How do they not get hurt by duplicate content?
Technical SEO | | jotham20