Blocking subdomains without blocking sites...
-
So let's say I am working for bloggingplatform.com, and people can create free sites through my tools and those sites show up as myblog.bloggingplatform.com. However that site can also be accessed from myblog.com.
Is there a way, separate from editing the myblog.com site code or files, for me to tell google to stop indexing myblog.bloggingplatform.com while still letting them index myblog.com without inserting any code into the page load?
This is a simplification of a problem I am running across.
Basically, Google is associating subdomains to my domain that it shouldn't even index, and it is adversely affecting my main domain. Other than contacting the offending sub-domain holders (which we do), I am looking for a way to stop Google from indexing those domains at all (they are used for technical purposes, and not for users to find the sites).
Thoughts?
-
Ah, I see now. Try this out http://moz.com/community/q/block-an-entire-subdomain-with-robots-txt#reply_26992 - basically, when a subdomain is identified, it would pull a different file into the robots.txt location (which would contain the disallow: / syntax)
Read the remaining comments about getting the subdomain removed via GWT.
-
You are correct, but that isn't what I was asking.
user1.bloggingplatform.com and myblog.com point to the same web server files. If I put up a robots.txt on user1.b... I would effectively de-index myblog.com.
The problem we have run accross is that user205.bloggingplatform.com might be doing something shady, but instead of de-listing the subdomain google kills the primary domain from the index as well.
Because user205.bloggingplatform.com should only be used for technical reasons, and not be in Google's index I am looking for a way to tell google not to index the sub-domain.
I think the better way to solve the problem would be to change the technical subdomain's domain though so change it from user205.bloggingplatform.com to user205.bloggingplatformtesting.com.
Then google can kill that URL all it wants as I don't care.
-
bloggingplatform.com/robots.txt
and
user1.bloggingplatform.com/robots.txt
can and should be different. If you disallow at the subdomain level, only the subdomain will be affected. You can search around for other examples of this but i'm certain it works (we have a development domain that is indexed and create subdomains for all clients that aren't indexed and done via individual robots.txt files)
-
I don't think that works. Since both URLs point to the same server the robots.txt file for the test URL would completely kill the main url.
Or am I missing something?
-
Each subdomain should have a robots.txt file that blocks that specific subdomain. e.g. user1.bloggingplatform.com/robots.txt should have:
User-agent: *
Disallow: /
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Then why my site is not ranking
My website's DA and PAs are good compare with my competitors. Then why my site is not ranking.
Technical SEO | | Somanathan0 -
Site is not displaying in Search Engines
My site is www.deoveritas.com it is in magento framework and it has a blog section in wordpress. When I enter Site:www.deoveroitas.com in google it shows all blog links in search result. The homepage and other innerpages are not getting displayed in search results at all. I even tried searching for www.deoveritas.com/about-us and it displays blogs in result. Checked Google webmaster fetch as google and it was index and successful. Can you please help me with this. Is my site de-indexed or banned by Google? the same issue is on Bing and Yahoo search engines too. Please help Thank you.
Technical SEO | | tpt.com0 -
Redirects in site map
I have a site with the ace/sef ( creates friendly URLS) in a large data base site. It creates a site map dynamically. Yet I realize one issue which I am trying to think through. I recently changed my urls to include an ID number example: homepage/houses/1134-big-blue-house The prior url was: homepage/houses/big-blue-house the original url above redirects to the new one with the ID like I want. However the site map has both URLS in it which go to same page I am not sure but it seems rather stupid to have the new URL and OLD redirected URL in the site map. Yet beside stupid I am wondering if this is duplicate content and will cause a penalty from the google bot. What is your opinion ?
Technical SEO | | aimiyo0 -
Want to Target Mobile site for Google Mobile Version and Desktop Site for Google Desktop Version
I have ecommerce site with both mobile version and desktop version. Mobile version starts with m.example.com and full version starts with www.example.com I am using same content through out both site and using 301 redirection by detecting user agent vice-versa. My both sites are accessible to crawl by any google spider. I have submitted both sites's sitemap to GWT and mobile site having mobile sitemap xml, so google can easily recognize my mobile site. Is it going to help to rank my both sites as per my expectation? I need to rank for mobile site in Google mobile and ranking for desktop site in Google desktop version. Some of pages of my mobile site are started to appearing in Google desktop version. So how I can stop them to appear in Google desktop? Your comments are highly welcome.
Technical SEO | | Hexpress0 -
Block url with dynamic text in
I've just ran a report and I have a lot of duplicate page titles, most of which seem to be the review page, I use Magento and my normal url would be something like blah-blahtext.html but the review url is something like blah-blahtext/reviews/category/categoryname So I want to block the /reviews url bit as no one ever leaves reviews and it's not something I will be using in the future. Also I have a dynamic navigation which creates urls that look like product-name.html?size=2&colour=14 these are also creating duplicate urls, anyway to fix this? While I'm asking, anyone any tips for Magento?
Technical SEO | | Beermonster0 -
How do you find bad links to your site?
My website has around 900 incoming links and I have a Google 50 penalty that is sitewide. I have been doing research and from what I can see is that the 50 penalty is usually associated with scetchy links. The penalty started last year. I had about 40 related domains to my main site and each had a simple one page site with a link to the main site. (I know I screwed up) I cleaned up all of those links by removing them. The single page site still exist, but they have no links and several of them still rank very well. I also had an outside SEO person that bought a few links. I came clean with Google and told them everything. I gave them all of my sites and that the SEO person had bought links. I gave them full disclosure and removed everything. I have one site that I can't get the link removed from. I have contacted them numerous times to remove the link and I get no response. I am curious if anyone has had a simular experience and how they corrected the situation. Another issue is that my site is "thin" because its an ecommerce affiliate site and full of affiliate links. I work in the costume market. I'm also afraid that I have other bad links pointing to my site. Dooes anyone know of a tool to identify bad links that Google may be penalizing me for at this time. Here is Google's latest denial of my reconsideration request. Dear site owner or webmaster of XXXXXXXXX.com. We received a request from a site owner to reconsider XXXXXXXX.com for compliance with Google's Webmaster Guidelines. We've reviewed your site and we believe that some or all of your pages still violate our quality guidelines. In order to preserve the quality of our search engine, pages from XXXXXXXXXX.com may not appear or may not rank as highly in Google's search results, or may otherwise be considered to be less trustworthy than sites which follow the quality guidelines. If you wish to be reconsidered again, please correct or remove all pages that are outside our quality guidelines. When such changes have been made, please visit https://www.google.com/webmasters/tools/reconsideration?hl=en and resubmit your site for reconsideration. If you have additional questions about how to resolve this issue, please see our Webmaster Help Forum for support. Sincerely, Google Search Quality
Technical SEO | | tadden0 -
Internal linking to subdomains
Hi *, I have a main site called example.org, and a lot of user generated pages to foo.example.org / bar.example.org and so on. Most of those pages link back to example.org. In example.org I have a page that links to all subdomains. How can I optimize the pagerank of the list page? Should I add nofollow to subdomain sites to avoid passing link juice to those sites and keep normal linking from subdomain sites?
Technical SEO | | ngw0