Can I disallow my subdomain for penguin recover?
-
Hi,
I have a site like BannerBuzz.com, before last penguin my site's all keywords were in good position in google, but after penguin hit on my website, my all keywords are going down and down day by day, i have done some changes in my website for improvement, but in 1 change i have some confusion.
i have one sub domain (http://reviews.bannerbuzz.com/), which display my websites all keywords user reviews, in which every category's 15 reviews are display in my website http://www.bannerbuzz.com so are those user reviews consider as duplicate content between sub domain and main website.
can i disallow sub domain from all search engine? currently sub domain is open for all search engine, is that helpful to block it?
Thanks
-
Hello Rafi,
I am going to make necessary changes on it. And, I have started work to gather backlinks on home page with Vinyl Banners keyword from various sources. It may help me to recover my old ranking!
-
No problem my friend. You are most welcome.
So if you are using 3rd party services to fill in the reviews content on the sub-domain, you can the following:
1. Stop using the sub-domain henceforth for the reviews content and use the new reviews sub-folder to get the reviews content filled in.
2. Redirect the old reviews content on the sub-domain to the new reviews sub-folder via 301.
This will make sure that you don't loose the SEO goodies that the sub-domain has acquired till date and also all (almost all) of those goodies will be passed on to the new sub-folder.
Please feel free to post any or all of your queries if you have any in this regard.
Best regards,
Devanur Rafi.
-
Thanks Devanur Rafi, for your information
You gave us really great information, but i have one question, currently i am using 3rd party reviews services fro customer's users (powerreviews.com), so is it possible to make sub folder and redirect sub-domain to sub-folder?
-
Hi there,
Here are my two cents in this regard. Instead of showing 10 or 15 reviews on the root domain, show no more than 2 and for more reviews you can send the visitors to the reviews sub-domain (using a 'view more reviews' button as you currently have). This will mitigate duplicate content issues to a great extent if at all any. I do not recommend blocking the sub-domain from the search engines. However, you can move the content of the sub-domain to something like a reviews sub-folder as follows:
From an SEO stand point, sub-folder is a safe bet compared to a sub-domain. Here is what Rand Fishkin has to say in this regard (http://www.seomoz.org/q/subdomains-vs-subfolders
_ “All the testing, research and examples I've seen in the past few years (and even the past few months) strongly suggest that the same principles still hold true._
Subdomains SOMETIMES inherit and pass link/trust/quality/ranking metrics between one another.
Subfolders ALWAYS inherit and pass link/trust/quality/ranking metrics across the same subdomain.
Thus, having a single subdomain (even just domainname.tld with no subdomain extension) with all of your content is absolutely ideal from an SEO perspective. It's also more usable and brandable, too IMO.”
Here is an interesting discussion about the same here on Moz.com:
http://www.seomoz.org/q/multiple-subdomains-my-worst-seo-mistake-now-what-should-i-do
Hope these help.
Best regards,
Devanur Rafi.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Disallow wildcard match in Robots.txt
This is in my robots.txt file, does anyone know what this is supposed to accomplish, it doesn't appear to be blocking URLs with question marks Disallow: /?crawler=1
Technical SEO | | AmandaBridge
Disallow: /?mobile=1 Thank you0 -
Can i do 301 redirect
So this is what im doing, 301 redirect to my site/allen-webdesign points to main domain Allen is the city i have a page called local-webdesign with all the cities. Will this improve my ranking or should i stop?
Technical SEO | | jsdfw0 -
I have lose my ranking Via 301 Redirection - How To Recover?
Hey, Folks! I Have Used 301 Redirection Method to Increase My Rankings but When i applied this Method My Website Ranked Down To 55 Numbers. Can anyone Suggest me How to Recover it?
Technical SEO | | SumitJiGupta0 -
How can you best use additional domains with important keywords
Currently I have a corporate website that is ranking all right. However, I have some additional domains containing import search terms that I would like to use to get higher rankings for the corporate website, or allow these domains to generate more traffic for the corporate website. What are best practice in using these domains with keyword terms, to make most use of them, for ideally both ranking as well as generating additional traffic. All input is highly appreciated.
Technical SEO | | moojoo0 -
Why am I getting millions of links from my root domain to my subdomains?
My site's subdomains (us.example.com, de.example.com, etc.) are showing millions of links (in Google Webmaster Tools) from the root domain. This seems very unnatural to me. Any idea what would be cause this or is this? In addition, I just found out that we deliberately stop googlebot crawling GEO-IP redirects, so that when googlebot tries to crawl our UK, DE, FR, etc. sites, it is not redirected to us.example.com. I'm thinking they may be linked? Thanks for your help!
Technical SEO | | CMcC0 -
Oh no googlebot can not access my robots.txt file
I just receive a n error message from google webmaster Wonder it was something to do with Yoast plugin. Could somebody help me with troubleshooting this? Here's original message Over the last 24 hours, Googlebot encountered 189 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%. Recommended action If the site error rate is 100%: Using a web browser, attempt to access http://www.soobumimphotography.com//robots.txt. If you are able to access it from your browser, then your site may be configured to deny access to googlebot. Check the configuration of your firewall and site to ensure that you are not denying access to googlebot. If your robots.txt is a static page, verify that your web service has proper permissions to access the file. If your robots.txt is dynamically generated, verify that the scripts that generate the robots.txt are properly configured and have permission to run. Check the logs for your website to see if your scripts are failing, and if so attempt to diagnose the cause of the failure. If the site error rate is less than 100%: Using Webmaster Tools, find a day with a high error rate and examine the logs for your web server for that day. Look for errors accessing robots.txt in the logs for that day and fix the causes of those errors. The most likely explanation is that your site is overloaded. Contact your hosting provider and discuss reconfiguring your web server or adding more resources to your website. After you think you've fixed the problem, use Fetch as Google to fetch http://www.soobumimphotography.com//robots.txt to verify that Googlebot can properly access your site.
Technical SEO | | BistosAmerica0 -
How can you tell if a link acts like a roadblock?
Hi from sunny 19 deg C wetherby UK 🙂 I read somewhere some links cannont be passed thru if they use Javascript. With that in mind on this site http://www.collegeofphlebology.com/ theres a horizontal scrolling nav bar illustrted here: http://i216.photobucket.com/albums/cc53/zymurgy_bucket/footer-nav-indexable.jpg Ive got two questions please: 1. Are these links ok in that they do not represent an indexing roadblock
Technical SEO | | Nightwing
2. How can you tell if an internal link cannot be passed Thanks in advance 🙂0 -
How to block google robots from a subdomain
I have a subdomain that lets me preview the changes I put on my site. The live site URL is www.site.com, working preview version is www.site.edit.com The contents on both are almost identical I want to block the preview version (www.site.edit.com) from Google Robots, so that they don't penalize me for duplicated content. Is it the right way to do it: User-Agent: * Disallow: .edit.com/*
Technical SEO | | Alexey_mindvalley0