Deleting Subdomain - 301 to Homepage Best Option?
-
We have a subdomain with lots of content that we think Google may consider thin, so we're thinking of removing it to improve our SEO. Is the best way to do this to simply remove the directory and then 301 everything to our homepage?
Basically the subdomain consists of product images with links to retailers where they can be purchased. We've basically used it to refer our Facebook fans to when they like a product they've seen on our Facebook page, so the subdomain was not meant to rank for SEO purposes. However, it is integrated with our main webpage, and it is possible that it is hurting our SEO efforts.
The subdomain is photos.yournextshoes.com and the main domain www.yournextshoes.com
-
Well, just took a look at your subdomain links and there's nothing bad there. You can go ahead and remove the subdomain placing a 301 to your homepage.
-
Basically it takes a lot of time to update the subdomain, so we'd rather just remove it and instead post more content on Facebook.
-
What don't you just block Google and other search engines from crawling/indexing the subdomain?
That will keep the content there while not affect your SEO at all (and prevent from having bad backlinks from subdomain now pointing to domain (if any) by using a 301).
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have a GoDaddy website and have multiple homepages
I have GoDaddy website builder and a new website http://ecuadorvisapros.com and I notices through your crawl test that there are 3 home pages http://ecuadorvisapros with a 302 temporary redirect, http://www.ecuadorvisapros.com/ with no redirect and http://www.ecuadorvisapros/home.html. GoDaddy says there is only one home page. Is this going to kill my chances of having a successful website and can this be fixed? Or can it. I actually went with the SEO version thinking it would be better, but it wants to auto change my settings that I worked so hard at with your sites help. Please keep it simple, I am a novice although I have had websites in the past I know more about the what's than the how's of websites. Thanks,
Technical SEO | | ScottR.0 -
Best Practices for Image Optimisation
Hi Guys, I would love some recommendations from you all. A potential client of mine is currently hosting all their website image galleries (of which there are many) on a flickr account and realise that they could gain more leverage in Google images (currently none of their images cover off any of the basics for optimisation eg filename, alt text etc), I did say that these basics would at least need to be covered off and that Image hosting is supposedly an important factor especially when it comes to driving traffic from Google Image Search. (potentially images hosted on the same domain as the text are given more value than the images hosted at another domain like websites such as Flickr). The client has now come back saying they have done some 'reading' and that this suggests a sub-domain could be the way to go, e.g. images.mydomain.com - would love feedback on this before I go back to them as it would be a huge undertaking for them. Cheers
Technical SEO | | musthavemarketing0 -
Redirect URLS with 301 twice
Hello, I had asked my client to ask her web developer to move to a more simplified URL structure. There was a folder called "home" after the root which served no purpose. I asked for the URLs to be redirected using 301 to the new URLs which did not have this structure. However, the web developer didn't agree and decided to just rename the "home" folder "p". I don't know why he did this. We argued the case and he then created the URL structure we wanted. Initially he had 301 redirected the old URLS (the one with "Home") to his new version (the one with the "p"). When we asked for the more simplified URL after arguing, he just redirected all the "p" URLS to the PAGE NOT FOUND. However, remember, all the original URLs are now being redirected to the PAGE NOT FOUND as a result. The problems I see are these unless he redirects again: The new simplified URLS have to start from scratch to rank 2)We have duplicated content - two URLs with the same content Customers clicking products in the SERPs will currently find that they are being redirect to the 404 page. I understand that redirection has to occur but my questions are these: Is it ok to redirect twice with 301 - so old URL to the "p" version then to final simplified version. Will link juice be lost doing this twice? If he redirects from the original URLS to the final version missing out the "p" version, what should happen to the "p" version - they are currently indexed. Any help would be appreciated. Thanks
Technical SEO | | AL123al0 -
Canonical tag or 301
Hi, Our crawl report is showing duplicate content. some of the report I am clear about what to do but on others I am not. Some of the duplicate content arises with a 'theme=default' on the end of the URL. Is this version of a page necessary for people to see when they visit the site (like a theme=print page is) in which case I think we should use a canonical tag, or is it not necessary in which case we should use a 301? Thanks
Technical SEO | | Houses0 -
Setting up a 301 redirect from expired webpages
Hi Guys, We have recently created a new website for one of our clients and replaced their old website on the same domain. One problem that we are having is that all of the old pages are indexed within Google (1000s) and are just getting sent to our custom 404 page. We are finding that there is an large bounce rate from this and also, I am worried from an SEO point of view that the site could lose rank positioning through the number of crawl errors that Google is getting. Want I want is to set up a 301 redirect from these pages to go to the 'our brands' page. The reason for this is that the majority of the old URLs linked to individual product pages, and one thing to note is that they are all .asp pages. Is there a way of setting up a rule in the htaccess file (or another way) to say that all webpages that end with the suffix of .asp will be 301 redirected to the our brands' page? (there is no .asp pages on the new site as it is all done in php). If so, I would love it if someone could post the code snippet. Thanks in advance guys and if you have any other ideas then be my guest to suggest 🙂 Matt.
Technical SEO | | MatthewBarby0 -
Duplicate homepage content
Hi, I recently did a site crawl using seomoz crawl test My homepage seems to have 3 cases of duplicate content.. These are the urls www.example.ie/ www.example..ie/%5B%7E19%7E%5D www.example..ie/index.htm Does anyone have any advise on this? What impact does this have on my seo?
Technical SEO | | Socialdude0 -
What are the SEOmoz-suggested best practices for limiting the number of 301 redirects for a given site?
I've read some vague warnings of potential problems with having a long list of 301 redirects within an htaccess file. If this is a problem, could you provide any guidance on how much is too much? And if there is a problem associated with this, what is that problem exactly?
Technical SEO | | roush0 -
Best blocking solution for Google
Posting this for Dave SottimanoI Here's the scenario: You've got a set of URLs indexed by Google, and you want them out quickly Once you've managed to remove them, you want to block Googlebot from crawling them again - for whatever reason. Below is a sample of the URLs you want blocked, but you only want to block /beerbottles/ and anything past it: www.example.com/beers/brandofbeer/beerbottles/1 www.example.com/beers/brandofbeer/beerbottles/2 www.example.com/beers/brandofbeer/beerbottles/3 etc.. To remove the pages from the index should you?: Add the Meta=noindex,follow tag to each URL you want de-indexed Use GWT to help remove the pages Wait for Google to crawl again If that's successful, to block Googlebot from crawling again - should you?: Add this line to Robots.txt: DISALLOW */beerbottles/ Or add this line: DISALLOW: /beerbottles/ "To add the * or not to add the *, that is the question" Thanks! Dave
Technical SEO | | goodnewscowboy0