To merge or not to merge? That is the question.
-
I am planning to do something I never did, and I am wondering if it's really a good idea or not.
I have four websites, all of the same company, each one with a different domain and different content:
- one has been the main official site for 16 years, 200 unique per month, indexed for 134 keywords, Domain Authority 17, 13 linking root domains
- one has been used as the main site from 2003 to 2006, it's focused on a specific business they actually discontinued, still online, no update since 2006, 500 unique per month, indexed for 92 keywords, Domain Authority 13, 8 linking root domains
- another has been a built on 2010 and maintained for less than year, and it's focused on a business they never really started, still online, no update since 2010, 3000 unique per month, indexed for 557 keywords, Domain Authority 25, 84 linking root domains
- a fourth one has been also built on 2010 and focused on a business never really started, still online, no update since 2010, 100 unique per month, indexed for 4 keywords, Domain Authority 6, 3 linking root domains
Each website has traffic and links, all links being natural, they never tried to gain links in any way, they never did on page optimization, they never ever thought about SEO. They are not event interlinked.
So, my idea is to merge all of them, putting websites 2, 3 and 4 as subfolders of the main site and replicating the old content there. Because those sites have traffic, incredibly one of the abandoned sites has 3000 unique per month, while the main site just 200!
My doubts are:
- does it make sense to merge everything from a SEO prospective?
- A part from doing 301 correctly, what else should I be careful to do or not to do?
- website number 4 it's really outdated, content and structure is not easy to merge with the rest, traffic is really small, is it worth spending the time to merge it?
Finally I also have a problem; customer didn't want to merge them, they agreed to, but they don't want visitors of the main site to be able to navigate to the old ones, so once moved and redirected I would have to put them in the sitemap of the main site but avoid linking to them on the actual "main" site.
As far as I know google crawler doesn't like to find pages in sitemaps which are not reachable through a linking path on the website, is that correct? Is that going to make all the merging work useless?
Should I convince the client to at least put small links in the footer or on a page linked from the footer?
-
Thanks for you answer.
The first place google bot goes is the sitemap, yes, but is it true or not that finding a page in the sitemap and not being able to reach the same page when crawling the website makes google devaluate the page juice?
About the footer I would just put a link to each subfolder, so just 3 or 4 links, what I don't know is if small links in the footer are enough to make google bot happy, or if it would still devaluate the page juice of the pages. Since they wouldn't have much interlinkage in the main website.
-
Well, from a business prospective, since site-2 and site-3 are about technologies which could still be used by customers of my client, even if they don't provide them anymore, in my mind they could bring in leads. site-3 is still originating natural links even if they didn't update it in four years, it must still be valuable to someone.
The 301 redirect doesn't scares me for site-2 and site-3 because they are wordpress installation, I will download the content and sitemaps, upload the content and use the sitemaps to generate 1-1 redirect rules with a script, it's not complex.
Site-4 is an application and I have no idea where to start to move it, that's why I now think is better to drop it.
-
Thanks. I think I will combine the three more related and discard the last one, it's traffic is small and they would not care of pennies from adsense.
-
I understand all the websites get traffic and rankings for different keywords. Although, you are stating that some of the services and products provided are no longer active and since that is the case (as stated above a service that never launched) then I would look at the point of keeping the content alive as it will not convert any new clients since that service is not provided. How would you get a return on investment for all the merge work.
Like sureshchowdary said above, making a list of all the pages and doing a 1-1 redirect is a lot of work (believe me I know -> in february 2014 I did the same thing for a client redirecting the entire site to new location (+/- 1000 pages)).
So If I were you I would look at the effort needed to perform all the work, make an estimate in what the investment would be and what would be the return on the work. It might just be wise to decide to add some content to the oldest site and redirect all the links but leave the rest of the content.
Just my 2 cents and for your consideration
Jarno
-
I agree. I would not mix diverse topics. But, if they are related, I would combine them as long as the content is meaningful to someone. I would monetize the traffic with adsense or other ads.
-
That was also my concern but all the four websites are related to the same branch of business I would say.
It's a software house basically, site 1 is just about them as a company. Site 2 and site 3 are about old technologies they do not resell/implement anymore. Site 4 is an online webservice to exchange messages anonymously, they actually never launched it. It's the one more distant from what they are doing and I agree with them it really looks unfinished and not so professional.
-
How closely related are the topics of these websites?
Are you mixing fishing, knitting and hydraulic jacks?
-
Hi Max,
Its good to know that even though the websites are not optimized the keywords are ranking and generating traffic.
1)It does make sense to merge. But all the weight age goes to the 1st website. You need to do a 1-1 mapping of all the url's while redirecting which I think is a big task. As you said the customers didn't want to merge them.
2)When we are doing a 301 redirect for the whole website I don't think see any such things which we need to look after.
3)If the keywords are ranking for the 4th website then definitely its worth redirecting your 4th website.
The first place google bot goes when it enters your website is the sitemap.
Its also not a good practice to put more links in the footer section. Limit the number of links in the footer section.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Structure Question
Am starting to work with a new site that has a domain name contrived to help it with a certain kind of long tail search. Just for fictional example sake, let's call it WhatAreTheBestRestaurantsIn.com. The idea is that people might do searches for "what are the best restaurants in seattle" and over time they would make some organic search progress. Again, fictional top level domain example, but the real thing is just like that and designed to be cities in all states. Here's the question, if you were targeting searches like the above and had that domain to work with, would you go with... whatarethebestrestaurantsin.com/seattle-washington whatarethebestrestaurantsin.com/washington/seattle whatarethebestrestaurantsin.com/wa/seattle whatarethebestrestaurantsin.com/what-are-the-best-restaurants-in-seattle-wa ... or what and why? Separate question (still need the above answered), would you rather go with a super short (4 letter), but meaningless domain name, and stick the longtail part after that? I doubt I can win the argument the new domain name, so still need the first question answered. The good news is it's pretty good content. Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
Questions about duplicate photo content?
I know that Google is a mystery, so I am not sure if there are answers to these questions, but I'm going to ask anyway! I recently realized that Google is not happy with duplicate photo content. I'm a photographer and have sold many photos in the past (but retained the rights for) that I am now using on my site. My recent revelations means that I'm now taking down all of these photos. So I've been reverse image searching all of my photos to see if I let anyone else use it first, and in the course of this I found out that there are many of my photos being used by other sites on the web. So my questions are: With photos that I used first and others have stolen, If I edit these photos (to add copyright info) and then re-upload them, will the sites that are using these images then get credit for using the original image first? If I have a photo on another one of my own sites and I take it down, can I safely use that photo on my main site, or will Google retain the knowledge that it's been used somewhere else first? If I sold a photo and it's being used on another site, can I safely use a different photo from the same series that is almost exactly the same? I am unclear what data from the photo Google is matching, and if they can tell the difference between photos that were taken a few seconds apart.
Intermediate & Advanced SEO | | Lina5000 -
XML Sitemap Questions For Big Site
Hey Guys, I have a few question about XML Sitemaps. For a social site that is going to have presonal accounts created, what is the best way to get them indexed? When it comes to profiles I found out that twitter (https://twitter.com/i/directory/profiles) and facebook (https://www.facebook.com/find-friends?ref=pf) have directory pages, but Google plus has xml index pages (http://www.gstatic.com/s2/sitemaps/profiles-sitemap.xml). If we go the XML route, how would we automatically add new profiles to the sitemap? Or is the only option to keep updating your xml profiles using a third party software (sitemapwriter)? If a user chooses to not have their profile indexed (by default it will be index-able), how do we go about deindexing that profile? Is their an automatic way of doing this? Lastly, has anyone dappled with google sitemap generator (https://code.google.com/p/googlesitemapgenerator/) if so do you recommend it? Thank you!
Intermediate & Advanced SEO | | keywordwizzard0 -
I have 2 Questions
what if we do the interlinking on the exact keywords? Is this comes under spam technique? For example - http://blog.payscout.com/automotive-merchant-services/ I interlink the exact keyword in the above URL. Can we use same image 2-3 times on the same website with different anchor tags? For example - http://packforcity.com/what-to-wear-in-new-orleans-in-january/ http://packforcity.com/what-to-wear-in-san-francisco-in-october/ Same image used on the website with different alt tag.
Intermediate & Advanced SEO | | AlexanderWhite0 -
Removing Content 301 vs 410 question
Hello, I was hoping to get the SEOmoz community’s advice on how to remove content most effectively from a large website. I just read a very thought-provoking thread in which Dr. Pete and Kerry22 answered a question about how to cut content in order to recover from Panda. (http://www.seomoz.org/q/panda-recovery-what-is-the-best-way-to-shrink-your-index-and-make-google-aware). Kerry22 mentioned a process in which 410s would be totally visible to googlebot so that it would easily recognize the removal of content. The conversation implied that it is not just important to remove the content, but also to give google the ability to recrawl that content to indeed confirm the content was removed (as opposed to just recrawling the site and not finding the content anywhere). This really made lots of sense to me and also struck a personal chord… Our website was hit by a later Panda refresh back in March 2012, and ever since then we have been aggressive about cutting content and doing what we can to improve user experience. When we cut pages, though, we used a different approach, doing all of the below steps:
Intermediate & Advanced SEO | | Eric_R
1. We cut the pages
2. We set up permanent 301 redirects for all of them immediately.
3. And at the same time, we would always remove from our site all links pointing to these pages (to make sure users didn’t stumble upon the removed pages. When we cut the content pages, we would either delete them or unpublish them, causing them to 404 or 401, but this is probably a moot point since we gave them 301 redirects every time anyway. We thought we could signal to Google that we removed the content while avoiding generating lots of errors that way… I see that this is basically the exact opposite of Dr. Pete's advice and opposite what Kerry22 used in order to get a recovery, and meanwhile here we are still trying to help our site recover. We've been feeling that our site should no longer be under the shadow of Panda. So here is what I'm wondering, and I'd be very appreciative of advice or answers for the following questions: 1. Is it possible that Google still thinks we have this content on our site, and we continue to suffer from Panda because of this?
Could there be a residual taint caused by the way we removed it, or is it all water under the bridge at this point because Google would have figured out we removed it (albeit not in a preferred way)? 2. If there’s a possibility our former cutting process has caused lasting issues and affected how Google sees us, what can we do now (if anything) to correct the damage we did? Thank you in advance for your help,
Eric1 -
Canonical or 301 redirect, that is the question?
So my site has duplicate content issues because of the index.html and the www and non www version of the site. What's the best way to deal with this without htaccess? Is it a 301 redirect or is it the canonical, or is it both?
Intermediate & Advanced SEO | | bronxpad0 -
Retargeting questions
The question is in reference to SEOmoz post - http://www.seomoz.org/blog/retargeting-basics-what-it-is-how-to-use-it 1. What is the size of a retargeting pixel and who places it on the site ? Is it the retargeting company ? Can we place it ourselves ? Does a code have to be added to the site ? 2. In the post mentioned above, the author talks about "burn pixel" "If a person in your audience converts then a "burn pixel" will fire" What do you mean by burn pixel ? How do we come to know that a burn pixel has fired ?
Intermediate & Advanced SEO | | seoug_20050 -
Image Links Vs. Text Links, Questions About PR & Anchor Text Value
I am searching for testing results to find out the value of text links versus image links with alt text. Do any of you have testing results that can answer or discuss these questions? If 2 separate pages on the same domain were to have the same Page Authority, same amount of internal and external links and virtually carry the same strength and the location of the image or text link is in the same spot on both pages, in the middle of the body within paragraphs. Would an image link with alt text pass the same amount of Page Authority and PR as a text link? Would an image link with alt text pass the same amount of textual value as a text link? For example, if the alt text on the image on one page said "nike shoes" and the text link on the other page said "nike shoes" would both pass the same value to drive up the rankings of the page for "nike shoes"? Would a link wrapped around an image and text phrase be better than creating 2 links, one around the image and one around the text pointing to the same page? The following questions have to do with when you have an image and text link on a page right next to each other, like when you link a compelling graphic image to a category page and then list a text link underneath it to pass text link value to the linked-to page. If the image link displays before the text link pointing to a page, would first link priority use the alt text and not even apply the anchor text phrase to the linked page? Would it be best to link the image and text phrase together pointing to the product page to decrease the link count on the page, thus allowing for more page rank and page authority to pass to other pages that are being linked to on the page? And would this also pass anchor text value to the link-to page since the link would include an image and text? I know that the questions sound a bit repetitive, so please let me know if you need any further clarification. I'd like to solve these to further look into ways to improve some user experience aspects while optimizing the link strength on each page at the same time. Thanks!
Intermediate & Advanced SEO | | abernhardt
Andrew0