Temporary Duplicate Sites - Do anything?
-
Hi Mozzers -
We are about to move one of our sites to Joomla. This is one of our main sites and it receives about 40 million visits a month, so the dev team is a little concerned about how the new site will handle the load.
Dev's solution, since we control about 2/3 of that traffic through our own internal email and cross promotions, is to launch the new site and not take down the old site. They would leave the old site on its current URL and make the new site something like new.sub.site.com. Traffic we control would continue to the old site, traffic that we detect as new would be re-directed to the new site. Over time (the think about 3-4 months) they would shift the traffic all to the new site, then eventually change the URL of the new site to be the URL of the old site and be done.
So this seems to be at the outset a duplicate content (whole site) issue to start with. I think the best course of action is try to preserve all SEO value on the old URL since the new URL will eventually go away and become the old URL. I could consider on the new site no-crawl/no-index tags temporarily while both sites exist, but would that be risky since that site will eventually need to take those tags off and become the only site? Rel=canonical temporarily from the new site to the old site also seems like it might not be the best answer.
Any thoughts?
-
I'm going to throw in a completely different option, because in my opinion, messing with this kind of multiple version situation is going to put your huge website at massive risk of screwed up rankings and lost traffic no matter how tricky you get.
First, I'm assuming that significant high-level load testing has been done on the dev site already. If not, that's the place to start. (I'm suspecting a Joomla site for 40 million visits a month will have lots of load-balancing in place?)
Since by all indications, the sites will be identical to the visitor, I'd suggest switching to the new site, but keeping the original site immediately available in near-line status. By setting the TTL of the DNS to a very short duration while in transition, the site could be switched back to the old version within a minute or two just by updating the DNS if something goes pear-shaped on the new site.
Then, while the old site continues to serve visitors as it always has, devs can fix whatever issue was discovered on the new site.
This would mean keeping both sites' content updated concurrently during the period of the changeover, but it sounds like you were going to have to do that anyway. There's also the small risk that some visitors would have cached DNS on their own computers and so might still get sent to the new site for a while even after the DNS had been set back to the old site, but I'd say that's a vastly smaller risk than screwing up the rankings of the whole site.
Bottom line, there are plenty of load testing/quality assurance/server over-provisioning methods for making virtually certain the new site will be able to perform before going live. Having the backup site should be a very short term insurance, rather than a long term duplication process.
That's my perspective, anyway, having done a number of large-site migrations (though certainly nothing approaching 40M visits/month)
Paul
Just for refernce, I was involved in helping after just such a major migration where the multiple sites did get indexed. It took nearly a year to rectify the situation and get the rankings/traffic/usability back in order
-
Arghhh... This sounds like a crazy situation.
If the temp site is on a temporary subdomain, you definitely don't want any of those pages seeping into the index. But 3-4 months seems like an incredibly long time to sustain this. 3-4 days seems more reasonable to handle load testing.
For example, what happens when someone links to one of the temporary pages? Unless you put a rel canonical on the page, and allow robots to crawl it, then you won't gain from that link equity.
For a shorter time period, I'd simple block all crawlers via robots.txt, add a meta "noindex, nofollow" tag to the header, and hope for the best.
But for 3-4 months, you're taking the chance of sending very confusing signals to search engines, or losing out on new link equity. You could still use the meta "noindex, nofollow" on the temp domain if you need to, and also include rel=canonical tags (these are separate directives and actually processed differently) but there's no gaurentee of a smooth transistion once you ditch the temp urls.
So... my best advice is to convince your dev team to shorten the 3-4 month time frame. Not an easy job.
-
Wow 40 million visitors a month is no joke and nothing to be taken lightly if not done right the loss of traffic could be huge.
The new site should be non indexable and you can redirect a percentage of traffic to the new site (beta.site.com) for server load testing reasons and once you determine it is stable you can move it over to the new site.
Are URLs and site structure etc remaining the same? I wouldn't change too much at once or you won't know what happened if something tanks.
-
Thanks for the response.
It might have been just an unfounded concern, based on a vague memory of something I read about rel=canonical on here, but cannot find it now.
I was just concerned that if you have site A and B and rel=canonical from B to A, then eventually get rid of A and have B take on the URL of A, that the engines might interpret this oddly and have it affect domain authority.
-
Why do you think that canonical tags won't work?
That's what I would suggest.. Those tags simply tell Google which is the authoritative site of the duplicates. If you are preserving the original domain, canonical to that one and when you make the switch nothing will change. Do keep in mind if any of your directories or file structures are altered you will want to put in redirects but it sounds like your web team knows what they're doing here.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Migration - Pagination
Hi, We are migrating our website and an issue we are facing is how to handle paginated content in our categories. Our new website will have the same structure but with different urls. Should we 301 redirect all the paginated content (if crawled by Google) to the url of the main category? To put this into an example: Old urls: www.example.com/technology/tvs (main category of TVs & also page 1) ** www.example.com/technology/tvs?v=0&page=2 ** ( page 2 of TVs) New urls: **www.example.com/soundvision/tvs **(main category of TVs & also page 1) **www.example.com/soundvision/tvs?page=2 **(page 2 of tvs) Should we redirect all of the old TV urls (also the paginated) to www.example.com/soundvision/tvs ? The is no rel next, prev tag in our site and no canonicals. Also there is a view all products page in each category, BUT it doesn't contain all the products(max. is 100 per page - yes the view all page is also paginated). The same view all products page (paginated) will exist in the new website also. I checked google search console, and Google has decided to treat as canonical page the first page www.example.com/technology/tvs . Also, all the organic traffic of our categories goes to these pages (main category page - 1st page). I would appreciate any thoughts on this.
Intermediate & Advanced SEO | | HellasSITES0 -
Possible duplicate content issue
Hi, Here is a rather detailed overview of our problem, any feedback / suggestions is most welcome. We currently have 6 sites targeting the various markets (countries) we operate in all websites are on one wordpress install but are separate sites in a multisite network, content and structure is pretty much the same barring a few regional differences. The UK site has held a pretty strong position in search engines the past few years. Here is where we have the problem. Our strongest page (from an organic point of view) has dropped off the search results completely for Google.co.uk, we've picked this up through a drop in search visibility in SEMRush, and confirmed this by looking at our organic landing page traffic in Google Analytics and Search Analytics in Search Console. Here are a few of the assumptions we've made and things we've checked: Checked for any Crawl or technical issues, nothing serious found Bad backlinks, no new spammy backlinks Geotarggetting, this was fine for the UK site, however the US site a .com (not a cctld) was not set to the US (we suspect this to be the issue, but more below) On-site issues, nothing wrong here - the page was edited recently which coincided with the drop in traffic (more below), but these changes did not impact things such as title, h1, url or body content - we replaced some call to action blocks from a custom one to one that was built into the framework (Div) Manual or algorithmic penalties: Nothing reported by search console HTTPs change: We did transition over to http at the start of june. The sites are not too big (around 6K pages) and all redirects were put in place. Here is what we suspect has happened, the https change triggered google to re-crawl and reindex the whole site (we anticipated this), during this process, an edit was made to the key page, and through some technical fault the page title was changed to match the US version of the page, and because geotargetting was not turned on for the US site, Google filtered out the duplicate content page on the UK site, there by dropping it off the index. What further contributes to this theory is that a search of Google.co.uk returns the US version of the page. With country targeting on (ie only return pages from the UK) that UK version of the page is not returned. Also a site: query from google.co.uk DOES return the Uk version of that page, but with the old US title. All these factors leads me to believe that its a duplicate content filter issue due to incorrect geo-targetting - what does surprise me is that the co.uk site has much more search equity than the US site, so it was odd that it choose to filter out the UK version of the page. What we have done to counter this is as follows: Turned on Geo targeting for US site Ensured that the title of the UK page says UK and not US Edited both pages to trigger a last modified date and so the 2 pages share less similarities Recreated a site map and resubmitted to Google Re-crawled and requested a re-index of the whole site Fixed a few of the smaller issues If our theory is right and our actions do help, I believe its now a waiting game for Google to re-crawl and reindex. Unfortunately, Search Console is still only showing data from a few days ago, so its hard to tell if there has been any changes in the index. I am happy to wait it out, but you can appreciate that some of snr management are very nervous given the impact of loosing this page and are keen to get a second opinion on the matter. Does the Moz Community have any further ideas or insights on how we can speed up the indexing of the site? Kind regards, Jason
Intermediate & Advanced SEO | | Clickmetrics0 -
Why does old "Free" site ranks better than new "Optimized" site?
My client has a "free" site he set-up years ago - www.montclairbariatricsurgery.com (We'll call this the old site) that consistently outranks his current "optimized" (new) website - http://www.njbariatricsurgery.com/ The client doesn't want to get rid of his old site, which is now a competitor, because it ranks so much better. But he's invested so much in the new site with no results. A bit of background: We recently discovered the content on the new site was a direct copy of content on the old site. We had all copy on new site rewritten. This was back in April. The domain of the new site was changed on July 8th from www.Bariatrx.com to what you see now - www.njbariatricsurgery.com. Any insight you can provide would be greatly appreciated!!!
Intermediate & Advanced SEO | | WhatUpHud0 -
Why is my m-dot site outranking my main site in SERPs?
My client has a WP site and a Duda mobile site that we inherited. For some reason their m-dot site is ranking on P1 of Google for their top KWs instead of the main site which is much more robust. The main site might rank beyond page 5 when the generic home page for their m-dot site appears on P1. Does anyone have any idea why this might be happening?
Intermediate & Advanced SEO | | Etna0 -
Why Did My Site Go Limp On Me?
One of my clients was once in the #1 position for "Philadelphia interior designer" and other related terms, but her site has dropped significantly. Still it is on the first page, but far from its former glory. http://www.interiorsbydonnahoffman.com is the site. What really confuses me is why in her home turf search of "Bucks County Interior Designer" a competitor, http://www.miriamansellinteriors.com, is above her in the SERPS. According to OSE her competitor has a PA of 32 vs my client's 39. My client has 35 Linking Root Domains (and some of high quality) compared to just 11 for the competition. In all aspects her competitor looks weaker and less relevant to me. Her site has been weak in the SERPs since May/June. We are redesigning her site- she has a high bounce rate compared to my other interior design clients, something like 55%. Any insights from y'all?
Intermediate & Advanced SEO | | dfhytrwy0 -
Two Sites Similar content?
I just started working at this company last month. We started to add new content to pages like http://www.rockymountainatvmc.com/t/49/-/181/1137/Bridgestone-Motorcycle-Tires. This is their main site. Then i realized it also put the new content on their sister site http://www.jakewilson.com/t/52/-/343/1137/Bridgestone-Motorcycle-Tires. the first site is the main site and I think will get credit for the unique new content. The second one I do not think will get credit and will more than likely be counted as duplicate content. We are changing this so it will no longer be the same. However, I am curious to see ways people think we could fix this issues? Also is it effecting both sits for just the second one?
Intermediate & Advanced SEO | | DoRM0 -
This site got hit but why..?
I am currently looking at taking on a small project website which was recently hit but we are really at a loss as to why so I wanted to open this up to the floor and see if anyone else had some thoughts or theories to add. The site is Howtotradecommodities.co.uk and the site appeared to be hit by Penguin because sure enough it drops from several hundred visitors a day to less than 50. Nothing was changed about the website, and looking at the Analytics it bumbled along at a less than 50 visitors a day. On June 25th when Panda 3.8 hit, the site saw traffic increase to between 80-100 visitors a day and steadily increases almost to pre-penguin levels. On August 9th/10th, traffic drops off the face of the planet once again. This site has some amazing links http://techcrunch.com/2012/02/04/algorithmsdata-vs-analystsreports-fight/
Intermediate & Advanced SEO | | JamesAgate
http://as.exeter.ac.uk/library/using/help/business/researchingfinance/stockmarket/ That were earned entirely naturally/editorially. I know these aren't "get out of jail free cards" but the rest of the profile isn't that bad either. Normally you can look at a link profile and say "Yep, this link and that link are a bit questionable" but beyond some slightly off-topic guest blogging done a while back before I was looking to get involved in the project there really isn't anything all that fruity about the links in my opinion. I know that the site design needs some work but the content is of a high standard and it covers its topic (commodities) in a very comprehensive and authoritative way. In my opinion, (I'm not biased yet because it isn't my site) this site genuinely deserves to rank. As far as I know, this site has received no unnatural link warnings. I am hoping this is just a case of us having looked at this for too long and it will be a couple of obvious/glaring fixes to someone with a fresh pair of eyes. Does anyone have any insights into what the solution might be? [UPDATE] after responses from a few folks I decided to update the thread with progress I made on investigating the situation. After plugging the domain into Open Site Explorer I can see quite a few links that didn't show up in Link Research Tools (which is odd as I thought LRT was powered by mozscape but anyway... shows the need for multiple tools). It does seem like someone in the past has been a little trigger happy with building links to some of the inner pages.0 -
Optimize a Classifieds Site
Hi, I have a classifieds website and would like to optimize it. The issues/questions I have: A Classifieds site has, say, 500 cities. Is it better to create separate subdomains for each city (http://city_name.site.com) or subdirectory (http://site.com/city_name)? Now in each city, there will be say 50 categories. Now these 50 categories are common across all the cities. Hence, the layout and content will be the same with difference of latest ads from each city and name of the city and the urls pointing to each category in the relevant city. The site architecture of a classifieds site is highly prone to have major content which is not really a duplicate content. What is the best way to deal with this situation? I have been hit by Panda in April 2011 with traffic going down 50%. However, the traffic since then has been around same level. How to best handle the duplicate content penalty in case with site like a classifieds site. Cheers!
Intermediate & Advanced SEO | | ketan90