Entire website is duplicated on 2 domains - what to do?
-
My client's website has 1000+ pages and a Domain Authority of 23.
I have just discovered that the entire site is duplicated on a second domain (main URL = companyname.com - duplicate site URL = company-name.com).
The home page of the duplicate domain has a 301 redirect going to the main domain. However, none of the 1000+ other pages have any redirect set up, so Google is indexing the entire duplicate site. I'm assuming this is a bad thing for SEO.
Duplicate site has a domain Authority of 4, so I'd like to transfer whatever link juice it has, towards the main site.
What's the best thing to do?
Ultimately I think it would be best to delete the duplicate site. So would it be a case of adding a redirect to the htaccess file along the lines of:
redirect company-name.com/?slug? to https://companyname.com/?slug? (I realise this isn't the correct syntax - but is the concept correct?)
Has anyone ever dealt with this successfully?
-
You are correct that you have to delete the duplicate website and set up in the redirects in the htaccess file. Once you have done that I would fetch the home page and other top-level pages to force Google to crawl. This will help with removing the duplicate pages out of Google's index.
Also, make sure they don't have a search console set up for the duplicate webiste with a submitted site map.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
Spammers created bad links to old hacked domain, now redirected to our new domain. Advice?
My client had an old site hacked (let's call it "myolddomain.com") and the hackers created many links in other hacked sites with links such as http://myolddomain.com/styless.asp?jordan-12-taxi-kids-cheap-T8927.html The old myolddomain.com site was redirected to a different new site since then, but we still see over a thousand spam links showing up in the new site's Search Console 404 crawl errors report. Also, using the links: operator in google search, we see many results of spam links. Should we be worried about these bad links pointing to our old site and redirecting to 404s on the new site? What is the best recommendation to clean them up? Ignore? 410s? Other? I'm seeing conflicting advice out there. The old site is hosted by the client's previous web developer who doesn't want to clean anything up on their end without an ongoing hosting contract. So beyond turning redirects on or off, the client doesn't want to pay for any additional hosting. So we don't have much control over anything related to "myolddomain.com". 😞 Thanks in advance for any assistance!
Technical SEO | | usDragons0 -
Merging Domains
Hi, Everyone, My company is currently working with a client that has multiple websites and is interested in merging them into one. One is a primary corporate site, the other is a site for a single line of products. They obviously want to merge the product site into the corporate site. The interesting thing is that the product site outperforms the corporate site. It has the highest traffic, and it has far more links/linking domains, a higher domain authority (although only by two points), and much more social activity. However, their reasons for wanting to merge the two are completely valid - less management, URL would match print collateral, etc. They're asking our opinion on whether or not to move forward with the merger. I'm leaning toward no simply because of the fact that the site they want to merge is outperforming the other. I'm curious, though, to get some other opinions on this. Would a merger be worth the work in this case? Any advice would be appreciated. Thanks!
Technical SEO | | PapercutInteractive0 -
Do bad links to a sub-domain which redirects to our primary domain pass link juice and hurt rankings?
Sometime in the distant past there existed a blog.domain.com for domain.com. This was before we started work for domain.com. During the process of optimizing domain.com we decided to 301 blog.domain.com to www.domain.com. Recently, we discovered that blog.domain.com actually has a lot of bad links pointing towards it. By a lot I mean, 5000+. I am curious to hear people's opinions on the following: 1. Are they passing bad link juice? 2. does Google consider links to a sub-domain being passed through a 301 to be bad links to our primary domain? 3. The best approach to having these links removed?
Technical SEO | | Shredward0 -
Will Links to one Sub-Domain on a Site hurt a different Sub-Domain on the same site by affecting the Quality of the Root Domain?
Hi, I work for a SaaS company which uses two different subdomains on our site. A public for our main site (which we want to rank in SERPs for), and a secure subdomain, which is the portal for our customers to access our services (which we don't want to rank for) . Recently I realized that by using our product, our customers are creating large amounts of low quality links to our secure subdomain and I'm concerned that this might affect our public subdomain by bringing down the overall Authority of our root domain. Is this a legitimate concern? Has anyone ever worked through a similar situation? any help is appreciated!
Technical SEO | | ifbyphone0 -
Only my website homepage is appearing in search and the other indvidual pages are not coming up?This happened after the website revamp
We have revamped our website http://www.wsinetpower.com/ after te revamp the SEO rankings went down and the inner pages are not appearing in serach. What could be the reason
Technical SEO | | Muna0 -
Duplicate titles- what quailfies??
What qulifies as a duplicate title? If I have one tile Kelowna Real estate Smith McLellan Group and another title Kelowna Condos Smith Mclellan Group Is that a dupilcate title??
Technical SEO | | Realtor1010 -
Seomoz api for domains working, for domains+directory not?
We're working on a tool using the seomoz api ... for domains we're always getting the right values, but for longer URLs we're having troubles ... Example: http://www.seomoz.org/blog/6-reasons-why-qa-sites-can-boost-your-seo-in-2011-despite-googles-farmer-update-12160 won't work http://www.seomoz.org/blog works Any idea what we might be doing wrong?
Technical SEO | | gmellak0