How google treats my two different domains with the same content ?
-
I have two internet stores for two different markets but in the same language (English), the same content and the same url (only domains different). They are in different servers one in USA another in UK. Example: sample.com (global) and sample.uk (for UK).
Currently sample.com (7 years old) is doing better but not very very well, sample.uk (2 years old) is rated poorly.My question is if it's possible that google will rank both stores well in the future ?
Thanks
Vaidas
-
Hi there,
They are both competing for the same search terms. Probably, one will never out rank the other and you have double work to do. Also, the duplicate content might be hurting both sites.
My advise is to create a second directory in the older and global site. Then, use hreflang telling google the country specifics. Then you can use some of the leverage the older domain has.
I'd do: sample.com (global) and sample.com/uk/ (for UK).Here some resources:
International SEO - Moz.com
FAQ: Internationalisation - Google WMT central
The International SEO Checklist - Moz Blog.Best Luck.
GR.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Keyword in Domain AND Title. Yes or No?
We're working on a new buildout, and this one is really important to us. We've put a lot of resources into it. Before we launch, we want the structure to be just right... and this one question is nagging at me. How to structure urls? Consider these two options. The fictitious domain is "icesurfing.org". Including all 50 states in the keyword, there are nearly one million searches per month for "ice surfing [state]". We have a page for each state to focus on this traffic. But how would you structure the urls and titles? **icesurfing.org/state ** icesurfing.org/ice-surfing-state One concern is that the duplicate keywords in option 2 seem redundant, and a little spammy. When presented with google search, the matching tags are not as clean. Texas - IceSurfing.org Ice Surfing Texas - IceSurfing.org But Yoast automatically suggests option 2. Is this really the best practice? Is there are definitive article on this? THANK YOU!
On-Page Optimization | | RetBit0 -
Some Content The Same
Hello. I am about to publish some landing pages that target different industries that we are trying to market to. X for Accountants
On-Page Optimization | | smithandco
X for Financial Advisors
X for Fitness Trainers
X for X While a good portion of the content is unique on each page "the benefits of using X for accountants" some of the content on the page is duplicate which explains more about how our software works (the features), this will be the same content on every page. Is this considered duplicate content? What should I be aware of in term of Google rankings and penalties? Thanks,
David0 -
Can Robots.txt on Root Domain override a Robots.txt on a Sub Domain?
We currently have beta sites on sub-domains of our own domain. We have had issues where people forget to change the Robots.txt and these non-relevant beta sites get indexed by search engines (nightmare). We are going to move all of these beta sites to a new domain that we disallow all in the root of the domain. If we put fully configured Robots.txt on these sub-domains (that are ready to go live and open for crawling by the search engines) is there a way for the Robots.txt in the root domain to override the Robots.txt in these sub-domains? Apologies if this is unclear. I know we can handle this relatively easy by changing the Robots.txt in the sub-domain on going live but due to a few instances where people have forgotten I want to reduce the chance of human error! Cheers, Dave.
On-Page Optimization | | davelane.verve0 -
Removing old URLs from Google
We rebuilt a site about a year ago on a new platform however Google is still indexing URL's from the old site that we have no control over. We had hoped that time would have 'cleaned' these out but they are still being flagged in HTML improvements in GWT. Is there anything we can do to effect these 'external' dropping out of the indexing given that they are still being picked up after a year.
On-Page Optimization | | Switch_Digital0 -
Why do I have such drastic differences in my ratings?
I work for an e-commerce site that has quite a few categories. Some of these categories rank really well and other don't rank at all. What causes such a drastic difference? I understand that there are a lot of factors in ranking, so I am not asking why am I ranked #1 in some keywords and #2 in others. What I am trying to figure out is why I rank #1 in some keywords and don't rank in the first ten pages in another. The pages and optimization are the same. Why wouldn't it rank at all?
On-Page Optimization | | EcommerceSite0 -
Why isn't our site being shown on the first page of Google for a query using the exact domain, when its pages are indeed indexed by Google
When I type our domain.com as a query into Google, I only see one of our pages on the homepage, and it's in 4th position. It seems though, that all pages of the site are indexed by google when I type in the query "site:domain.com". There was an issue at the site launch, where the robots.txt file was left active for around two weeks. Would this have been responsible for the fact that another domain ranks #1 when we type in our own domain? It has been around a couple of months now since the site was launched. Thanks in advance.
On-Page Optimization | | featherseo0 -
Geo-targeted content and SEO?
I am wondering, what effect does geo-targeted "cookie cutter" content have on SEO. For example, one might have a list of "Top US Comedians", which appears as "Top UK Comedians" for users from the United Kingdom. The data would be populated with information from a database in both cases, but would be completely different for each region, with the exception of a few words. Is this essentially giving Google's (US-based) crawler different content to users? I know that plenty of sites do it, but is it legitimate? Would it be better to redirect to a unique page, based on location, rather than change the content of one static page? I know what the logical SEO answer is here, but even some of the big players use the "wrong" tactic. I am very interested to hear your thoughts.
On-Page Optimization | | HalogenDigital0 -
Exact Match Domains
Hi All Which strategy from below would be best for the purchase of an exact match domain... 1) www.shiny-blue-widgets.com 2) wwwshinybluewidgetsshop.com Wondered if there was much difference in either as I know both have plus and minus points.
On-Page Optimization | | PerchDigital0