International web site - duplicate content?
-
I am looking at a site offering different language options via a javascript drop down chooser. Will google flag this as duplicate content? Should I recommend the purchase of individual domains for each country?
i.e. .uk
-
To avoid duplicate content you need to use the rel=”alternate” href=”x” tag. You dont have to buy a domain, here are several ways you can organise your website.
ccTLDs - [example.ie]
Pros
- Clear geotargeting
- Server location irrelevant
- Easy separation of sites
Cons
- Expensive (and may have limited availability)
- Requires more infrastructure
- Strict ccTLD requirements (sometimes)
****Subdomains with gTLDS [de.example.com]
Pros
- Easy to set up
- Can use Webmaster Tools geotargeting
- Allows different server locations
- Easy separation of sites
Cons
- Users might not recognize geotargeting from the URL alone (is “de” the language or country?)
Subdirectories with gTLDs [example.com/de/]
Pros
- Easy to set up
- Can use Webmaster Tools geotargeting
- Low maintenance (same host)
Cons
- Users might not recognize geotargeting from the URL alone
- Single server location
- Separation of sites harder
URL parameters [site.com?loc=de]
Pros
- Not recommended.
Cons
- URL-based segmentation difficult
- Users might not recognize geotargeting from the URL alone
- Geotargeting in Webmaster Tools is not possible
You can have even the same content for example, if you have a version for the UK and another for the US the content will be very similar.
Hope this helps
-
It's been my experience that if you're going to have additional languages on your site, you can do one of a few things to the URL:
or
or
Then you can place the content in the respective languages on those sites. They won't necessarily be seen as duplicated content (semantics change as you translate to different languages). However, Google won't necessarily penalize you if you keep your URL to site.com and just switch up the languages because, again, semantics change as you translate languages.
I worked with several sites that translated content from English to Spanish or English to Hebrew and they were never once penalized for duplicated content.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moving content form Non-performing site to performing site - wihtout 301 Redirection
I have 2 different websites: one have good amount of traffic and another have No Traffic at all. I have a website that has lots of valuable content But no traffic. And I want to move the content of non-performing site to performing site. (Don't want to redirect) My only concern is duplicate content. I was thinking of setting the pages to "noindex" on the original website and wait until they don't appear in Google's index. Then I'd move them over to the performing domain to be indexed again. So, I was wondering If it will create any copied content issue or not? What should i have to take care of when I am going to move content from one site to another?
White Hat / Black Hat SEO | | HuptechWebseo0 -
Ecommerce sites we own have similar products, is this OK?
Hello, In one of our niches, we have a big site with all products and a couple more sites that are smaller niches of the same niche. The product descriptions are different with different product names. Is this OK. We've got one big site and 2 smaller subsides in different niches that cross over with the big site. Let me know if Google is OK with this. We will have a separate blog for each with completely different content. There's not really duplicate content issues and although only the big site has a blog right now, the small ones eventually will have their own unique blog. Is this OK in Google's eyes now and in the future? What can we do to ensure we are OK? Thank you.
White Hat / Black Hat SEO | | BobGW1 -
Duplicate Content for e-commerce help
Hi. I know I have duplicate content issues and Moz has shown me the issues on ecommerce websites. However a large number of these issues are for variations of the same product. For example a blue, armani t-shirt can be found on armani page, t-shirt page, armani t-shirt page and it also shows links for the duplicates due to sizing variations. Is it possible or even worthwhile working on these issues? Thanks
White Hat / Black Hat SEO | | YNWA0 -
What do you say in your emails to horrible sites to remove your links?
Morning guys, I've the unenviable task of having to rectify poor link building (a previous company's work, not mine) which inevitably means emailing tons and tons of horrible directories with links to the client from as far back as 5/6 years ago. I'm sure many of you are in the same boat so it begs the question: What have you said to these types of sites that is effective in getting them to remove the links? This could even be a two/three-parter: If you've had little joy in requesting removals, have you dis-avowed the links, and what (if any) effect did it have? Thanks, M.
White Hat / Black Hat SEO | | Martin_S0 -
How to stop links from sites that have plagurized my blogs
I have been hit hard by Penguin 2.0. My webmaster explains that I have many links to my articles (a medical website with quality content) from "bad sites." These sites publish my articles with my name and link to my site and it appears I have posted my articles on their site although I have not posted them-theses sites have copied and pasted my articles. Is there a way to prevent sites from posting my content on their site with links to my site?
White Hat / Black Hat SEO | | wianno1681 -
Web virus attack every second
Hello my wordpress has been constantly attacked every day, files were uploaded and redirections were made to others websites. I instaled sucruri pluggin paying the annual fee, and no result. They keep acessing the web. And i uploading backup security. Know i have instaled OSE wp firewall and seems that they are getting more dificulty accessing and uploading files. But still sending like 40 attacks every day. Is ther any way to stop this? were is some information of the blocked attacks LOGTIME: 2013-02-22 10:58:01 FROM IP: http://whois.domaintools.com/27.153.210.183 REFERRER: http://www.propdental.com/index.php?option=com_registration&task=register LOGTIME: 2013-02-22 10:52:09 FROM IP: http://whois.domaintools.com/2a00:1d70:c01c::69:61 URI: http://www.propdental.com/video//wp-admin.php FROM IP 40 attacks this ip every two seconds: http://whois.domaintools.com/2a00:1d70:c01c::69:61 URI: http://www.propdental.com/video//wp-admin.php ACTION: Blocked LOGTIME: 2013-02-22 10:49:10 FROM IP: http://whois.domaintools.com/103.31.186.82 URI: http://www.propdental.com/ METHOD: GET LOGTIME: 2013-02-22 10:37:10 FROM IP: http://whois.domaintools.com/120.43.11.251 URI: http://www.propdental.com/blog/tag/carillas-de-porcelana-cerinate METHOD: GET USERAGENT: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.95 Safari/537.11 REFERRER: http://www.propdental.com/blog/tag/carillas-de-porcelana-cerinate ACTION: Blocked LOGTIME: 2013-02-22 10:28:52 FROM IP: http://whois.domaintools.com/36.251.43.51 URI: http://www.propdental.com/ METHOD: GET USERAGENT: Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.4 (KHTML, like Gecko) Chrome/22.0.1229.94 Safari/537.4 REFERRER: http://www.buyclassybags.com/
White Hat / Black Hat SEO | | maestrosonrisas0 -
Can anyone recommend a Google-friendly way of utilising a large number of individual yet similar domains related to one main site?
I have a client who has one main service website, on which they have local landing pages for some of the areas in which they operate. They have since purchased 20 or so domains (although in the process of acquiring more) for which the domain names are all localised versions of the service they offer. Rather than redirecting these to the main site, they wish to operate them all separately with the goal of ranking for the specific localised terms related to each of the domains. One option would be to create microsites (hosted on individual C class IPs etc) with unique, location specific content on each of the domains. Another suggestion would be to park the domains and have them pointing at the individual local landing pages on the main site, so the domains would just be a window through which to view the pages which have already been created. The client is aware of the recent EMD update which could affect the above. Of course, we would wish to go with the most Google-friendly option, so I was wondering if anyone could offer some advice about how would be best to handle this? Many thanks in advance!
White Hat / Black Hat SEO | | AndrewAkesson0 -
IP-Based Content on Homepage?
We're looking to redesign one of our niche business directory websites and we'd like to place local content on the homepage catered to the user based on IP. For instance, someone from Los Angeles would see local business recommendations in their area. Pretty much a majority of the page would be this kind of content. Is this considered cloaking or in any way a bad idea for SEO? Here are some examples of what we're thinking: http://www.yellowbook.com http://www.yellowpages.com/ I've seen some sites redirect to a local version of the page, but I'm a little worried Google will index us with localized content and the homepage would not rank for any worthwhile keywords. What's the best way to handle this? Thanks.
White Hat / Black Hat SEO | | newriver0