Multiple Countries, Same Language: Receiving Duplicate Page & Content Errors
-
Hello!
I have a site that serves three English-speaking countries, and is using subfolders for each country version:
- United Kingdom: https://site.com/uk/
- Canada: https://site.com/ca/
- United States & other English-speaking countries: https://site.com/en/
The site displayed is dependent on where the user is located, and users can also change the country version by using a drop-down flag navigation element in the navigation bar. If a user switches versions using the flag, the first URL of the new language version includes a language parameter in the URL, like:
In the Moz crawl diagnostics report, this site is getting dinged for lots of duplicate content because the crawler is finding both versions of each country's site, with and without the language parameter.
However, the site has rel="canonical" tags set up on both URL versions and none of the URLs containing the "?language=" parameter are getting indexed.
So...my questions:
1. Are the Duplicate Title and Content errors found by the Moz crawl diagnostic really an issue?
2. If they are, how can I best clean this up?
Additional notes: the site currently has no sitemaps (XML or HTML), and is not yet using the hreflang tag. I intend to create sitemaps for each country version, like:
- .com/en/sitemap.xml
- .com/ca/sitemap.xml
- .com/uk/sitemap.xml
I thought about putting a 'nofollow' tag on the flag navigation element, but since no sitemaps are in place I didn't want to accidentally cut off crawler access to alternate versions.
Thanks for your help!
-
Yep, given your resource constraints, I'd focus on translations for now. If you ever get to a point that there is something bigger than price differentiating your content, then you can think about geo-targeting. You will need the resources to differentiate the content though.
Right now, my recommendation is to drop the country specific content and just offer English for now. Your content can rank for any English speaking search, regardless of country. However, if the terms people use in the US, UK and Canada differ that much, you can "translate" the content (en-us, en-gb, en-au) and use the HREFLANG tag.
For price changes, that's tricker, but do you offer the price in search results via schema? Does it show up? If not, then you can use cookies to set the prices dependent on the country the person chooses (try not to use IP address, and if you do, make people confirm the setting).
For now, focus your time and efforts getting the flow right for the user. Only worry about HREFLANG if your English content needs to be differentiated for term usage. Then focus your efforts on getting those upcoming translations right. When that is ready, then really use HREFLANG.
Hope that helps!
-
Hi Kate,
Nifty quiz and flowchart! Thanks for sharing it. All the countries targeted are English-speaking, though further expansion to non-English speaking countries is planned for 2015. Here are the answers to the questions:
1. Does your business/product/content change in different countries?
A: Not really. 90% of the products are available in all three countries, and only one country is currently lacking the remaining 10%, and it will start selling those products there in 2015.
2. Would it make sense to an international visitor to see different site content? (ex. currency, localization, etc.)
A: Currency - yes. Otherwise, not really.
3. Do you have the resources to differentiate the content?
A: Not currently. This is a set of branded products, and the product descriptions use extensive "on-brand" language.
4. Are there multiple official languages for any of these countries?
A: Yes, Canada's official languages are English and French. There is no French version currently available.
5. Do you plan on offering the site content in all official languages?
A: Next six months - no. Late 2015 - maybe.
Going through the quiz, if I answer:
1. No, 2. Yes, 3. No
This is the recommendation:
Your International Strategy is:
Translate Only
- Don’t machine translate, while manual translation is costly, it’s the best for your goals.
- Put your HREFLANG in XML sitemaps.
- Use the Language Meta tag for Bing translation targeting.
- Don’t use a ccTLD. That is for Geo-Targeting only.
Aside from the manual translation portion, do you think #2 and #3 are still the best solutions for this situation?
Thanks for your help!
-
Hi!
This is a tough one because I can't tell if you mean to be geo-targeting or translate. It's not a one or the other thing, but it usually is when you are just targeting english speaking countries. Can you do me a favor and go to http://www.katemorris.com/issg/ and go through the questions there? Let me know what the "answer" is for your situation and I'll help you get to the right solution.
But in short, yes, the duplicate content is a real issue with or without the lang parameter.
Let me know!
-
Oh this is a tough one. The problem is that no matter the tags and language, the content is the same. It is reflecting duplicate content because it is duplicate content. Duplicate content within your site is serious, especially if you are trying to target keywords on those pages.
The hreflang tags should help you be able to display languages without using so many duplicate pages. I don't have much experience with that tag, but my advice would be to look into it further to help with your duplicate content issue. No following the duplicate pages will ultimately effect their rankings, so that probably isn't the best thing to do.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are AMP pages affecting mobile search visibility?
Hello fellow Mozzers. I've recently seen a fairly hefty drop in search visibility on Google mobile, from 12.8% to 4.1%. Desktop visibility is unaffected. The same search visibility drop is echoed in SEMRush. However, Google Analytics shows that our site traffic from mobile hasn't changed. The only thing I can think of is that we recently launched AMP pages. I know Google sometimes caches AMPs so they’re served off google domains. Could that mean that the cached version of the page is ranking rather than our own? That would explain the drop in visibility but stable traffic I think?! What other explanation could it be? Many thanks in advance, Kit
Moz Pro | | KitSmith0 -
Duplicate content
Hi Since adding blog to a site semoz is reporting increased duplicate content warning on seomoz crawl error tool such as: /blog/category/easter being a duplicate of blog/2013/03 Does this type of dupe content matter ? If so how do you stop this ? Also pages and pages of dupe content reported from internal/site search results, such as: /catalogsearch/result/index/?q=mens+fashion being a duplicate of /catalogsearch/result/?q=mens+fashion Does this matter need to be fixed or since internal site search not an issue and can just ignore, if it is an issue what do you need do to fix this type of dupe content ? Cheers Dan
Moz Pro | | Dan-Lawrence0 -
Rankings in Google.be - 3 languages
As the site of my customers is in 3 languages, I also want to monitor the rankings in 3 languages. I do have the possibility to monitor them in seomoz: google.be english google.be dutch google.be french However, in the report (http://pro.seomoz.org/campaigns/227154/rankings) I do see the 3 columns, but the title is only google.be, WITHOUT the language selection. Not really helpfull... Any advice? oNDu9
Moz Pro | | nans0 -
Transfering Page Authority
Hi, I have recently change my url architecture with site redesign and was just doing some analysis of the old and new pages. I seem to be losing a little bit of Organic Search because of it. As an example this old diving page in open site explorer shows a Page Authority of 46 whilst the new diving page shows a Page Authority of 22. I have a 301 redirect going from the old page to the new, but that seems to be quite a drop in Page Authority. Is there anything else I can be doing to improve upon it? Thanks, Adam
Moz Pro | | NaescentAdam0 -
20000 site errors and 10000 pages crawled.
I have recently built an e-commerce website for the company I work at. Its built on opencart. Say for example we have a chair for sale. The url will be: www.domain.com/best-offers/cool-chair Thats fine, seomoz is crawling them all fine and reporting any errors under them url great. On each product listing we have several options and zoom options (allows the user to zoom in to the image to get a more detailed look). When a different zoom type is selected it adds on to the url, so for example: www.domain.com/best-offers/cool-chair?zoom=1 and there are 3 different zoom types. So effectively its taking for urls as different when in fact they are all one url. and Seomoz has interpreted it this way, and crawled 10000 pages(it thinks exist because of this) and thrown up 20000 errors. Does anyone have any idea how to solve this?
Moz Pro | | CompleteOffice0 -
How Does On Page Analysis work
Hi guys, I just need to run something past you. when I look at my on page analysis I have 5 key terms I am focusing on. For instance one of them is "computer backup". According to the report the current grade is 'F' when looking at site page "/" which I assume is the home page.
Moz Pro | | cubetech
When I do a lookup on other pages of the site it gets a ranking of A. Which is good. But since the hompage ranking went from A to F my rankings have definitely been affect. So i guess my questions are: does "/" mean the hompage, or all pages overall. What should I really be looking at here. I am assuming that you select certain pages to target certain key words. Should i be focusing like this, or more to the "/". Thanks Guys hoping to clear this one up.0 -
Crawl Errors Confusing Me
The SEOMoz crawl tool is telling me that I have a slew of crawl errors on the blog of one domain. All are related to the MSNbot. And related to trackbacks (which we do want to block, right?) and attachments (makes sense to block those, too) ... any idea why these are crawl issues with MSNbot and not Google? My robots.txt is here: http://www.wevegotthekeys.com/robots.txt. Thanks, MJ
Moz Pro | | mjtaylor0 -
Crawl Diagnostic Errors
Hi there, Seeing a large number of errors in the SEOMOZ Pro crawl results. The 404 errors are for pages that look like this: http://www.example.com/2010/07/blogpost/http:%2F%2Fwww.example.com%2F2010%2F07%2Fblogpost%2F I know that t%2F represents the two slashes, but I'm not sure why these addresses are being crawled. The site is a wordpress site. Anyone seen anything like this?
Moz Pro | | rosstaylor0