International SEO and duplicate content: what should I do when hreflangs are not enough?
-
Hi,
A follow up question from another one I had a couple of months ago:
It has been almost 2 months now that my hreflangs are in place. Google recognises them well and GSC is cleaned (no hreflang errors).
Though I've seen some positive changes, I'm quite far from sorting that duplicate content issue completely and some entire sub-folders remain hidden from the SERP.
I believe it happens for two reasons:1. Fully mirrored content - as per the link to my previous question above, some parts of the site I'm working on are 100% similar. Quite a "gravity issue" here as there is nothing I can do to fix the site architecture nor to get bespoke content in place.
2. Sub-folders "authority". I'm guessing that Google prefers sub-folders over others due to their legacy traffic/history. Meaning that even with hreflangs in place, the older sub-folder would rank over the right one because Google believes it provides better results to its users.
Two questions from these reasons:
1. Is the latter correct? Am I guessing correctly re "sub-folders" authority (if such thing exists) or am I simply wrong?2. Can I solve this using canonical tags?
Instead of trying to fix and "promote" hidden sub-folders, I'm thinking to actually reinforce the results I'm getting from stronger sub-folders.
I.e: if a user based in belgium is Googling something relating to my site, the site.com/fr/ subfolder shows up instead of the site.com/be/fr/ sub-sub-folder.
Or if someone is based in Belgium using Dutch, he would get site.com/nl/ results instead of the site.com/be/nl/ sub-sub-folder.Therefore, I could canonicalise /be/fr/ to /fr/ and do something similar for that second one.
I'd prefer traffic coming to the right part of the site for tracking and analytic reasons. However, instead of trying to move mountain by changing Google's behaviour (if ever I could do this?), I'm thinking to encourage the current flow (also because it's not completely wrong as it brings traffic to pages featuring the correct language no matter what).
That second question is the main reason why I'm looking out for MoZ's community advice: am I going to damage the site badly by using canonical tags that way?
Thank you so much!
G -
Apologies for the delay coming back to you - Christmas didn't help.
And thanks for your answer; I will give this specific use of canonical a shot starting with small subsets of the site and monitor the impact on my ranking first.
Another interrogation on top of its impact on the site is to know whether it's worth the effort.
But I guess I'll only know it by trying directly. -
1. Is the latter correct? Am I guessing correctly re "sub-folders" authority (if such thing exists) or am I simply wrong?
Your two points are valid ones. I don't want to say correct as in that is the cause for sure, but the age of content in my experience does play a role in duplicate content picking.
2. Can I solve this using canonical tags?
Canonicals can go wrong with hreflang, but it isn't a bad idea if you get it right. However, you know your content and your users better than us.Another possible solution to help everything is to detect the user's location and ASK (Don't redirect on IP alone) if they prefer to see that location's content. This will encourage the sharing of all of your content over time.
But if I am completely realistic, nothing is going to show up perfectly if you are trying to geo-target without actual geo-targeted content. Sometimes you just need to tell the business owners who made this decision that opening a shop in another country, trying to act like a local business with zero changes to the content, just isn't going to work out in every business in every country.
-
Great, thanks for your reply!
How should I use canonical tags though?
I assume that blindly canonicalising parts of the site would be pretty silly.
As in, I've pulled out analytics reviewing the volume of page views for an entire sub-folder against a potential sub-folder it could be canonicalised to.I.e. site.com/fr/ gets 100k visits
Site.com/be/fr/ gets 1k visits.
Therefore it should be canonicalised as it receives very low traffic (1% of /fr/)Site.com/de/ gets 100k visits
Site.com/ch/de gets 50k visits
Therefore it should not be canonicalised as it receives a fair bit of traffic (50% of /de/).Or it doesn't matter and both sub-folders should be canonicalised no matter what?
-
Hi - Pages have authority & this forms part of the domain authority & yes use canonical tags as to avoid being penalised for duplicate content
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to solve JavaScript paginated content for SEO
In our blog listings page, we limit the number of blogs that can be seen on the page to 10. However, all of the blogs are loaded in the html of the page and page links are added to the bottom. Example page: https://tulanehealthcare.com/about/newsroom/ When a user clicks the next page, it simply filters the content on the same page for the next group of postings and displays these to the user. Nothing in the html or URL change. This is all done via JavaScript. So the question is, does Google consider this hidden content because all listings are in the html but the listings on page are limited to only a handful of them? Or is Googlebot smart enough to know that the content is being filtered by JavaScript pagination? If this is indeed a problem we have 2 possible solutions: not building the HTML for the next pages until you click on the 'next' page. adding parameters to the URL to show the content has changed. Any other solutions that would be better for SEO?
Intermediate & Advanced SEO | | MJTrevens1 -
Duplicate content created by website Calendar - A Penalty?
A colleague of mine asked me a question about duplicate content coming from their event calendar. I don't think this will affect them negatively, but I would love some feedback and thoughts. ThanksOne of my clients, LifeTech Academy, is using my RavenTools software. Raventools has reported a HUGE amount of duplicate content (4.4K instances).The duplicate content all revolves around their calendar and repeating events (http://lifetechacademy.org/events/)The question is this - will this impact their SEO efforts in a negative way?
Intermediate & Advanced SEO | | Bill_K0 -
Noindexing Duplicate (non-unique) Content
When "noindex" is added to a page, does this ensure Google does not count page as part of their analysis of unique vs duplicate content ratio on a website? Example: I have a real estate business and I have noindex on MLS pages. However, is there a chance that even though Google does not index these pages, Google will still see those pages and think "ah, these are duplicate MLS pages, we are going to let those pages drag down value of entire site and lower ranking of even the unique pages". I like to just use "noindex, follow" on those MLS pages, but would it be safer to add pages to robots.txt as well and that should - in theory - increase likelihood Google will not see such MLS pages as duplicate content on my website? On another note: I had these MLS pages indexed and 3-4 weeks ago added "noindex, follow". However, still all indexed and no signs Google is noindexing yet.....
Intermediate & Advanced SEO | | khi50 -
ALT Tag Labels that Use Near Duplicate Text-SEO No, No???
Greetings Moz Community: About 280 pages of my 650 page commercial real estate website are listing pages. Each listing page contains between two and five photos, each with a corresponding ALT tag. My developer has set up the labeling of the ALT tags in the following manner. I can create a label for the first photo, but each subsequent photo automatically gets the same label plus a number tagged to the ALT. Like this: alt="Flatiron Loft for Rent"
Intermediate & Advanced SEO | | Kingalan1
alt="Flatiron Loft for Rent - Photo 0"
alt="Flatiron Loft for Rent - Photo 1"
alt="Flatiron Loft for Rent - Photo 2"
alt="Flatiron Loft for Rent - Photo 3" Is this method neutral, positive or negative for SEO? I am concerned that this manner of labeling ALT tags might risk triggering a duplicate content penalty. In early July I migrated the site from Drupal to Wordpress. We changed the URL structure (adding a sub-directory) for the listings at that time. Google is refusing to index about 100 listing pages. Any chance the ALT tags are contributing to Google's reluctance to index the URLs? I might also add that images are hosted on Amazon's CDN. A sample listing URL is http://www.nyc-officespace-leader.com/listings/278-21st-street-flatiron-loft-for-rent
Note: (/listings/278) were added to the URL in July, representing the listing sub directory plus the listing number. I Look forward to hearing the opinion of the MOZ community!!! THANKS!!!
Alan1 -
Does duplicate content penalize the whole site or just the pages affected?
I am trying to assess the impact of duplicate content on our e-commerce site and I need to know if the duplicate content is affecting only the pages that contain the dupe content or does it affect the whole site? In Google that is. But of course. Lol
Intermediate & Advanced SEO | | bjs20100 -
Category Content Duplication
Does indexing category archive page for a blog cause duplications? http://www.seomoz.org/blog/setup-wordpress-for-seo-success After reading this article I am unsure.
Intermediate & Advanced SEO | | SEODinosaur0 -
Concerns about duplicate content issues with australian and us version of website
My company has an ecommerce website that's been online for about 5 years. The url is www.betterbraces.com. We're getting ready to launch an australian version of the website and the url will be www.betterbraces.com.au. The australian website will have the same look as the US website and will contain about 200 of the same products that are featured on the US website. The only major difference between the two websites is the price that is charged for the products. The australian website will be hosted on the same server as the US website. To ensure Australians don't purchase from the US site we are going to have a geo redirect in place that sends anyone with a AU ip address to the australian website. I am concerned that the australian website is going to have duplicate content issues. However, I'm not sure if the fact that the domains are so similar coupled with the redirect will help the search engines understand that these sites are related. I would appreciate any recommendations on how to handle this situation to ensure oue rankings in the search engines aren't penalized. Thanks in advance for your help. Alison French
Intermediate & Advanced SEO | | djo-2836690 -
Removing Duplicate Content Issues in an Ecommerce Store
Hi All OK i have an ecommerce store and there is a load of duplicate content which is pretty much the norm with ecommerce store setups e.g. this is my problem http://www.mystoreexample.com/product1.html
Intermediate & Advanced SEO | | ChriSEOcouk
http://www.mystoreexample.com/brandname/product1.html
http://www.mystoreexample.com/appliancetype/product1.html
http://www.mystoreexample.com/brandname/appliancetype/product1.html
http://www.mystoreexample.com/appliancetype/brandname/product1.html so all the above lead to the same product
I also want to keep the breadcrumb path to the product Here's my plan Add a canonical URL to the product page
e.g. http://www.mystoreexample.com/product1.html
This way i have a short product URL Noindex all duplicate pages but do follow the internal links so the pages are spidered What are the other options available and recommended? Does that make sense?
Is this what most people are doing to remove duplicate content pages? thanks 🙂0