How to handle International Duplicated Content?
-
Hi,
We have multiple international E-Commerce websites. Usually our content is translated and doesn't interfere with each other, but how do search engines react to duplicate content on different TLDs?
We have copied our Dutch (NL) store for Belgium (BE) and i'm wondering if we could be inflicting damage onto ourselves...
Should I use: for every page? are there other options so we can be sure that our websites aren't conflicting? Are they conflicting at all?
Alex
-
Hi Alexander,
The hreflang link in the header is probably the best way to do it. As to if it is impacting you, it depends on how much duplicate content there is to some degree. If you have set up both sites in GWT with separate sitemaps you can keep an eye on how well both sites are being indexed, a lot of unindexed pages on one or the other might indicate a problem. Best practice would be to put in the hreflag as you mention if in doubt.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to deal with 100s of Wordpress media link pages, containing images, but zero content
I have a Wordpress website with well over 1000 posts. I had a SEO audit done and it was highlighted that every post had clickable images. If you click the image a new webpage opens containing nothing but the image. I was told these image pages with zero content are very bad for SEO and that I should get them removed. I have contacted several Wordpress specialists on People Per Hour. I have basically been offered two solutions. 1 - redirect all these image pages to a 404, so they are not found by Google 2 - redirect each image page to the main post page the image is from. What's my best option here? Is there a better option? I don't care if these pages remain, providing they are not crawled by Google and classified as spam etc. All suggestions greatly received!
Web Design | | xpers0 -
Redirects Not Working / Issue with Duplicate Page Titles
Hi all We are being penalised on Webmaster Tools and Crawl Diagnostics for duplicate page titles and I'm not sure how to fix it.We recently switched from HTTP to HTTPS, but when we first switched over, we accidentally set a permanent redirect from HTTPS to HTTP for a week or so(!).We now have a permanent redirect going the other way, HTTP to HTTPS, and we also have canonical tags in place to redirect to HTTPS.Unfortunately, it seems that because of this short time with the permanent redirect the wrong way round, Google is confused as sees our http and https sites as duplicate content.Is there any way to get Google to recognise this new (correct) permanent redirect and completely forget the old (incorrect) one?Any ideas welcome!
Web Design | | HireSpace0 -
Fixing my sites problem with duplicate page content
My site has a problem with duplicate page content. SEO MOZ is telling me 725 pages worth. I have looked a lot into the 301 Re direct and the Rel=canonical Tag and I have a few questions: First of all, I'm not sure which on I should use in this case. I have read that the 301 Redirect is the most popular path to take. If I take this path do I need to go in and change the URL of each of these pages or does it automatically change with in the redirect when I plug in the old URL and the new one? Also, do I need to just go to each page that SEO MOZ is telling me is a duplicate and make a redirect of that page? One thing that I am very confused about is the fact that some of these duplicates listed out are actually different pages on my site. So does this just mean the URL's are too similar to each other, and there fore need the redirect to fix them? Then on the other hand I have a log in page that says it has 50 duplicates. Would this be a case in which I would use the Canonical Tag and would put it into each duplicate so that the SE knew to go to the original file? Sorry for all of the questions in this. Thank you for any responses.
Web Design | | JoshMaxAmps0 -
How To Avoid Duplicate Content
We are an eCommerce site for autoparts. It is basically impossible to avoid duplicate content, and I think we are getting penalized by Google for it. Here is why it is impossible. Let's say I sell a steering rack for a 2000 Honda Accord. I need an SEO rich page for 2000 Honda Accord Steering Rack. I sell steering racks for more than 25 years of Honda Accords. I can try and make the copy different but there is no way to spin the copy that many times and make it seem like it is not duplicate copy. This even gets more complicated because I sell hundreds of parts for each year of a Honda Accord, plus a lot of times you even have to go down to the engine size of the car for the right part. I can't use a redirect, ie 301 redirect because they are not the same pages. One is for a 2000 Honda Accord and the other a 2001 Honda Accord, and so on. Is their a redirect out there that I do not know about that would help me out in this case? Also, if their is no way around this and I am getting penalized would it be better to eliminate all these pages, possibly losing my ability to rank high on searches such as "2000 Honda Accord Steering Rack," and just replace with a page that has a Year Make Model, and Part dropdown which just takes the customer a checkout page?
Web Design | | joebuilder0 -
Multiple Sites, multiple locations similar / duplicate content
I am working with a business that wants to rank in local searches around the country for the same service. So they have websites such as OURSITE-chicago.com and OURSITE-seattle.com -- All of these sites are selling the same services, but with small variations in each state due to different legal standards in the state. The current strategy is to put up similar "local" websites with all the same content. So the bottom line is that we have a few different sites with the same content. The business wants to go national and is planning a different website for each location. In my opinion the duplicate content is a real problem. Unfortunately the nature of the service makes it so that there aren't many ways to say the same thing on each site 50 times without duplicate content. Rewriting content for each state seems like a daunting task when you have 70+ pages per site. So, from an SEO standpoint we have considered: Using the canonocalization tag on all but the central site... I think this would hurt all of the websites SERPs because none will have unique content. Having a central site with directories OURSITE.com/chicago -- but this creates a problem because we need to link back to the relevant content in the main site and ALSO have the unique "Chicago" content easily accessable to Chicago users while having Seattle users able to access their Seattle data. The best way we thought to do this was using a frame with a universal menu and a unique state based menu... Also not a good option because of frames will also hurt SEO. Rewrite all the same content 50 times. You can see why none of these are desirable options. But I know that plenty of websites have "state maps" on their main site. Is there a way to accomplish this in a way that doesn't make our copywriter want to kill us?
Web Design | | SysAdmin190 -
Google Bot cannot see the content of my pages
When I go to Google Webmaster tools and I type in any URL from the site http://www.ccisolutions.com in the "Fetch as Google Bot" feature, and then I click the link that says "success," Google bot is seeing my pages like this: <code>HTTP/1.1 200 OK Date: Tue, 26 Apr 2011 19:11:50 GMT Server: Apache/2.2.6 (Unix) mod_ssl/2.2.6 OpenSSL/0.9.7a DAV/2 PHP/5.2.4 mod_jk/1.2.25 Set-Cookie: CCISolutions-UT-Status=66.249.72.55.1303845110495128; path=/; expires=Thu, 25-Apr-13 19:11:50 GMT; domain=.ccisolutions.com Last-Modified: Tue, 28 Oct 2008 14:36:45 GMT ETag: "314b26-5a-2d421940" Accept-Ranges: bytes Content-Length: 90 Keep-Alive: timeout=15, max=99 Connection: Keep-Alive Content-Type: text/html Any clue as to why this could be happening?</code>
Web Design | | danatanseo0 -
Best way to handle related content links in a sidebar?
My site contains tens of thousands of articles, studies, multimedia files, biographies, etc. To assist users with finding content that might be related to the page they're on, I use a side bar with 'also of interest' links to other, similar content on my site. This is, of course, pretty standard practice. Search engines -- Google in particular -- index these pages and then include the text in the sidebar links in search results. So, for example, on a given page I may have 20 links to related content, and the text in those links might be, 'A story about subject ABC.' When I search for 'A story about subject ABC,' Google returns not only the page titled (and containing the content) 'A story about subject ABC.' but also every page that links to it and happens to have that link text in the sidebar. What is the proper way to handle this kind of thing?
Web Design | | smorrison0 -
What is the optimal URL Structure for Internal Pages
Is it more SEO friendly to have an internal page URL structure that reads like www.smithlawfirm.com/personal-injury/car-accidents or www.smithlawfirm.com/personal-injury-car-accidents? The former structure has the benefit of showing Google all the sub-categories under personal injury; the later the benefit of a flatter structure. Thanks
Web Design | | rarbel0