Is this duplicate content?
-
My client has several articles and pages that have 2 different URLs
For example:
is the same article as:
I was not sure if this was duplicate content or not ...
Or if I should be putting "/article.cfm" into the robots.txt file or not..
if anyone could help me out, that would be awesome!
Thanks
-
Agreed - although I think a 301-redirect or canonical tag implementation would probabyl be ok. If there's a database lookup that can translate the DocID into a URL string, the canonical is easy (I write some CF code, so I can at least tell you it's doable). Keep in mind that "article.cfm" is only one template, so if you can find a solution that's data-driven, it's just as easy for 1,000 pages as it is for 10.
You could also create a dynamic 301-redirect via <cfheader>- the core logic is the same. Basically, you look up the URL from the DocID and dynamically create the tag. You just need someone who understands your CMS and data. The actual code is only a few lines, but understanding your setup is the time-consuming part.</cfheader>
-
ATMOS, those are just the same page, so Canonical tag should do it, but also you want to stop google indexing it, so you could detect that it is called with the article.cfm and use a no index META tag too, but not if it uses the friendly url
-
I mostly agree with kchan.
- It is considered duplicate content.
- Simplest way is to do rel canonical for the pages with ids.
However, I suspect 301 redirect is not the best way. Especially, if your website is using Omniture and/or Google web analytics code, you might get miscalculated traffic through them because of 301 redirect.
Be careful if you choose the last route.
-
Awesome Chan, thanks. That was my thought as well. Most difficult part will be determining how to get that script in place.
-
Any chance you can spend a little time writing it out?
My guess is that we should be doing a rel canonical tag on all the article.cfm?intDocID=22572 type pages, that would then direct the bots to our /bc-blazes-construction-trail. But what's the easiest way to do that across the whole site?
-
Hello,
It sure is duplicate content. By putting "/article.cfm" into the robot.txt won't work because if you do that you are just re-directing the whole folder. You need to do a permanent re-direct. I had a brief look at the site and it seems like there are over 1000+ pages. This might take a while but it is neccessary to do it, if not your clients rankings will not perform and most likely penalised. A simple way would be doing a canonical link in /article.cfm?intDocID=22572 so you are showing google the main article is located at /bc-blazes-construction-trail.
However, the best way would be doing a 301 permanent re-direct of course. I'm sure you could get a web dev to write a script to automatically run through the database and output the re-directs than manually re-directing 1000+ pages. If it can't be done, you could outsource it on freelancer.com for around $2-300.
Thanks
-
That would definitely be considered duplicate content. There are a few things you can do to fix it, but rather than wasting a bunch of time writing it out here I would recommend visiting the link below for more detailed info:
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content but different pages?
Hi there! Im getting LOTS of "duplicate content" pages but the thing is they are different pages. My website essentially is a niche video hosting site with embedded videos from Youtube. Im working on adding personal descriptions to each video but keeping the same video title (should I re-word it from the original also? Any help?
Intermediate & Advanced SEO | | sarevme0 -
Best method for blocking a subdomain with duplicated content
Hello Moz Community Hoping somebody can assist. We have a subdomain, used by our CMS, which is being indexed by Google.
Intermediate & Advanced SEO | | KateWaite
http://www.naturalworldsafaris.com/
https://admin.naturalworldsafaris.com/ The page is the same so we can't add a no-index or no-follow.
I have both set up as separate properties in webmaster tools I understand the best method would be to update the robots.txt with a user disallow for the subdomain - but the robots text is only accessible on the main domain. http://www.naturalworldsafaris.com/robots.txt Will this work if we add the subdomain exclusion to this file? It means it won't be accessible on https://admin.naturalworldsafaris.com/robots.txt (where we can't create a file). Therefore won't be seen within that specific webmaster tools property. I've also asked the developer to add a password protection to the subdomain but this does not look possible. What approach would you recommend?0 -
How do I best handle Duplicate Content on an IIS site using 301 redirects?
The crawl report for a site indicates the existence of both www and non-www content, which I am aware is duplicate. However, only the www pages are indexed**, which is throwing me off. There are not any 'no-index' tags on the non-www pages and nothing in robots.txt and I can't find a sitemap. I believe a 301 redirect from the non-www pages is what is in order. Is this accurate? I believe the site is built using asp.net on IIS as the pages end in .asp. (not very familiar to me) There are multiple versions of the homepage, including 'index.html' and 'default.asp.' Meta refresh tags are being used to point to 'default.asp'. What has been done: 1. I set the preferred domain to 'www' in Google's Webmaster Tools, as most links already point to www. 2. The Wordpress blog which sits in a /blog subdirectory has been set with rel="canonical" to point to the www version. What I have asked the programmer to do: 1. Add 301 redirects from the non-www pages to the www pages. 2. Set all versions of the homepage to redirect to www.site.org using 301 redirects as opposed to meta refresh tags. Have all bases been covered correctly? One more concern: I notice the canonical tags in the source code of the blog use a trailing slash - will this create a problem of inconsistency? (And why is rel="canonical" the standard for Wordpress SEO plugins while 301 redirects are preferred for SEO?) Thanks a million! **To clarify regarding the indexation of non-www pages: A search for 'site:site.org -inurl:www' returns only 7 pages without www which are all blog pages without content (Code 200, not 404 - maybe deleted or moved - which is perhaps another 301 redirect issue).
Intermediate & Advanced SEO | | kimmiedawn0 -
Problems with ecommerce filters causing duplicate content.
We have an ecommerce website with 700 pages. Due to the implementation of filters, we are seeing upto 11,000 pages being indexed where the filter tag is apphended to the URL. This is causing duplicate content issues across the site. We tried adding "nofollow" to all the filters, we have also tried adding canonical tags, which it seems are being ignored. So how can we fix this? We are now toying with 2 other ideas to fix this issue; adding "no index" to all filtered pages making the filters uncrawble using javascript Has anyone else encountered this issue? If so what did you do to combat this and was it successful?
Intermediate & Advanced SEO | | Silkstream0 -
Duplicate Content Question
We are getting ready to release an integration with another product for our app. We would like to add a landing page specifically for this integration. We would also like it to be very similar to our current home page. However, if we do this and use a lot of the same content, will this hurt our SEO due to duplicate content?
Intermediate & Advanced SEO | | NathanGilmore0 -
3rd Party hosted whitepapers — bad idea? Duplicate content?
It is common the B2B world to have 3rd parties host your whitepapers for added exposure. Is this a bad practice from an SEO point of view? Is the expectation that the 3rd parties use rel=canonical tags? I doubt most of them do . . .
Intermediate & Advanced SEO | | BlueLinkERP0 -
Bi-Lingual Site: Lack of Translated Content & Duplicate Content
One of our clients has a blog with an English and Spanish version of every blog post. It's in WordPress and we're using the Q-Translate plugin. The problem is that my company is publishing blog posts in English only. The client is then responsible for having the piece translated, at which point we can add the translation to the blog. So the process is working like this: We add the post in English. We literally copy the exact same English content to the Spanish version, to serve as a placeholder until it's translated by the client. (*Question on this below) We give the Spanish page a placeholder title tag, so at least the title tags will not be duplicate in the mean time. We publish. Two pages go live with the exact same content and different title tags. A week or more later, we get the translated version of the post, and add that as the Spanish version, updating the content, links, and meta data. Our posts typically get indexed very quickly, so I'm worried that this is creating a duplicate content issue. What do you think? What we're noticing is that growth in search traffic is much flatter than it usually is after the first month of a new client blog. I'm looking for any suggestions and advice to make this process more successful for the client. *Would it be better to leave the Spanish page blank? Or add a sentence like: "This post is only available in English" with a link to the English version? Additionally, if you know of a relatively inexpensive but high-quality translation service that can turn these translations around quicker than my client can, I would love to hear about it. Thanks! David
Intermediate & Advanced SEO | | djreich0 -
Duplicate content issue
Hi I installed a wiki and a forum to subdomains of one of my sites. The crawl report shows me duplicate content on the forum and on wiki. This will hurt the main site? Or the root domain? the site by the way is clean absolutely from errors. Thanks
Intermediate & Advanced SEO | | nyanainc0