Duplicate Terms of Use and Privacy Policy, is it a problem?
-
Hi,
If i use same terms of use and privacy policy content across my websites, does it amounts to duplicate content issues? Does it affect my websites in any manner?
Regards
-
Duplicate content is one of many hundreds of factors. If you have a very well crafted site, highly optimized, and with a very strong inbound link profile, but only a couple pages (ones that are not highly relevant to your primary topical focus) are duplicate, the potential negative impact on your overall rankings will be minimal.
This is true for most SEO factors. If any single factor has a flaw, but it's not a flaw that applies to the whole site, that single factor is going to have minimum impact on the overall site.
-
You can do almost anything you wish on a "noindex" tagged page. You are telling the search engine bot to exclude the page from the search index, so the page should not affect your ranking.
The reason your site's # of pages is a factor, is your overall site is viewed as a whole. If you have a basic site with 10 pages, and there is a problem with 1 of the pages having duplicate content, then 10% of your site is affected, and this can impact how the search engine views your site. If your site hosted a forums with 10k pages, then that 1 page would represent 0.001 of your site, so the impact would not have any real effect.
-
Thanks for the helpful reply Alan! Can you please explain this - "If it's only a few pages, sure, duplicate content there could have an impact". How duplicate content issues vary between small and big sites? I was under the impression that number of pages do not have any influence in duplicate content.
Is it okay to use the same privacy policy and terms of use across different websites as long as i noindex,follow them?
-
How big is your site? If it's only a few pages, sure, duplicate content there could have an impact. But in reality, I expect your site is not primarily made up of keyword phrases that either of those pages would be optimized for, and that you have more than a few pages. If so, any "negative" aspect would not be severe.
Having said that, it really is best to just use a robots meta tag set to noindex,follow (my preference instead of blocking completely in the robots.txt file.
-
Thanks for the reply Ryan! But if i don't block it via robots.txt file or noindex tag, will it affect my site negatively? I mean the overall site
-
I would recommend blocking pages such as privacy policy, terms of use, legal, etc. It is unlikely these pages would ever bring traffic to your site. Even if they did, it is not going to be the quality traffic you desire.
In robots.txt you can add
Disallow: /pages/privacy/
substitute your local path for /pages/privacy/
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Different WP Taxonomies seen as duplicate content
Hey guys, We're seeing Moz report "duplicate" content on pages like: mysite.com/interesting-category/ and mysite.com/interesting-tag. Why exactly is this, and is there something that we should do about this? Obviously some of the same posts will intersect on pages like category, tag, author pages. Thanks
Content Development | | andy.bigbangthemes0 -
Duplicate Content & Tags
I've recently added tags to my blog posts so that related blog posts are suggested to visitors. My understanding was that my robot.txt was handling duplicate content so thought it wouldn't be an issue but after Moz crawled by site this week is reported 56 issues of duplicate content in my blog. I'm using Shopify, so I can edit the robot.txt file but is my understanding correct that if there are 2 or more tags then they will be ignored? I've searched the Shopify documents and forum and can't find a straight answer. My understanding of SEO is fairly limited. Disallow: /blogs/+
Content Development | | Tangled
Disallow: /blogs/%2B
Disallow: /blogs/%2b0 -
Using images from Wikipedia, The right way to give credit for them?
Hi, I am writing my first blog post and I have downloaded 9 images from Wikipedia, After reading the confusing legal stuff, I am under the impression that all those images are allowed to be used for other purposes with citing/ giving credit to the owner. At least the ones I download the author said its ok to use them anywhere, for anything So how do i do that? Should I have something like this: Image, underneath-- image credit to xyz (link to Wikipedia page where i downloaded the image) ??? Thanks for you time and explanation.
Content Development | | Davit19850 -
Using Brand Descriptions
We have an online site with approximately 800 products. When we add new products we typically use the photos and product description provided by the brand. The problem is there are numerous other sites using the same descriptions. All other content on the site is original. Is having small blocks of content that are the same on multiple sites likely to cause problems with Google?
Content Development | | twitime0 -
Duplicate Content
I have a service based client that is interested in optimizing his website for all the services that he provides in all the locations that he provides them in. For example: Service 1, location 1 Service 1, location 2 Service 2, location 1 Service 2, location 2 He wants to essentially create an individual page for each of the above, but i'm concerned that he will be penalized for duplicate content. Each of the pages would have the keyword in the url, page title and within the main body of content. We would certainly alter the content somewhat, but not sure how much a difference this would make. Any thoughts or advice would be greatly appreciated.
Content Development | | embracedarrenhughes1 -
Author Rank - Using the brand as the author
Hi i'm building a new site and want to start building up author rank right from the start. If you are building author rank for a brand, do you think its fine to use the brand as the actual author of the content, instead of a actual person? Or using a stage name rather then a persons actual name, and have your writers write under that particular stage name? Would love to hear peoples opinions. Cheers, Mark
Content Development | | monster990 -
301 Redirect & Duplicate Content
We currently have 16465 audiobook products presented at our Web store. 5411 of them are out-of-publication (OOP). Here's an example: Harry Potter Audiobook 2 : Harry Potter and the Chamber of Secrets - J.K. Rowling - cassette audiobook Many of the 5411 OOP products are duplicates and triplicates of one title but were offered on a different medium (cassette, CD or MP3 CD) or were a different type (abridged, unabridged, dramatized). The description (story-line) is the same for all. Because we know once a page gets on the Internet, it can live there for years, we decided to keep OOP product pages at our Web store to: Let those who may have searched for the product and clicked on a link to an OOP product's page that it was no longer available. Invite them to explore our Web store. Let them know that although the product may not be available on cassette, CD or MP3 CD, that it might be available as a digital download. We know that Google does NOT like duplicate content from one site to another and even within the same site. If we redirect all the 5411 pages to one OOP page, will this eliminate this duplicate content issue? The OOP page would explain that the title they were looking for is no longer available but that it might be available as a digital download.
Content Development | | lbohen0 -
Help with Duplicate Content Issue for pages...
I have pages with duplicate content, i want to put them on hold while i write unique content as i do not want to get marked down for it. I also want to keep the urls and use them again.
Content Development | | pauledwards
There are about 300 pages affected by duplicate content currently. Am i best doing 302 redirects as it is temporary? to the origional source of the content, or canonical tags no index? The pages are currently indexed and cahced by google, i want to use the url in the future for unique content to get it valued by Google. Any advice much appreciated. Kind Regards,0