Duplicate Terms of Use and Privacy Policy, is it a problem?
-
Hi,
If i use same terms of use and privacy policy content across my websites, does it amounts to duplicate content issues? Does it affect my websites in any manner?
Regards
-
Duplicate content is one of many hundreds of factors. If you have a very well crafted site, highly optimized, and with a very strong inbound link profile, but only a couple pages (ones that are not highly relevant to your primary topical focus) are duplicate, the potential negative impact on your overall rankings will be minimal.
This is true for most SEO factors. If any single factor has a flaw, but it's not a flaw that applies to the whole site, that single factor is going to have minimum impact on the overall site.
-
You can do almost anything you wish on a "noindex" tagged page. You are telling the search engine bot to exclude the page from the search index, so the page should not affect your ranking.
The reason your site's # of pages is a factor, is your overall site is viewed as a whole. If you have a basic site with 10 pages, and there is a problem with 1 of the pages having duplicate content, then 10% of your site is affected, and this can impact how the search engine views your site. If your site hosted a forums with 10k pages, then that 1 page would represent 0.001 of your site, so the impact would not have any real effect.
-
Thanks for the helpful reply Alan! Can you please explain this - "If it's only a few pages, sure, duplicate content there could have an impact". How duplicate content issues vary between small and big sites? I was under the impression that number of pages do not have any influence in duplicate content.
Is it okay to use the same privacy policy and terms of use across different websites as long as i noindex,follow them?
-
How big is your site? If it's only a few pages, sure, duplicate content there could have an impact. But in reality, I expect your site is not primarily made up of keyword phrases that either of those pages would be optimized for, and that you have more than a few pages. If so, any "negative" aspect would not be severe.
Having said that, it really is best to just use a robots meta tag set to noindex,follow (my preference instead of blocking completely in the robots.txt file.
-
Thanks for the reply Ryan! But if i don't block it via robots.txt file or noindex tag, will it affect my site negatively? I mean the overall site
-
I would recommend blocking pages such as privacy policy, terms of use, legal, etc. It is unlikely these pages would ever bring traffic to your site. Even if they did, it is not going to be the quality traffic you desire.
In robots.txt you can add
Disallow: /pages/privacy/
substitute your local path for /pages/privacy/
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Franchise-Like Duplicate Sites
I know that ideally businesses that operate as franchises should have 1 site with separate location pages. However, I have a slightly different issue. Each location is owned by a different parent company, and named accordingly. For example, there is "Location by XYZ Company" and "Location by ABC Company." In addition, each location, while carrying similar products, does not carry the same exact products and brands. So my question is how would you go about writing the content for each of these sites, keeping the same tone but avoiding duplicate content?
Content Development | | GavinAdv1 -
Duplicate Content In Webmaster Tools
In wordpress on some of our blogs when we have gone to publish them wordpress has shortened the url. In Google webmaster tools the orignal url is coming up as a 404 error. This url is not indexed in Google. Is this something to worry about and can this be avoided? Thank you in advance.
Content Development | | Palmbourne0 -
Duplicate Content
Hi All, I am doing work for a rug company that acts as a third party. They have close to 4,000+ products. Each rug belongs to a collection. The collection has one main description that is the same throughout every rug in the collection. Ex. One Collection has 15 rugs, all with the same description. Should I take the time and change every single description? I think the answer is yes but I wanted another opinion. Thanks
Content Development | | Mike.NW0 -
Duplicate Content
Hi Does anyone know a site where i can paste text to test for duplication? We've used some outstanding freelancer copywriters in the past but need to check the authenticity of the article created before publishing, Thanks Gary
Content Development | | GaryVictory0 -
Duplicate Content- Archives
Our site is showing duplicate content for about 20 pages, they are all on the site as regular pages and in the "Archive" section. So the URL is different by one word but all the content is the same, is there a way to fix this and make it to where Google doesn't see it as duplicate content? Thanks!
Content Development | | legallaw0 -
Duplicate Content From Huffington Post Blog
A client who writes blog posts for Huffington Post also wants an identical version of the blog posted to his personal site. Do you think there could be a problem of being punished for duplicate content? Would a better SEO practice be to have the client do an on-site blog just linking to the Huffington Post blog and providing information about it?
Content Development | | EmarketedTeam0 -
How to Get Rid of Duplicate Content Captured on Article Lists
We have a ton of articles and blog posts on our site. Currently, we display summary lists of articles that contain the first paragraph of the article in the summary list. However, in my reports, this is coming back as duplicate content with the full article itself. How do I fix this? Ex: article main page- http://www.robots.com/articles/10 First article on that page- http://www.robots.com/articles/viewing/grippers-for-robots (which shows up as duplicate content with the main artilce page). With our blogs, we have the most recent 5 blogs (in the same summary format) listed on our main blog page. We then have categories that people can sort by. But again, this is causing us duplicate content because those pages show the first paragraph of the blogs related to that category. Ex: blog main page- http://www.robots.com/blog. First blog listed on that page- http://www.robots.com/blog/viewing/robots-and-automation-bringing-jobs-back-to-the-united-states (which then shows as duplicate content with the main blog page). And then you can also select categories to see related topics: http://www.robots.com/blog/category/buying-a-robot which is showing as duplicate content also. Help! How can I prevent this? Thanks! JWanner
Content Development | | jwanner0 -
I use a CDN
Hi there my sitemap is @ www.mobbly.com/sitemap.xml and images are stored on cdn.mobbly.com I have verified cdn.mobbly.com but as there is no sitemap apart from mobbly.com/sitemap.xml google has 3000 images in sitemap but only 289 indexed - was all when all were on the same domain any advice would be good
Content Development | | jimmy1640