Hotel affiliate website - noindex pages with little unique content?
-
We are well into development of a hotel affiliate website (using Expedia Affiliate Network), and I know there are many challenges to SEO when using an affiliate system - one of the biggest being how to handle duplicate content.
Outside of blog posts and static marketing pages, the majority of the textual content is contained in hotel descriptions. We will be creating unique descriptions over time, but we are a small team and this will be a lengthy process.
My question for you mozzers, is whether or not it's advisable for ranking purposes to noindex any page with mostly 'stock' content, and only allow Google to index hotel pages with unique descriptions?
Thanks for any input!
-
whether or not it's advisable for ranking purposes to noindex any page with mostly 'stock' content, and only allow Google to index hotel pages with unique descriptions?
This is what I would do.
In the past you could allow those pages open for indexing and google would simply ignore them. Now you run the risk of hitting a Panada filter that will reduce the rankings for the rest of your site.
Good luck in this niche. Really really difficult. You need either enormous site strength or lots of unique substantive content (which you are doing).
-
That is exactly how I would handle it. If you get penalised from having large amounts of duplicate content it can be a slow process to get gbot to crawl the whole site again and lift that penalty - particularly if it is large site.
I'd also be looking at other ways to bring unique content in to those pages beyond just descriptions.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Page Content Issue
Hello, I recently solved www / no www duplicate issue for my website, but now I am in trouble with duplicate content again. This time something that I cannot understand happens: In Crawl Issues Report, I received Duplicate Page Content for http://yourappliancerepairla.com (DA 19) http://yourappliancerepairla.com/index.html (DA 1) Could you please help me figure out what is happenning here? By default, index.html is being loaded, but this is the only index.html I have in the folder. And it looks like the crawler sees two different pages with different DA... What should I do to handle this issue?
Technical SEO | | kirupa0 -
Partial duplicate content (reviews) on product pages - is this ok?
Hello, we recently received some really good reviews about a range of products we sell (there are normally 8 products in a range). Due to the industry we are in it made no sense to try and get reviews on each individual product within the range as they differ only ever so slightly. So my question is we want to add these reviews to each of the 8 products that lie within each range, but by adding them it would mean that each page has around 600 words of unique product description followed by approx 600 words of reviews that are the same on each of the products within that range. Is this ok? my only other option would be to screenshot the reviews and upload them as images below each product description. If anyone could offer advice here that would be much appreciated. Thanks
Technical SEO | | livs20130 -
Is there an easy solution for duplicate page content on a drupal CMS?
I have a drupal 7 site www.australiacounselling.com.au that has over 5000 crawl errors (!). The main problem - close to 3000 errors- is I have duplicate page content. When I create a page I can create a URL alias for the page that is SEO friendly, however every time I do this, it is registering there are 2 pages with the same content. Is there a module that you're aware of that I can have installed that would allow me to show what is the canonical page? My developers seemed stumped and have given up trying to find a solution, but I'm not convinced that it should be that hard. Any ideas from those familiar with drupal 7 would be greatly appreciated!
Technical SEO | | ClintonP0 -
Is Noindex Enough To Solve My Duplicate Content Issue?
Hello SEO Gurus! I have a client who runs 7 web properties. 6 of them are satellite websites, and 7th is his company's main website. For a long while, my company has, among other things, blogged on a hosted blog at www.hismainwebsite.com/blog, and when we were optimizing for one of the other satellite websites, we would simply link to it in the article. Now, however, the client has gone ahead and set up separate blogs on every one of the satellite websites as well, and he has a nifty plug-in set up on the main website's blog that pipes in articles that we write to their corresponding satellite blog as well. My concern is duplicate content. In a sense, this is like autoblogging -- the only thing that doesn't make it heinous is that the client is autoblogging himself. He thinks that it will be a great feature for giving users to his satellite websites some great fresh content to read -- which I agree, as I think the combination of publishing and e-commerce is a thing of the future -- but I really want to avoid the duplicate content issue and a possible SEO/SERP hit. I am thinking that a noindexing of each of the satellite websites' blog pages might suffice. But I'd like to hear from all of you if you think that even this may not be a foolproof solution. Thanks in advance! Kind Regards, Mike
Technical SEO | | RCNOnlineMarketing0 -
132 pages reported as having Duplicate Page Content but I'm not sure where to go to fix the problems?
I am seeing “Duplicate Page Content” coming up in our
Technical SEO | | danatanseo
reports on SEOMOZ.org Here’s an example: http://www.ccisolutions.com/StoreFront/product/williams-sound-ppa-r35-e http://www.ccisolutions.com/StoreFront/product/aphex-230-master-voice-channel-processor http://www.ccisolutions.com/StoreFront/product/AT-AE4100.prod These three pages are for completely unrelated products.
They are returning “200” status codes, but are being identified as having
duplicate page content. It appears these are all going to the home page, but it’s
an odd version of the home page because there’s no title. I would understand if these pages 301-redirected to the home page if they were obsolete products, but it's not a 301-redirect. The referring page is
listed as: http://www.ccisolutions.com/StoreFront/category/cd-duplicators None of the 3 links in question appear anywhere on that page. It's puzzling. We have 132 of these. Can anyone help me figure out
why this is happening and how best to fix it? Thanks!0 -
Does Google pass link juice a page receives if the URL parameter specifies content and has the Crawl setting in Webmaster Tools set to NO?
The page in question receives a lot of quality traffic but is only relevant to a small percent of my users. I want to keep the link juice received from this page but I do not want it to appear in the SERPs.
Technical SEO | | surveygizmo0 -
How to Solve Duplicate Page Content Issue?
I have created one campaign over SEOmoz tools for my website. I have found 89 duplicate content issue from report. Please, look in to Duplicate Page Content Issue. I am quite confuse to resolve this issue. Can any one suggest me best solution to resolve it?
Technical SEO | | CommercePundit0 -
Mask links with JS that point to noindex'ed paged
Hi, in an effort to prepare our page for the Panda we dramatically reduced the number of pages that can be indexed (from 100k down to 4k). All the remaining pages are being equipped with unique and valuable content. We still have the other pages around, since they represent searches with filter combination which we deem are less interesting to the majority of users (hence they are not indexed). So I am wondering if we should mask links to these non-indexed pages with JS, such that Link-Juice doesn't get lost to those. Currently the targeted pages are non-index via "noindex, follow" - we might de-index them with robots.txt though, if the "site:" query doesn't show improvements. Thanks, Sebastian
Technical SEO | | derderko0