Noindex large productpages on webshop to counter Panda
-
A Dutch webshop with 10.000 productpages is experiencing lower rankings and indexation. Problems started last october, a little while after the panda and penguin update.
One of the problems diagnosed is the lack of unique content. Many of the productpages lack a description and some are variants of eachother. (color, size, etc). So a solution could be to write unique descriptions and use rel canonical to concentrate color/size variations to one productpage.
There is however no capacity to do this on short notice. So now I'm wondering if the following is effective.
Exclude all productpages via noindex, robots.txt. IN the same way as you can do with search pages. The only pages left for indexation are homepage and 200-300 categorypages. We then write unique content and work on the ranking of the categorypages. When this works the product pages are rewritten and slowly reincluded, category by category.
My worry is the loss of ranking for productpages. ALthoug the ranking is minimal currently. My second worry is the high amount of links on category pages that lead to produtpages that will be excluded rom google. Thirdly, I am wondering if this works at all. using noindex on 10.000 productpages consumes crawl budget and dillutes the internal link structure.
What do you think?
-
I see. There's a pretty thorough discussion on a very similar situation here: http://moz.com/community/q/can-i-use-nofollow-tag-on-product-page-duplicated-content. Everett endorsed Monica's answer with, "... you might consider putting a Robots Noindex,Follow meta tag on the product pages. You'll need to rely on category pages for rankings in that case, which makes sense for a site like this." Monica's long term solution was to also work on getting specific user-generated content on as many product pages as possible. Cheers!
-
@Ryan, thx for your answer. The pagerank flow is indeed one of the things I worry about when deindexing large parts of the site. Especcialy since the category pages will be full of internal links to productpages that are excluded from indexation by robots.txt or robots meta.
The problem I am trying to solve however has nothing to do with pagerank sculpting. I suspect an algorithmic drop due to thin, duplicate and syndicated content. The drop is sitewide. Assuming that the drop is due to panda I suspect the percentage of low quality pages should be optimized. Would outlinking and better DA really be sufficient to counter a suspected Panda problem? Or is it required to make the 10.000 product pages of better quality, I would think the latter. Since there is no budget to do so I wonder if it is possible to drop these low quality pages from the index (but keep them in the website). Would this strenghten the remaining pages to bounce back up, assuming these remaining pages are of good quality offcourse.
Since SEO is not the only factor to be taken into account I'd rather not delete these pages from the website.
-
Matt Cutts speaks to part of what you're thinking about doing here: https://www.mattcutts.com/blog/pagerank-sculpting/ and it's important to note that it's not nearly as effective. The thing I would focus more on is the DA and quality of referrals to your site. Secondly, linking out from pages is actually a positive strength indicator when done in the right way, per Cutts in the same article, "In the same way that Google trusts sites less when they link to spammy sites or bad neighborhoods, parts of our system encourage links to good sites." Perhaps your product pages could be strengthened further by this as well.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Extreme high number of pages found on webshop
Hi, Im working for the first time on a magento webshop. But i run into a problem where crawlers find then thousands of pages while there are a few hunderd products. I expect is has something to do with filters that generate dynamic URL's. I can't find any setting in Magento to prevent this and i think this will hurt SEO performance because of duplicate content and high amount of pages that need to be crawled while the site has no authority. What would my approach be to solve this? Do i need to ad certain tags to the pages or are these settings in my robots file.
Technical SEO | | J05B0 -
Help Crawl friendliness for large site
After watching Rand's video I am trying to think of the best way to make my large site more crawl friendly. Background I have a large site with over 100k product skus and so when you get to a particular page of products there are tons of different refinements and options that help you sort the products. Most of these are noindex followed, but I was wondering if I should be nofollowing the internal links as well in order to keep bots out of those pages and going to the pages that I want them to go too. Is this a good way to handle it? Also, does anyone have good recommendations of links to posts that deal with helping the crawl friendliness of a large site? Thanks!
Technical SEO | | Gordian0 -
Lost with conical, nofollow noindex. Not sure how to use it on a dyanmic php site with multiple region select options
I have a site with multiple regions the main page after a region is selected is login.php but the regions are defined by ?rid=11 , 12, etc. These are being picked up as duplicate content but they are all different regions. As i hired external php coders to develop most of the site I am scared to start meddling with any of the raw code and would like some advise on how to not show these as duplicate content. should i use noindex nofollow or connical? if Connical how do i set it up on the main login.php page? p.s. i am an extreme nube to seo
Technical SEO | | moby1230 -
NOINDEX,FOLLOW on product pages
Hi Can I have people's thoughts on something please. We sell wedding stationery and whilst we can generate lots of good content describing a particular range of stationery we can't relistically differentiate at a product level. So imagine we have three ranges Range 1 - A Bird Range 2 - A Heart Range 3 - A Flower Within each of these ranges we would have invitations, menus, place cards, magnets etc. The ranges vary quite alot so we can write good textual keyword rich descriptions that attract traffic (i.e. one about the bird, one about the heart and one about the flower). However the individual products within a range just reflect the design for the range as a whole (as all items in a range match). Therefore we can't just copy the content down to the product level and if we just describe the generic attributes of the products they will alll be very similar. We have over 1,000 "products" easily so I am conscious of creating too much duplication over the site in case Mr Panda comes to call. So I was thinking that I "might" NOINDEX, FOLLOW the product pages to avoid this duplication and put lots of effort into making my category pages much better and content rich. The site would be smaller in the index BUT I do not really expect to generate traffic from the product pages because they are not branded items and any searches looking for particular features of our stationery would be picked up, much more effectively, by the category pages. Any thoughts on this one? Gary
Technical SEO | | gtrotter6660 -
Panda: Are our ads duplicate content or just structural and not even considered?
We have hundreds and hundreds of pages with similar ads on. We are getting content written for these pages right now and we're removing some pages, but we're wondering how Panda might see the ads which we have across the site? The ads consist of the name of a company and a description and a few other bits. The description is the same on all pages that a company's ad is listed on - and that can be hundreds of pages. You can see some examples here: http://www.agencycentral.co.uk/agencysearch/accounting/skills/indandcomm/financialanalyst.htm http://www.agencycentral.co.uk/agencysearch/accounting/skills/indandcomm/financialaccountant.htm http://www.agencycentral.co.uk/agencysearch/accounting/skills/indandcomm/assistantaccountant.htm What we're wondering is whether Google Panda might be seeing the description of the company as internal duplicate content or just structural and not even considered as part of the Panda algorithm? Or something else? Or wouldn't it be clear in this case? Clearly Panda wouldn't hit duplicate content in nav bards, sidebars etc... but this is in the content area of the page so it did make us wonder. This could make a difference to how we proceed so we appreciate your thoughts. Regards, Phil
Technical SEO | | agencycentral0 -
Pages noindex'ed. Submit removal request too?
We had a bunch of catalog pages "noindex,follow" 'ed. Now should we also submit removal request in WMT for these pages? Thank you! LL
Technical SEO | | LocalLocal0 -
Google Panda and ticketing sites: quality of content
Hi from Madrid! I am managing the Marketing Department of a ticketing site in Europe similar to Stubhub.com. We have thousands of events and, until now, we used templates for their descriptions. A lot of events share the same description with minor changes. They also have a lot of tickets on sale, so that's unique content different on each event. Now the last Google Panda update hit Europe and I was wondering if that will affect us a lot. It's hard to tell for now, because we are in the middle of the summer and the volume of searches in our industry depends decreases a lot during this time of the year. I know that ideally we should have unique descriptions but that would need a lot of resources and they are not important for our users: they just want to know the venue, the time and the price of the tickets! Have you experienced something about Google Panda update with a similar site or with another e-commerce industry? Thanks!
Technical SEO | | jorgediaz0 -
Pagerank panda rebound
First, I want to say thank you to the support i've been getting from everyone in this community over the last few months. Im glad to be a part of something that isnt full of flammers and haters... and happy late 4th of july... I implemented canonical tags on a clients website (job board with tons of job postings) and the pagerank has almost immediately rebounded from 2.3 - 4.6. it was a little higher pre-panda, but im happy with the rebound. has anyone had experience on how long it usually takes for the keyword rankings to rebound after changes are made? google recached the site this morning, but im still waiting to see traffic bumps. Thanks!
Technical SEO | | malachiii0