20-30% of our ecommerce categories contain no extra content, could this be a problem
-
Hello,
About 20-30% of our ecommerce categories have no content beyond the products that are in them. Could this be a problem with Panda?
Thanks!
-
It's not an exact science in regard to any one signal, however yes, the more you can reinforce the ability to strengthen topical focus, the less likely Panda would find category pages to be weak.
-
No worries Bob. Ignore my original suggestion then.
Alan has some good suggestions for you to follow
-Andy
-
Thanks Alan, this is perfect.
So if we had at least a couple of good paragraphs on every category page, and a few extra very appropriate internal links pointing to each of those category pages then we would be in good shape as far as Panda and category strength. Correct?
-
Hi Andy,
Sorry for the confusion. This is an ecommerce site. I edited the original question to be clear.
-
I'm assuming that this is a Wordpress site (more info would be useful) and a common issue is category pages causing problems due to them showing the same excerpts over and over. No indexing them gets around this.
if I have misread the type of issue this is, then of course, this doesn't apply. With this being posted in blogging and content, this was my assumption.
A URL to look at would I'm sure confirm more of the problem.
-Andy
-
Andy,
why would you noindex/follow category pages? Thats like saying "hey - we have X products for this category - so it's really a high value and important page we deserve ranking for. Except hey - we don't have the willingness to boost the trust signals on the category page itself, so don't bother."
That in turn negatively impacts the site's ability to gain maximum ranking signals for any products in those categories (at least in highly competitive fields).
So I'm curious why you'd take that path.
-
It could be Bob. I always advise that category pages are noindex / follow to avoid issues.
if you are using Wordpress and Yoast, this is just a setting.
-Andy
-
If a category page has almost no content (other than photos and product names), then that's a potential "thin content" issue, though the way your question is worded, I'm not confident my interpretation is actually what you meant by "no content beyond".
If product names don't reference the category name, and if there's a lack of any descriptive content on the category page, that's likely even more of a problem - thin content and lack of topical reinforcement of the category itself.
A general rule (barring other issues or considerations) is to have at least a couple paragraphs of unique, descriptive paragraph based text that reinforces the topical focus of each category page. There are numerous ways to split that content out across a category page, and in highly competitive categories, more content may be needed if not enough products exist in the category.
Other factors that can help mitigate this to a certain degree include (but aren't necessarily limited to):
- hierarchical URL structure (nested URLs so product detail pages are seen at the URL as being "beneath" their category
- Proper nested breadcrumbs to reinforce that hierarchical structure
- Strong internal linking a) within categories this would include pagination code (rel-next/rel-prev). b) outside a category this would include links and highly refined relevant content elsewhere on the page linking to the category page.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to re-rank an established website with new content
I can't help but feel this is a somewhat untapped resource with a distinct lack of information.
White Hat / Black Hat SEO | | ChimplyWebGroup
There is a massive amount of information around on how to rank a new website, or techniques in order to increase SEO effectiveness, but to rank a whole new set of pages or indeed to 're-build' a site that may have suffered an algorithmic penalty is a harder nut to crack in terms of information and resources. To start I'll provide my situation; SuperTED is an entertainment directory SEO project.
It seems likely we may have suffered an algorithmic penalty at some point around Penguin 2.0 (May 22nd) as traffic dropped steadily since then, but wasn't too aggressive really. Then to coincide with the newest Panda 27 (According to Moz) in late September this year we decided it was time to re-assess tactics to keep in line with Google's guidelines over the two years. We've slowly built a natural link-profile over this time but it's likely thin content was also an issue. So beginning of September up to end of October we took these steps; Contacted webmasters (and unfortunately there was some 'paid' link-building before I arrived) to remove links 'Disavowed' the rest of the unnatural links that we couldn't have removed manually. Worked on pagespeed as per Google guidelines until we received high-scores in the majority of 'speed testing' tools (e.g WebPageTest) Redesigned the entire site with speed, simplicity and accessibility in mind. Htaccessed 'fancy' URLs to remove file extensions and simplify the link structure. Completely removed two or three pages that were quite clearly just trying to 'trick' Google. Think a large page of links that simply said 'Entertainers in London', 'Entertainers in Scotland', etc. 404'ed, asked for URL removal via WMT, thinking of 410'ing? Added new content and pages that seem to follow Google's guidelines as far as I can tell, e.g;
Main Category Page Sub-category Pages Started to build new links to our now 'content-driven' pages naturally by asking our members to link to us via their personal profiles. We offered a reward system internally for this so we've seen a fairly good turnout. Many other 'possible' ranking factors; such as adding Schema data, optimising for mobile devices as best we can, added a blog and began to blog original content, utilise and expand our social media reach, custom 404 pages, removed duplicate content, utilised Moz and much more. It's been a fairly exhaustive process but we were happy to do so to be within Google guidelines. Unfortunately, some of those link-wheel pages mentioned previously were the only pages driving organic traffic, so once we were rid of these traffic has dropped to not even 10% of what it was previously. Equally with the changes (htaccess) to the link structure and the creation of brand new pages, we've lost many of the pages that previously held Page Authority.
We've 301'ed those pages that have been 'replaced' with much better content and a different URL structure - http://www.superted.com/profiles.php/bands-musicians/wedding-bands to simply http://www.superted.com/profiles.php/wedding-bands, for example. Therefore, with the loss of the 'spammy' pages and the creation of brand new 'content-driven' pages, we've probably lost up to 75% of the old website, including those that were driving any traffic at all (even with potential thin-content algorithmic penalties). Because of the loss of entire pages, the changes of URLs and the rest discussed above, it's likely the site looks very new and probably very updated in a short period of time. What I need to work out is a campaign to drive traffic to the 'new' site.
We're naturally building links through our own customerbase, so they will likely be seen as quality, natural link-building.
Perhaps the sudden occurrence of a large amount of 404's and 'lost' pages are affecting us?
Perhaps we're yet to really be indexed properly, but it has been almost a month since most of the changes are made and we'd often be re-indexed 3 or 4 times a week previous to the changes.
Our events page is the only one without the new design left to update, could this be affecting us? It potentially may look like two sites in one.
Perhaps we need to wait until the next Google 'link' update to feel the benefits of our link audit.
Perhaps simply getting rid of many of the 'spammy' links has done us no favours - I should point out we've never been issued with a manual penalty. Was I perhaps too hasty in following the rules? Would appreciate some professional opinion or from anyone who may have experience with a similar process before. It does seem fairly odd that following guidelines and general white-hat SEO advice could cripple a domain, especially one with age (10 years+ the domain has been established) and relatively good domain authority within the industry. Many, many thanks in advance. Ryan.0 -
How do I make a content calendar to increase my rank for a key word?
I've watched more than a few seminars on having a content calendar. Now I'm curious as to what I would need to do to increase ranking for a specific keyword in local SEO. Let's say I wanted to help them increase their rank for used trucks in buffalo, NY. Would I regularly publish blog posts about used trucks? Thanks!
White Hat / Black Hat SEO | | oomdomarketing0 -
International web site - duplicate content?
I am looking at a site offering different language options via a javascript drop down chooser. Will google flag this as duplicate content? Should I recommend the purchase of individual domains for each country? i.e. .uk
White Hat / Black Hat SEO | | bakergraphix_yahoo.com1 -
Lots of websites copied my original content from my own website, what should I do?
1. Should I ask them to remove and replace the content with their unique and original content? 2. Should I ask them to link to the URL where the original content is located? 3. Should I use a tool to easily track these "copycat" sites and automatically add links from their site to my site? Thanks in advance!
White Hat / Black Hat SEO | | esiow20130 -
Content within a toggle, Juice or No Juice?
Greetings Mozzers, I recently added a significant amount of information within a single page utilizing toggles to hide the content from a user and for them to see it they must click to reveal. Since technically the code is reading "display:none" to start, would that be considered "Black Hat" or "Not There" to crawlers? It isn't displayed in any sort of spammy way. It is more for the UX of the visitor that toggles were utilized. Thoughts and advice is greatly appreciated!
White Hat / Black Hat SEO | | MonsterWeb280 -
Google Reconsideration Requests no problem... So what do I do next?
Hi all, So I recently filed a Google reconsideration request - but it came back saying "No manual spam actions found" - ok, so that's that. But from what I've read, if we have been hit by Panda for duplicate or thin content, we wouldn't know - in other words, Google does not report it as it is an algorhythm penalty as opposed to a manual one. So what are my options - do I wait until the next Panda update? when can that be? Or do I start over on a fresh domain? Input and views appreciated. thanks,
White Hat / Black Hat SEO | | bjs20100 -
How to Not Scrap Content, but still Being a Hub
Hello Seomoz members. I'm relatively new to SEO, so please forgive me if my questions are a little basic. One of the sites I manage is GoldSilver.com. We sell gold and silver coins and bars, but we also have a very important news aspect to our site. For about 2-3 years now we have been a major hub as a gold and silver news aggregator. At 1.5 years ago (before we knew much about SEO), we switched from linking to the original news site to scraping their content and putting it on our site. The chief reason for this was users would click outbound to read an article, see an ad for a competitor, then buy elsewhere. We were trying to avoid this (a relatively stupid decision with hindsight). We have realized that the Search Engines are penalizing us, which I don't blame them for, for having this scraped content on our site. So I'm trying to figure out how to move forward from here. We would like to remain a hub for news related to Gold and Silver and not be penalized by SEs, but we also need to sell bullion and would like to avoid loosing clients to competitors through ads on the news articles. One of the solutions we are thinking about is perhaps using an iFrame to display the original url, but within our experience. An example is how trap.it does this (see attached picture). This way we can still control the experience some what, but are still remaining a hub. Thoughts? Thank you, nick 3dLVv
White Hat / Black Hat SEO | | nwright0 -
Showing pre-loaded content cloaking?
Hi everyone, another quick question. We have a number of different resources available for our users that load dynamically as the user scrolls down the page (like Facebook's Timeline) with the aim of improving page load time. Would it be considered cloaking if we had Google bot index a version of the page with all available content that would load for the user if he/she scrolled down to the bottom?
White Hat / Black Hat SEO | | CuriosityMedia0