Large number of thin content pages indexed, affect overall site performance?
-
Hello Community,
Question on negative impact of many virtually identical calendar pages indexed.
We have a site that is a b2b software product. There are about 150 product-related pages, and another 1,200 or so short articles on industry related topics. In addition, we recently (~4 months ago) had Google index a large number of calendar pages used for webinar schedules. This boosted the indexed pages number shown in Webmaster tools to about 54,000.
Since then, we "no-followed" the links on the calendar pages that allow you to view future months, and added "no-index" meta tags to all future month pages (beyond 6 months out). Our number of pages indexed value seems to be dropping, and is now down to 26,000.
When you look at Google's report showing pages appearing in response to search queries, a more normal 890 pages appear. Very few calendar pages show up in this report.
So, the question that has been raised is: Does a large number of pages in a search index with very thin content (basically blank calendar months) hurt the overall site? One person at the company said that because Panda/Penguin targeted thin-content sites that these pages would cause the performance of this site to drop as well.
Thanks for your feedback.
Chris
-
Unless a page can give value to a searcher (not just an existing customer) it shouldn't be in Google's search index.
Sometimes I like to go back to the basics. Remember that search engines exist to help people find information that they WANT to find. Realistically, people are not going to want to find every page on your websites in SERPS.I suggest you ask yourself this question; does this page offer information that someone would actually want to search for, and make your decision accordingly.
p.s. Having said all of that, I'll answer your question. The answer is yes, having thin pages on your site can hurt your domain. If your pages offer value to searchers, I suggest you improve them instead of remove them, but if they don't offer value to searchers don't waste your time, and just no-index them.
-
So, the question that has been raised is: Does a large number of pages in a search index with very thin content (basically blank calendar months) hurt the overall site?
Yes.
We had a site with some image content pages that had not a lot of text. They ranked great for years. Then, BAM, rankings across the site dropped on a Panda update.
We added noindex/follow to these pages, redirected some that were obsolete and our rankings came back with the next update.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
On page vs Off page vs Technical SEO: Priority, easy to handle, easy to measure.
Hi community, I am just trying to figure out which can be priority in on page, off page and technical SEO. Which one you prefer to go first? Which one is easy to handle? Which one is easy to measure? Your opinions and suggestions please. Expecting more realistic answers rather than usual check list. Thanks
Algorithm Updates | | vtmoz0 -
Google Search Analytics desktop site to losing page position compared to the mobile version of the site
Looking at Google Search Analytics page position by device. The desktop version has seen a dramatic drop in the last 60 days compared to the mobile site. Could this be caused by mobile first indexing? Has Google had any releases that might have caused this?
Algorithm Updates | | merch_zzounds0 -
Doorway Algorithm Update Affecting Location Based Pages?
Hi all, I read this article concerning the doorway algorithm update - http://searchengineland.com/google-to-launch-new-doorway-page-penalty-algorithm-216974 This quote is what got my attention: "How do you know if your web pages are classified as a “doorway page?” Google said asked yourself these questions: Is the purpose to optimize for search engines and funnel visitors into the actual usable or relevant portion of your site, or are they an integral part of your site’s user experience? Are the pages intended to rank on generic terms yet the content presented on the page is very specific? Do the pages duplicate useful aggregations of items (locations, products, etc.) that already exist on the site for the purpose of capturing more search traffic? Are these pages made solely for drawing affiliate traffic and sending users along without creating unique value in content or functionality? Do these pages exist as an “island?” Are they difficult or impossible to navigate to from other parts of your site? Are links to such pages from other pages within the site or network of sites created just for search engines?" We utilize location based pages for ourselves and a few clients too. **Example Case: ** -We attempt to rank for "keyword city/state" - "keyword city/state" - "keyword city/state" The keywords will often be the same such as "AC Repair" or "Physical Therapy" etc. with city / state combination such as "Tulsa, OK" "Seattle, WA" etc. The goal is to rank locally for those terms (NAP is applicable in some circumstances). Does the above case classify as a Doorway page? According to that definition, it does. However, this is a business that services that area. Some don't have physical address there but they do service that area (whether it be AC Repair or Website Design). Please advise me as to what a doorway page is exactly & if my practice is in-line. Thanks, Cole
Algorithm Updates | | ColeLusby0 -
Do we take a SEO hit for having multiple URLs on an infinite scroll page vs a site with many pages/URLs. If we do take a hit, quantify the hit we would suffer.
We are redesigning a preschool website which has over 100 pages. We are looking at 2 options and want to make sure we meet the best user experience and SEO. Option 1 is to condense the site into perhaps 10 pages and window shade the content. For instance, on the curriculum page there would be an overview and each age group program would open via window shade. Option 2 is to have an overview and then each age program links to its own page. Do we lose out on SEO if there are not unique URLS? Or is there a way using metatags or other programming to have the same effect?
Algorithm Updates | | jgodwin0 -
Duplicate Content?
My client is a manufacturers representative for highly technical controls. The manufacturers do not sell their products directly, relying on manufacturers reps to sell and service them. Most but not all of them publish their specs on their sites, usually in PDF only. As a service to our customers and with permission of the manufacturers we publish the manufacturers specs on our site for our customers in HTML with images and downloadable PDF's — this constitutes our catalogue. The pages are lengthy and technical, and are pretty much the opposite of thin content. The URLS for these (technical) queries rank well, so Google doesn't seem to mind. Does this constitute duplicate content and can we be penalized for it?
Algorithm Updates | | waynekolenchuk0 -
Sudden drop in rankings and indexed pages!
Over the past few days I have noticed some apparent major changes. Before I explain, let me say this: Checking my analytics and WMT: There is an increase in traffic (even via google organic) There is no drop in impressions or clicks There is no drop in indexed pages in GWT Having said that; When I check my indexed pages using site:www.mywebsite.com, I see only 30 results as opposed to the 120K that I was seeing before (it was steadily climbing). The indexed pages have increase 3 fold in the past year, because of the increase in pages, updates, and products on the site. I see a sudden drop in rankings for major keywords that had been steadily rising. For example, I had some major keywords that were on page 7-8, not they are on page 20+ or not at all. Also, the page that used to show in the rankings has changed. I have only done white-hat guest blogging in the past year for link building, on a small scale (maybe 20-30 links in a year). They only other change recently, is that we are: Posting products on Houzz and Pinterest daily adding our site to all local directories (white pages, Yelp, citysearch, etc.) My site got hit by Penguin more than a year ago, but we have done everything right since, and our traffic via organic results has more than doubled since the Penguin release. What the hell is going on? Should I be concerned?
Algorithm Updates | | inhouseseo0 -
Why does Google say they have more URLs indexed for my site than they really do?
When I do a site search with Google (i.e. site:www.mysite.com), Google reports "About 7,500 results" -- but when I click through to the end of the results and choose to include omitted results, Google really has only 210 results for my site. I had an issue months back with a large # of URLs being indexed because of query strings and some other non-optimized technicalities - at that time I could see that Google really had indexed all of those URLs - but I've since implemented canonical URLs and fixed most (if not all) of my technical issues in order to get our index count down. At first I thought it would just be a matter of time for them to reconcile this, perhaps they were looking at cached data or something, but it's been months and the "About 7,500 results" just won't change even though the actual pages indexed keeps dropping! Does anyone know why Google would be still reporting a high index count, which doesn't actually reflect what is currently indexed? Thanks!
Algorithm Updates | | CassisGroup0 -
SEOmoz suddenly reporting duplicate content with no changes???
I am told the crawler has been updated and wanted to know if anyone else is seeing the same thing I am. SEOmoz reports show many months of no duplicate content problems. As of last week though, I get a little over a thousand pages reported as dupe content errors. Checking these pages I find there is similar content (hasn't changed) with keywords that are definitely different. Many of these pages rank well in Google, but SEOmoz is calling them out as duplicate content. Is SEOmoz attempting to closely imitate Google's perspective in this matter and therefore telling me that I need to seriously change the similar content? Anyone else seeing something like this?
Algorithm Updates | | Corp0