When Panda's attack...
-
I have a predicament. The site I manage (www.duhaime.org) has been hit by the Panda update but the system seems fixed against this site’s purpose. I need some advice on what i'm planning and what could be done.
First, the issues:
Content Length
The site is legal reference including dictionary and citation look up. Hundreds (perhaps upwards of 1000) of pages, by virtue of the content, are thin. The acronym C.B.N.S. stands for “Common Bench Reports, New Series” a part of the English reports. There really isn’t too much more to say nor is there much value to the target audience in saying it.
Visit Length as a Metric
There is chatter claiming Google watches how long a person uses a page to gauge it’s value. Fair enough but, a large number of people that visit this site are looking for one small piece of data. They want the definition of a term or citation then they return to whatever caused the query in the first place.
My strategy so far…
Noindex some Pages
Identify terms and citations that are really small – less than 500 characters – and put a no index tag on them. I will also remove the directory links to the pages and clean the sitemaps. This should remove the obviously troublesome pages. We’ll have to live with the fact these page won’t be found in Google’s index despite their value.
Create more click incentives
We already started with related terms and now we are looking at diagrams and images. Anything to punch up the content for that ever important second click.
Expand Content (of course)
The author will focus the next six months on doing his best to extend the content of these short pages. There are images and text to be added in many cases – perhaps 200 pages. Still won't be able to cover them all without heavy cut-n-paste feel.
Site Redesign
Looking to lighten up the code and boiler plate content shortly. We were working on this anyway. Resulting pages should have less than 15 hard-coded site-wide links and the disclaimer will be loaded with AJAX upon scroll. Ads units will be kept at 3 per page.
What do you think? Are the super light pages of the citations and dictionary why site traffic is down 35% this week?
-
Traffic (and income) is now down over 55% which is really too bad. The content is unique and highly valuable to the target market.
Any advice about why would be really appreciated.
-
All content is unique. Much of it is 10 years old.
It gets duplicated/syndicated to other sites: some legit, others we constantly fight to have removed. One in India completely copied the site from a few years ago and changed most of the links to internal addresses.
However, the owner wrote all of the non-quote or referenced material.
-
"Google watches how long a person uses a page to gauge it’s value"
Perhaps, but I wouldn't stress about that metric in particular. As you correctly pointed out, a visitor who is looking for a specific item and finds it will leave a site rather quickly.Is the content unique or duplicate?
EDIT: According to a quick check on Copyscape, your content is duplicated across other sites. You definitely need unique content as a starting point.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Question: About Google's personalization of search results and its impact on monitoring ranking results
Given Google's personalization of search results for anyone who's logged into a Google property, how realistic and how actually meaningful/worthwhile is it to monitor one's ranking results for any keyword term these days?
Algorithm Updates | | RandallScrubs0 -
301'ing old (2000), high PR, high pages indexed domain
Hi, I have an old (2000), very high PR, 20M+ pages indexed by goog domain which... got adsense banned. The domain has taken a few hits over the years from penguin/panda, but come out pretty well compared to many competitors. The problem is it was adsense banned in the big adsense acct ban of 2012 for invalid activity. No, I still have no idea what the issue was. I'd like to start using a new domain if I can safely get goog to pass the PR & indexing love so I can run adsense & Adx. What are your initial thoughts? Am I out of my mind to try?
Algorithm Updates | | comfortsteve1 -
Numbers vs #'s For Blog Titles
For your blog post titles, is it "better" to use numbers or write them out? For example, 3 Things I love About People Answering My Constant Questions or Three Things I Love About People Answering My Constant Questions? I could see this being like the attorney/lawyer, ecommerce/e-commerce and therefore not a big deal. But, I also thought you should avoid using #'s in your url's. Any thoughts, Ruben
Algorithm Updates | | KempRugeLawGroup0 -
Why isn't our structured markup showing in search results
Hi All, We installed Schema.org structured markup on our pages nearly 1.5 months ago at this point and we have yet to see the markup show in the search results. It also checks out in Webmaster tools and Google's structured markup language testing tool. So, I'm just confused why it's not even showing up site a "site" search in Google either. Here's an example of two such pages on our site: http://www.learningtree.com/htfu/usdc01/washington/java-perl-and-python-programming-training and http://www.learningtree.com/htfu/usat40/alpharetta/it-and-management-training Any advice is greatly appreciated! Thank you 🙂
Algorithm Updates | | CSawatzky0 -
What's better for seo, NOINDEX, or INDEX
Hello Mozers; I am having an issue, my client has 10K pages on their site; in WP, and they have a classified section. Question #1: I am asking, what's better for seo, NOINDEX, or INDEX, for their Classified section. They currently have no SEO plug ins, that fix their errors, and warnings. Question #2: My question is also, do I want the Categories crawled, or INDEXED or NOINDEX? Check out their Campaign results by Moz: Title Element Too Long (> 70 Characters) 32 Too Many On-Page Links 9,032 Missing Meta Description Tag 6,234
Algorithm Updates | | smstv0 -
Did anyone else notice all their keyword rankings go down after the last Panda refresh on January 17th 2013?
Even before January 17th I noticed my keyword ranking slowly going from the top 3 to around 8, 9 and 10. Then between January 15 and January 30th, (SEO MOZ) is not showing the exact date) they all went down to the second page and worse. The rankings dropped for an e-commerce website petsspark.com. They sell a tear stain removal product which is a pretty competitive market. After January i started to notice that Google was starting to rank blogs, forums, overal product review websites and of course amazon, better than me and my competitors. Was anyone else effected by the panda refresh or have any idea what may have gone wrong? Please help ScreenShot2013-04-10at50852PM.png?t=1365628252
Algorithm Updates | | DTOSI1 -
Interesting SERP trend I'm observing
I know Google has been favoring brands a big names lately, but I'm seeing something a bit more alarming Our company offers custom embroidered patches, and through keyword and search research I have discovered that almost all searches for "embroidered patches" are by people who need embroidered patches and are looking to purchase them, or learn more about the process of purchasing them. The SERPs for this term used to be all embroidered patch companies such as ours. In the past month: We've been outranked by a page on Amazon that's fairly irrelevant. An equally irrelevant ebay page has emerged The Wikipedia page for "embroidered patch" is now number seven. This has pushed three other embroidered patch companies off the first page (not that I'm complaining because it wasn't our company . . . yet). My question is, has anyone else noticed something similar happening, where large sites are gaining ground, in spite of the fact that they have low relevance to the search term?
Algorithm Updates | | UnderRugSwept0