Thin Content Pages: Adding more content really help?
-
Hello all,
So I have a website that was hit hard by Panda back in 2012 November, and ever since the traffic continues to die week by week. The site doesnt have any major moz errors (aside from too many on page links).
The site has about 2,700 articles and the text to html ratio is about 14.38%, so clearly we need more text in our articles and we need to relax a little on the number of pictures/links we add.
We have increased the text to html ratio for all of our new articles that we put out, but I was wondering how beneficial it is to go back and add more text content to the 2,700 old articles that we have just sitting.
Would this really be worth the time and investment? Could this help the drastic decline in traffic and maybe even help it grow?
-
Just saying what we did...
We had a site that was hit by panda. We had lots of very short news blurbs and some republished content from government agencies and academic institutions - much of that done by their request for exposure to our visitors. Immediately after the hit, we noindex/followed or deleted/redirected the republished content. We also noindex/followed or deleted all of the short content. The site got out of panda a few weeks later. Some traffic loss but not substantial
As for improving short content. We have done a lot of that. We had lots of very short descriptions of two sentences plus one or two images that were getting nice amounts of traffic. We improved those to a few hundred words and two or three images (very time consuming, very expensive - a few hours per page. The rankings for short tail queries went up nicely and there was a huge increase in long tail traffic. We later started improving the few hundred words plus two or three images to one to two thousand words plus four to eight images - even more time consuming - a day or two per page. Again, rankings and traffic go up nicely.
Today, for each new article that I publish, I am making a huge improvement to a page that is a proven traffic getter but could be improved a lot.
For you, take a look at the traffic into those 2700 old articles prior to your panda problem. Some might not be worth much, but others might be golden. Then decide what to delete/redirect, what to noindex/follow, and what to improve. Then begin working.
Good luck.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
A Sitemap Web page & A Sitemap in htaccess - will a website be penalised for having both?
Hi I have a sitemap url already generated by SEO Yoast in the htaccess file, and I have submitted that to the search engines. I'd already created a sitemap web page on the website, also as a helpful aid for users to see a list of all page urls. Is this a problem and could this scenario create duplicate issues or any problems with search engines? Thanks.
White Hat / Black Hat SEO | | SEOguy10 -
Is Syndicated (Duplicate) Content considered Fresh Content?
Hi all, I've been asking quite a bit of questions lately and sincerely appreciate your feedback. My co-workers & I have been discussing content as an avenue outside of SEO. There is a lot of syndicated content programs/plugins out there (in a lot of cases duplicate) - would this be considered fresh content on an individual domain? An example may clearly show what I'm after: domain1.com is a lawyer in Seattle.
White Hat / Black Hat SEO | | ColeLusby
domain2.com is a lawyer in New York. Both need content on their website relating to being a lawyer for Google to understand what the domain is about. Fresh content is also a factor within Google's algorithm (source: http://moz.com/blog/google-fresh-factor). Therefore, fresh content is needed on their domain. But what if that content is duplicate, does it still hold the same value? Question: Is fresh content (adding new / updating existing content) still considered "fresh" even if it's duplicate (across multiple domains). Purpose: domain1.com may benefit from a resource for his/her local clientale as the same would domain2.com. And both customers would be reading the "duplicate content" for the first time. Therefore, both lawyers will be seen as an authority & improve their website to rank well. We weren't interested in ranking the individual article and are aware of canonical URLs. We aren't implementing this as a strategy - just as a means to really understand content marketing outside of SEO. Conclusion: IF duplicate content is still considered fresh content on an individual domain, then couldn't duplicate content (that obviously won't rank) still help SEO across a domain? This may sound controversial & I desire an open-ended discussion with linked sources / case studies. This conversation may tie into another Q&A I posted: http://moz.com/community/q/does-duplicate-content-actually-penalize-a-domain. TLDR version: Is duplicate content (same article across multiple domains) considered fresh content on an individual domain? Thanks so much, Cole0 -
Need help determining how toxic this backlinking is
Okay, so my company has an SEO company already. However, we're trying to get people internally cross-trained on SEO, so I've been selected to kind of do a crash-course in SEO and look at our site from a new perspective. We are in the process of getting our old site ported over to a new one we've also created on Wordpress. I've been doing a LOT of online research, but this is definitely a very new field for me. Here's our current site: www.cedrsolutions.com So, here's my question: While doing some SEO-optimizing automatic tests on our site, I came across some weird backlinks to one of our pages: http://www.cedrsolutions.com/dental-office-manual/ http://en.calameo.com/read/003415063525a885728e7 Here's the thing: We didn't make this. It looks HORRIBLE, the copy is gibberish, and it looks weird. Doing some more searching, I started finding stuff like this https://lessons.engrade.com/dentalofficemanual/1 http://pumosust.over-blog.com/2014/09/how-to-get-customized-dental-office-manuals-online.html https://www.youtube.com/watch?v=egMonqa5eRo (???? I don't even understand how someone did this, the photo in the book is just the photo from our page) http://www.tuugo.in/Companies/cedr-hr-solutions/0150008267958#! http://www.webjam.com/dental_office_manual/$my_blog/2014/09/12/how_to_get_customized_dental_office_manuals_online Conservatively, I'd say there's at least 100 of these types of pages out there linking to us, maybe more Then I started finding comments on blogs http://blog.kenexa.com/hr-focus-on-increasing-revenue-not-just-managing-costs/ http://geekologie.com/2012/05/bad-ideas-boyfriend-visits-dentist-ex-da.php (some NSFW language on that one) So, my first thought is obviously "Okay, these are gibberish, over-optimized, and ALL of them are trying to bump our relevancy for something along the lines "Dental office manual" EDIT: I should also mention these links ALL just appeared out of thin air. A whole bunch in early July, and more in mid-September. They didn't just slowly accumulate. So (finally) here's my questions: 1. Did our current SEO company probably do this? The only thing they've mentioned before is that they were going to create some backlinks for us, with an assurance they'd be genuine links that would build Pagerank without getting us slapped by Google. 2. Am I correct in my opinion that these are toxic links that could get manual action taken against us by Google? I'm not sure how LIKELY it is (as again, there's only about 100 or so) but they seem to be violating multiple Google principles. With how often Google pushes out algorithm updates I feel like we could still get busted for this even if the links are like 6-7 months old and not sending us much traffic. I'm asking because I've been told to set up a conference call with the account manager at our current SEO place, and I want to know what I'm getting into. I might be wildly over-reacting about nothing, I might be kind of right but it's not that bad, or I might be 100% right and what they are doing is not cool at all, and could kill our SEO if we get busted by Google. I'm not sure which it is. Checking Google webmaster tools and analytics, I don't see any drops in organic traffic between July '14 and now, so I don't think we've been smacked by Google algorithm-wise. And there's no notice from Google of manual action being taken, or anything being wrong with our backlinks, so I'm fairly confident these links haven't hurt us at least as of today. I'm just worried going forward (especially when we finish the new site and submit it to Google to get crawled, the URLs will be the same) Sorry this was so long. I'm kind of nervous, honestly. On the one hand, these backlinks seem SUPER sketchy to me, but on the other hand, I don't KNOW any of this stuff. It sounds kind of ridiculous for me, someone with maybe 3 weeks of intense Google-education in SEO, to be questioning something a real, established SEO company is doing. I mean, I kind of have to assume they know better, right?
White Hat / Black Hat SEO | | CEDRSolutions1 -
Are landing pages making a comeback
Just recently I have noticed a ever increasing number of landing pages on websites, the ones I have come across have been in the sports industry like rugby/football and their landing pages are sparse but offering the social avenues on a plate. are Landing pages making their way back in the seo industry?
White Hat / Black Hat SEO | | TeamacPaints0 -
Duplicate content or not? If you're using abstracts from external sources you link to
I was wondering if a page (a blog post, for example) that offers links to external web pages along with abstracts from these pages would be considered duplicate content page and therefore penalized by Google. For example, I have a page that has very little original content (just two or three sentences that summarize or sometimes frame the topic) followed by five references to different external sources. Each reference contains a title, which is a link, and a short abstract, which basically is the first few sentences copied from the page it links to. So, except from a few sentences in the beginning everything is copied from other pages. Such a page would be very helpful for people interested in the topic as the sources it links to had been analyzed before, handpicked and were placed there to enhance user experience. But will this format be considered duplicate or near-duplicate content?
White Hat / Black Hat SEO | | romanbond0 -
Does posting a source to the original content avoid duplicate content risk?
A site I work with allows registered user to post blog posts (longer articles). Often, the blog posts have been published earlier on the writer's own blog. Is posting a link to the original source a sufficient preventative solution to possibly getting dinged for duplicate content? Thanks!
White Hat / Black Hat SEO | | 945010 -
My attempt to reduce duplicate content got me slapped with a doorway page penalty. Halp!
On Friday, 4/29, we noticed that we suddenly lost all rankings for all of our keywords, including searches like "bbq guys". This indicated to us that we are being penalized for something. We immediately went through the list of things that changed, and the most obvious is that we were migrating domains. On Thursday, we turned off one of our older sites, http://www.thegrillstoreandmore.com/, and 301 redirected each page on it to the same page on bbqguys.com. Our intent was to eliminate duplicate content issues. When we realized that something bad was happening, we immediately turned off the redirects and put thegrillstoreandmore.com back online. This did not unpenalize bbqguys. We've been looking for things for two days, and have not been able to find what we did wrong, at least not until tonight. I just logged back in to webmaster tools to do some more digging, and I saw that I had a new message. "Google Webmaster Tools notice of detected doorway pages on http://www.bbqguys.com/" It is my understanding that doorway pages are pages jammed with keywords and links and devoid of any real content. We don't do those pages. The message does link me to Google's definition of doorway pages, but it does not give me a list of pages on my site that it does not like. If I could even see one or two pages, I could probably figure out what I am doing wrong. I find this most shocking since we go out of our way to try not to do anything spammy or sneaky. Since we try hard not to do anything that is even grey hat, I have no idea what could possibly have triggered this message and the penalty. Does anyone know how to go about figuring out what pages specifically are causing the problem so I can change them or take them down? We are slowly canonical-izing urls and changing the way different parts of the sites build links to make them all the same, and I am aware that these things need work. We were in the process of discontinuing some sites and 301 redirecting pages to a more centralized location to try to stop duplicate content. The day after we instituted the 301 redirects, the site we were redirecting all of the traffic to (the main site) got blacklisted. Because of this, we immediately took down the 301 redirects. Since the webmaster tools notifications are different (ie: too many urls is a notice level message and doorway pages is a separate alert level message), and the too many urls has been triggering for a while now, I am guessing that the doorway pages problem has nothing to do with url structure. According to the help files, doorway pages is a content problem with a specific page. The architecture suggestions are helpful and they reassure us they we should be working on them, but they don't help me solve my immediate problem. I would really be thankful for any help we could get identifying the pages that Google thinks are "doorway pages", since this is what I am getting immediately and severely penalized for. I want to stop doing whatever it is I am doing wrong, I just don't know what it is! Thanks for any help identifying the problem! It feels like we got penalized for trying to do what we think Google wants. If we could figure out what a "doorway page" is, and how our 301 redirects triggered Googlebot into saying we have them, we could more appropriately reduce duplicate content. As it stands now, we are not sure what we did wrong. We know we have duplicate content issues, but we also thought we were following webmaster guidelines on how to reduce the problem and we got nailed almost immediately when we instituted the 301 redirects.
White Hat / Black Hat SEO | | CoreyTisdale0