What Should I Do With Low Quality Content?
-
As my site has definitely got hit by Panda, I am in the process of cleaning my website of low quality content.
Needless to say, shitty articles are completed being removed but I think lots of this content is now of low quality because it is obsolete and dated.
So what should I do with this content?
Should I rewrite those articles as completely new posts and link from the old posts to the new ones? Or should I delete the old posts and do a 301 redirect to the new post?
Or should I rewrite the content of these articles in place so I can keep the old URL and backlinks?
One thing is that I've got a lot more followers than I used to so publishing a new post gets a lot more views, like and shares and whatnot from social networks.
-
I wouldn't re-write old posts. If they can be refreshed or added to with recent updates go ahead and redirect (if it can be redirected without losing any additional info) or link to the new version.
Things get tricky if there's nothing new that can be written about the post. First, kill the really bad stuff, as Mike suggested, and keep the good stuff. The stuff on the borderline is probably not worth keeping unless it was still receiving traffic. In my experience with Panda, using 410s on bad pages is better than redirecting, but you will probably want to 301 redirect to the next-best page if you have good links.
If it was still receiving organic traffic, think about what you can do to make it better or provide additional resources and reading. Try to save traffic-generating pieces by improving them and making them useful to the people who were landing on them. For high-traffic pieces, you will want to look at the organic keywords and make sure the page somehow answers the query.
As always with Panda, make sure your design doesn't turn people off and that you're not filling the template with too many ads.
-
No 404's are fine they will just not pass link juice and Google will eventually stop following them but ideally they will phase out regardless as the internet moves along.
-
And what about real crappy content? If I delete it, I end up with lots of 404 errors. Does that pose a problem to Google?
-
In this case, would you change the date to post it as "new" content? Because even if I rewrite it, I can't post an article from 2008 to the website's Facebook page.
-
I think that it all depends on how bad the content is. If you have content that is complete and total crap (10+ instances of the same keyword, reads like a toddler wrote it, etc.), it is better just to kill it and redirect your pages elsewhere. On the other hand, if the content is salvageable, then take the time to re-write it and make it good. The benefit of this is that at the end of the day you have good content instead of a bunch of links re-directed to pages that don't necessarily have anything to do with the old content.
Good luck!
P.S. Don't forget the disavow tool if you need it!
-
There is absolutely nothing wrong with multiple 301's pointing to the same page.
-
What if I have two dated articles that I merge into one updated article. Does it matter if I have two 301 redirects to the same URL?
-
with a 301 redirect? that's gonna be a huge htaccess file
-
Kill it and redirect if there are any backlinks incoming. Definitely.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moz Pro recommends not using a keyword more than 15 times. If there is a lot of content and the density is low, is it okay to go over that?
From MOZ on-page grader... "Recommendation: Edit your page to use your targeted keywords no more than 15 times." But if I use a keyword 50 times and the keyword density is only 2 percent, is that ok? What is more important, the raw number used or the density?
On-Page Optimization | | Jeremy-Marion1 -
How to check duplicate content with other website?
Hello, I guest that my website may be duplicate contents with other websites. Is this a important factor on SEO? and how to check and fix them? Thanks,
On-Page Optimization | | JohnHuynh1 -
Duplicate Page Content for Product Pages
Hello, We have one website which URL is http://www.bannerbuzz.com & we have many product pages which having duplicate page content issue in SEOMOZ which are below. http://www.bannerbuzz.com/backlit-banners-1.html
On-Page Optimization | | CommercePundit
http://www.bannerbuzz.com/backlit-banners-10.html
http://www.bannerbuzz.com/backlit-banners-11.html
http://www.bannerbuzz.com/backlit-banners-12.html
http://www.bannerbuzz.com/backlit-banners-13.html We haven't any content on these pages, still getting duplicate page content errors for all pages in SEOMOZ. Please help me how can i fix this issue. Thanks,0 -
Duplicate content in the title
Good morning, I am developing an application that searches offers in the press. The problem I have is the follow one:
On-Page Optimization | | ofuente
When I find an offer that I have already post, I cant use the same URL because it generates duplicate content , as the URL is generated from the title. If I find two offers in different stores (for example Thomson TV) I am studying two options. The first would be to add a number at the end of the URL
http://www.offertazo.com/televisor-thomson
http://www.offertazo.com/televisor-thomson1
http://www.offertazo.com/televisor-thomson2 Another option I propose would be to add semantic data to provide value (such as the date). For example:
http://www.offertazo.com/01-12-12/televisor-thomson I appreciate your help.0 -
Duplicate Content
Hi I have Duplicate content that i do sent understand 1 - www.example.dk 2- www.example.dk/ I thought i was the same page, whit and without the / Hope someone can help 🙂
On-Page Optimization | | seopeter290 -
Is it better to drip feed content?
Hi All, I've assembled a collection of 5 closely related articles each about 700 words for publishing by linking to them from on one of my pages and would appreciate some advice on the role out of these articles. Backround: My site is a listings based site and a majority of the content is published on my competitors sites too. This is because advertisers are aiming to spread there adverts wide with the hope of generating more responses. The page I'm targeting ranks 11th but I would like to link it to some new articles and guides to beef it up a bit. My main focus is to rank better for the page that links to these articles and as a result I write up an introduction to the article/guide which serves as my unique content. Question: Is it better to drip feed the new articles onto the site or would it be best to get as much unique content on as quickly as possible to increase the ratio of unique content vs. external duplicate content on the page that links to these articles**?** Thank you in advance.
On-Page Optimization | | Mulith0 -
Duplicate Content Question
On the home page of my site I have a read more link that takes you to a different URL with basically the same content, just more of it. Home Page: http://www.opwdecks.com/ Read More Link on Home Page: http://www.opwdecks.com/deckmaintain.htm I think this may be affecting my seo. Any suggestions on what I should do about this? Should I add a canonical to the home page and/or on the other page? Both pages are indexed by google. Thanks for any help or tips.
On-Page Optimization | | opwdecks0 -
Creating optimized content: how to standardize the process?
Hello there, we are creating the new content for a website. For each web page we have created a “Pages file” to have the advantage of the spell checker. For each page, in the “Pages file” we have written the title tag (70 characters) and the meta description (155 character), so we have a kind of “template” like this in every page: title tag meta desciption text content (included the alt of the images inside the text) Every page is optimized for a single keyword/keyword phrase. What we wanna know from you guys if does exist a kind of “best practice” to test keyword density to avoid keyword stuffing penalities. In our case we opted to use “Pages” as editor, does exist a “standard Numbers/Excel spreadsheet” to understand if a keyword is over optimized in a page and so might look spammy? And in your opinion guys, what’s the best way to standardize the process of creating optimized content? Take care and thank you in advance for sharing your experience. YESdesign guys.
On-Page Optimization | | YESdesign0