Updating Old Content at Scale - Any Danger from a Google Penalty/Spam Perspective?
-
We've read a lot about the power of updating old content (making it more relevant for today, finding other ways to add value to it) and republishing (Here I mean changing the publish date from the original publish date to today's date - not publishing on other sites).
I'm wondering if there is any danger of doing this at scale (designating a few months out of the year where we don't publish brand-new content but instead focus on taking our old blog posts, updating them, and changing the publish date - ~15 posts/month). We have a huge archive of old posts we believe we can add value to and publish anew to benefit our community/organic traffic visitors.
It seems like we could add a lot of value to readers by doing this, but I'm a little worried this might somehow be seen by Google as manipulative/spammy/something that could otherwise get us in trouble.
Does anyone have experience doing this or have thoughts on whether this might somehow be dangerous to do?
Thanks Moz community!
-
Awesome, thank you so much for the detailed response and ideas - this all makes a good deal of sense and we really appreciate it!
-
We have actually been doing this on one of our sites where we have several thousand articles going all the way back to the late 90s. Here is what we do / our process (I am not including how to select articles here, just what to do once they are selected).
- Really take the time to update the article. Ask the questions, "How can we improve it? Can we give better information? Better graphics? Better references? Can we improve conversion?" 2) Republish with a new date on the page. Sometimes add an editor's note on how this is an updated version of the older article. 3) Keep the same URL to preserve link equity etc or 301 to new url if needed 4) mix these in with new articles as a part of our publication schedule.
We have done this for years and have not run into issues. I do not think Google sees this as spammy as long as you are really taking the time to improve your articles. John M. and Gary I. have stated unequivocally that Google likes it when you improve your content. We have done the above, it has not been dangerous at all. Our content is better overall. In some cases where we really focused on conversion, we not only got more traffic, but converted better. Doing this will only benefit your visitors, which usually translates into Google liking the result.
I would ask, why take a few months where you only recycle content, to just mixing it up all year long? If you were going to designate 3 months of the year to just update content, then why not take the 3rd week of the month each month or every Wednesday and do the same thing instead. You accomplish the same thing, but spread it out. Make it a feature! Flashback Friday etc.
Bonus idea - make sure you get the schema right
We have something new with our process. Previously, we only marked up the publication date in schema. So when we republished, we would change the publication date in the schema as well to the new pub date. Now that Google requires a pub date and last modified date in schema we have changed our process. When we republish content, we will leave the original publication date as the publication date marked up in schema and then put the new date that the article is being published marked up as last modified in schema. This is a much more clearer and accurate representation to Google as what you are doing with the article.
We are also displaying the last modified date to the user as the primary date, with the publication date made secondary. The intent here is that we want to show that this is an article that has been recently updated to the user so they know the information is current.
To get this to work properly, we had to rework how our CMS interacts with content on both published date and last modified date, but in the end, I think we are giving better signals to Google and users on the statuses of our articles.
-
You'll probably experience a dip from not publishing new content but I don't believe there will be any other issues.
Updating old content (drip fed or in bulk) won't trigger any spam/manipulation flags.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google Understand H2 As Subtitle?
I use some HTML 5 tags on my custom template. I implement <header class="entry-header-outer"> Flavour & Chidinma – 40 Yrs 40 Yrs by Flavour & Chidinma </header> html code. h1 tag serves as the title, while h2 tag servers as the subtitle of the post. Take a look at it here: https://xclusiveloaded.com/flavour-chidinma-40-yrs/ I want to know if it's ok or should I remove the h2 tag. Guys, what is your thoughts?
On-Page Optimization | | Kingsmart4 -
If Product Pages Perform Well In Google then Is It Possible That Category Page Can perform Well In Google?
Hi All, For my ecommerce site I have optimized my product pages very nicely like good images, detailed information about products, good reviews, implemented schema for my product and reviews and very perfect onpage. Now my query is if my products pages performing well in google then there are chances that my category page rank well in google too? Thanks!
On-Page Optimization | | wright3350 -
Duplicate Page Content
Hey Moz Community, Newbie here. On my second week of Moz and I love it but have a couple questions regarding crawl errors. I have two questions: 1. I have a few pages with duplicate content but it say 0 duplicate URL's. How do I know what is duplicated in this instance? 2. I'm not sure if anyone here is familiar with an IDX for a real estate website. But I have this setup on my site and it seems as though all the links it generates for different homes for sale show up as duplicate pages. For instance, http://www.handyrealtysa.com/idx/mls...tonio_tx_78258 is listed as having duplicate page content compared with 7 duplicate URLS: http://www.handyrealtysa.com/idx/mls...tonio_tx_78247
On-Page Optimization | | HandyRealtySA
http://www.handyrealtysa.com/idx/mls...tonio_tx_78253
http://www.handyrealtysa.com/idx/mls...tonio_tx_78245
http://www.handyrealtysa.com/idx/mls...tonio_tx_78261
http://www.handyrealtysa.com/idx/mls...tonio_tx_78258
http://www.handyrealtysa.com/idx/mls...tonio_tx_78260
http://www.handyrealtysa.com/idx/mls...tonio_tx_78260 I've attached a screenshot that shows 2 of the pages that state duplicate page content but have 0 duplicate URLs. Also you can see somewhat about the idx duplicate pages. rel="canonical" is functioning on these pages, or so it seems when I view the source code from the page. Any help is greatly appreciated. skitch.png0 -
Duplicate Content - Delete it or NoIndex?
Last month I realized that one of my freelancers had been feeding my website with copied / spun content and sadly, there's lots of it. And of course it got my website to be hit hard by the last Panda update. Now that I've identified the content, what the best thing to do? Should I delete it permanently and get 404 errors or should I set the pages' robot meta tag to "nofollow"?
On-Page Optimization | | sbrault740 -
DMCA Complaint to Google
I have several sites copying the content from my pages. I filed numerous complaints with Google and checked under the DMCA dashboard and it's showing my requests are approved; however, still several days later the infringing pages still remain. I'm a bit confused by this as I thought Google was supposed to remove the page. I guess I'm not understand why it was "approved" when I don't see changes. Am I supposed to contact the site's owner to get the pages down now? I don't want to show duplicate content to my site. Please help!
On-Page Optimization | | tutugirl0 -
Duplicat contents on wordpress
I ran a crawl error and found that I have many pages with "tag" i.e. http://www.soobumimphotography.com/tag/70-200-2-8-is/ What's the best way to deal with this problems? Is it worth to visit all of them and fix? Delete? Could you give me some suggestions?
On-Page Optimization | | BistosAmerica0 -
Seeking a Google Penalty / Panda & Penguin recovery expert
Hi folks, I've been dealing with an online travel agency who came to me looking for content marketing services. One look at their analytics & GWT was all I needed to see they have some serious site cleanup to do before they can start spending money on our services. Their search traffic floored with the first Panda update back in Feb 2011, and they've gone through all sorts of turbulence with the Penguin updates too. There are no messages in GWT, but I suspect their previous SEO guy might have deleted them to cover his tracks. I did a quick look at their link profile and there's all kinds of junk in there, they have dupe content all over the place and the whole thing needs cleaning up. In other words, it's a mess. I'd like to win some business from them, but first they need to talk to a Panda/Penguin recovery specialist. If you're interested, drop me a private message and I'll put you in touch with them. Thanks, Matt.
On-Page Optimization | | MattBarker0 -
Updating Old Posts
I have ~ 45 posts that I wrote 2-3 years ago that need to be updated with current information and I'm wondering if I should: Just update them Update them and change the date published to present day Publish the updated info. as a completely new post other? ... and why. I've read so many conflicting thoughts on this, really curious to hear what other Pro members think (or would do if it were them). To give a little more background, the topics of the posts are various retirement communities. Things that may have changed could be they added new amenities, new home types, prices, number of homes still available, etc. I have one page of my site that acts as sort of a directory linking to an article(post) for each community, but worried if I add all the updates as new posts I'll have to link to separate articles about each community which doesn't really make things too friendly for the reader. They want to know about what's going on with each community now...not back 3 years ago. Thoughts? Suggestions? Many thanks! Ryan
On-Page Optimization | | ryanerisman0