I'm updating content that is out of date. What is the best way to handle if I want to keep old content as well?
-
So here is the situation.
I'm working on a site that offers "Best Of" Top 10 list type content. They have a list that ranks very well but is out of date.
They'd like to create a new list for 2014, but have the old list exist. Ideally the new list would replace the old list in search results.
Here's what I'm thinking, but let me know if you think theres a better way to handle this:
-
Put a "View New List" banner on the old page
-
Make sure all internal links point to the new page
-
Rel=canonical tag on the old list pointing to the new list
Does this seem like a reasonable way to handle this?
-
-
Thanks Anthony, thats a great solution..
-
You could handle this similar to how you'd handle any sort of seasonal/yearly content.
Ideally, I'd try to have this set-up.
domain.com/top-10-list is your main piece of content. This is the page that currently ranks well. I would update that URL with your new, updated content. The content that was originally on that list, I would put on a new URL like domain.com/top-10-list/2011 or whatever year the original article was created. Then, on the primary page, I would link to all previous/archive lists. You could update the page every year to keep it fresh.
This would keep all rankings/authority focused on your original URL. If your original URL doesn't work with this plan, for instance if it was domain.com/2011-top-10-list, I would consider 301ing that to a newly created page that is focused on being an evergreen list.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to do site seals for clients to have on their sites
I am about to help release a product which also gives people a site seal for them to place on their website. Just like the geotrust, comodo, symantec, rapidssl and other web security providers do.
Intermediate & Advanced SEO | | ssltrustpaul
I have notices all these siteseals by these companies never have nofollow on their seals that link back to their websites. So i am wondering what is the best way to do this. Should i have a nofollow on the site seal that links back to domain or is it safe to not have the nofollow.
It wont be doing any keyword stuffing or anything, it will probly just have our domain in the link and that is all. The problem is too, we wont have any control of where customers place these site seals. From experience i would say they will mostly likely always be placed in the footer on every page of the clients website. I would like to hear any and all thoughts on this. As i can't get a proper answer anywhere i have asked.0 -
How should I handle URL's created by an internal search engine?
Hi, I'm aware that internal search result URL's (www.example.co.uk/catalogsearch/result/?q=searchterm) should ideally be blocked using the robots.txt file. Unfortunately the damage has already been done and a large number of internal search result URL's have already been created and indexed by Google. I have double checked and these pages only account for approximately 1.5% of traffic per month. Is there a way I can remove the internal search URL's that have already been indexed and then stop this from happening in the future, I presume the last part would be to disallow /catalogsearch/ in the robots.txt file. Thanks
Intermediate & Advanced SEO | | GrappleAgency0 -
Using the same content on different TLD's
HI Everyone, We have clients for whom we are going to work with in different countries but sometimes with the same language. For example we might have a client in a competitive niche working in Germany, Austria and Switzerland (Swiss German) ie we're going to potentially rewrite our website three times in German, We're thinking of using Google's href lang tags and use pretty much the same content - is this a safe option, has anyone actually tries this successfully or otherwise? All answers appreciated. Cheers, Mel.
Intermediate & Advanced SEO | | dancape1 -
What is the best way to handle special characters in URLs
What is the best way to handle special characters? We have some URL's that use special characters and when a sitemap is generate using Xenu it changes the characters to something different. Do we need to have physically change the URL back to display the correct character? Example: URL: http://petstreetmall.com/Feeding-&-Watering/361.html Sitmap Link: http://www.petstreetmall.com/Feeding-%26-Watering/361.html
Intermediate & Advanced SEO | | WebRiverGroup0 -
Panda Recovery - What is the best way to shrink your index and make Google aware?
We have been hit significantly with Panda and assume that our large index with some pages holding thin/duplicate content being the reason. We have reduced our index size by 95% and have done significant content development on the remaining 5% pages. For the old, removed pages, we have installed 410 responses (Page does not exist any longer) and made sure that they are removed from the sitempa submitted to Google; however after over a month we still see Google spider returning to the same pages and the webmaster tools shows no indicator that Google is shrinking our index size. Are there more effective and automated ways to make Google aware of a smaller index size in hope of Panda recovery? Potentially using the robots.txt file, GWT URL removal tool etc? Thanks /sp80
Intermediate & Advanced SEO | | sp800 -
Will pages irrelevant to a site's core content dilute SEO value of core pages?
We have a website with around 40 product pages. We also have around 300 pages with individual ingredients used for the products and on top of that we have some 400 pages of individual retailers which stock the products. Ingredient pages have same basic short info about the ingredients and the retail pages just have the retailer name, adress and content details. Question is, should I add noindex to all the ingredient and or retailer pages so that the focus is entirely on the product pages? Thanks for you help!
Intermediate & Advanced SEO | | ArchMedia0 -
Best way to view Global Navigation bar from GoogleBot's perspective
Hi, Links in the global navigation bar of our website do not show up when we look at Google cache --> text only version of the page. These links use "style="<a class="attribute-value">display:none;</a>" when we looked at HTML source. But if I use "user agent switcher" add-on in Firefox and set it to Googlebot, the links in global nav are displayed. I am wondering what is the best way to find out if Google can/can not see the links. Thanks for the help! Supriya.
Intermediate & Advanced SEO | | SShiyekar0 -
Removing hundreds of old product pages - Best process
Hi guys, I've got a site about discounts/specials etc. A few months ago we decided it might be useful to have shop specials in PDF documents "pulled" and put on the site individually so that people could find the specials easily. This resulted in over 2000 new pages being added to the site over a few weeks (there are lots of specials).
Intermediate & Advanced SEO | | cashchampion
However, 2 things have happened: 1 - we have decided to go in another direction with the site and are no longer doing this
2 - the specials that were uploaded have now ended but the pages are still live Google has indexed these pages already. What would be the best way to "deal" with these pages? Do I just delete them, do I 301 them to the home page? PS the site is build on wordpress. Any ideas as I am at a complete loss. Thanks,
Marc0