I'm updating content that is out of date. What is the best way to handle if I want to keep old content as well?
-
So here is the situation.
I'm working on a site that offers "Best Of" Top 10 list type content. They have a list that ranks very well but is out of date.
They'd like to create a new list for 2014, but have the old list exist. Ideally the new list would replace the old list in search results.
Here's what I'm thinking, but let me know if you think theres a better way to handle this:
-
Put a "View New List" banner on the old page
-
Make sure all internal links point to the new page
-
Rel=canonical tag on the old list pointing to the new list
Does this seem like a reasonable way to handle this?
-
-
Thanks Anthony, thats a great solution..
-
You could handle this similar to how you'd handle any sort of seasonal/yearly content.
Ideally, I'd try to have this set-up.
domain.com/top-10-list is your main piece of content. This is the page that currently ranks well. I would update that URL with your new, updated content. The content that was originally on that list, I would put on a new URL like domain.com/top-10-list/2011 or whatever year the original article was created. Then, on the primary page, I would link to all previous/archive lists. You could update the page every year to keep it fresh.
This would keep all rankings/authority focused on your original URL. If your original URL doesn't work with this plan, for instance if it was domain.com/2011-top-10-list, I would consider 301ing that to a newly created page that is focused on being an evergreen list.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why isn't our complete meta title showing up in the Google SERPS? (cut off half way)
We carry a product line, cutless bearings (for use on boats). For instance, we have one, called the Able, that has the following meta title (and searched by View Page Source to confirm): BOOT 1-3/8" x 2-3/8" x 5-1/2" Johnson Cutless Bearing | BOOT Cutlass However, if I search for it on on Google by part number or name (boot cutless bearing, boot cutlass bearing), the meta title comes back with whole first part chopped off, only showing this : "x 5-1/2" Johnson Cutless Bearing | BOOT Cutlass - Citimarine ..." Any idea why? Here's the url if it will hopefully help: https://citimarinestore.com/en/metallic-inches/156-boot-johnson-cutless-bearing-870352103.html All the products in the category are doing the same. Thanks!
Intermediate & Advanced SEO | | Citimarine0 -
What is the best way to leverage Fiverr for content marketing?
I am currently using Fiver for cleaning up grammar, building backlinks and summarizing topics and articles. Does anyone else have suggestions on other ways to leverage the Fiverr community or any other similar services?
Intermediate & Advanced SEO | | clickshift0 -
Whats the best way to set up a directory listing website
Hello all, I am building a website that lists homeschool events and field trips across various states (locker-time.com) and I have a few questions on setting it up correctly. Both the events and field trips are searchable by distance. For clarification, events are associated with a specific date and time and field trips are not. I currently have a link that says homeschool events and you enter your zip to find things close by. Is it better to create a separate page for each state I am targeting instead? So the link would be homeschool events and then a sub-link that says homeschool events in GA and the GA page brings up all the events in GA, still searchable by zip. Or does it matter? I was thinking if its a separate page, I could put keyword rich copy on top, but then clicking on the menu and choosing the appropriate sub-menu is an additional step for users on the site and as the number of states increase, that sub-menu could get pretty big. The search results pages lists the post title of any events or field trips found and the links go to a page on my website with more information, such as the location, details on the event / field trip and a link to their website. I am wondering for SEO purposes, is this the right way to do it? Or I could set up the results page to show an excerpt and some listing info and then link directly to their website. Does it matter? I was thinking a page on my own website since then I could add images (but that might end up sucking up all my hosting space). As I am adding these listings to my website, I simply copied/pasted the details on the event. Now that I'm thinking about it, original content is best, so should I stop doing that and rewrite the description in my own words? Since the events are date specific events and when they pass, they are no longer on the site, does it matter as much for the events? The field trips do not have dates associated with them, so I can probably work on creating my own descriptions for those. Just not sure if I should bother with events that are more short term. Thanks in advance for ANY advice or suggestions. I'm so looking forward to getting this all set up correctly! I find working on this SEO stuff such fun! Jeanette
Intermediate & Advanced SEO | | fatcreat0 -
Can't get auto-generated content de-indexed
Hello and thanks in advance for any help you can offer me! Customgia.com, a costume jewelry e-commerce site, has two types of product pages - public pages that are internally linked and private pages that are only accessible by accessing the URL directly. Every item on Customgia is created online using an online design tool. Users can register for a free account and save the designs they create, even if they don't purchase them. Prior to saving their design, the user is required to enter a product name and choose "public" or "private" for that design. The page title and product description are auto-generated. Since launching in October '11, the number of products grew and grew as more users designed jewelry items. Most users chose to show their designs publicly, so the number of products in the store swelled to nearly 3000. I realized many of these designs were similar to each and occasionally exact duplicates. So over the past 8 months, I've made 2300 of these design "private" - and no longer accessible unless the designer logs into their account (these pages can also be linked to directly). When I realized that Google had indexed nearly all 3000 products, I entered URL removal requests on Webmaster Tools for the designs that I had changed to "private". I did this starting about 4 months ago. At the time, I did not have NOINDEX meta tags on these product pages (obviously a mistake) so it appears that most of these product pages were never removed from the index. Or if they were removed, they were added back in after the 90 days were up. Of the 716 products currently showing (the ones I want Google to know about), 466 have unique, informative descriptions written by humans. The remaining 250 have auto-generated descriptions that read coherently but are somewhat similar to one another. I don't think these 250 descriptions are the big problem right now but these product pages can be hidden if necessary. I think the big problem is the 2000 product pages that are still in the Google index but shouldn't be. The following Google query tells me roughly how many product pages are in the index: site:Customgia.com inurl:shop-for Ideally, it should return just over 716 results but instead it's returning 2650 results. Most of these 1900 product pages have bad product names and highly similar, auto-generated descriptions and page titles. I wish Google never crawled them. Last week, NOINDEX tags were added to all 1900 "private" designs so currently the only product pages that should be indexed are the 716 showing on the site. Unfortunately, over the past ten days the number of product pages in the Google index hasn't changed. One solution I initially thought might work is to re-enter the removal requests because now, with the NOINDEX tags, these pages should be removed permanently. But I can't determine which product pages need to be removed because Google doesn't let me see that deep into the search results. If I look at the removal request history it says "Expired" or "Removed" but these labels don't seem to correspond in any way to whether or not that page is currently indexed. Additionally, Google is unlikely to crawl these "private" pages because they are orphaned and no longer linked to any public pages of the site (and no external links either). Currently, Customgia.com averages 25 organic visits per month (branded and non-branded) and close to zero sales. Does anyone think de-indexing the entire site would be appropriate here? Start with a clean slate and then let Google re-crawl and index only the public pages - would that be easier than battling with Webmaster tools for months on end? Back in August, I posted a similar problem that was solved using NOINDEX tags (de-indexing a different set of pages on Customgia): http://moz.com/community/q/does-this-site-have-a-duplicate-content-issue#reply_176813 Thanks for reading through all this!
Intermediate & Advanced SEO | | rja2140 -
Best practice for duplicate website content: same root domain name but different extension
Hi there I have a new client who has two websites: http://www.bayofislandsteambuilding.co.nz
Intermediate & Advanced SEO | | turnbullholdingsltd
http://www.bayofislandsteambuilding.org.nz They are the same in every regard apart from the domain extension (.co.nz & .org.nz) which is likely to be causing them issues with Google ranking given the huge amount of duplicate content. What is the best practice approach to fixing this? Normally, if I was starting from scratch, I would set one of the extensions as an alias which redirects to the main domain. Thanks in advance. Laurie0 -
Need to move highest content pages into a sub-domain and want to minimize the loss of traffic - details inside!
Hi All! So the company that I work for owns two very strong domains in the information security industry. There are two separate sections on each site that draws a ton of long tail SEO traffic. For our corporate site we have a vulnerability database where people search for vulnerabilities to research, and find out how to remediate. On our other website we have an exploit database where people can look up exploits in order to see how to patch an attackers attack path. We are going to move these into a super database under our corporate domain and I want to ensure that we maintain or minimize the traffic loss. The exploit database which is currently on our other domain yields about three quarters of the traffic to the domain. It is obviously OK if that traffic goes directly to this new subdomain. What are my options to keep our search traffic steady for this content? There are thousands and thousands of these vulnerabilities and exploits so it would not make sense to 301 redirect all of them. What are some other options and what would you do?
Intermediate & Advanced SEO | | PatBausemer0 -
Best way to view Global Navigation bar from GoogleBot's perspective
Hi, Links in the global navigation bar of our website do not show up when we look at Google cache --> text only version of the page. These links use "style="<a class="attribute-value">display:none;</a>" when we looked at HTML source. But if I use "user agent switcher" add-on in Firefox and set it to Googlebot, the links in global nav are displayed. I am wondering what is the best way to find out if Google can/can not see the links. Thanks for the help! Supriya.
Intermediate & Advanced SEO | | SShiyekar0 -
Best way to migrate to a new URL structure
Hello everyone, We’re changing our URL structure from something like this: example.com/index.php?language=English To something like this: example.com**/english/**index.php The change is implemented with mod_rewrite so all the old URLs can still work We have hundreds of thousands of pages that are currently indexed with the old URL structure What’s the best way to get Google to rapidly update its index and to maintain as much ranking as possible? 301 redirect all the old URLs to the new equivalent format? If we detect that the URL is in an old format, render the page with a canonical tag pointing to the new equivalent format as well as adding a noindex, nofollow tag? Something else? Thanks for your input!
Intermediate & Advanced SEO | | anthematic0