An immediate and long-term plan for expired Events?
-
Hello all, I've spent the past day scouring guides and walkthroughs and advice and Q&As regarding this (including on here), and while I'm pretty confident in my approach to this query, I wanted to crowd source some advice in case I might be way off base. I'll start by saying that Technical SEO is arguably my weakest area, so please bear with me. Anyhoozles, onto the question (and advance apologies for being vague):
PROBLEM
I'm working on a website that, in part, works with providers of a service to open their own programs/centers. Most programs tend to run their own events, which leads to an influx of Event pages, almost all of which are indexed. At my last count, there were approximately 800 indexed Event pages.
The problem? Almost all of these have expired, leading to a little bit of index bloat.
THINGS TO CONSIDER
-
A spot check revealed that traffic for each Event occurs for about a two-to-four week period then disappears completely once the Event expires.
-
About half of these indexed Event pages redirect to a new page. So the indexed URL will be /events/name-of-event but will redirect to /state/city/events/name-of-event.
QUESTIONS I'M ASKING
-
How do we address all these old events that provide no real value to the user?
-
What should a future process look like to prevent this from happening?
MY SOLUTION
Step 1: Add a noindex to each of the currently-expired Event pages. Since some of these pages have link equity (one event had 8 unique links pointing to it), I don't want to just 404 all of them, and redirecting them doesn't seem like a good idea since one of the goals is to reduce the number of indexed pages that provide no value to users.
Step 2: Remove all of the expired Event pages from the Sitemap and resubmit. This is an ongoing process due to a variety of factors, so we'd wrap this up into a complete sitemap overhaul for the client. We would also be removing the Events from the website so there are not internal links pointing to them.
Step 3: Write a rule (well, have their developers write a rule) that automatically adds noindex to each Event page once it's expired.
Step 4: Wait for Google to re-crawl the site and hopefully remove the expired Events from its index.
Thoughts? I feel like this is the simplest way to get things done quickly while preventing future expired events from being indexed. All of this is part of a bigger project involving the overhaul of the way Events are linked to on the website (since we wouldn't be 404ing them, I would simply suggest that they be removed entirely from all navigation), but ultimately, automating the process once we get this concern cleaned up is the direction I want to go.
Thanks. Eager to hear all your thoughts.
-
-
Great! Happy to help
-
Hi Robin, thanks for taking the time to write out such detailed and helpful responses. I think I've decided to go with the approach you're outlining above:
For those that are already indexed:
- Change the 302s to 301s (all of the expired events that are indexed are 302s for some reason)
- 404/410 those that don't have any equity
- Create a custom 404 page
- Wait for them to drop out of index
For future Expired Events
- Wait about one month, then apply a 404 with custom page
- Redirect any that have backlinks
It'll require a little more work, but it is, I think, the right thing to do in this very bizarre situation.
-
To be honest it sounds like you already have your plan.
One thing I'd bear in mind is a crawl you run of your site won't line up with the pages that Google is visiting. For one thing, the tools we use try to approximate Google but won't be exactly the same. More importantly, once Google knows of a page it'll come back and check on it to see if the content changed, the only way you'll see that is by looking at your log files.
Yea there's no point making it "noindex, follow", it's not that Google doesn't know what to do with the page, it's just that it's attitude to the page will change over time.
In terms of the large number of redirects, there is some risk that Google could see the large number of 301s as spammy but, to be honest, I've never directly seen evidence of that being a problem. The way I see it, the choice is fairly similar you could
-
404/410 that's the way the internet is meant to work when something no longer exists but you'll lose link equity.
-
301 to preserve link equity but you're essentially misusing the status code.
-
Do a monthly check, 301 any expired pages with discovered backlinks, 410 the rest. This is best of both worlds but is much more time consuming.
I think you can probably get away with the 301s but it all comes down to your appetite for risk.
Good luck!
-
-
Thanks for the detailed response and the suggestion. The problem is, I think, a little more complicated than that. So there are two main concerns:
**1. What do we do with the current expired pages? **
So one thing that happens is that the event pages are effectively orphaned once the event has passed. All trace of them is removed from the website, and if my previous crawl is to be believed, they don't get crawled. Right now, the majority of these expired and indexed event pages are actually 302 redirects. So we're getting a temporary redirect to a page that is expired. Hardly a good user experience.
I do know that since it's a 302, Google is thinking "Hey, the page is coming back, so we're going to index that but send visitors to the new pages." This would be why the 302 URL is indexed. Am I correct in assuming that updating all of these to a 301 would result in the URL ultimately being removed? If so, then I think the best course of action would simply be to 301 redirect all of the current 302 URLs, as well as the actual expired event pages to the relevant event host / program pages.
Also, I did not know that _noindex _was treated as noindex, nofollow after awhile. Would it be beneficial to make them _noindex, follow, _or would that still be a redundancy that Google will ultimately ignore? I also do not think a pop-up is the way to go. These are very short-term events, so the issue is _less _a user experience and more a means of preventing them from clogging up the index. Also it would just be more work for the client and I'm trying to keep things as simple as possible.
**2. What do we do with the future expired pages so they don't end up getting indexed? **
This is probably a more pressing question. So the main concern is we want the Event pages to be indexed while they're live then ultimately removed after they've expired. I'm okay with this process: write a script that auto-redirects and remove all internal links from the website and just simply be patient. My main concern is just having way too many 301 redirects in place.
I'm hoping that the 301 in place combined with the complete orphaning of the page will mean they simply won't be crawled and eventually dropped from the index and thus not accessible to Google or users, but I'm still a little wary. Thoughts? Is there any room for adding anything to robots.txt?
Thanks again for your help. It is much appreciated.
-
Hi there, thanks for posting!
I think my main question here is around the decision to note 404 or 301 these pages. I totally understand that you want to reduce the number of indexed pages which aren't providing value but also don't want to lose equity. I know you mention you're not super technical so I'm going to break down how I expect link equity to be passed around a site and therefore how I expect each of these techniques to impact the page.
Equity is passed from page to page via links so these events pages will pass equity to other pages on yo by Google having a record of the page and the equity of that page, then distributing that equity through links it can follow. Google representatives have said recently that, after a period of time, noindex pages are treated as noindex nofollow at which point we can't rely on equity being passed along any of the outbound links from these pages.
-
noindex: removes the page from the index, after a period of time no equity will be passed from the noindexed page. Initially Google will continue to crawl the page but that will reduce over time.
-
404: the page doesn't exist so will be removed from the index after a period of time. No equity will be passed from the page. Google should stop crawling the page fairly quickly.
-
410: more definitive than 404. Page should drop out of the index more quickly. No equity will be passed from the page. Google should should stop crawling the page fairly quickly.
-
301: we're telling Google that this address is no good any more and it should instead look at a different address. Again, the redirected page should drop out of the index and some proportion of the redirected page's equity should be transferred to the target page. Google should stop crawling the page more quickly than noindexed version but probably not as quickly as the 404/410.
Based on all that I don't think noindex is necessarily your best option. You'll still have a bunch of defunct pages, which Google may still spend time crawling, and you can't rely on them passing equity.
A custom 404/410 page explaining to users that the event has passed is probably a pretty good user experience and would be the most expected behaviour for a situation where content isn't there any more, but won't help you with equity.
I think what you could do is automatically 301 redirect to a relevant category page with a pop-up message that explains to users what's happened. Doesn't sound like you expect the event pages to pop in and out of existence so the logic should be fairly simple.
Hope that helps!
-
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
A long domain name included on the count of characters bad for SEO as well?
Re: Are long URLs bad for SEO? Does a long domain name included on the count of characters bad for SEO as well?Here is an example : "https://kesslerfoundation.org/press-release/kessler-team-tests-regenerative-approach-preventing-osteoarthritis-after-knee-injury". This over 35 characters, however, does it begin after or before the domain name?
Technical SEO | | cesaromar19730 -
Online shop - Long Titles & URLs acceptable?
Hi guys, We have this new online shop with over 1000 products (very technical products), synchronised with the SAP system of the company. So basically the page URLs are generated based on the following structure: Domain Name / Language / Product Category / Subcategory-1 / Subcategory-2 / Subcategory-3 / Product Name and Model Sometimes the URLs are over 130 characters length. Would this harm the shop's ranking, so should we really fix this, or it's something that can be ignored, having in mind the technical products in the shop? I would really appreciate your advice! Thanks!
Technical SEO | | Andreea-M0 -
301 for a Very Long URL
Hey gang, Thanks ahead of time for the help. I have a url somehow that is very very long: http://www.colbysphotography.com/wedding-caterers-knoxville-east-tennessee/Here is an extensive list of wedding venues in the Knoxville and East Tennessee region. If you find that any of these links are not working, that the venues are no longer in business, or have a suggestion for an additional venue (at no charge), please contact Colby. Colby's Photography works hard on keeping this list helpful. I have tried Yoast Premium on a wordpress site to redirect the url but it doesn't seem to keep. I've tried a few other redirect plugins with not help either. I would love some suggestions on this one! Colby
Technical SEO | | littlecolby0 -
Number of links you should have on a taxonomy term??
According to SeoMoz, my taxonomy terms contain more than 100 links (links to articles in my case) and it tells me that I should reduce it. I have seen a video by Matt Cutts, the google software engineer, and in that video he said that Google's engine has dramatically improved ever since and 100 is not the limit anymore. What do you guys think is the best practice here? To clarify the subject even more: I want to learn this from link juice perspective, does it effect how link juice is distributed? Let's say I have 5 taxonomy terms and all of them have 200 articles and these 5 terms are listed on the home page of a PR7 website. In this case some of the PR will be passed to these 5 taxonomy terms. However, if I increase taxonomy terms to 10, then i will reduce links to 100, but the PR will be distributed even more. This means each taxonomy term will have even less PR value. Am I wrong? Any ideas?
Technical SEO | | mertsevinc0 -
Estimating local or long tail keywords
Google data show little or no traffic for some local and long tail keywords. Do they just have a cut off that anything say under 50 keywords does not show up? Is this data 100% accurate? Are there other methods or tools for measuring this better? Thanks much!
Technical SEO | | BrandonWentland0 -
Optimising multiple pages for the same search term
We were having a discussion on title tags and optimising multiple pages for the same term. We rank well for the phrase 'chanel glasses' which points to our Chanel brand page. The Chanel brand page is optimised for this term, and has the phrase 'Chanel glasses' at the front of its title tag. Previously, the title tag on our home page had the words 'Chanel glasses' at the start in an attempt to rank twice for the term (as one of our competitors has managed). This never worked (though at the time, our DA/PA was lower than it is now). For this reason I switched the title tag on the homepage to try and rank for 'designer glasses'. My belief is, given we already rank highly for the term on a more relevant landing page, trying to rank for it again on the home page is not the best use of a title tag on our highest PA page. We may as well use it for something more generic like 'designer glasses' (though this term does not convert nearly as well, nor does it currently rank as well for us as we've not been attempting to get 'designer glasses' as anchor text. Plus it's more competitive. Another generic term maybe be preferable). My colleague's view is we should attempt to do what our competitor has done and try and rank twice on page one for this term. I like the idea of dominating the top results, but I feel that since attempting to get double-listed hasn't worked for us so far, we should use the homepage for optimising for a different term ( ideally something that we don't already rank for elsewhere on the site). I see his point of view - if we were ranking nowhere for the search term then, yes we should concentrate on getting one page to rank, not two. But since we already rank well for the term, perhaps his strategy is preferable? Just for clarity, the title tags are not duplicate, but the idea was to share many of the same keywords between the two title tags. What are your thoughts SEOmoz?
Technical SEO | | seanmccauley0 -
Long tail keywords and duplicate content (product description)
Hi <acronym title="Search Engine Optimization">SEO</acronym> pro's, how are you doing these days? Hope everything is fine... Let's get down to business: I've got a little question about ecommerce sites with duplicate content (product descriptions). I'm already ranking top #1 for exact keyword matche's (did a lot of backlink work with exact keyword). That's fine. The question is: long tail keywords still getting lower results than the competitors, because they published the content first. How to beat them? What I need to do/work to outrank competitors on long tail keywords? (I really need this because almost keywords/products from my niche only have 10% of exact search's). Hope someone can give me a word of light on this! Thanks!
Technical SEO | | azaiats20 -
How long does it take open site explorer to recognize new links?
I'm building a steady link profile to one of my websites and the new links still haven't shown up in open site explorer even after 2 months. How long does it take OSE to recognize new backlinks?
Technical SEO | | C-Style2