Correct usage of expired pages -410 or not?
-
Hi Mozzes,
We're running a property portal that carries around 200.000 listings in two languages. All listings are updated several times per day and when one of our ads expire we report this via the "410 Gone", and place a link to our users: This ad has expired, click here to search for similar properties.
Looking at our competition I seems that here are many different ways to deal with this, one popular being a 301 to the corresponding search result.
We've tried to get directions from Google on what method they prefere, but as usual dead silence.
Advices are mostly welcome.
-
Matthew,
How would you go about tracking user vs bot traffic on 410 header pages? We see that we got plenty of hits on the pages via Awstats, but no means to measure what sort of traffic these hits really are?
Best
Johan
-
Thanks a lot for that Matthew,
I will look into it, but my gut tells me that we do not get a lot of traffic from these pages. Google visits though, tons, so hopefully the 301s will bring us some more nice juice.
Right after posting I ran into this great post about the subject too http://www.seomoz.org/blog/how-should-you-handle-expired-content
However, few words are mentionend about the 410.
Thanks
Johan
-
Hi Johan,
A 410 response code is perfectly acceptable for expired pages. With a 410, you are communicating that page is "gone" and expired content usually is "gone", so it fits. However, with 410 you are going to see that page fall out of the index and that page will lose traffic (assuming it would get any given that some expired content likely won't get any traffic since it is no longer timely) and, more importantly, lose link value (if you had any links to those pages).
As for 301 redirects, I'd start tracking visits to the 410 expired page and links to the 410 expired pages. How much traffic are you getting? How engaged is that traffic? How many links are there and are they good quality? Links are easy enough to track in OSE and for tracking traffic you can use Google events (http://antezeta.com/news/404-errors-google-analytics).
When I see a lot of links or a lot of traffic (especially traffic that leaves), I've converted a 410 page into a 301 redirect that goes to our best (programmatic) guess. For instance, 301 redirect the user to a search for the properties in a similar location or similar price range (or etc.). What I've often found is that when I get the the user redirected to the best page, I'm more likely to see them engage and use the site. Along with the user benefits, I've also see that help with overall organic performance when there are a lot of links back to these pages.
Hope that helps. Thanks,
Matthew
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ctr question with home page and product pages
do you believe that the advantage of targeting a search term on the home page is now worse off than before? as I understand it ctr is a big factor now And as far as i can see if two pages are equal on page etc the better ctr will win out, the issue with the home page is the serp stars cannot be used hence the ctr on a product page will be higher? I feel if you where able to get a home page up quicker (1 year instead of two) you still lost out in the end due to the product page winning on ctr? do you think this is correct?
Algorithm Updates | | BobAnderson0 -
What do media queries have to do with the page layout update?
Who thinks the lack of media queries will have an impact on whether the page layout update affects a site?
Algorithm Updates | | kimmiedawn0 -
Does google index non-public pages ie. members logged in page
hi, I was trying to locate resources on the topics regarding how much the google bot indexes in order to qualify a 'good' site on their engine. For example, our site has many pages that are associated with logged in users and not available to the public until they acquire a login username and password. Although those pages show up in google analytics, they should not be made public in the google index which is what happens. In light of Google trying to qualify a site according to how 'engaged' a user is on the site, I would feel that the activities on those member pages are very important. Can anyone offer suggestions on how Google treats those pages since we are planning to do further SEO optimization of those pages. Thanks
Algorithm Updates | | jumpdates0 -
Has Google problems in indexing pages that use <base href=""> the last days?
Since a couple of days I have the problem, that Google Webmaster tools are showing a lot more 404 Errors than normal. If I go thru the list I find very strange URLs that look like two paths put together. For example: http://www.domain.de/languages/languageschools/havanna/languages/languageschools/london/london.htm If I check on which page Google found that path it is showing me the following URL: http://www.domain.de/languages/languageschools/havanna/spanishcourse.htm If I check the source code of the Page for the Link leading to the London Page it looks like the following: [...](languages/languageschools/london/london.htm) So to me it looks like Google is ignoring the <base href="..."> and putting the path together as following: Part 1) http://www.domain.de/laguages/languageschools/havanna/ instead of base href Part 2) languages/languageschools/london/london.htm Result is the wrong path! http://www.domain.de/languages/languageschools/havanna/languages/languageschools/london/london.htm I know finding a solution is not difficult, I can use absolute paths instead of relative ones. But: - Does anyone make the same experience? - Do you know other reasons which could cause such a problem? P.s.: I am quite sure that the CMS (Typo3) is not generating these paths randomly. I would like to be sure before we change the CMS's Settings to absolute paths!
Algorithm Updates | | SimCaffe0 -
Too many page links?`
Hi there This blog insert was flag suggesting there was too many page links? I cant identify the same problem? Can anyone explain?
Algorithm Updates | | footballfriends0 -
Difference in which pages Google is ranking?
Over the past two weeks I've noticed that Google has decided to change which pages on our site rank for specific keywords. The thing is, this is for keywords that the homepage was already ranking for. Due to our workload, we've made no changes to the site, and I'm not tracking any additional backlinks. Certainly there are no new deep links to these pages. In SEOmoz dashboard (and via tools/manual checking with a proxy) of the 24 terms we have first page ranking for, 9 of them are marked "new to top 50". These are terms we were already ranking for. Google just appears to have switched out the homepage for other pages. I've noticed this across a couple of client sites, too, though none to the extent that I'm seeing on our own. Certainly this isn't a bad thing, as the deeper pages ranking means that they're landing on the content they want first, and I can work to up the conversion rates. It's just caught me by surprise. Anyone else noticing similar changes?
Algorithm Updates | | BedeFahey1 -
Does Google index Wordpress pages with frames
Does Google or other search engines index Wordpress pages that use frames? Here is the site in question: http://www.source-nutrition.com/son/
Algorithm Updates | | BradBorst0 -
Google said that low-quality pages on your site may affect rankings on other parts
One of my sites got hit pretty hard during the latest Google update. It lost about 30-40% of its US traffic and the future does not look bright considering that Google plans a worldwide roll-out. Problem is, my site is a six year old heavy linked, popular Wordpress blog. I do not know why the article believes that it is low quality. The only reason I came up with is the statement that low-quality pages on a site may affect other pages (think it was in the Wired article). If that is so, would you recommend blocking and de-indexing of Wordpress tag, archive and category pages from the Google index? Or would you suggest to wait a bit more before doing something that drastically. Or do you have another idea what I could to do? I invite you to take a look at the site www.ghacks.net
Algorithm Updates | | badabing0