Benefit of using 410 gone over 404 ??
-
It seems like it takes Google Webmaster Tools to forever realize that some pages, well, are just gone.
Truth is, the 30k plus pages in 404 errors, were due to a big site URL architecture change.
I wonder, is there any benefit of using 410 GONE as a temporary measure to speed things up for this case?
Or, when would you use a 410 gone?
Thanks
-
I had the (mis)fortune of trying to deindex nearly 2 million URLs across a couple of domains recently, so had plenty of time to play with this.
Like CleverPhD I was not able to measure any real difference in the time it took to remove a page that had been 410'd vs one that had been 404'd.
The biggest factor governing the removal of the URLs was getting all the pages recrawled. Don't underestimate how long that can take. We ended up creating crawlable routes back to that content to help Google keep visiting those pages and updating the results.
-
The 410 is supposed to be more definitive
http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html
404 is "not found" vs 410 is "gone
10.4.5 404 Not Found
The server has not found anything matching the Request-URI. No indication is given of whether the condition is temporary or permanent. The 410 (Gone) status code SHOULD be used if the server knows, through some internally configurable mechanism, that an old resource is permanently unavailable and has no forwarding address. This status code is commonly used when the server does not wish to reveal exactly why the request has been refused, or when no other response is applicable.
10.4.11 410 Gone
The requested resource is no longer available at the server and no forwarding address is known. This condition is expected to be considered permanent. Clients with link editing capabilities SHOULD delete references to the Request-URI after user approval. If the server does not know, or has no facility to determine, whether or not the condition is permanent, the status code 404 (Not Found) SHOULD be used instead. This response is cacheable unless indicated otherwise.
The 410 response is primarily intended to assist the task of web maintenance by notifying the recipient that the resource is intentionally unavailable and that the server owners desire that remote links to that resource be removed. Such an event is common for limited-time, promotional services and for resources belonging to individuals no longer working at the server's site. It is not necessary to mark all permanently unavailable resources as "gone" or to keep the mark for any length of time -- that is left to the discretion of the server owner.
That said, I had a similar issue on a site with a couple thousand pages and went with the 410, not sure it really made things disappear any faster than the 404 (that I noticed).
I just found a post from John Mueller from Google
https://productforums.google.com/forum/#!topic/webmasters/qv49s4mTwNM/discussion
"In the meantime, we do treat 410s slightly differently than 404s. In particular, when we see a 404 HTTP result code, we'll want to confirm that before dropping the URL out of our search results. Using a 410 HTTP result code can help to speed that up. In practice, the time difference is just a matter of a few days, so it's not critical to return a 410 HTTP result code for URLs that are permanently removed from your website, returning a 404 is fine for that. "
So, use the 410 as a matter of a few days you may see a difference with 30k pages.
All of that said, are you sure with a site that big you would not need to 301 some of those pages. If you have a bunch of old news items or blog posts, would you not want to redirect them to the new URLs for those same assets? Seems like you should be able to recover some of them - at least your top traffic pages etc.
Cheers
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Broad keyword use
It seems that the broad keyword use is not very accurate. I have 3 very similar keywords: Dive Florida, scuba diving in Florida, Florida scuba diving. Why does the program not recognise them as broad usage when assessing the page title? And if the program cannot understand broad usage terms, how confident can we be that the program can properly measure for keyword stuffing?
White Hat / Black Hat SEO | | Zambezikid0 -
Our webpage has been embedded in others website using iframe. Does this hurts us in rankings?
Hi all, One of our partners have embedded our webpage in their website using iframe such a way that anybody can browse to any page of our website from their website. It's said that content in iframes will not get indexed. But when I Google for h1 tag of our embedded webpage, I can see their website in SERP but not ours which is original. How it's been indexed? Is this hurting us in rankings? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Do I lose link juice if I have a https site and someone links to me using http instead?
We have recently launched a https site which is getting some organic links some of which are using https and some are using http. Am I losing link juice on the ones linked using http even though I am redirecting or does Google view them the same way? As most people still use http naturally will it look strange to google if I contact anyone who has given us a link and ask them to change to https?
White Hat / Black Hat SEO | | Lisa-Devins0 -
Whether letting an old category just 404 out is OK
Hello, We've got some hidden categories that are still indexed in the search engines. If there are no links to these hidden categories, can we just let them 404 out and be OK SEO wise? We can't 301 redirect them. It's about 50 categories.
White Hat / Black Hat SEO | | BobGW0 -
Has anyone used tribepro.com
Does that concept really work. Any experience? I've registered and so far I think it's hard to measure whether the shares are spam or genuine. Would love to see it works for someoneThanks
White Hat / Black Hat SEO | | LauraHT0 -
Any Benefit to Artificially Boosting the CTR for rank?
I've read articles that indicates Google will provide a higher rank to listing with higher click through rates (i.e, http://bit.ly/132mUd0, "If a search result achieves a higher than average click through rate then it may be given a higher ranking.") First, this seems like a chicken-and-egg scenario: it seems like results with higher rank will have higher CTR from increased exposure, no? Second, if this was an accurate ranking signal, it seems like it would be so easy to black hat (as well as other web usage signals, such as goal conversions and time on site). I'd just pay some Indian dude to search for my website on different IPs and click through in the SERP. Your thoughts about this scenario?
White Hat / Black Hat SEO | | ExploreConsulting0 -
Using Canonical Tags to Boost Lead Form Ranking
I presently have a number of whitepapers that bring traffic to our site. If a visitor elects to download the whitepaper they are taken to a lead form with an abstract of the whitepaper. The abstract is present because the visitor may or may not have come to the lead form directly. I imagine this would be a "no no," but how do you feel about placing a canoncial tag on a whitepaper that points to the lead form w/ abstract? The obvious idea being to take the umph of a whitepaper to direction search visitors directly to the lead form.
White Hat / Black Hat SEO | | shoffy0