Send noindex, noarchive with 410?
-
My classifieds site returns a 410 along with an X-Robots-Tag HTTP header set to "noindex,noarchive" for vehicles that are no longer for sale. Google, however, apparently refuses to drop these vehicles from their index (at least as reported in GWT). By returning a "noindex,noarchive" directive, am I effectively telling the bots "yeah, this is a 410 but don't record the fact that this is a 410", thus effectively canceling out the intended effect of the 410?
-
That sounds good, let me know if you have further questions, I'm always glad to be of help!
-
Thanks for the info, mememax. I don't relish the thought of using the removal tool, but I suppose I can actually 301-redirect many of those 410s to category pages and then use the GWT for the rest.
-
hey Tony you made it in the right way, you added the error code + the noindex. However google won't drop your page from the index until it crawls it several times.
You can do this: first of all be sure that you have no links pointing to that page then:
- see in GWT if the page is showing as a 404 and when it will disappear from GWTools errors
- or go to GWT and ask google to remove it from the index. This is the fastest way, and google asks you to add a noindex or return a 404 to make this action, so actually you're more than fine to do that, however it depends on the volume of 404s you have this may be a huge and repetitive task to do.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 or 302 or leave at 410
I have a client who manages vacation rental properties and those properties get links. If an owner pulls their property off the rental market the current status given is a 410 which I instinctively want turned into a 301. The problem is, often those properties come back online with the same URL so the question is, when a 301 is turned into a 200 - has anyone noticed a significant delay in time for that page to rank?I know technically it should probably be a 410 or maybe a 302 but ... you know ... the link weight. 🙂
Technical SEO | | BeanstalkIM1 -
Noindex PPC landing pages or optimise for SEO?
Organic seems to be down YoY on one of the categories of a large ecommerce website that I work on. This particular category has multiple landing pages set up for PPC consisting of filtered products. So these landing pages are prone to duplicate content due to the products listed. e.g. Blue Thingamajigs White Thingamajigs Black Thingamajigs High Gloss Thingamajigs Oak Thingamajigs Glass Thingamajigs etc These landing pages do well for PPC, but are nowhere to be seen in organic (51+). The main category page however ranks quite well for quite a variety of root and longtail keywords, though not as well as it used to. For example, it does rank for "thingamajigs", "white thingamajigs", "white gloss thingamajigs" and "white gloss thingamajigs with cherries on top". Would it benefit the main category page if the PPC landing pages were noindexed? Or, despite Google's preference for the main category, work on further optimising the landing pages for SEO? Or is there another solution that I'm completely overlooking? (It is a Friday afternoon after all...)
Technical SEO | | Ria_0 -
Noindex large productpages on webshop to counter Panda
A Dutch webshop with 10.000 productpages is experiencing lower rankings and indexation. Problems started last october, a little while after the panda and penguin update. One of the problems diagnosed is the lack of unique content. Many of the productpages lack a description and some are variants of eachother. (color, size, etc). So a solution could be to write unique descriptions and use rel canonical to concentrate color/size variations to one productpage. There is however no capacity to do this on short notice. So now I'm wondering if the following is effective. Exclude all productpages via noindex, robots.txt. IN the same way as you can do with search pages. The only pages left for indexation are homepage and 200-300 categorypages. We then write unique content and work on the ranking of the categorypages. When this works the product pages are rewritten and slowly reincluded, category by category. My worry is the loss of ranking for productpages. ALthoug the ranking is minimal currently. My second worry is the high amount of links on category pages that lead to produtpages that will be excluded rom google. Thirdly, I am wondering if this works at all. using noindex on 10.000 productpages consumes crawl budget and dillutes the internal link structure. What do you think?
Technical SEO | | oeroek0 -
Noindex nofollow issue
Hi, For some reason 2 pages on my website, time to time get noindex nofollow tags they disappear from search engine, i have to log in my thesis wp theme and uncheck box for "noindex" "nofollow" and them update, in couple days my website is back up. here is screen shot http://screencast.com/t/A6V6tIr2Cb6 Is that something in thesis theme that cause the problem? even though i unchecked the box and updated but its still stays checked http://screencast.com/t/TnjDcYfsH4sq appreciated for your help!
Technical SEO | | tonyklu0 -
Timely use of robots.txt and meta noindex
Hi, I have been checking every possible resources for content removal, but I am still unsure on how to remove already indexed contents. When I use robots.txt alone, the urls will remain in the index, however no crawling budget is wasted on them, But still, e.g having 100,000+ completely identical login pages within the omitted results, might not mean anything good. When I use meta noindex alone, I keep my index clean, but also keep Googlebot busy with indexing these no-value pages. When I use robots.txt and meta noindex together for existing content, then I suggest Google, that please ignore my content, but at the same time, I restrict him from crawling the noindex tag. Robots.txt and url removal together still not a good solution, as I have failed to remove directories this way. It seems, that only exact urls could be removed like this. I need a clear solution, which solves both issues (index and crawling). What I try to do now, is the following: I remove these directories (one at a time to test the theory) from the robots.txt file, and at the same time, I add the meta noindex tag to all these pages within the directory. The indexed pages should start decreasing (while useless page crawling increasing), and once the number of these indexed pages are low or none, then I would put the directory back to robots.txt and keep the noindex on all of the pages within this directory. Can this work the way I imagine, or do you have a better way of doing so? Thank you in advance for all your help.
Technical SEO | | Dilbak0 -
NoIndex user generated pages?
Hi, I have a site, downorisitjustme (dot) com It has over 30,000 pages in google which have been generated by people searching to check if a specific site is working or not and then possibly adding a link to a msg board to the deeplink of the results page or something which is why the pages have been picked up. Am I best to noindex the res.php page where all the auto generated content is showing up and just have the main static pages as the only ones available to be indexed?
Technical SEO | | Wardy0 -
Sitemaps and "noindex" pages
Experimenting a little bit to recover from Panda and added "noindex" tag for quite a few pages. Obviously now we need Google to re-crawl them ASAP and de-index. Should we leave these pages in sitemaps (with updated "lastmod") for that? Or just patiently wait? 🙂 What's the common/best way?
Technical SEO | | LocalLocal0 -
Can I noindex most of my site?
A large number of the pages on my site are pages that contain things like photos and maps that are useful to my visitors, but would make poor landing pages and have very little written content. My site is huge. Would it be benificial to noindex all of these?
Technical SEO | | mascotmike0