Why are "noindex" pages access denied errors in GWT and should I worry about it?
-
GWT calls pages that have "noindex, follow" tags "access denied errors."
How is it an "error" to say, "hey, don't include these in your index, but go ahead and crawl them."
These pages are thin content/duplicate content/overly templated pages I inherited and the noindex, follow tags are an effort to not crap up Google's view of this site.
The reason I ask is that GWT's detection of a rash of these access restricted errors coincides with a drop in organic traffic. Of course, coincidence is not necessarily cause.
Should I worry about it and do something or not?
Thanks... Darcy
-
I am a little surprised, because having those pages as "noindex, follow" should not bring GWT to flag them as errors.
Monica is correct in addressing google flag anything than 200 as errors, but... Your page with "noindex, follow" should return a HTTP code of 200. If it is returning anything else, it's probably wrong, and you should analyze why is doing it.
My religion has a law saying that GWT should return no errors, point. I have also witnessed few times a correlation between lowering GWT errors count to 0 and an improve in SERP ranking; but I have no proof one is causing the other.
-
I had a similar issue where my sitemap and my robots.txt didn't match properly and they were causing a slew of errors to show up. Everything falls under a crawler error but "should" clean itself up as its being indexed. I resubmitted an updated sitemap that matched my robots.txt and I have gotten rid of the errors.
Google also states that these errors don't directly hurt your ranking, but they can indirectly hurt because of user experience. You can always double check and see if the pages are being indexed by doing a "site:" search in google and checking if those pages exist.
Now, the errors are somewhat of a blessing. We had a design firm who redid our website and they had contracted an SEO "expert" to optimize the site before launch. They launched our website, and the next day I open up GWMT and our entire website was still under "noindex". The forgot to take the noindex from the dev site off of our main site.
Also I would consider just redirecting the thing content all together.
EDIT: And again Ryan sneaks in before me!!!!!!!!
-
Thumbs up to Monica's answer. I'd just add that you could redirect some of those pages to thin out the use of no index if possible, but it sounds like you've kept them around as they're marginally useful. You can also click the 'ignore' button for given error messages and they'll go away.
-
No. I wouldn't worry about it. Google calls them errors, the same as a 404 error. To them an error is anything that returns a code other than 200. I have hundreds of noindex pages on my site and it doesn't hurt. I believe it helps because it removes duplicate content and eliminates bad user experiences.
I have always thought that it is Google's way of double checking to make sure that the Webmaster is aware those pages are blocked. There have been times that I found URLs in there that weren't supposed to be, and contrarily found missing URLs as well. Its checks and balances in my opinion.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do we decide which pages to index/de-index? Help for a 250k page site
At Siftery (siftery.com) we have about 250k pages, most of them reflected in our sitemap. Though after submitting a sitemap we started seeing an increase in the number of pages Google indexed, in the past few weeks progress has slowed to a crawl at about 80k pages, and in fact has been coming down very marginally. Due to the nature of the site, a lot of the pages on the site likely look very similar to search engines. We've also broken down our sitemap into an index, so we know that most of the indexation problems are coming from a particular type of page (company profiles). Given these facts below, what do you recommend we do? Should we de-index all of the pages that are not being picked up by the Google index (and are therefore likely seen as low quality)? There seems to be a school of thought that de-indexing "thin" pages improves the ranking potential of the indexed pages. We have plans for enriching and differentiating the pages that are being picked up as thin (Moz itself picks them up as 'duplicate' pages even though they're not. Thanks for sharing your thoughts and experiences!
Intermediate & Advanced SEO | | ggiaco-siftery0 -
Where the "fudge-nuggets" are my internal links?
Ok, so... Google Webmaster Tools Internal Links are not showing any links to my site's homepage. I only link to the homepage by wrapping the logo with the link throughout the site. Does Google need these to be text links to show them? [/](<a class=)" title="Kona Coffee">![](<a class=)http://1s93mbet6ccj5zkm31703gqj8.wpengine.netdna-cdn.com/wp-content/uploads/kona-coffee-1.png" alt="Kona Coffee"/> Site is here:
Intermediate & Advanced SEO | | AhlerManagement
http://goo.gl/4C8GKc Could CDN image source be affecting it? Lost... please help!0 -
Consistent Ranking Jumps Page 1 to Page 5 for months - help needed
Hi guys and gals, I have a really tricky client who I just can't seem to gain consistency with in their SERP results. The keywords are competitive but what the main issue I have is the big page jumps that happen pretty much on a weekly basis. We go up and down 40 positions and this behaviour has been going on for nearly 6 months.
Intermediate & Advanced SEO | | Jon_bangonline
I felt it would resolve itself in time but it has not. The website is a large ecommerce website. Their link profile is OK in regards to several high quality newspaper publication links, majority brand related anchor texts and the link building we have engaged in has all been very good i.e. content relevant / high quality places. See below for some potential causes I think could be the reason: The on page SEO is good however the way their ecommerce website is setup they have formed a substantial amount of duplicate title tags. So in my opinion this is a potential cause. The previous web developer set-up 301 redirects all to their home page for any 404 errors. I know best practice is to go to the most relevant pages, however could this be a potential issue? We had some server connectivity issues show up in webmasters tools but that was for 1 day about 4 months ago. Since then no issues. they have quite a few 'blocked URLs' in their robots.txt file, e.g. Disallow: /login, Disallow: /checkout/ but to me these seem normal and not a big issue. We have seen a decrease over the last 12 months in Webmasters Tools of 'total indexed web pages' from 5000 to 2000 which is quite an odd statistic. Summary So all in all I am a tad stumped. We have some duplicate content issues in title tags, perhaps not following best practice in the 301 redirects but other than that I dont see any major on page issues, unless I am missing something in the seriousness of what I have listed.
Finally we have also do a bit of a cull of poor quality links, requesting links to be removed and also submitting a 'disavow' of some really bad links. We do not have a manual penalty though. Thoughts, feedback or comments VERY welcome.0 -
Duplicate content within sections of a page but not full page duplicate content
Hi, I am working on a website redesign and the client offers several services and within those services some elements of the services crossover with one another. For example, they offer a service called Modelling and when you click onto that page several elements that build up that service are featured, so in this case 'mentoring'. Now mentoring is common to other services therefore will feature on other service pages. The page will feature a mixture of unique content to that service and small sections of duplicate content and I'm not sure how to treat this. One thing we have come up with is take the user through to a unique page to host all the content however some features do not warrant a page being created for this. Another idea is to have the feature pop up with inline content. Any thoughts/experience on this would be much appreciated.
Intermediate & Advanced SEO | | J_Sinclair0 -
Do image "lightbox" photo gallery links on a page count as links and dilute PageRank?
Hi everyone, On my site I have about 1,000 hotel listing pages, each which uses a lightbox photo gallery that displays 10-50 photos when you click on it. In the code, these photos are each surrounded with an "a href", as they rotate when you click on them. Going through my Moz analytics I see that these photos are being counted by Moz as internal links (they point to an image on the site), and Moz suggests that I reduce the number of links on these pages. I also just watched Matt Cutt's new video where he says to disregard the old "100 links max on a page" rule, yet also states that each link does divide your PageRank. Do you think that this applies to links in an image gallery? We could just switch to another viewer that doesn't use "a href" if we think this is really an issue. Is it worth the bother? Thanks.
Intermediate & Advanced SEO | | TomNYC0 -
How can I fix "Too Many On Page Links"?
One of the warnings from SEO Moz says that we have "too many on page links" on a series of pages on my website. The pages it's giving me these warnings on are on my printing sample pages. I'm assuming that it's because of my left navigation. You can see an example here: http://www.3000doorhangers.com/door-hanger-design-samples/deck-and-fence-door-hanger-samples/ Any suggestions on how to fix this warning? Thanks!
Intermediate & Advanced SEO | | JimDirectMailCoach0 -
Optimize the category page or a content page?
Hi, We wish to start ranking on a specific keyword ("log house prices" in italian). We have two options on what pages we should optimize for this keyword: A long content page (1000+ words with images) Log houses category page, optimized for the keyword (we have 50+ houses on this page, together with a short price summary). I would think that we have better chances with ranking with option nr.2 , but then we can't use that page for ranking with a more short-tail keyword (like "log houses"). What would you suggest? Is there maybe a third option for this?
Intermediate & Advanced SEO | | JohanMattisson0 -
Category Pages up - Product Pages down... what would help?
Hi I mentioned yesterday how one of our sites was losing rank on product pages. What steps do you take to improve the SERPS of product pages, in this case home/category/product is the tree. There isn't really any internal linking, except one link from the category page to each product, would setting up a host of internal links perhaps "similar products" linking them together be a place to start? How can I improve my ranking of these more deeply internal pages? Not just internal links?
Intermediate & Advanced SEO | | xoffie0