Indexed, though blocked by robots.txt: Need to bother?
-
Hi,
We have intentionally blocked some of the website files which were indexed for years. Now we receive a message "Indexed, though blocked by robots.txt" in GSC. We can ignore as per my knowledge? Are any actions required about this? We thought of blocking them with meta tags but these are PDF files.
Thanks
-
Hi there!
What Google is telling you is that you are indexing URLs that you probably are not wanting to be indexed, or the other way around, that important pages are being blocked but indexed for other reasons.
If I might ask, why did you blocked through robots.txt those files?
There most 2 answers are:
1- Wanted to remove those from search results. If this is your case, you've solved only a part of the problem. What you should have done is (previously allowing robots to crawl those urls) apply noindex rules (keep in mind that can be set up in the HTTP header, as long as not html files cant have meta robots tag), then after a sufficient time block them in robots.txt.
_2- Optimize how GoogleBot (crawiling) time. _Being this case, then you've done it correctly and there is nothing to worry.Hope this help.
Best luck.
GR
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
PDFs With No Index Contribute To Page Ranks?
I have a question I'm hoping you can help me with. If I upload a PDF and add a no index under the meta robots index so that the PDF doesn't appear in search results when I send people the link to this PDF, does it still contribute to my site traffic/ranking etc? Basically we are deciding whether to put some PDFs with pricing options etc onto our website or on a google drive. We will be sending the links to potential clients. If visitors clicking on the link would still help with increasing traffic and increasing our google rank (without that PDF showing in results) we thought this might be the best solution.
Algorithm Updates | | whiterabbitnz0 -
Tens of duplicate homepages indexed and blocked later: How to remove from Google cache?
Hi community, Due to some WP plugin issue, many homepages indexed in Google with anonymous URLs. We blocked them later. Still they are in SERP. I wonder whether these are causing some trouble to our website, especially as our exact homepages indexed. How to remove these pages from Google cache? Is that the right approach? Thanks
Algorithm Updates | | vtmoz0 -
Need only tens of pages to be indexed out of hundreds: Robots.txt is Okay for Google to proceed with?
Hi all, We 2 sub domains with hundreds of pages where we need only 50 pages to get indexed which are important. Unfortunately the CMS of these sub domains is very old and not supporting "noindex" tag to be deployed on page level. So we are planning to block the entire sites from robots.txt and allow the 50 pages needed. But we are not sure if this is the right approach as Google been suggesting to depend mostly on "noindex" than robots.txt. Please suggest whether we can proceed with robots.txt file. Thanks
Algorithm Updates | | vtmoz0 -
A page will not be indexed if published without linking from anywhere?
Hi all, I have noticed one page from our competitors' website which has been hardly linked from one internal page. I just would like to know if the page not linked anywhere get indexed by Google or not? Will it be found by Google? What if a page not linked internally but go some backlinks from other websites? Thanks
Algorithm Updates | | vtmoz0 -
Does using parent pages in WordPress help with SEO and/or indexing for SERPs?
I have a law office and we handle four different practice areas. I used to have multiple websites (one for each practice area) with keywords in the actual domain name, but based on the recommendation of SEO "experts" a few years ago, I consolidated all the webpages into one single webpage (based on the rumors at the time that Google was going to be focusing on authorship and branding in the future, rather than keywords in URLs or titles). Needless to say, Google authorship was dropped a year or two later and "branding" never took off. Overall, having one webpage is convenient and generally makes SEO easier, but there's been a huge drawback: When my page comes up in SERPs after searching for "attorney" or "lawyer" combined with a specific practice area, the practice area landing pages don't typically come up in the SERPs, only the front page comes up. It's as if Google recognizes that I have some decent content, and Google knows that I specialize in multiple practice areas, but it directs everyone to the front page only. Prospective clients don't like this and it causes my bounce rate to be high. They like to land on a page focusing on the practice area they searched for. Two questions: (1) Would using parent pages (e.g. http://lawfirm.com/divorce/anytown-usa-attorney-lawyer/ vs. http://lawfirm.com/anytown-usa-divorce-attorney-lawyer/) be better for SEO? The research I've done up to this point appears to indicate "no." It doesn't make much difference as long as the keywords are in the domain name and/or URL. But I'd be interested to hear contrary opinions. (2) Would using parent pages (e.g. http://lawfirm.com/divorce/anytown-usa-attorney-lawyer/ vs. http://lawfirm.com/anytown-usa-divorce-attorney-lawyer/) be better for indexing in Google SERPs? For example, would it make it more likely that someone searching for "anytown usa divorce attorney" would actually end up in the divorce section of the website rather than the front page?
Algorithm Updates | | micromano0 -
Google has indexed a lot of test pages/junk from the development days.
With hind site I understand that this could have been avoided if robots.txt was configured properly. My website is www.clearvisas.com, and is indexed with both the www subdomain and with out. When I run site:clearvisas.com in Google I get 1,330 - All junk from the development days. But when I run site:www.clearvisas.com in Google I get 66 - these results all post development and more in line with what I wanted to be indexed. Will 1,330 junk pages hurt my seo? Is it possible to de-index them and should I? If the answer is yes to any of the questions how should I proceed? Kind regards, Fuad
Algorithm Updates | | Fuad_YK0 -
Panda Update: Need your expertise...
Hi all, After Panda update our website lost about 45% of it's traffic from Google. It wasn't an instant drop mostly it happened gradually over the last 5 months. Our keywords (all of them except the domain name) started to lose positions from top #10 to now 40+ and all recovery attempts we have done so far didn't really help. At this moment it would be great to get some advice from the top experts like you here. What we have done so far is that We have gone through the all pages and removed the duplicate / redundant ones. We have refresh the content on the main pages and also all pages now have an canonical tags. Our website is www.PrintCountry.com. Thank you very much in advance for your time.
Algorithm Updates | | gbssinc0 -
Index Page lost rankings? Please Help!
This morning I ranked highly (Page 1 UK Google) for over 50 keyword search terms for my website http://www.careworx.co.uk This afternoon my rankings have bottomed out and dropped pages? I have not been de-indexed it appears and many of my sub-pages are still highly ranked. Would anybody know what has happened? I know of Google Panda but I would've seen results drop before now so I'm very concerned. Don't seem to have lost any links etc and am careful to balance SEO with a mix of techniques to keep Google happy and again, have not been de-indexed. Can anybody offer advice please, or let me know how I can rectify this.
Algorithm Updates | | andystep0