Get a list of robots.txt blocked URL and tell Google to crawl and index it.
-
Some of my key pages got blocked by robots.txt file and I have made required changes in robots.txt file but how can I get the blocked URL's list.
My webmaster page Health>blocked URL's shows only number not the blocked URL's.My first question is from where can I fetch these blocked URL's and how can I get them back in searches,
One other interesting point I see is that blocked pages are still showing up in searches.Title is appearing fine but Description shows blocked by robots.txt file.
I need urgent recommendation as I do not want to see drop in my traffic any more.
-
"changing the lastmod of those pages to today".
How can I make these changes?
Right now the news is that Resubmitted the Sitemap and no warnings this time.
-
I imagine that since you've got a robots txt error you'll probably ended closing a whole directory to bots which you wanted to be indexed. You can easily spot the directory and resubmit a sitemap to google changing the lastmod of those pages to today and the priority to 1 but only of those pages.
If you still receive warnings it may be due to errors in your sitemap. You're probably including some directory you don't want. You can try it in GWT putting in the box at the bottom the url you want to maintain in the index and then trying to see if some urls are being blocked by your robots.
If you want you can post here your robots and the URIs you want to be indexed without knowing the domain so that won't be public. Hope this may help you
-
Ok Resubmitted it.but even with updated file it gives a lot of errors.I think it takes some time.20,016 warnings
I have not added no index attribute in my header region.It was all messy stuff with robots.txt file.It means that with site showing up in SERP the rank will probably be the same or it was deranked?
-
Go into GWMT and resubmit sitemap.xml files (with the URLs you want indexed) for recrawling and Google will digest the sitemaps again, instead of waiting for Googlebot to come around on their own, you are requesting it to come around, also include those new sitemap files in your robots.txt file.
-
In Google Webmaster Tools, go to Health -> Fetch As Google. Then add the previously blocked URL and click Fetch. Once you've done that, refresh the page and click "Submit to index". That should get Google indexing those pages again.
Getting external links to your pages also helps get pages crawled & indexed, so it may be worth submitting your pages to social bookmarking sites, or get other types of backlinks to your previously blocked pages if possible.
-
Since you fixed your robots.txt file you should be good to. It will probably take a few days for Google to recrawl your site and update the index with the URLs they are now allow to crawl.
Blocked URLs can still show up in SERPs if you haven't defined the no-index attribute in your section.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google encryption of keyword data impact SEO revenue reporting in Google analytics?
Hi there, I know Google has been encrypting SEO keyword data which they rolled out in September 2013. My question is - will this impact SEO revenue figures reported in Google analytics? I have been monitoring SEO revenue figures for a client and they are significantly down even though rankings have not lowered. Is this because of Google's encryption? Could there be another reason? Many thanks!
Reporting & Analytics | | CayenneRed890 -
Uptick in not tracked conversions / anyone have a list of things that google analytics will not track
There seems to have been an uptick in users on our site not being tracked in Google Analytics cause I see a lot more un-tracked revenue in the last 6 months then I used to. I know analytics is still working as it has been tracking a normal amount of visits but I assumed there might be a reason less would be actually showing up in analytics (mabye a change is what is being reported as organic). I know a lot of stuff goes into "not provided" such as logged in search and stuff like that but is there a list of all of the ones that go into not provided and all that just do not get tracked (javascript not enabled, iOS?). If it could be something else as well let me know. Thanks for the help!
Reporting & Analytics | | Gordian0 -
Google Analytics: Deleted Profile
Has anyone ever successfully managed to have a deleted GA profile restored? One of our client's profiles was deleted accidentally. I know the official line is it can't be restored, but...
Reporting & Analytics | | David_ODonnell0 -
Google Analytics Campaigns
I need the help of a smart Mozzer. In Google Analytics: Traffic Sources>Sources>Campaigns all the results shown are from RSS. Can anyone help me with why RSS results would be displayed in Campaigns?
Reporting & Analytics | | waynekolenchuk0 -
Pages crawled
Hi I've created a campaign for my own website and added 3 competitor sites. Under the campaign it says that 53 pages have been crawled but my site has less than 10 pages. Are the other pages from my competitor sites? Thanks James
Reporting & Analytics | | avecsys0 -
Does using Google URL Builder override original source in Google Analytics?
During a free trial on Tatango, we send daily emails to customers to give them advice, resources, etc. We started using Google URL Builder http://www.google.com/support/analytics/bin/answer.py?answer=55578 to create individual links in each of these emails, but when the customer purchases a subscription now, the source in GA isn't Google, Facebook, Twitter, etc. they are all showing up as the source we created using the URL builder for each email. Does Google URL builder override the original source in Google Analytic?
Reporting & Analytics | | Tatango0 -
Google URL Builder Extension showing up as indexed pages.
Hello, I was reviewing my PRO member campaign report. I see that I am getting warnings for too long of URLs. However, these URLs are my website URL with the Google URL builder tracking code that I set up for my marketing campaings. Why are these being indexed? For example: www.website.com/?utm_source=Oct+Newsletter&utm_medium=e.... Thank you, Kristen
Reporting & Analytics | | KLFeichtner0 -
Google analytics advanced segments
Ok, I need help with a simple (although, for some reason, I'm having trouble with it) advanced segment. Dilemma**:** All of our techs have a backend cookie that they use to log into our website. I want a way toexclude all visites where the landing page contained this in the url:/backend/cookie.php?username= Advanced Segment: We have a lot of techs and each one of them has a different "username=example". So how can I set up an advanced segment where it will exclude any visit where the visitor came in one a landing page containing /backend/cookie.php?username=
Reporting & Analytics | | NerdsOnCall0