How can I remove parameters from the GSC URL blocking tool?
-
Hello Mozzers
My client's previous SEO company went ahead and blindly blocked a number of parameters using the GSC URL blocking tool. This has now caused Google to stop crawling many pages on my client's website and I am not sure how to remove these blocked parameters so that they can be crawled and reindexed by Google.
The crawl setting is set to "Let Google bot decide" but still there has been a drop in the number of pages being crawled. Can someone please share their experience and help me delete these blocked parameters from GSC's URL blocking tool.
Thank you Mozzers!
-
Hi Vincent,
My short answer is: don't let Googlebot decide. Tell Googlebot which parameters should or should not create new pages. This is something you should do if you ever have indexation problems with parameters.
Do a site: search for a handful of these URLs with parameters to double check that the drop in the number of pages being crawled is because of these pages or because of something else. If it is because of these pages, you can quickly add them back to the index by using the "Fetch as Googlebot" tool. Once you have Google fetch something, you have the option of submitting it to the index.
(If it turns out the drop in crawled pages is from something else, a good way to figure out which pages are being affected is by creating multiple XML sitemaps and organizing them by site section, so when Google reports on how many of your URLs are in its index, you quickly know which section of the site is being affected. This post is really old, but still incredibly useful here.)
Double check that these URLs with parameters are in the XML sitemap, and that you have a number of internal links on prominent pages pointing to them. Even if these can only be temporary, those links will really help the process.
Hope this helps!
Kristina
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search Keywords, Meta Keywords and Meta Descriptions; Keeping Webmaster Tools Current
What are Search Keywords, Meta Keywords and Meta Descriptions? What exactly is the difference between them and which one is more important? In regards to Webmaster Tools, if we delete a page or a product, it still shows up in Search Analytics. How can we update Webmaster Tools so as to keep it current with our website? Lastly, again in regards to Webmaster Tools, in Search Analytics. At the moment we put relevant queries into the Meta Description of low ranking pages, in order to raise the position of the page. Is this the right way to handle queries? Should we be putting the queries into the Meta Description or the Meta Keywords?
Reporting & Analytics | | CostumeD0 -
Google Analytics shows most referrers as "Direct" -- What are some better tools?
Very often Google Analytics will show 50-90% of our referrers as (direct) which is not very helpful. Are there other tools out there that will provide a clearer breakdown of what other websites are sending us our traffic? Specifically, I want to be able to be able to tell who are the top traffic referrers to my top performing pages on my site for the last 30 days. (I want to be able to study this on a per-page basis.) Thanks in advance!
Reporting & Analytics | | Brand_Psychic0 -
URL Parameters
Hi there, I have a magento sort by feature which has indexed loads of pages in Google with urls that have /shopby/ in them.Over 8k pages have been indexed like this. I cannot edit the robots within the page but have now disallowed the urls in robots.txt - i guess this will prevent new ones being indexed but not deindex current ones? So I looked into URL parameters, I added 'shopby' as a parameter in webmaster tools and told Google not to crawl any urls with this in it, will this deindex the pages already indexed? The only other way seems to be manually removing 8k urls, which i do not want to do. Any advice much appreciated. Obviously I do not want these urls indexed as they are weak/duplicate sort by search pages, I fear the panda update would not be too kind on it long term?
Reporting & Analytics | | tdigital0 -
Google analytics vs Webmaster tools data
Hi Which is more accurate WT or GA data ? Since GA reporting a KW (thats very recently fallen from page 1 to 3 hence looking into data to find the cause) in the Organic part of Search tab as having generated just 1 visitor over a month (hence presuming fall could be due to low visits from a page 1 result) whilst under Search Engine Optimisation tab (data sourced from WT i think) its reporting 5 click thrus from 150 impressions over same period resulting in a quite good 3.33% CTR (hence wouldn't expect to be the cause of a fall) and what i would have thought GA would report as 5 visits instead of the 1 they do report !? The reason im looking for answer in the data is because no on-page has changed and still scoring a grade A and off page metrics have all improved across the board (apart from small drop in majseo's Trust Flow) such as increased links, RD, Citation Flow, Ref Subnets etc etc etc Cheers Dan
Reporting & Analytics | | Dan-Lawrence0 -
Can't seem to rank for keyword "home care grand rapids" - need some advice
I am trying to rank for "home care grand rapids" and am having a really difficult time. My site: http://healthcareassociates.net has better backlinks, keywords and other seo markers than my competitors but I still can't seem to rank. The keyword and associated keywords (home care grand rapids michigan, home health care grand rapids, etc.) are only 31-33% difficulty and my site/page rank is better than the leading sites. What gives? Todd
Reporting & Analytics | | t1kuslik0 -
Removing Bad Links, Bad?
Our situation is this.. We recently went through a major redesign that included a significant change in content and URL structure. This of course cause a significant increase in 404 from our search traffic. In an effort to minimize those 404s we requested that Google remove certain URL directories from their index. Doing this resulted in a 90% drop in the number of impressions we were getting from search. My question is this, is it better to allow the 404s with a good landing page that explains the changes or better to remove the bad urls? Much thanks...
Reporting & Analytics | | SQE-SEOMoz0 -
What to get from google Webmaster tools?
Hi everyone, I've been doing optimization for our websites and tracking the results regularly but don't really know what the results actually mean. I heard that I need to check the traffic and organic results from Google analytics and Google webmaster tools. Everybody says something and not sure what to do? Is there anyone clarify the SEO process for me - when to do what and why in simplest way? Thanks in advance,
Reporting & Analytics | | WTGEvents0 -
SEOMoz & Google Webmaster Tools crawl error conflicting info
Site im working on has zero crawl errors according to SEOMoz (it did previously have lots since ironed out) but now looking at GWebmaster Tools saying 5000 errors. Date of those are not that recent but Webmaster Tools line graph of errors still showing aprox 5000 up to yesterday There is an option to bulk action/tick them all as fixed so thinking/hoping GWT just keeping a historical record that can now be deleted since no longer applicable. However i'm not confident this is the case since still showing on the line graph. Any ideas re this anomalous info (can i delete and forget in GWT) ? Also side question I take it its not possible to link a GA property with a GWT account if created with different logins/accounts ? Many Thanks Dan
Reporting & Analytics | | Dan-Lawrence0