Removal request for entire catalog. Can be done without blocking in robots?
-
Bunch of thin content (catalog) pages modified with "follow, noindex" few weeks ago. Site completely re-crawled and related cache shows that these pages were not indexed again. So it's good I suppose
But all of them are still in main Google index and shows up from time to time in SERPs. Will they eventually disappear or we need to submit removal request?Problem is we really don't want to add this pages into robots.txt (they are passing link juice down below to product pages)Thanks!
-
If your intention is to keep the page for "follow" I would not submit a removal request for the page since that will wipe it off the map. Google needs to crawl your page to know you don't want it indexed.
As I said, Google already re-crawled them and they are not re-indexed. Last cache dates show all is correctly lef out of new indexation.
I guess my concern was how long this pages will stay in index. That was done to make sure Panda will be able to recalculate the value of the site.
-
If your intention is to keep the page for "follow" I would not submit a removal request for the page since that will wipe it off the map. Google needs to crawl your page to know you don't want it indexed.
Things can take a while, sometimes many months before its not indexed properly. Google also has a disclaimer that it could possibly keep the page indexed for a while.
Also you have to make sure that you do some post things after the removal is made to ensure it never comes back
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1663419
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removing a canonical tag from Pagination pages
Hello, Currently on our site we have the rel=prev/next markup for pagination along with a self pointing canonical via the Yoast Plugin. However, on page 2 of our paginated series, (there's only 2 pages currently), the canonical points to page one, rather than page 2. My understanding is that if you use a canonical on paginated pages it should point to a viewall page as opposed to page one. I also believe that you don't need to use both a canonical and the rel=prev/next markup, one or the other will do. As we use the markup I wanted to get rid of the canonical, would this be correct? For those who use the Yoast Plugin have you managed to get that to work? Thanks!
Technical SEO | | jessicarcf0 -
"Url blocked by robots.txt." on my Video Sitemap
I'm getting a warning about "Url blocked by robots.txt." on my video sitemap - but just for youtube videos? Has anyone else encountered this issue, and how did you fix it if so?! Thanks, J
Technical SEO | | Critical_Mass0 -
How can my homepage have 2 meta descriptions?
Hi all, When googling our company, I see our main page pop up with 2 different meta descriptions, depending on the search query. The situation
Technical SEO | | NHA_DistanceLearning
The search query 'nha' (on google.nl) returns the main page with a meta description that looks like a random grab from the code by Google itself, starting with 'Ik volg een cursus bij de NHA...' The search query 'nha.nl' (on google.nl) returns the main page with the proper meta description, starting with 'Aanbieder van thuisstudies met onder meer MBO-opleidingen...'. So yeah, I'd like to have the main page only appear with the proper meta description, the latter one. We did have a redirect issue (duplicate homepages) a few weeks ago and programming fixed it. Could this have something to do with a redirect? I'd love to hear your thoughts. Thanks!0 -
How can i do SEO For Ecommerce site
I am doing SEO for my WP blog but now I am starting my recently launch an eCommerce site where I am selling electronics products. I want to know how can I do the SEO so at least I can top 10 position for my google India. Second how can i avoid duplicate content about copying manufacture contents. Please help
Technical SEO | | chandubaba0 -
Remove a directory using htaccess
Hi, Can someone tell me if there's a way using htaccess to say that everything in a particular directory, let's call it "A", is gone (http 410 code)? i.e. all the links should be de-indexed? Right now, I'm using the robots file to deny access. I'm not sure if it's the right thing to do since Google webmaster tools is showing me the link as indexed still and a 403 error code. Thanks.
Technical SEO | | webtarget0 -
Un-Indexing a Page without robots.txt or access to HEAD
I am in a situation where a page was pushed live (Went live for an hour and then taken down) before it was supposed to go live. Now normally I would utilize the robots.txt or but I do not have access to either and putting a request in will not suffice as it is against protocol with the CMS. So basically I am left to just utilizing the and I cannot seem to find a nice way to play with the SE to get this un-indexed. I know for this instance I could go to GWT and do it but for clients that do not have GWT and for all the other SE's how could I do this? Here is the big question here: What if I have a promotional page that I don't want indexed and am met with these same limitations? Is there anything to do here?
Technical SEO | | DRSearchEngOpt0 -
How can I prevent sh404SEF Anti-flood control from blocking SEOMoz?
I'm using sh404SEF on my Joomla 1.5 website. Last week, I activated the security functions of the tool, which includes an anti-flood control feature. This morning when I looked at my new crawl statistics in SEOMoz, I noticed a significant drop in the number of webpages crawled, and I'm attributing that to the security configurations that I made earlier in the week. I'm looking for a way to prevent this from happening so the next crawl is accurate. I was thinking of using sh404SEFs "UserAgent white list" feature. Does SEOMoz have a UserAgent string that I could try adding to my white list? Is this what you guys recommend as a solution to this problem?
Technical SEO | | JBradySD0 -
Help removing duplicate content from the index?
Last week, after a significant drop in traffic, I noticed a subdomain in the index with duplicate content. The main site and subdomain can be found below. http://mobile17.com http://232315.mobile17.com/ I've 301'd everything on the subdomain to the appropriate location on the main site. Problem is, site: searches show me that if the subdomain content is being deindexed, it's happening really slowly. Traffic is still down about 50% in the last week or so... what's the best way to tackle this issue moving forward?
Technical SEO | | ccorlando0