Stuck trying to deindex pages from google
-
Hi There,
We had developers put a lot of spammy markups in one of our websites. We tried many ways to deindex them by fixing it and requesting recrawls... However, some of the URLs that had these spammy markups were incorrect URLs - redirected to the right version, (ex. same URL with or without / at the end)
so now all the regular URLs are updated and clean, however, the redirected URLs can't be found in crawls so they weren't updated, and couldn't get the spam removed. They still show up in the serp.
I tried deindexing those spammed pages by making then no-index in the robot.txt file. This seemed to be working for about a week, and now they showed up again in the serp
Can you help us get rid of these spammy urls?
-
Ruchy,
Yeap it might had helped for a few weeks. But internal links from your site are not the only way to crawl all your pages. Remember that there may be other sites linking other pages.
B- Absolutely, adding noindex will help. There is no way to know for sure how long will it take, give it a few weeks. Also, it could help removing manually all those pages with the Google Search Console, as Logan said.
Hope it helps!.
GR -
Hi Gaston,
Thanks so much for taking your time to answer my question
here are two points - A- My mistake, in the robot.txt we disallowed it, and it was done right. Our devs did it for us and I double checked in in search console tester. Also, this idea did work for us the first few weeks.
B - There is no place the crawlers can find these pages to recrawl, as they are no longer linked from anywhere in my site. will adding the no index help? If yes, how long can it take?
-
I second what Gaston said. This usage of robots.txt is one of the most common misconceptions in SEO, so don't feel bad. Google actually explicitly says to not use robots.txt for index-prevention in their webmaster guide.
To add to Gaston's point, make sure you remove the robots.txt disallow when you add the meta noindex tag he provided. If you don't let them crawl the page, they won't see the tag.
You can also use remove these URLs temporarily in Search Console by going to the Google Index menu and selecting "Remove URLs". It'll remove from search results, then when they come back to crawl that page again (as long as you're letting them), they'll see your noindex tag and keep it out.
-
Hello Ruchy,
If by "making no-index" in the robots you are meaning _disallowing _them, you are making ir wrong.
Robots.txt are just signs to the robots and only tell them to NOT CRAWL them, it doesnt prevent from indexing those pages. (it can happen the case that there is a link pointing to that page and the crawler just passes by it).The most used way to remove certaing indexed pages is by adding the robots noindex meta tag, it should look like this:
Also, some useful links:
Robots meta directives - Moz
Robots meta tag - Google developers
Robots tag generatorHope it helps.
GR
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Trying to find all internal links to a specific page (without index)
Hi guys -- Still waiting on Moz to index a page of mine. We launched a new site over two months ago. In the meantime, I really just need a list of internal links to a specific page because I want to change its URL. Does anybody know how to find that list (of internal links to 1 of my pages) without the Moz index? I appreciate the help!
Technical SEO | | marchexmarketingmcc1 -
Does adding subcategory pages to an commerce site limit the link juice to the product pages?
I have a client who has an online outdoor gear company. He mostly sells high end outdoor gear (like ski jackets, vests, boots, etc) at a deep discount. His store currently only resides on Ebay. So we're building him an online store from scratch. I'm trying to determine the best site architecture and wonder if we should include subcategory pages. My issue is that I think the subcategory pages might be good from a user experience, but it'll add an additional layer between the homepage and the product pages. The problem is that I think a lot of user's might be searching for the product name to see if they can find a better deal, and my client's site would be perfect for them. So I really want to rank well for the product pages, but I'm nervous that the subcategory pages will limit the link juice of the product pages. Home --> SubCategory --> Product List --> Product Detail Home --> Men's Ski Clothing --> Men's Ski Jack --> North Face Mt Everest Jacket Should I keep the SubCategory page "Men's Ski Clothing" if it helps usability? On a separate note, the SubCategory pages would have some head keyword terms, but I don't think that he could rank well for these terms anytime soon. However, they would be great pages / terms to rank for in the long term. Should this influence the decision?
Technical SEO | | Santaur0 -
Should I deindex my pages?
I recently changed the URLs on a website to make them tidier and easier to follow. I put 301s in place to direct all the previous page names to the new ones. However, I didn't read moz's guide which says I should leave the old sitemap online for a few weeks afterwards. As I result, webmaster tools is showing duplicate page titles (which means duplicate pages) for the old versions of the pages I have renamed. Since the old versions are no longer on the sitemap, google can no longer access them to find the 301s I have put in place. Is this a problem that will fix itself over time or is there a way to quicken up the process? I could use webmaster tools to remove these old urls, but I'm not sure if this is recommended. Alternatively, I could try and recreate the old sitemap, but this would take a lot of time.
Technical SEO | | maxweb0 -
Google sees 2 home pages while I only have 1
How to solve the problem of google seeing both domain.com and domain.com/index.htm when I only have one file? Will the cannonical work? If so which? Or any other solutions for a novice? I learned from previous blogs that it needs to be done by hosting service, but Yahoo has no solution.
Technical SEO | | Kurtyj0 -
My site was Not removed from google, but my most visited page was. what does that mean?
Help. My most important page http://hoodamath.com/games/ has disappeared from google, why the rest of my site still remains. i can't find anything about this type of ban. any help would be appreciated ( i would like to sleep tonight)
Technical SEO | | hoodamath0 -
Translating Page Titles & Page Descriptions
I am working on a site that will be published in the original English, with localized versions in French, Spanish, Japanese and Chinese. All the versions will use the English information architecture. As part of the process, we will be translating the page the titles and page descriptions. Translation quality will be outstanding. The client is a translation company. Each version will get at least four pairs of eyes including expert translators, editors, QA experts and proofreaders. My question is what special SEO instructions should be issued to translators re: the page titles and page descriptions. (We have to presume the translators know nothing about SEO.) I was thinking of: stick to the character counts for titles and descriptions make sure the title and description work together avoid over repetition of keywords page titles (over-optimization peril) think of the descriptions as marketing copy try to repeat some title phrases in the description (to get the bolding and promote click though) That's the micro stuff. The macro stuff: We haven't done extensive keyword research for the other languages. Most of the clients are in the US. The other language versions are more a demo of translation ability than looking for clients elsewhere. Are we missing something big here?
Technical SEO | | DanielFreedman0 -
Google Indexed Only 1 Page
Hi, I'm new and hope this forum can help me. I have recently resubmit my sitemap and Google only Indexed 1 Page. I can still see many of my old indexed pages in the SERP's? I have upgraded my template and graded all my pages to A's on SEOmoz, I have solid backlinks and have been building them over time. I have redirected all my 404 errors in .htaccess and removed /index.php from my url's. I have never done this before but my website runs perfect and all my pages redirect as I hoped. My site: www.FunerallCoverFinder.co.za How do I figure out what the problem is? Thanks in Advance!
Technical SEO | | Klement690 -
Page Over-optimized?
I read over this post on the blog tonight: http://www.seomoz.org/blog/lessons-learned-by-an-over-optimizer-14730 & it's got me concerned that I might be having a similar issue on our site? Back in March & April of last year, we ranked fairly well for a number of long tail keywords, here is one in particular 'Mio Drink' for this page: http://www.discountqueens.com/free-mio-drink-from-kraft-facebook-offer The page is still indexed, but appears back on page #3 for the search term. During this time we had made a number of different updates to our site & I can't seem to put an exact finger on what might have caused the problem? Can anyone see any issues that might have caused this to drop? Thanks, BJ
Technical SEO | | seointern0