I have removed over 2000+ pages but Google still says i have 3000+ pages indexed
-
Good Afternoon,
I run a office equipment website called top4office.co.uk.
My predecessor decided that he would make an exact copy of the content on our existing site top4office.com and place it on the top4office.co.uk domain which included over 2k of thin pages.
Since coming in i have hired a copywriter who has rewritten all the important content and I have removed over 2k pages of thin pages.
I have set up 301's and blocked the thin pages using robots.txt and then used Google's removal tool to remove the pages from the index which was successfully done.
But, although they were removed and can now longer be found in Google, when i use site:top4office.co.uk i still have over 3k of indexed pages (Originally i had 3700).
Does anyone have any ideas why this is happening and more importantly how i can fix it?
Our ranking on this site is woeful in comparison to what it was in 2011. I have a deadline and was wondering how quickly, in your opinion, do you think all these changes will impact my SERPs rankings?
Look forward to your responses!
-
I agree with DrPete. You cant have the pages within the robot.txt otherwise Google will not crawl the pages and "see" the 301s to then update the index.
Something else to consider is on the new pages, have them canonical to themselves. We had a site that Google was caching old URLs that had 301 redirects that had been up for 2 years. Google was finding the new pages and new titles and new content, but were referencing the old URLs. We were seeing this in the SERPs and also in the GWT. GWT was reporting duplicate content for titles and descriptions for sets of pages that were 301ed. Adding the canonical to self helped get that cleaned up.
Cheers.
-
This process can take a painfully long time, even done right, but I do have a couple of concerns:
(1) Assuming I understand the situation, I think using Robots.txt on top of 301-redirects is a bad idea. If Google doesn't recrawl the pages, they won't process the 301s, and Robots.txt is bad for removal (good for prevention, but not once something is in the index). Basically, you're telling Google not to re-crawl these pages, and if they don't re-crawl, they won't process the 301s. So, I'd drop the Robots.txt blocking for now, honestly.
(2) What's your internationalization strategy? You could potential try rel="alternate"/hreflang to specify US vs. UK English, target each domain in webmaster tools, and leave the duplicates alone. If you 301-redirect, you're not giving the UK site a chance to rank properly on Google.co.uk (if that's your objective).
-
It sounds like you have done pretty much everything you could do to remove those pages from Google, and that Google has removed them.
There are two possibilities that I can think of. First, Google is finding new pages or new URLs at least. These may be old pages that have some sort of a parameter on them or something like that that are causing Google to find some new pages even though you're not adding any new pages.
Another possibility is that, I found that the site:search is not entirely accurate. So, it's more like anything else that Google gives us words this kind of estimate of the actual figure. It's possible that Google was giving you a smaller number of pages if in that original 3700 they said they had. And now they're just reporting more of the pages that they had had in their index, which they weren't showing before.
By the way, when I do a search for site:top four office.co.uk, I only get 2600 results.
-
I no longer see the pages. No chance Google has seen any additional pages as we spend every day looking at new pages indexed by using the filter and site:top4office.co.uk.
Any ideas?
-
Just a quick question, do you see the URLs you "removed" still in the index? Or is it possible that Google has found a different set of 3000 URLs on your site?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should remove 404 page
Hello, I upload a new website with new web addresses and my current addresses don't work anymore. I don't want to do redirects. Should I just remove the old address from google index using their tool or let google do it on its own. Thank you,
Intermediate & Advanced SEO | | seoanalytics1 -
How long to re-index a page after being blocked
Morning all! I am doing some research at the moment and am trying to find out, just roughly, how long you have ever had to wait to have a page re-indexed by Google. For this purpose, say you had blocked a page via meta noindex or disallowed access by robots.txt, and then opened it back up. No right or wrong answers, just after a few numbers 🙂 Cheers, -Andy
Intermediate & Advanced SEO | | Andy.Drinkwater0 -
Home page suddenly dropped from index!!
A client's home page, which has always done very well, has just dropped out of Google's index overnight!
Intermediate & Advanced SEO | | Caro-O
Webmaster tools does not show any problem. The page doesn't even show up if we Google the company name. The Robot.txt contains: Default Flywheel robots file User-agent: * Disallow: /calendar/action:posterboard/
Disallow: /events/action~posterboard/ The only unusual thing I'm aware of is some A/B testing of the page done with 'Optimizely' - it redirects visitors to a test page, but it's not a 'real' redirect in that redirect checker tools still see the page as a 200. Also, other pages that are being tested this way are not having the same problem. Other recent activity over the last few weeks/months includes linking to the page from some of our blog posts using the page topic as anchor text. Any thoughts would be appreciated.
Caro0 -
Links / Top Pages by Page Authority ==> pages shouldnt be there
I checked my site links and top pages by page authority. What i have found i dont understand, because the first 5-10 pages did not exist!! Should know that we launched a new site and rebuilt the static pages so there are a lot of new pages, and of course we deleted some old ones. I refreshed the sitemap.xml (these pages are not in there) and upload it in GWT. Why those old pages appear under the links menu at top pages by page authority?? How can i get rid off them? thx, Endre
Intermediate & Advanced SEO | | Neckermann0 -
How many times will Google read a page?
Hello! Do you know if Google reads a page more than once? We want to include a very robust menu that has a lot of links, so we were thinking about coding a very simple page that loads first and immediately loading the other code that has all the links thinking that perhaps Google will only read the first version but won't read it the second time with all the links. Do you know if we will get penalized? I'm not sure if I got the idea across, let me know if I need to expand more. Thanks,
Intermediate & Advanced SEO | | alinaalvarez0 -
Google Is Indexing My Internal Search Results - What should i do?
Hello, We are using a CMS/E-Commerce platform which isn't really built with SEO in mind, this has led us to the following problem.... a large number of internal (product search) search result pages, which aren't "search engine friendly" or "user friendly", are being indexed by google and are driving traffic to the site, generating our client revenue. We want to remove these pages and stop them from being indexed, replacing them with static category pages - essentially moving the traffic from the search results to static pages. We feel this is necessary as our current situation is a short-term (accidental) win and later down the line as more pages become indexed we don't want to incur a penalty . We're hesitant to do a blanket de-indexation of all ?search results pages because we would lose revenue and traffic in the short term, while trying to improve the rankings of our optimised static pages. The idea is to really move up our static pages in Google's index, and when their performance is strong enough, to de-index all of the internal search results pages. Our main focus is to improve user experience and not have customers enter the site through unexpected pages. All thoughts or recommendations are welcome. Thanks
Intermediate & Advanced SEO | | iThinkMedia0 -
Https & http urls in Google Index
Hi everyone, this question is a two parter: I am now working for a large website - over 500k monthly organic traffic. The site currently has both http and https urls in Google's index. The website has not formally converted to https. The https began with an error and has evolved unchecked over time. Both versions of the site (http & https) are registered in webmaster tools so I can clearly track and see that as time passes http indexation is decreasing and https has been increasing. The ratio is at about 3:1 in favor of https at this time. Traffic over the last year has slowly dipped, however, over the last two months there has been a steady decline in overall visits registered through analytics. No single page appears to be the culprit, this decline is occurring across most pages of the website, pages which traditionally draw heavy traffic - including the home page. Considering that Google is giving priority to https pages, could it be possible that the split is having a negative impact on traffic as rankings sway? Additionally, mobile activity for the site has steadily increased both from a traffic and a conversion standpoint. However that traffic has also dipped significantly over the last two months. Looking at Google's mobile usability error's page I see a significant number of errors (over 1k). I know Google has been testing and changing mobile ranking factors, is it safe to posit that this could be having an impact on mobile traffic? The traffic declines are 9-10% MOM. Thank you. ~Geo
Intermediate & Advanced SEO | | Geosem0 -
How to remove an entire site from Google?
Hi people, I have a site with around 2.000 urls indexed in google, and 10 subdomains indexed too, which I want to remove entirely, to set up a new web. Which is the best way to do it? Regards!
Intermediate & Advanced SEO | | SeoExpertos0