Reasons for a sharp decline in pages crawled
-
Hello!
I have a site I've been tracking using Moz since July. The site is mainly stagnant with some on page content updates. Starting the first week of December, Moz crawler diagnostics showed that the number of pages crawled decreased from 300 to 100 in a week.
So did the number of errors through. So crawler issues went from 275 to 50 and total pages crawled went from 190 to 125 in a week and this number has stayed the same for the last 5 weeks.
Are the drops a red flag? Or is it ok since errors decreased also? Has anyone else experienced this and found an issue?
FYI: sitemap exists and is submitted via webmaster tools. GWT shows no crawler errors nor blocked URLs.
-
Google is indexing just over 80 URLs, although about 40% of them are developer test URLs (they lead to live pages of the site though). Nothing in robots.txt. No errors.
The Google bot is still crawling, but it's crawling half the pages. What would make it decrease in page crawls? I'm working if there is a broken link or something on the home page that's pointing away from the site... although it's unlikely, I'll check....
-
If you fixed a problem, such as duplicate content, that would mean that we're showing fewer errors and crawling fewer pages, since that problem is fixed. Might that be the case?
-
How many URLs are indexed in Google if you use site:yourdomain.com Has that figure dropped too?
Have you got anything in your robots.txt that could be blocking?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple links from same domain (different pages) considered in credibility of backlinks?
Hi, Let's say there are multiple backlinks from different pages of same domain to different pages of other domain like below: Website A: Page 1 -----------> Website B: Page 1 Website A: Page 2 -----------> Website B: Page 2 Do the pages of Website B pages will get backlinks authority equally or they don't get much backlinks impact as they have multiple backlinks from same domain? There were old school stories that Google ignores second link from same domain.....etc... So, please suggest on this. Thank you. Note: The question is NOT about content relevancy or domain authority score of the backlinks.
Algorithm Updates | | vtmoz1 -
Increase of non-relevant back-links drop page ranking?
Hi community, Let's say there is a page with 50 back-links where 40 are non-relevant back-links and only 10 are relevant in-terms of content around the link, etc....Will these non-relevant back-links impact the ranking of the page by diluting the back-link profile? Thanks
Algorithm Updates | | vtmoz0 -
What happens when most of the website visitors end up at an "noindex" log-in page?
Hi all, As most of the users are visiting our website for log-in, we are planning to deindex login page. As they cannpt find it on SERP, they gonna visit our website and login; I just wonder what happens when most of the visitors just end up at homepage by browsing into an "noindex" page. Obviously it increases bounce rate and exit rate as they just gonna disappear. Is this going to push down us in rankings? What are the other concerns to check about? Thanks
Algorithm Updates | | vtmoz0 -
Sub-directory pages must be optimised well?
Hi all, We have help pages as sub-directory which have been linked from our website pages (3 clicks depth). But these pages are not well optimised with minor issues like header tags, image alts, etc...Moreover some of these pages are dead-end pages. Will these things hurt us? Thanks
Algorithm Updates | | vtmoz0 -
Duplicate Content on Product Pages with Canonical Tags
Hi, I'm an SEO Intern for a third party wine delivery company and I'm trying to fix the following issue with the site regarding duplicate content on our product pages: Just to give you a picture of what I'm dealing with, the duplicate product pages that are being flagged have URLs that have different Geo-variations and Product-Key Variations. This is what Moz's Site Crawler is seeing as Duplicate content for the URL www.example.com/wines/dry-red/: www.example.com/wines/dry-red/_/N-g123456 www.example.com/wines/dry-red/_/N-g456789 www.example.com/wines/California/_/N-0 We have loads of product pages with dozens of duplicate content and I'm coming to the conclusion that its the product keys that are confusing google. So we had the web development team put the canonical tag on the pages but still they were being flagged by google. I checked the of the pages and found that all the pages that had 2 canonical tags I understand we should only have one canonical tag in the so I wanted to know if I could just easily remove the second canonical tag and will it solve the duplicate content issue we're currently having? Any suggestions? Thanks -Drew
Algorithm Updates | | drewstorys0 -
Why do we have so many pages scanned by bots (over 250,000) and our biggest competitors have about 70,000? Seems like something is very wrong.
We are trying to figure out why last year we had a huge (80%) and sudden (within two days) drop in our google searches. The only "outlier" in our site that we can find is a huge number of pages reported in MOZ as scanned by search engines. Is this a problem? How did we get so many pages reported? What can we do to bring the number of searched pages back to a "normal" level? BT
Algorithm Updates | | achituv0 -
A Serious drop in Pages crawled per day
On 21st April ,I spotted a sudden decrease in pages crawled per day.Previously it was about 5,000 bust after the drop it reached to 225.From the crawl rate never spiked. Here is my website url - http://www.wpstuffs.com/ 8fQHW2G.png
Algorithm Updates | | vividvilla0 -
18 years later, Page Rank 6 Drops to 0, All +1s disappear, Scrapers outrank us
18 years ago I put up our first website at http://oz.vc/6 Traffic grew and our forums reached hundreds of thousands of posts, our website had a page rank of 6 and our forums and other content areas ranked 5-6, the others usually 4-6. Panda 2.2 came along and whacked it. No measures recommended by SEO experts and the Matt Cutts videos even made a dent, including some pretty severe measures that were supposed to make a difference. Bing and Yahoo traffic both grew since Panda 2.2 and only Google kept dropping every few updates without recovery. Several few weeks ago Google provides the ultimate whack. It seems every page other than the home page has either a PR of 0 or not generating any PR at all. Every +1 disappeared off of the site. Now three pages have +1 back and the entire guide section (hundreds of articles) are still missing all +1s. I discovered two scrapers, one of which was copying all of our forum posts and ranking a PR 2 for it (while we have a zero. They were taken down but I still can't imagine how this result could happen. I am going to have an RSS feed aggregator taken down that is ranking a 2 and knows we can't prevent them from taking our Wordress feeds and storing them (we use them for areas on the site.) How can Google provide us with a zero page rank and give obvious scrapers page rank? What should have been years worth of awesome rich added content and new features was wasted chasing Google ghosts. I've had two SEO people look at the site and none could point to any major issue that would explain what we've seen, especially the latest page rank death penalty. We haven't sold paid links. We have received no warnings from Google (nor should we have.) The large "thin" area you may see in a directory were removed entirely from Google (and made no difference and a drop in Google doing the "right" thing!) Most think we have been stuck for a very long time in the rare Google glitch. Would be interested in your insights.
Algorithm Updates | | seoagnostic0