Googlebot take 5 times longer to crawl each page
-
Hello All
From about mid September my GWMT has show that the average time to crawl a page on my site has shot up from an average of 130ms to an average of 700ms and peaks at 4000ms.
I have checked my server error logs and found nothing there, I have checked with the hosting comapny and there are no issues with the server or other sites on the same server.
Two weeks after this my ranking fell by about 950 places for most of my keywords etc.I am really just trying to eliminate this as a possible cause, of these ranking drops. Or was it the Pand/ EMD algo that has done it.
Many Thanks
Si
-
Thank you for having a look
I made no strcutural changes around the time of the issues starting.
On the third graph in GWMT yes there is was a spike on the time spent downloading at it is still a lot higher than previously. I have add an image of it below.
There were two google update about two weeks later the latest Panda and the new EMD.
Most of the content has been written by myself from my own experience etc. There are some pages that I am in the process of removing / changing that are the same as other sites.
Until 4 months ago the layout was in fixed size nested tables etc, I am just about getting my head around CSS etc., to try and drag it in the 21st century.
-
Hi.
Based on the site size (number of pages) and format (code, elements and structure) and two speed test I just run on it and a trace-route (from Austria) looks like you don't have any issues with it from a technical point of view.
One thing you need to check is still possibile is the time spent downloding a page graph (the third one) from within GWMT. Did this spiked up in the same time when crawl pages went down ?
A few other questions you should consider:
-
did you do any changes - especially structure changes around the same time you've notice the issues ?
-
are there any public google updates in the same timeframe with those changes that you've notice ?( you can check them here: http://www.seomoz.org/google-algorithm-change )
-
is your content duplicate ? (with external sources I mean - not internally)
Please don't get me wrong - i would be ok with the format of the site if it will be very old - before 2000. But the domain is from 2008 - you should get on track with new trends as far as layout, content format and web site format in general.
Hope it helps.
-
-
Hi
I am as sure as I can be but not being a full expert on these things I may have missed something technical.
I have be making changes to the site since mainly on the css layout.
The site is www.growingyourownveg.com
Thanks
-
Hi,
As far as I know a low crawl rate won't end up with bad rankings but bad rankings will end up with a lower crawl rate.
If you are sure and I mean really sure you don't have any technical issues on your side that will influence the crawl rate and possibile also rankings then you should take in consideration that maybe you do actually have a -950 filter that is causing your rankings to drop, google dosen't consider your site an authority and for thsi reason it won't crawl your site often or as often as it used to do it.
Can you share the url of the site ? Just to have a look and see if at a first glance there are any obvios reason for google to dislike your site.
Cheers !
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will putting a one page site up for all other countries stop Googlebot from crawling my UK website?
I have a client that only wants UK users to be able to purchase from the UK site. Currently, there are customers from the US and other countries purchasing from the UK site. They want to have a single webpage that is displayed to users trying to access the UK site that are outside the UK. This is fine but what impact would this have on Google bots trying to crawl the UK website? I have scoured the web for an answer but can't find one. Any help will be greatly appreciated. Thanks 🙂
Technical SEO | | lbagley0 -
How to find orphan pages
Hi all, I've been checking these forums for an answer on how to find orphaned pages on my site and I can see a lot of people are saying that I should cross check the my XML sitemap against a Screaming Frog crawl of my site. However, the sitemap is created using Screaming Frog in the first place... (I'm sure this is the case for a lot of people too). Are there any other ways to get a full list of orphaned pages? I assume it would be a developer request but where can I ask them to look / extract? Thanks!
Technical SEO | | KJH-HAC1 -
How long after disallowing Googlebot from crawling a domain until those pages drop out of their index?
We recently had Google crawl a version of the site we that we had thought we had disallowed already. We have corrected the issue of them crawling the site, but pages from that version are still appearing in the search results (the version we want them to not index and serve up is our .us domain which should have been blocked to them). My question is this: How long should I expect that domain (the .us we don't want to appear) to stay in their index after disallowing their bot? Is this a matter of days, weeks, or months?
Technical SEO | | TLM0 -
Does adding subcategory pages to an commerce site limit the link juice to the product pages?
I have a client who has an online outdoor gear company. He mostly sells high end outdoor gear (like ski jackets, vests, boots, etc) at a deep discount. His store currently only resides on Ebay. So we're building him an online store from scratch. I'm trying to determine the best site architecture and wonder if we should include subcategory pages. My issue is that I think the subcategory pages might be good from a user experience, but it'll add an additional layer between the homepage and the product pages. The problem is that I think a lot of user's might be searching for the product name to see if they can find a better deal, and my client's site would be perfect for them. So I really want to rank well for the product pages, but I'm nervous that the subcategory pages will limit the link juice of the product pages. Home --> SubCategory --> Product List --> Product Detail Home --> Men's Ski Clothing --> Men's Ski Jack --> North Face Mt Everest Jacket Should I keep the SubCategory page "Men's Ski Clothing" if it helps usability? On a separate note, the SubCategory pages would have some head keyword terms, but I don't think that he could rank well for these terms anytime soon. However, they would be great pages / terms to rank for in the long term. Should this influence the decision?
Technical SEO | | Santaur0 -
Website not crawled
i added website www.nsale.in in add campaign, it shows only 1 page crawled. but its working fine for other sites, any idea why it failed ?
Technical SEO | | Dhinesh0 -
Product Pages Outranking Category Pages
Hi, We are noticing an issue where some product pages are outranking our relevant category pages for certain keywords. For a made up example, a "heavy duty widgets" product page might rank for the keyword phrase Heavy Duty Widgets, instead of our Heavy Duty Widgets category page appearing in the SERPs. We've noticed this happening primarily in cases where the name of the product page contains an at least partial match for the desired keyword phrase we want the category page to rank for. However, we've also found isolated cases where the specified keyword points to a completely irrelevent pages instead of the relevant category page. Has anyone encountered a similar issue before, or have any ideas as to what may cause this to happen? Let me know if more clarification of the question is needed. Thanks!
Technical SEO | | ShawnHerrick0 -
How Often is Site Crawled
Good morning- I saw some errors in my first crawl and immediately removed the pages from my website. I then re-created my XML sitemap and uploaded to Google. The question I have is will the site be crawled to recognize the changes in the next day or so? The pages were just placed on the site as test pages and never removed. The initial crawl that notified me it was done found the errors and were removed. Thanks for your help. Peter
Technical SEO | | VT_Pete0 -
Too Many On Page LInk
The analysis of my site is showing that I have a problem with too many on-page links. Most of this is due to our menu, and wanting users to be able to quickly get to the shopping category they are looking for. We end up with over 200 links in order to get the menu we want. How are other people dealing with a robust menu, but avoiding getting dinged for too many links? One of our pages in question is: http://www.milosport.com/category/2176-snowboards.aspx
Technical SEO | | dantheriver0