Moz Crawl shows over 100 times more pages than my site has?
-
The latest crawl stats are attached. My site has just over 300 pages?
Wondering what I have done wrong?
-
total pages is higher you are right Keri but still only 581
-
I believe this image looks at what's indexed that's a subset of your sitemap that you submitted. You may want to look at Google Index -> Index Status in GWT to see what it shows there.
-
latest Moz crawl
-
latest webmaster tools crawl
-
I will definetly be paying attention to those numbers Keri. Webmaster tools is showing the right number of pages (something over 300 with 90% of those indexed)
-
It's not going to be a penalty, but it'll be good to have a bit less of a load on your server (bots no longer crawling thousands of pages) and just have your real pages in the index.
Places to look for interesting changes in site metrics would be your organic traffic in analytics and taking a look at your Google Webmaster Tools account to see your impressions, pages crawled, etc.
-
Thanks Keri, I will update asap.
could you let me know how big an issue would this be? (When you have the time of course;))
-
You're welcome! I may have opened a can of worms, however. That sitemap is generated by an automated tool (based on the footer at the bottom), so somehow it's finding that page 28 as well.
You may also want to ask the developer if you should be indexing the categories in the blog archives. There are resources on Moz about the best way to set that up in Wordpress, but I don't have them at my fingertips at the moment (I have a snuggly baby sleeping on my lap instead that's slowing me down a tad).
To answer your next question, after you figure out where the page 28 is being linked from and cure that, yes, you can do a one-time crawl from Research Tools. It won't overwrite your campaign info, but you can at least see if Moz is seeing thousands of pages or just a few hundred to see if stuff was fixed. Again, happy to provide more detail if/when you need it (and others will likely jump in with help on the thread, too).
I'd love to also see a little update a few weeks down the line of any changes you've noticed on your site metrics after getting this fixed.
-
You rock:)
-
And I found it. The sitemap at http://www.nineclouds.ca/sitemap includes a page /28, which is where the crawlers are finding the non-existent pages.
-
If you look at http://www.nineclouds.ca/blog/page/23, you'll see that there's a double arrow in the pagination at the right that goes to page 24, even though the last page is page 21. Google somehow has found the pages greater than 21 (which I'm not sure how they found), and once they found one of those, they keep seeing the link there with the double arrows to go to another page. Same happened with Rogerbot. I'm not sure where the bad originating link is (what legit page on your site is linking to something over page 21), but that's the loop that's happening and causing a ton of pages to be indexed. Get rid of those, and you'll also get rid of most of your errors.
-
Not shy about that at all thanks Keri.
any help you can provide is greatly appreciated.
-
Hi Bill,
Using my admin powers, I took a peek at your account. I'm still trying to figure out where it's coming from, but you have thousands of empty pages of your blog indexed. I'll dig around a little more and see if I can figure out what's up.
If you're comfortable with sharing your URL here in a public forum, other people can come take a look too. Otherwise, I'm happy to send you a private message with part of what's up and give your developer a place to start looking.
-
Thanks Keri. I am the owner of the site not the programmer so I am looking up the terms you are using as I write this response. If I am using pagination is there a way for the moz not to allow for this? If I understand your question about the calendar correctly I do have one as part of my blog that dates each post? Can I get the bot to not recognize this calendar?
-
My first guess would be parameters or something are being crawled. Do you have pagination? Sorting ascending and descending? A calendar that's getting crawled through the year 2525?
Your next step would be to look into what those duplicate pages are and see if something is amiss that's generating a ton of URLs.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Analytics Set-Up for site with both http & https pages
We have a client that migrated to https last September. The site uses canonicals pointing to the https version. The client IT team is reluctant to put 301 redirects from the non-secure to the secure and we are not sure why they object. We ran a screaming frog report and it is showing both URLs for the same page (http and https). The non-secure version has a canonical pointing to the secure version. For every secure page there is a non-secure version in ScreamingFrog so Google must be ignoring the canonical and still indexing the page however, when we run a site: we see that most URLs are the secure version. At that time we did not change the Google Analytics setup option to use: "https" instead of "http" BUT GA appears to be recording data correctly. Yesterday we set up a new profile and selected "https" but our question is: Does the GAnalytics http/https version make a difference if so, what difference is it?
Reporting & Analytics | | RosemaryB1 -
Google Crawl Stats
Hi all Wondering if anyone could help me out here. I am seeing massive variations in WMT of google crawl stats on a site I run. Just wondering if this is normal (see attached image). The site is an eccommerce site and gets a handful of new products added every couple of weeks. The total no of products is about 220k so this is only a very small %. I notice in WMT I have an amber warning under Server connectivity. About 10 days back I had warnings under DNS, Server and Robots. This was due to bad server performance. I have since moved to a new server and the other two warnings have gone back to green. I expect the Server connectivity one to update any day now. Ive included the graph for this incase it is relevant here. Many thanks for assistance. Carl crawlstats.png connect.png
Reporting & Analytics | | daedriccarl0 -
Page Name Tracking - Without UTM links
Hello, Our current site is fully html with no cms. We're moving to a newer version today with a typo 3 cms. My one worry is as follows; On some pages an internal link opens a jquery lightbox, inside the lightbox will be a video or download link. I cannot simply add a standard event url like I wanted to (utm_ link) as the pages are linked via the cms pointing to different pages, not the url's. We really need the following appearing in Analytics, whether its an event or landing page; The click of the light-box link The click of the video OR download link inside the lightbox. I would really, really appreciate any help on this as the new website is going live today, regardless whether this has been resolved. Thank you very much in advance of any replies. twY8s8B.jpg
Reporting & Analytics | | Whittie0 -
Why did my home page fall off of google rankings?
My home page at www.smt-associates.com has been ranked well for various key word phrases for years. I've tried to optimize it for the search "Crystal Lake CPA Firm" and it always had ranked number 1-2. Now it doesn't even rank in the top 5 pages (actually I don't know which page it falls on). I did an on-page report card and it has an A rating. So, what is preventing Google from ranking my home page on page 1? There's not that much competition so this should be an easy ranking for me. I don't know how ling this has not been listed, but I did modify my site about 12-18 months ago with a new WP theme. Could the theme be the problem?
Reporting & Analytics | | smtcpa0 -
Google Analytics Site Search to new sub-domain
Hi Mozzers, I'm setting up Google's Site Search on a website. However this isn't for search terms, this will be for people filling in a form and using the POST action to land on a results page. This is similar to what is outlined at http://support.google.com/analytics/bin/answer.py?hl=en&answer=1012264 ('<a class="zippy zippy-collapse">Setting Up Site Search for POST-Based Search Engines').</a> However my approach is different as my results appear on a sub-domain of the top level domain. Eg.. user is on www.domain.com/page.php user fills in form submits user gets taken to results.domain.com/results.php The issue is with the suggested code provided by Google as copied below.. Firstly, I don't use query strings on my results page so I would have to create an artificial page which shouldn't be a problem. But what I don't know is how the tracking will work across a sub-domain without the _gaq.push(['_setDomainName', '.domain.com']); code. Can this be added in? Can I also add Custom Variables? Does anyone have experience of using Site Search across a sub-domain perhaps to track quote form values? Many thanks!
Reporting & Analytics | | panini0 -
Has anyone noticed a dramatic drop in direct visits year over year in GA across multiple sites?
I monitor about 10 websites in GA. Many of these sites are in a stable phase of their lifecycle. I've noticed this year that direct visits on all my sites and even friends sites have dropped by 20-60%. Has anyone seen any explanation for this or noticed this when compared to previous year? In every instance I have no penalties, notices, anything and the drop is made up completely of "direct visits".
Reporting & Analytics | | bradwayland0 -
How do I find out how well a page converts in Analytics
Hello All, I am looking to find out how well a page converts in Analytics. A simple request you would thing, but no! First off let me list what I don't want to know: I don't want to know the conversion rate of a product I don't want to know the conversion rate based on the landing page What I want to know is how many people click to add the product to their basket on a particular page (which I understand is not strictly the conversion rate, but whatever). So the ways I have tried unsuccessfully are: The Analytics overlay (the in page Analytics thing) - this doesn't work because the "Add to Basket" button is not a link, it is an input. The Navigation Summary - this doesn't work because most of the time the /shopping_cart.php URL doesn't come up in the short list, and if you search for the URL in the search box beneath the percentages get all skewed. The most obvious solution would be Event tracking but I can't get that implemented in the short term. So does anyone have the answer to this most curious of conundrums? Thanking you in advance, Rich
Reporting & Analytics | | tonyatfat0 -
Google: show all images indexed on a domain
Is there a way to display all images that google has indexed on a domain / subdomain? I'm basically looking for something like a site:-command for google image search.
Reporting & Analytics | | jmueller0