Can't generate a sitemap with all my pages
-
I am trying to generate a site map for my site nationalcurrencyvalues.com but all the tools I have tried don't get all my 70000 html pages... I have found that the one at check-domains.com crawls all my pages but when it writes the xml file most of them are gone... seemingly randomly.
I have used this same site before and it worked without a problem. Can anyone help me understand why this is or point me to a utility that will map all of the pages?
Kindly,
Greg
-
Thank you all for the responses... I found them all helpful. I will look into creating my own sitemap with the IIS tool.
I can't help the 70k pages but the URLS are totally static. I guess I can make a site map for all the aspx pages and then other one for all the lowest level .html pages.
Thanks everyone!
-
I definitely agree with Logan. The max for an XML sitemap for Search Console is 50,000 URLs, so you won't be able to fit all of yours into one.
That being the case, divide them into different sitemaps by category or type, then list all of those in one directory sitemap and submit that. Now you can see indexation by page type on your website.
Finally, I have to ask why you are doing this with a third party tool and creating a static sitemap as opposed to creating a dynamic one that can update automatically when you publish new content? If your site is static and you're not creating new pages, then your approach might be ok, but otherwise I'd recommend investigating how you build a dynamic XML sitemap that updates with new content.
Cheers!
-
Looking at your site how sure are you that you need 70,000 pages?
For the sitemap I would stop trying to use a website and do it yourself. It looks like you are running IIS. They have a sitemap generator that you can install on a server easily and run it there. It looks like you have GoDaddy, they catch a lot of crap but I have always found their technical support to be top notch. If you can't figure out how to do it on the server I would give them a call.
-
Greg,
Have you tried creating multiple XML sitemaps by section of the site, like by folder or by product detail pages? 70,000 is a huge amount of URLs and even if you could get them all on one sitemap, I wouldn't recommend it. Nesting sitemaps into an index sitemap can help Google understand your site structure and make it easier for you to troubleshoot indexing problems should they arise.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I identify links that point to a specific landing page with a specific anchor text on my own website?
I am trying to identify buttons and links that point to a specific landing page on our website that have a certain word in the anchor text and I would like to know the referring page URL too. Does anybody have an idea on how to do this? We have above a hundred landing pages and I would rather not go through them one by one 😄 Thanks for the help
Intermediate & Advanced SEO | | 10to8-Moz0 -
For a sitemap.html page, does the URL slug have to be /sitemap?
Also, do you have to have anchors in your sitemap.html? or are naked URLs that link okay?
Intermediate & Advanced SEO | | imjonny1230 -
Can I use two sitemaps?
I have a Magento website. I am going to add a Wordpress blog under /blog. If I setup each with its own webmaster tools to submit a sitemap does it hurt anything?
Intermediate & Advanced SEO | | Tylerj0 -
Silo not ranking for main silo page - what can I do?
Hi everyone, I set up a silo for my page http://werkzeug-kasten.com/ . Unfortunately only the silos inner pages rank very good. These are for example http://werkzeug-kasten.com/suchmaschinenoptimierung-seo-freiburg/keyword-analyse/ for "Keywordanalyse SEO Freiburg" <a>http://werkzeug-kasten.com/suchmaschinenoptimierung-seo-freiburg/onpage-seo/</a> for "Onpage SEO Freiburg" ... but the silos main page <a>http://werkzeug-kasten.com/suchmaschinenoptimierung-seo-freiburg/</a> does not rank for "SEO Freiburg". Do you have any idea why that might be? Cheers, Marc
Intermediate & Advanced SEO | | RWW0 -
Can someone help me understand why this page is ranking so well?
Hi everyone, EDIT: I'm going to link to the actual page, please remove if there are any issues with confidentiality. Here is the page: https://www.legalzoom.com/knowledge/llc/topic/advantages-and-disadvantages-overview It's ranking #2 on Google for "LLC" This page is a couple months old and is substantially heavy in content, but not much more so than all the dozens of other pages online that are competing with it. This is a highly competitive industry and this particular domain is an extremely huge player in this industry. This new page is suddenly ranking #2 for an extremely competitive head term, arguably the most important/high volume keyword being targeted by the entire site. The page is outranking the home page, as well as the service page that exactly targets the query - the one that you would think would be the ranking page for this head term. However, this new page is somewhat of a spin-off with some additional related content about the subject, some videos, resources, a lot of internal links, etc. The first word of the title tag exactly matches the head term. I did observe that almost no other pages on the site have the exact keyword as the first word of the title tag, but that couldn't be sufficient to bring it up so high in the ranks, could it? Another bizarre thing that is happening is that Google is ignoring the Title Tag in the actual HTML (which is a specific question that is accurate to the content on the page), and re-assigning a title tag that basically looks like this: "Head Term | Brand." Why would it do this on this page? Doesn't it usually prefer more descriptive title tags? There are no external links coming up on Moz or Majestic pointing to this page. It has just a couple social shares. It's not being linked to from the home page or top nav bar on the main site. Can anyone explain how this particular page would outrank the main service page targeting this keyword, as well as other highly authoritative, older pages online targeting the same keyword? Thanks for your help!
Intermediate & Advanced SEO | | FPD_NYC1 -
Google isn't seeing the content but it is still indexing the webpage
When I fetch my website page using GWT this is what I receive. HTTP/1.1 301 Moved Permanently
Intermediate & Advanced SEO | | jacobfy
X-Pantheon-Styx-Hostname: styx1560bba9.chios.panth.io
server: nginx
content-type: text/html
location: https://www.inscopix.com/
x-pantheon-endpoint: 4ac0249e-9a7a-4fd6-81fc-a7170812c4d6
Cache-Control: public, max-age=86400
Content-Length: 0
Accept-Ranges: bytes
Date: Fri, 14 Mar 2014 16:29:38 GMT
X-Varnish: 2640682369 2640432361
Age: 326
Via: 1.1 varnish
Connection: keep-alive What I used to get is this: HTTP/1.1 200 OK
Date: Thu, 11 Apr 2013 16:00:24 GMT
Server: Apache/2.2.23 (Amazon)
X-Powered-By: PHP/5.3.18
Expires: Sun, 19 Nov 1978 05:00:00 GMT
Last-Modified: Thu, 11 Apr 2013 16:00:24 +0000
Cache-Control: no-cache, must-revalidate, post-check=0, pre-check=0
ETag: "1365696024"
Content-Language: en
Link: ; rel="canonical",; rel="shortlink"
X-Generator: Drupal 7 (http://drupal.org)
Connection: close
Transfer-Encoding: chunked
Content-Type: text/html; charset=utf-8 xmlns:content="http://purl.org/rss/1.0/modules/content/"
xmlns:dc="http://purl.org/dc/terms/"
xmlns:foaf="http://xmlns.com/foaf/0.1/"
xmlns:og="http://ogp.me/ns#"
xmlns:rdfs="http://www.w3.org/2000/01/rdf-schema#"
xmlns:sioc="http://rdfs.org/sioc/ns#"
xmlns:sioct="http://rdfs.org/sioc/types#"
xmlns:skos="http://www.w3.org/2004/02/skos/core#"
xmlns:xsd="http://www.w3.org/2001/XMLSchema#"> <title>Inscopix | In vivo rodent brain imaging</title>0 -
How can I tell if a website is a 'NoFollow'?
I've been link building for a long time but have recently discovered that most of my links are from NoFollow links, such as twitter and Youtube. How can I tell if a website is a 'NoFollow'?
Intermediate & Advanced SEO | | Paul_Tovey0 -
Google swapped our website's long standing ranking home page for a less authoritative product page?
Our website has ranked for two variations of a keyword, one singular & the other plural in Google at #1 & #2 (for over a year). Keep in mind both links in serps were pointed to our home page. This year we targeted both variations of the keyword in PPC to a products landing page(still relevant to the keywords) within our website. After about 6 weeks, Google swapped out the long standing ranked home page links (p.a. 55) rank #1,2 with the ppc directed product page links (p.a. 01) and dropped us to #2 & #8 respectively in search results for the singular and plural version of the keyword. Would you consider this swapping of pages temporary, if the volume of traffic slowed on our product page?
Intermediate & Advanced SEO | | JingShack0