Sitemap Help!
-
Hi Guys,
Quick question regarding sitemaps. I am currently working on a huge site that has masses of pages.
I am looking to create a site map. How would you guys do this? i have looked at some tools but it say it will only do up to 30,000 pages roughly. It is so large it would be impossible to do this myself....any suggestions?
Also, how do i find out how many pages my site actually has indexed and not indexed??
Thank You all
Wayne
-
The problem that I have with CMS side sitemap generators is that it often pulls content from pages that are existing and adds entries based off that information. If you have pages linked to that are no longer there, as is the case with dynamic content, then you'll be imposing 404's on yourself like crazy.
Just something to watch out for but it's probably your best solution.
-
Hi! With this file, you can create a Google-friendly sitemap for any given folder almost automatically. No limits on the number of files. Please note that the code is the courtesy of @frkandris who generously helped me out when I had a similair problem. I hope it will be as helpful to you as it was to me
- Copy / paste the code below into a text editor.
- Edit the beginning of the file: where you see seomoz.com, put your own domain name there
- Save the file as getsitemap.php and ftp it to the appropriate folder.
- Write the full URL in your browser: http://www.yourdomain.com/getsitemap.php
- The moment you do it, a sitemap.xml will be generated in your folder
- Refresh your ftp client and download the sitemap. Make further changes to it if you wish.
=== CODE STARTS HERE ===
define(DIRBASE, './');define(URLBASE, 'http://www.seomoz.com/'); $isoLastModifiedSite = "";$newLine = "\n";$indent = " ";if (!$rootUrl) $rootUrl = "http://www.seomoz.com"; $xmlHeader = "$newLine"; $urlsetOpen = "<urlset xmlns=""http://www.google.com/schemas/sitemap/0.84"" ="" <="" span="">xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.google.com/schemas/sitemap/0.84 http://www.google.com/schemas/sitemap/0.84/sitemap.xsd">$newLine";$urlsetValue = "";$urlsetClose = "</urlset>$newLine"; function makeUrlString ($urlString) { return htmlentities($urlString, ENT_QUOTES, 'UTF-8');} function makeIso8601TimeStamp ($dateTime) { if (!$dateTime) { $dateTime = date('Y-m-d H:i:s'); } if (is_numeric(substr($dateTime, 11, 1))) { $isoTS = substr($dateTime, 0, 10) ."T" .substr($dateTime, 11, ."+00:00"; } else { $isoTS = substr($dateTime, 0, 10); } return $isoTS;} function makeUrlTag ($url, $modifiedDateTime, $changeFrequency, $priority) { GLOBAL $newLine; GLOBAL $indent; GLOBAL $isoLastModifiedSite; $urlOpen = "$indent<url>$newLine";</url> $urlValue = ""; $urlClose = "$indent$newLine"; $locOpen = "$indent$indent<loc>";</loc> $locValue = ""; $locClose = "$newLine"; $lastmodOpen = "$indent$indent<lastmod>";</lastmod> $lastmodValue = ""; $lastmodClose = "$newLine"; $changefreqOpen = "$indent$indent<changefreq>";</changefreq> $changefreqValue = ""; $changefreqClose = "$newLine"; $priorityOpen = "$indent$indent<priority>";</priority> $priorityValue = ""; $priorityClose = "$newLine"; $urlTag = $urlOpen; $urlValue = $locOpen .makeUrlString("$url") .$locClose; if ($modifiedDateTime) { $urlValue .= $lastmodOpen .makeIso8601TimeStamp($modifiedDateTime) .$lastmodClose; if (!$isoLastModifiedSite) { // last modification of web site $isoLastModifiedSite = makeIso8601TimeStamp($modifiedDateTime); } } if ($changeFrequency) { $urlValue .= $changefreqOpen .$changeFrequency .$changefreqClose; } if ($priority) { $urlValue .= $priorityOpen .$priority .$priorityClose; } $urlTag .= $urlValue; $urlTag .= $urlClose; return $urlTag;} function rscandir($base='', &$data=array()) { $array = array_diff(scandir($base), array('.', '..')); # remove ' and .. from the array / foreach($array as $value) : / loop through the array at the level of the supplied $base / if (is_dir($base.$value)) : / if this is a directory / $data[] = $base.$value.'/'; / add it to the $data array / $data = rscandir($base.$value.'/', $data); / then make a recursive call with the current $value as the $base supplying the $data array to carry into the recursion / elseif (is_file($base.$value)) : / else if the current $value is a file / $data[] = $base.$value; / just add the current $value to the $data array */ endif; endforeach; return $data; // return the $data array } function kill_base($t) { return(URLBASE.substr($t, strlen(DIRBASE)));} $dir = rscandir(DIRBASE);$a = array_map("kill_base", $dir); foreach ($a as $key => $pageUrl) { $pageLastModified = date ("Y-m-d", filemtime($dir[$key])); $pageChangeFrequency = "monthly"; $pagePriority = 0.8; $urlsetValue .= makeUrlTag ($pageUrl, $pageLastModified, $pageChangeFrequency, $pagePriority); } $current = "$xmlHeader$urlsetOpen$urlsetValue$urlsetClose"; file_put_contents('sitemap.xml', $current); ?>
=== CODE ENDS HERE ===
-
HTML sitemaps are good for users; having 100,000 links on a page though, not so much.
If you can (and certainly with a site this large) if you can do video and image sitemaps you'll help Google get around your site.
-
Is there any way i can see pages that have not been indexed?
Not that I can tell and using site: isn't going to be feasible on a large site I guess.
Is it more beneficial to include various site maps or just the one?
Well, the max files size is 50,000 or 10MB uncompressed (you can gzip them), so if you've more than 50,000 URLs you'll have to.
-
Is there any way i can see pages that have not been indexed?
Is it more beneficial to include various site maps or just the one?
Thanks for your help!!
-
Thanks for your help
do you ffel it is important to have HTML + Video site maps as well? How does this make a differance?
-
How big we talking?
Probably best grabbing something server side if your CMS can't do it. Check out - http://code.google.com/p/sitemap-generators/wiki/SitemapGenerators - I know Google says they've not tested any (and neither have I) but they must have looked at them at some point.
Secondly you'll need to know how to submit multiple sitemap parts and how to break them up.
Looking at it Amazon seem to cap theirs at 50,000 and Ebay at 40,000, so I think you should be fine with numbers around there.
Here's how to set up multiple sitemaps in the same directory - http://googlewebmastercentral.blogspot.com/2006/10/multiple-sitemaps-in-same-directory.html
Once you've submitted your sitemaps Webmaster Tools will tell you how many URLs you've submitted vs. how many they've indexed.
-
Hey,
I'm assuming you mean XML sitemaps here: You can create a sitemap index file which essentially lists a number of sitemaps in one file (A sitemap of sitemap files if that makes sense). See http://www.google.com/support/webmasters/bin/answer.py?answer=71453
There are automatic sitemap generators out there - if you're site has categories with thousands of pages I'd split up them up and have a sitemap per category.
DD
-
To extract URLs, you can use Xenu Link Sleuth. Then you msut make a hiearchy of sitemaps so that all sitemaps are efficiently crawled by Google.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need Professional Help with Site Structure, Page Authority, and Internal Linking
We are a 10 year + small (1000 pages) niche ecommerce site (Magento) that has recently lost rankings to a competitor. We out perform them in every metric so I do not understand them leap frogging us to the top spot. This has forced me to look at my site structure, page authority (rank), and internal linking. After reviewing a moz crawl issues report, here are some of my observations: Root domain has a PA of 40 Top 3 $ Category pages have a PA of 22, 18, 18 Multiple meaningless blog posts and other category/product pages have PA’s of 30+ Here is a screenshot of the crawl report with internal links, links, etc showing. I need some help - thoughts, suggestions, next steps in analysis?
On-Page Optimization | | SammyT0 -
Does RSS Feed help to rank better in Google?
Hello, I heard RSS Feed helps in ranking. However, I am not sure if I should enable RSS Feed or not. Whenever I publish an article on my site , I see that many other websites have leeched my Feed and get's the same article I written published with a nofollow backlink to my website article. The worst part is that my article doesn't appear in Google search, but the website which copied my article gets ranked in Google. Although the article gets index on google (checked by using site:website.com). Although some articles show up after 24 hours by ranking higher from the sites which copied my article. Any idea what should I do? Thank you
On-Page Optimization | | hakhan2010 -
*** Please HELP *** A/B tests and optimisation implications
Hi Mozzers, We've been A-B testing landing pages, and have had some success. The changes we've been making have been quite radical in some instances - for example we tested this page: https://www.turnkeymortgages.co.uk/todays-mortgage-deals/ against this one: https://www.turnkeymortgages.co.uk/mortgage-quote/ (Today's best deals won, but we've decided to keep the quote page as it does work for some channels). The decision was made to try and optimise Today's best deals for 'best mortgage deals today' rather than 'mortgage quote' because it offers so much more than simply a quote. The quote page is optimised for 'mortgage quote', though it doesn't rank particularly well (I'm not overly concerned by this as even though you'd think that when people are looking for a quote that they would fill in the form, they don't - people are strange!) As a result of the change above we changed all links that originally went to the quote page to go to Today's best deals instead. As we go through the process of optimising for best conversion will it be damaging if we don't change the url as well. As I can see lots of iterations and lots of work whenever we make changes to the pages (going through the entire site to change the links). I am worried though that we'll end up with hundreds of landing pages and changing links all over the site - do you think we should keep the URLs the same from now on, unless the content changes as radically as it did in the instance I've highlighted above? Thanks, Amelia
On-Page Optimization | | CommT0 -
Can you reference http and https in a sitemap.xml?
I have a site where some pages (in Spanish) are https. The english pages are http. Can you tell me if it's okay to have both in the sitemap.xml?
On-Page Optimization | | RoxBrock0 -
Keyword and SERP Help Please
So I am curious about keyword placements etc. My main question is: So is whatever you search for in say Google must be the same in a website - to be found? So say you search for plumbers in Colorado Then you must have that exact, same phrase, in your website to be found? or does Google know based on title tags and such that a page is about plumbers and they service Colorado? I just want to make sure I am understanding how keywords work to be found. I mean you can have Colorado plumbers and plumbers in Colorado. So its hard to figure out how to use keywords. So a brief suggestion is greatly appreciated Chris
On-Page Optimization | | Berner0 -
Robots file include sitemap
Hello, I see that google, facebook and moz... have robots.txt include sitemap at the footer.
On-Page Optimization | | JohnHuynh
Eg: http://www.google.com.vn/robots.txt Sitemap: http://www.google.com/sitemaps_webmasters.xml
Sitemap: http://www.google.com/ventures/sitemap_ventures.xml Should I include my sitemap file (sitemap.xml) at the footer of robots.txt and why should do this? Thanks,0 -
Unable to see internal link numbers on Opensiteexplorer - Need help
I'm Anuj, a regular user of SEOMOZ. I need some SEO guidance from SEO experts. I'm trying to optimize a webstore for few keywords. I am facing some issues on SEO I was using https all over the webstore and was advised by the community members to not have https through out the site (Due to various reasons). The internal links were not showing up in opensiteexplorer & Google Webmaster Tools too when the site was with https (They were just showing 1 or 2). After changing the pages from https to http, I'm now able to see all the internal links of my website on GWT. Unfortunately, the internal link count on opensiteexplorer shows a very small fraction when compared to the # of internal links shown on GWT. The link update from Opensiteexplorer was on 27th FEB 2013. I had done the https to http (for all pages) somewhere between 17-24th of JANUARY 2013. I wanted to know if I have missed something as I am unable to see those numbers on Opensiteexplorer or will it take time for opensiteexplorer to show the internal link numbers ?
On-Page Optimization | | Pepperjet0