Sitemap Help!
-
Hi Guys,
Quick question regarding sitemaps. I am currently working on a huge site that has masses of pages.
I am looking to create a site map. How would you guys do this? i have looked at some tools but it say it will only do up to 30,000 pages roughly. It is so large it would be impossible to do this myself....any suggestions?
Also, how do i find out how many pages my site actually has indexed and not indexed??
Thank You all
Wayne
-
The problem that I have with CMS side sitemap generators is that it often pulls content from pages that are existing and adds entries based off that information. If you have pages linked to that are no longer there, as is the case with dynamic content, then you'll be imposing 404's on yourself like crazy.
Just something to watch out for but it's probably your best solution.
-
Hi! With this file, you can create a Google-friendly sitemap for any given folder almost automatically. No limits on the number of files. Please note that the code is the courtesy of @frkandris who generously helped me out when I had a similair problem. I hope it will be as helpful to you as it was to me
- Copy / paste the code below into a text editor.
- Edit the beginning of the file: where you see seomoz.com, put your own domain name there
- Save the file as getsitemap.php and ftp it to the appropriate folder.
- Write the full URL in your browser: http://www.yourdomain.com/getsitemap.php
- The moment you do it, a sitemap.xml will be generated in your folder
- Refresh your ftp client and download the sitemap. Make further changes to it if you wish.
=== CODE STARTS HERE ===
define(DIRBASE, './');define(URLBASE, 'http://www.seomoz.com/'); $isoLastModifiedSite = "";$newLine = "\n";$indent = " ";if (!$rootUrl) $rootUrl = "http://www.seomoz.com"; $xmlHeader = "$newLine"; $urlsetOpen = "<urlset xmlns=""http://www.google.com/schemas/sitemap/0.84"" ="" <="" span="">xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.google.com/schemas/sitemap/0.84 http://www.google.com/schemas/sitemap/0.84/sitemap.xsd">$newLine";$urlsetValue = "";$urlsetClose = "</urlset>$newLine"; function makeUrlString ($urlString) { return htmlentities($urlString, ENT_QUOTES, 'UTF-8');} function makeIso8601TimeStamp ($dateTime) { if (!$dateTime) { $dateTime = date('Y-m-d H:i:s'); } if (is_numeric(substr($dateTime, 11, 1))) { $isoTS = substr($dateTime, 0, 10) ."T" .substr($dateTime, 11, ."+00:00"; } else { $isoTS = substr($dateTime, 0, 10); } return $isoTS;} function makeUrlTag ($url, $modifiedDateTime, $changeFrequency, $priority) { GLOBAL $newLine; GLOBAL $indent; GLOBAL $isoLastModifiedSite; $urlOpen = "$indent<url>$newLine";</url> $urlValue = ""; $urlClose = "$indent$newLine"; $locOpen = "$indent$indent<loc>";</loc> $locValue = ""; $locClose = "$newLine"; $lastmodOpen = "$indent$indent<lastmod>";</lastmod> $lastmodValue = ""; $lastmodClose = "$newLine"; $changefreqOpen = "$indent$indent<changefreq>";</changefreq> $changefreqValue = ""; $changefreqClose = "$newLine"; $priorityOpen = "$indent$indent<priority>";</priority> $priorityValue = ""; $priorityClose = "$newLine"; $urlTag = $urlOpen; $urlValue = $locOpen .makeUrlString("$url") .$locClose; if ($modifiedDateTime) { $urlValue .= $lastmodOpen .makeIso8601TimeStamp($modifiedDateTime) .$lastmodClose; if (!$isoLastModifiedSite) { // last modification of web site $isoLastModifiedSite = makeIso8601TimeStamp($modifiedDateTime); } } if ($changeFrequency) { $urlValue .= $changefreqOpen .$changeFrequency .$changefreqClose; } if ($priority) { $urlValue .= $priorityOpen .$priority .$priorityClose; } $urlTag .= $urlValue; $urlTag .= $urlClose; return $urlTag;} function rscandir($base='', &$data=array()) { $array = array_diff(scandir($base), array('.', '..')); # remove ' and .. from the array / foreach($array as $value) : / loop through the array at the level of the supplied $base / if (is_dir($base.$value)) : / if this is a directory / $data[] = $base.$value.'/'; / add it to the $data array / $data = rscandir($base.$value.'/', $data); / then make a recursive call with the current $value as the $base supplying the $data array to carry into the recursion / elseif (is_file($base.$value)) : / else if the current $value is a file / $data[] = $base.$value; / just add the current $value to the $data array */ endif; endforeach; return $data; // return the $data array } function kill_base($t) { return(URLBASE.substr($t, strlen(DIRBASE)));} $dir = rscandir(DIRBASE);$a = array_map("kill_base", $dir); foreach ($a as $key => $pageUrl) { $pageLastModified = date ("Y-m-d", filemtime($dir[$key])); $pageChangeFrequency = "monthly"; $pagePriority = 0.8; $urlsetValue .= makeUrlTag ($pageUrl, $pageLastModified, $pageChangeFrequency, $pagePriority); } $current = "$xmlHeader$urlsetOpen$urlsetValue$urlsetClose"; file_put_contents('sitemap.xml', $current); ?>
=== CODE ENDS HERE ===
-
HTML sitemaps are good for users; having 100,000 links on a page though, not so much.
If you can (and certainly with a site this large) if you can do video and image sitemaps you'll help Google get around your site.
-
Is there any way i can see pages that have not been indexed?
Not that I can tell and using site: isn't going to be feasible on a large site I guess.
Is it more beneficial to include various site maps or just the one?
Well, the max files size is 50,000 or 10MB uncompressed (you can gzip them), so if you've more than 50,000 URLs you'll have to.
-
Is there any way i can see pages that have not been indexed?
Is it more beneficial to include various site maps or just the one?
Thanks for your help!!
-
Thanks for your help
do you ffel it is important to have HTML + Video site maps as well? How does this make a differance?
-
How big we talking?
Probably best grabbing something server side if your CMS can't do it. Check out - http://code.google.com/p/sitemap-generators/wiki/SitemapGenerators - I know Google says they've not tested any (and neither have I) but they must have looked at them at some point.
Secondly you'll need to know how to submit multiple sitemap parts and how to break them up.
Looking at it Amazon seem to cap theirs at 50,000 and Ebay at 40,000, so I think you should be fine with numbers around there.
Here's how to set up multiple sitemaps in the same directory - http://googlewebmastercentral.blogspot.com/2006/10/multiple-sitemaps-in-same-directory.html
Once you've submitted your sitemaps Webmaster Tools will tell you how many URLs you've submitted vs. how many they've indexed.
-
Hey,
I'm assuming you mean XML sitemaps here: You can create a sitemap index file which essentially lists a number of sitemaps in one file (A sitemap of sitemap files if that makes sense). See http://www.google.com/support/webmasters/bin/answer.py?answer=71453
There are automatic sitemap generators out there - if you're site has categories with thousands of pages I'd split up them up and have a sitemap per category.
DD
-
To extract URLs, you can use Xenu Link Sleuth. Then you msut make a hiearchy of sitemaps so that all sitemaps are efficiently crawled by Google.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are Pages that have High Traffic and High Bounce Rates helpful?
I work as a marketing director for a small law firm. I had some basic SEO experience prior to joining the firm but I'm still building my knowledge base. I'm working to take over where the previous web developer left off and improve the site. The previous firm had developed pages for the law firm featuring yearly car crash statistics, statistics for bike crashes, etc. These pages get a good amount of hits but people quickly jump off the pages, and most are not even located in a region where they could potentially become a client. I feel like they added the pages to show the client that there was an increase in search traffic. However, 100% of the traffic bounces quickly after viewing the statistics pages. Does a page that has a decent amount of hits, but doesn't lead to conversions and has a nearly 100% bounce rate worth keeping around? My thought is, I would like to remove these pages that do not actually attract potential clients and reduce the overall bounce rate of the site. I don't want to take actions to remove the page without throwing this question out to more experienced people.
On-Page Optimization | | Champlain_Valley_Law1 -
Wordpress 'Hide Title' Feature, does this help shorten title length
Im wondering if anyone with some Wordpress experience can help me. I am using Yoast to create my page titles, but yet Moz tells me that my page titles including my actual page title tag which is 'dumfries wedding photography | Hemera Visuals' by clicking on the 'hide title' feature in wordpress will this in turn stop wordpress from automatically adding my page title and therfor bring my title length down drastically? And if so will I have to wait till google next crawls my page to see if this works? Kind Regards Cameron.
On-Page Optimization | | hemeravisuals120 -
[HELP!] File Name and ALT Tags
Hi, please answer my questions: 1. Is it okay to use the same keyword on both file name and alt tags when inserting an image? Example: File Name: buy-lego-online.jpg ALT tag: buy-lego-online Will it trigger Google Panda? Will I be penalized for that? Or the file name and alt tags should be different from each other? Because when inserting an image on Wordpress, the alt tags are always the same as the file name by default. 2. For example, I have 2 images in a page (same topic/niche) and I will put "cheap-lego-for-kids" and "best-lego-for-sale" as alt tags. Considering that I repeat the word "lego", is it considered keyword stuffing? Will I be penalized for that? Thanks in advance!
On-Page Optimization | | bubblymaiko0 -
Project Help
Hi guys, I'm working on a project and I need to identify any on-site issues that may be affecting this website: http://www.jerseystamps.com/ If anyone has any thoughts, I would be more than happy to hear them. Thanks guys 🙂
On-Page Optimization | | AAttias0 -
Help With Duplicated Content
Hi Moz Community, I am having some issue's with duplicated content, i recently removed the .html from all of our links and moz has reported it as being duplicated. I have been reading up about Canonicalization and would to verify some details, when using the canonical tag would it be placed in the /mywebpage.html or /mywebpage file? I am having a hard time to sort this out so any help from you SEO experts would be great 🙂 I have also updated my htaccess file with the following Thanks in advance
On-Page Optimization | | finelinewebsolutions0 -
I have a question about having to much content on a single page. Please help :)
I am working on a music related site. We are building a feature in our system to allow people to write information about songs on their playlist. So when a song is currently being played a user can read some cool facts or information about the song. http://imgur.com/5jFumPW ( screenshot). Some playlists have over 100 songs and could be completely random in genre and artist. I am wondering if some of these playlists have over 5,000 words of content if that is going to hurt us? We will be very strict about making sure its non spammy and good content. Also for the titles of the content is it bad to have over 100 h3 tags on one page? Just want to make sure we are on the right track. Any advice is greatly appreciated.
On-Page Optimization | | mikecrib10 -
Canonical Help?
This canonical thing is brand new to me and I'm trying to wrap my mind around it. Here is my situation: I use Wordpress. I am showing duplicate content with the following url's http://crosstrainingandfitness.com/online-workout-blog/ http://crosstrainingandfitness.com/online-workout-blog/page/2/ Would setting a canonical link solve this? If so, what do I put in the Canonical box for this category (online workout blog). I use Yoast's Wordpress SEO plugin. Any help is greatly appreciated.
On-Page Optimization | | carbbon0 -
I changed my site from HTML to PHP and I need to get some help.
Ok...so the other day I went from HTML to PHP in every part of my website. I want to know the best option for me for redirecting my pages from HTML to php. I had my site scanned with SEOMoz and I was given many 404 errors which is not at all good. I do not have any pages of my site linking to any of these html pages. All of the site links have been updated. I have checked 3 times. I have never created a robots.txt file so I would love to get a little help with this part. I was thinking it would be best to tell Google not to worry about these pages in the file. I kept the pages up and I plan to remove all code with them so that no content shows up if someone visits but the issue with that is my site is already indexed as HTML. I want to have the HTML pages redirect to the PHP without worrying that my visitors will land on my site via Google onto an HTML page. I hope I am making sense. What is the best advice you can give me. I need all pages to redirect to PHP. I used an htaccess redirect from all HTML to PHP but when I get so many of them added I get an error on my site saying too many redirects. Seriously need help.
On-Page Optimization | | TrendyHost0