Best XML Sitemap generator
-
Do you guys have any suggestions on a good XML Sitemaps generator? hopefully free, but if it's good i'd consider paying
I am using a MAC so would prefer a online or mac version
-
Hi James - i saw your reply on this thread and a quick question - i was running Gsitecrawler, after selecting all the suitable options , it opens up a "Crawl watch" page. While I am assuming it is crawling the site, as per the online instruction it says to select the "Generate" tab at the main application window (I did not opt for auto ftp).
When should I select the Generate option, immediately or wait for crawl to complete?
suparno
-
The only way to find out is to shoot them an e-mail. Either way you will discover the answer
-
I am wondering if they are talking about the paid version cus I run it on my site. www.psbspeakers.com and it comes up with all kinds of dup content.
<loc>http://www.psbspeakers.com/products/image/Image-B6-Bookshelf</loc>
<loc>http://www.psbspeakers.com/products/bookshelf-speakers/Image-B6-Bookshelf</loc>with this code siteing on both pages:
<link rel="canonical" href="http://www.psbspeakers.com/products/image/Image-B6-Bookshelf"/> -
I am wondering if they are talking about the paid version cus I run it on my site. www.psbspeakers.com and it comes up with all kinds of dup content.
<loc>http://www.psbspeakers.com/products/image/Image-B6-Bookshelf</loc>
<loc>http://www.psbspeakers.com/products/bookshelf-speakers/Image-B6-Bookshelf</loc>with this code siteing on both pages:
<link rel="canonical" href="http://www.psbspeakers.com/products/image/Image-B6-Bookshelf"/> -
I e-mailed their support and they shared it does support canonical tags. Below is the response I received:
Hi,
The script will detect canonical tags. If you can provide a live example we can look into for you.Regards,PhilipXML-Sitemaps.com-----------------------------I would suggest ensuring your tags are valid. If they are, contact the site support and they can provide specific feedback.
-
Thanks Ryan.
That's the one I already use, but it does not take canonical's into account so i end up with 2-3 links for the same page.
-
A popular sitemap generator: http://www.xml-sitemaps.com/
I cannot say it is the best but rather it works fine. The free online version will scan 500 pages. For $20, you can then have unlimited number of pages.
-
Sorry I should have said... I am on a mac ;(
is there any online ones around that don't have a cap of 500 pages? -
GsiteCrawler every time. It's free and It's an awesome awesome tool http://gsitecrawler.com/
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Would a Search Engine treat a sitemap hosted in the cloud in the same way as if it was simply on /sitemap.htm?
Mainly to allow updates without the need for publishing - would Google interpret any differently? Thanks
Technical SEO | | RichCMF0 -
New SEO manager needs help! Currently only about 15% of our live sitemap (~4 million url e-commerce site) is actually indexed in Google. What are best practices sitemaps for big sites with a lot of changing content?
In Google Search console 4,218,017 URLs submitted 402,035 URLs indexed what is the best way to troubleshoot? What is best guidance for sitemap indexation of large sites with a lot of changing content? view?usp=sharing
Technical SEO | | Hamish_TM1 -
Dynamic Url best approach
Hi We are currently in the process of making changes to our travel site where by if someone does a search this information can be stored and also if the user needs to can take the URL and paste into their browser at find that search again. The url will be dynamic for every search, so in order to stop duplicate content I wanted ask what would be the best approach to create the URLS. ** An example of the URL is: ** package-search/holidays/hotelFilters/?depart=LGW&arrival=BJV&sdate=20150812&edate=20150819&adult=2&child=0&infant=0&fsearch=first&directf=false&nights=7&tsdate=&rooms=1&r1a=2&r1c=0&r1i=0&&dest=3&desid=1&rating=&htype=all&btype=all&filter=no&page=1 I wanted to know if people have previous experience in something like this and what would be the best option for SEO. Will we need to create the URL with a # ( As i read this stops google crawling after the #) Block the folder IN ROBOTS is there any other areas I should be aware of in order stop any duplicate content and 404 pages once the URL/HOLIDAY SEARCH is no longer valid. thanks E
Technical SEO | | Direct_Ram0 -
Robots.txt and Multiple Sitemaps
Hello, I have a hopefully simple question but I wanted to ask to get a "second opinion" on what to do in this situation. I am working on a clients robots.txt and we have multiple sitemaps. Using yoast I have my sitemap_index.xml and I also have a sitemap-image.xml I do put them in google and bing by hand but wanted to have it added into the robots.txt for insurance. So my question is, when having multiple sitemaps called out on a robots.txt file does it matter if one is before the other? From my reading it looks like you can have multiple sitemaps called out, but I wasn't sure the best practice when writing it up in the file. Example: User-agent: * Disallow: Disallow: /cgi-bin/ Disallow: /wp-admin/ Disallow: /wp-content/plugins/ Sitemap: http://sitename.com/sitemap_index.xml Sitemap: http://sitename.com/sitemap-image.xml Thanks a ton for the feedback, I really appreciate it! :) J
Technical SEO | | allstatetransmission0 -
How to best remove old pages for SEO
I run an accommodation web site, each listing has its own page. When a property is removed what is the best way to handle this for SEO because the URL will no longer be valid and there will be a blank page.
Technical SEO | | JamieHibbert0 -
How best to optimise a website for more than one location?
I have a client who is a acupuncturist and operates clinics both in Chester and Knutsford in Cheshire the site performs well for Chester based terms such as "Chester acupuncture" this is the primary location the client wishes to focus efforts on but would also like to improve rankings for the Knutsford clinic and area. I have setup local places pages for each clinic and registered each on different local directories. Both clinic addresses are placed on each page of the website and have a map to each on the contact page. Most of the on-page SEO elements such as page titles, descriptions and on-page keywords mainly focus on the term "Chester" over "Knutsford" is it advisable to target both locations in these page elements or will local search have an effect on this and will reduce/ dilute overall rankings for Chester clinic? I haven't setup and separate page for each clinic location as this might help in terms of SEO for improving ranking for both locations but from a user point of view it would just duplicate the same content but for a different location and also would create duplicate content issues. Any advice/ experience on this matter would be greatly appreciated.
Technical SEO | | Bristolweb0 -
Best 404 Error Checker?
I have a client with a lot of 404 errors from Web Master Tools, and i have to go through and check each of the links because Some redirect to the correct page Some redirect to another url but its a 404 error Some are just 404 errors Does anyone know of a tool where i can dump all of the urls and it will tell me If the url is redirected, and to where if the page is a 404 or other error Any tips or suggestions will be really appreciated! Thanks SEO Moz'rs
Technical SEO | | anchorwave0