Why do some sites have several types of sitemap?
-
Hello Mozzers, I often seem to work on websites with several types of sitemaps - e.g. an html sitemap - an xml sitemap - almost always with identical structure and content. Does anybody know the thinking behind this?
Currently looking at site with php and xml sitemap sitting alongside one another. I'm guessing one is for site users to read (and also to aid indexing) and the other for search engines, to further aid indexing.
Does Google have any preferences? Is there anything you should be wary of re: Google, if there are multiple sitemaps?
-
Thanks for the clarification Peter - appreciated! Luke
-
Hi Luke
PHP is a scripting language not a sitemap format.
If you mean a PHP page, e,g sitemap.php showing a sitemap with clickable links then that will be the same as an HTML sitemap - so good for humans but not the ideal for search engines.
You can get PHP sitemap generators, but that means that they will generate a sitemap by using PHP scripting rather than creating a PHP sitemap.
I hope that helps,
Peter -
Thanks Peter - I'm used to seeing .html and .xml sitemaps but not .php - do you know whether a .php sitemap is a problem in any way?
-
Hi Luke
HTML sitemaps are suitable for humans because you can then put them up as a HTML page that can be viewed and clicked from. XML sitemaps are really for search engines and are the standard that Google, Bing and Yahoo all prefer. See more info here:
http://en.wikipedia.org/wiki/Sitemap#XML_Sitemaps
I hope that helps,
Peter
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 redirecting a site that currently links to the target site
I have a personal blog that has a good amount of back links pointing at it from high quality relevant authoritative sites in my niche. I also run a company in the same niche. I link to a page on the company site from the personal blog article that has bunch of relevant links pointing at it (as it's highly relevant to the content on the personal blog). Overview: Relevant personal blog post has a bunch of relevant external links pointing at it (completely organic). Relevant personal blog post then links (externally) to relevant company site page and is helping that page rank. Question: If I do the work to 301 the personal blog to the company site, and then link internally from the blog page to the other relevant company page, will this kill that back link or will the internal link help as much as the current external link does currently? **For clarity: ** External sites => External blog => External link to company page VS External sites => External blog 301 => Blog page (now on company blog) => Internal link to target page I would love to hear from anyone that has performed this in the past 🙂
Intermediate & Advanced SEO | | Keyword_NotProvided0 -
On-site Search - Revisited (again, *zZz*)
Howdy Moz fans! Okay so there's a mountain of information out there on the webernet about internal search results... but i'm finding some contradiction and a lot of pre-2014 stuff. Id like to hear some 2016 opinion and specifically around a couple of thoughts of my own, as well as some i've deduced from other sources. For clarity, I work on a large retail site with over 4 million products (product pages), and my predicament is thus - I want Google to be able to find and rank my product pages. Yes, I can link to a number of the best ones by creating well planned links via categorisation, silos, efficient menus etc (done), but can I utilise site search for this purpose? It was my understanding that Google bots don't/can't/won't use a search function... how could it? It's like expeciting it to find your members only area, it can't login! How can it find and index the millions of combinations of search results without typing in "XXXXL underpants" and all the other search combinations? Do I really need to robots.txt my search query parameter? How/why/when would googlebot generate that query parameter? Site Search is B.A.D - I read this everywhere I go, but is it really? I've read - "It eats up all your search quota", "search results have no content and are classed as spam", "results pages have no value" I want to find a positive SEO output to having a search function on my website, not just try and stifle Mr Googlebot. What I am trying to learn here is what the options are, and what are their outcomes? So far I have - _Robots.txt - _Remove the search pages from Google _No Index - _Allow the crawl but don't index the search pages. _No Follow - _I'm not sure this is even a valid idea, but I picked it up somewhere out there. _Just leave it alone - _Some of your search results might get ranked and bring traffic in. It appears that each and every option has it's positive and negative connotations. It'd be great to hear from this here community on their experiences in this practice.
Intermediate & Advanced SEO | | Mark_Elton0 -
Multiple Ecommerce sites, same products
We are a large catalog company with thousands of products across 2 different domains. Google clearly knows that the sites are connected. Both domains are fairly well known brands - thousands of branded searches for each site per month. Roughly half of our products overlap - they appear on both sites. We have a known duplicate content issue - both sites having exactly the same product descriptions, and we are working on it. We've seen that when a product has different content on the 2 sites, frequently, both pages get to page 2 of the SERPs, but that's as far as it goes, despite aggressive white hat link building tactics. 1. Is it possible to get the same product pages on page 1 of the SERPs for both sites? (I think I know the answer...) 2. Should we be canonicalizing (is that a word?) products across the sites? This would get tricky - both sites have roughly the same domain authority, but in different niches. Certain products and keywords naturally rank better on 1 site or the other depending on the niche.
Intermediate & Advanced SEO | | AMHC0 -
Migrating a very large site
hello, we are thinking about changing imageworksstudio.com to imageworkscreative.com we have TONS of of pages and want to make sure that our Page Rank and Rankings are maintained. Are there any best practices or specific guides for redirecting for such a large site (which already has a bunch of redirects in place in the first place) Thanks!
Intermediate & Advanced SEO | | imageworks-2612900 -
Need a mobile XML Sitemap?
We're going to be running our mobile site on the same domain and generating content for users on mobile devices with style sheets (will not have m.domain). The content on our URLs will be the exact same. My question is if we need to create a mobile XML Sitemap to submit to the search engines. Do we need to create the Sitemap, that will contain the exact same URLs as our non-mobile Sitemap, and just include <mobile><mobile>tags around the URLs? Or do we need to create a mobile Sitemap at all to alert the search engines that we have mobile content?</mobile></mobile> Thanks!
Intermediate & Advanced SEO | | bonnierSEO0 -
Reliable outsource site
Hi which is the most quality site to outsource my backlinks? freelancer.com odesk.com any other? From elance I am very disappointed. Thanks
Intermediate & Advanced SEO | | nyanainc0 -
Load balanced Site
Our client ecommerce site load from 3 different servers using load balancing. abc.com : IP: 222.222.222 Abc.com: IP: 111.111.111 For testing purpose 111.111.111 also point to beta.abc.com Now google crawling site beta.abc.com If we block beta.abc.com using robots.txt it will block google bot also , since beta.abc.com is really abc.com I know its confusing but I been trying to figure out. Ofcourse I can ask my dev to remove beta.abc.com make a seperate code and block it using .htaccess
Intermediate & Advanced SEO | | tpt.com0 -
Should the sitemap include just menu pages or all pages site wide?
I have a Drupal site that utilizes Solr, with 10 menu pages and about 4,000 pages of content. Redoing a few things and we'll need to revamp the sitemap. Typically I'd jam all pages into a single sitemap and that's it, but post-Panda, should I do anything different?
Intermediate & Advanced SEO | | EricPacifico0