What is the point of XML site maps?
-
Given how Google uses Page Rank to pass link juice from one page to the next if Google can only find a page in an XML site map it will have no link juice and appear very low in search results if at all.
The priority in XML sitemaps field also seems pretty much irrelevant to me. Google determines the priority of a page based on the number of inbound links to it. If your site is designed properly the most important pages will have the most links.
The changefreq field could maybe be useful if you have existing pages that are updated regularly. Though it seems to me Google tends to crawl sites often enough that it isn't useful. Plus for most of the web the significant content of an existing page doesn't change regularly, instead new pages are added with new content.
This leaves the lastmod field as being potentially useful. If Google starts each crawl of your site by grabbing the sitemap and then crawls the pages whose lastmod date is newer than its last crawl of the site their crawling could be much more efficient. The site map would not need to contain every single page of the site, just the ones that have changed recently.
From what I've seen most site map generation tools don't do a great job with the fields other than loc. If Google can't trust the priority, changefreq, or lastmod fields they won't put any weight on them.
It seems to me the best way to rank well in Google is by making a good, content-rich site that is easily navigable by real people (and that's just the way Google wants it).
So, what's the point of XML site maps? Does the benefit (if any) outweigh the cost of developing and maintaining them?
-
Thanks Axial,
I'm not convinced it matters much if Google crawls deep pages they wouldn't find through organic links. If the pages aren't linked to they won't have any link juice and therefore won't rank well in SERPs.
The link about using site maps for canonical URLs says or implies you should only put your most important URLs in the sitemap. The sitemap tools I've seen tend to take a kitchen sink approach, which is needed if you are using it to try to get a deeper crawl. Plus there's no way (I see) in a sitemap to specify that page A is the canonical of page B. They simply suggest telling Google about page A (and not page B) in the hopes page A will get more weight than page B. A canonical meta tag on page B pointing to page A is obviously a much better way to deal with canonicals.
Image and video site maps are potentially valuable. I am asking specifically about site maps for pages.
Specifying related content for a given URL, such as different languages, is indeed useful and not something I was aware of. But it is not applicable on most sites and not used on most site maps.
-
Your sitemap.xml will help googlebot crawl deep pages, but it serves other purposes such as:
-
helping Google identify canonical pages: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139066#3
-
creating sitemaps for video, images, etc.: "you can also use Sitemaps to provide Google with metadata about specific types of content on your site, including video, images, mobile, and News. For example, a video Sitemap entry can specify the running time, category, and family-friendly status of a video; an image Sitemap entry can provide information about an image’s subject matter, type, and license." http://support.google.com/webmasters/bin/answer.py?hl=en&hlrm=fr&answer=156184
-
you can specify alternate content, such as the URL of a translated page: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=2620865
-
and more.
Sometimes working with a sitemap is less risky and maintenance is easier, especially when your CMS is limitative. The 3rd point is a good example. You may also appreciate the centralized approach more from a personnal point of view.
There are good resources on the Google webmaster resources, check them out.
Hope this helps!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our Sites Organic Traffic Went Down Significantly After The June Core Algorithm Update, What Can I Do?
After the June Core Algorithim Update, the site suffered a loss of about 30-35% of traffic. My suggestions to try to get traffic back up have been to add metadata (since the majority of our content is lacking it), as well ask linking if possible, adding keywords to alt images, expanding and adding content as it's thin content wise. I know that from a technical standpoint there are a lot of fixes we can implement, but I do not want to suggest anything as we are onboarding an SEO agency soon. Last week, I saw that traffic for the site went back to "normal" for one day and then saw a dip of 30% the next day. Despite my efforts, traffic has been up and down, but the majority of organic traffic has dipped overall this month. I have been told by my company that I am not doing a good job of getting numbers back up, and have been given a warning stating that I need to increase traffic by 25% by the end of the month and keep it steady, or else. Does anyone have any suggestions? Is it realistic and/or possible to reach that goal?
Algorithm Updates | | NBJ_SM2 -
Our site dropped by April 2018 Google update about content relevance: How to recover?
Hi all, After Google's confirmed core update in April 2018, we dropped globally and couldn't able to recover later. We found the update is about the content relevance as officially stated by Google later. We wonder how we are not related in-terms of content being ranking for same keywords over years. And we are expecting to find a solution to this. Are there any standard ways to measure the content relevancy? Please suggest! Thank you
Algorithm Updates | | vtmoz0 -
Placement of /p/ in URL structure for ecommerce site product URLs
Hi, We're a discussion about how to structure a clients ecommerce site product page URLs where 12345 represent the product SKU/number: https://domain.com/Item--i-12345 https://domain.com/product-name/p/12345 https://domain.com/p/12345 It's a toss up between the second and the third URL, but the SEO company is saying the third is best because of the placement with the /p/ and creating a silo for "products" that help search engines recognize it is a product. Does anyone have thoughts on this? Thanks!
Algorithm Updates | | AliMac260 -
Optimized site-wide internal links in footer - a problem?
Hello all - I am looking at a website with 8 heavily keyword optimized site-wide links in the footer. Yes, there are only 8 but it looks a bit spammy and I'm tempted to remove them. I imagine there's some possibility of a Google penalty too? What would your advice be? Thanks, Luke
Algorithm Updates | | McTaggart0 -
Article & Press releases submission sites
Earlier we used to submit single Article & Press releases to 100+ sites..... Now shall we submit to only one site? Is it the same with Infographics?
Algorithm Updates | | zigmund0 -
How to get Yahoo visitors to my site
I get great traffic from Google but Yahoo is at about a 20 to 1 ratio on visitors. Is there anything I should do to increase Yahoo traffic? I bought a Yahoo Directory listing about 3 months ago but it did no good. Thanks, Boo
Algorithm Updates | | Boodreaux0 -
Does google have the worst site usability?
Google tells us to make our sites better for our readers, which we are doing, but do you think google has horrible site usabilty? For example, in webmaster tools, I'm always being confused by their changes and the way they just drop things. In the HTML suggestions area, they don't tell you when the data was last updated, so the only way to tell is to download the files and check. In the URL removals, they used to show you the URLs they had removed. Now that is gone and the only way you can check is to try adding one. We don't have any URL parameters, so any parameters are as a result of some other site tacking on stuff at the end of our URL and there is no way to tell them that we don't have any parameters, so ignore them all. Also, they add new parameters they find on the end of the list, so the only way to check is to click through to the end of the list.
Algorithm Updates | | loopyal0 -
Google said that low-quality pages on your site may affect rankings on other parts
One of my sites got hit pretty hard during the latest Google update. It lost about 30-40% of its US traffic and the future does not look bright considering that Google plans a worldwide roll-out. Problem is, my site is a six year old heavy linked, popular Wordpress blog. I do not know why the article believes that it is low quality. The only reason I came up with is the statement that low-quality pages on a site may affect other pages (think it was in the Wired article). If that is so, would you recommend blocking and de-indexing of Wordpress tag, archive and category pages from the Google index? Or would you suggest to wait a bit more before doing something that drastically. Or do you have another idea what I could to do? I invite you to take a look at the site www.ghacks.net
Algorithm Updates | | badabing0