Should I submit a sitemap for a site with dynamic pages?
-
I have a coupon website (http://couponeasy.com)
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically.I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical.
I have about 8-9 pages which are static and hence I can include them in sitemap.
Now the question is....
If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages?
NOTE: I need to create the sitemap for getting expanded sitelinks.
-
Hi Anuj -
I think you are operating from a very false assumption that is going to hurt your organic traffic (I suspect it has already).
The XML sitemap is one of the the very best ways to tell the search engines about new content on your website. Therefore, by not putting your new coupons in the sitemap, you are not giving the search engines one of the strongest signals possible that new content is there.
Of course, you have to automate your sitemap and have it update as often as possible. Depending on the size of your site and therefore the processing time, you could do it hourly, every 4 hours, something like that. If you need recommendations for automated sitemap tools, let me know. I should also point out that you should put the frequency that the URLs are updated (you should keep static URLs for even your coupons if possible). This will be a big win for you.
Finally, if you want to make sure your static pages are always indexed, or want to keep an eye on different types of coupons, you can create separate sitemaps under your main sitemap.xml and segment by type. So static-pages-sitemap.xml, type-1-sitemap.xml, etc. This way you can monitor indexation by type.
Hope this helps! Let me know if you need an audit or something like that. Sounds like there are some easy wins!
John
-
Hello Ahuj,
To answer your final question first:
Crawlers will not stop until they encounter something they cannot read or are told not to continue beyond a certain point. So your site will be updated in the index upon each crawl.
I did some quick browsing and it sounds like an automated sitemap might be your best option. Check out this link on Moz Q&A:
https://mza.seotoolninja.com/community/q/best-practices-for-adding-dynamic-url-s-to-xml-sitemap
There are tools out there that will help with the automation process, which will update hourly/daily to help crawlers find your dynamic pages. The tool suggested on this particular blog can be found at:
http://www.xml-sitemaps.com/standalone-google-sitemap-generator.html
I have never used it, but it is worth looking into as a solution to your problem. Another good suggestion I saw was to place all removed deals in an archive page and make them unavailable for purchase/collection. This sounds like a solution that would minimize future issues surrounding 404's, etc.
Hope this helps!
Rob
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Where can I find either web directories or decent sites that will link back to me...without paying thru the nose?
Hi Community! For starters, from my question, you can probably determine that I am either a novice, or very low intermediate SEO person. We all know, links are king with our friends at Google. I am a recently-retired IT person with a very small freelance IT company (just me), and I'd like to generate more leads/business. I'd be thrilled with one or two small jobs per week to supplement my pension. I've used the MOZ tools only to determine the majority of my competitors have like a thousand links, whereas Google reports me as having 97 links. My on-page grades are all upper 90s and a couple are at 100%. I am not targeting really competitive keywords, so that's not my problem. My problem is my DA, which is sitting at a mere 20...pause to laugh! I don't want to pay thru the nose for high DA links for the obvious reasons. I submitted my URLs to as many directories as I could find. Would anyone have a decent list of sites where I could submit my URL to get some more links? Thanks guys and Gals! Willy
White Hat / Black Hat SEO | | NewSEOguy0 -
1st Ecommerce site got penalized, can we start a 2nd one?
Hello, A client's first site got penalized by Goolge Penguin. It has recovered through cleaning up backlinks, but not to where it was before. It is 2nd and 3rd for several money keywords, but is far less successful than before penalization. We are starting a second site. Here's the important steps to mention The new site shows up first for it's domain name, and it has 30 pages indexed. It shows up NOWHERE for our leading search term. Out other site has a blog post that is 3rd for that search term. We are using new categories and new organization. We are using a different cart solution We are adding all unique content The home pages and some of the product pages are very thorough. We are adding comprehensive products like nothing else in the industry (10X) We plan on adding a very comprehensive blog, but haven't started yet. We've added the top 100 products so far. Our other store has 500. There's a lot of spam in the industry, so sites are slow to rank. Our category descriptions are 500 words Again, all unique content. No major errors in Moz Campaign tools Just a few categories so far, we're going to add many more. Same Google Analytics account as our other site It looks like we should eventually be on page 3 for our major search term. Again, we're nowhere for anything right now. ... Have you seen that Google will not rank a second site because it's from the same company and Google Analytics account, or does Google let you rank 2 sites in the same industry? We are hoping it's just slow to rank. If you can rank 2 sites, what are your best recommendations to help show up? Thanks.
White Hat / Black Hat SEO | | BobGW0 -
What can I put on a 404 page?
When it comes to SEO what can I put on a 404 page? I want to add content that actually makes the page useful so visitors will more likely stay on the website. Most pages just have a big image of 404 and a couple sentences saying what happened. I am wondering if Google would like if there was blog suggestions or navigational functions?
White Hat / Black Hat SEO | | JoeyGedgaud0 -
Dynamic Content Boxes: how to use them without get Duplicate Content Penalty?
Hi everybody, I am starting a project with a travelling website which has some standard category pages like Last Minute, Offers, Destinations, Vacations, Fly + Hotel. Every category has inside a lot of destinations with relative landing pages which will be like: Last Minute New York, Last Minute Paris, Offers New York, Offers Paris, etc. My question is: I am trying to simplify my job thinking about writing some dynamic content boxes for Last Minute, Offers and the other categories, changing only the destination city (Rome, Paris, New York, etc) repeated X types in X different combinations inside the content box. In this way I would simplify a lot my content writing for the principal generic landing pages of each category but I'm worried about getting penalized for Duplicate Content. Do you think my solution could work? If not, what is your suggestion? Is there a rule for categorize a content as duplicate (for example number of same words in a row, ...)? Thanks in advance for your help! A.
White Hat / Black Hat SEO | | OptimizedGroup0 -
Subtle On-site Factors That Could Cause a Penalty
It looks like we have the same penalties on more than one ecommerce site. What subtle on-site factors can contribute to non-manual penalty, specifically rankings slowly going down for all short tail keywords? And what does it take to pull yourself out of these penalties?
White Hat / Black Hat SEO | | BobGW0 -
New Site Structure
Greetings SEOmoz Team and Users, I need some advise, our site has more products to offer so I am try to optimize the index for a general term and each page product for it's own main keyword. Our site offers accommodation such apartments, hotels and vacation rentals so this is my structure: Index: Main Keyword 1 | Keyword 2 | Site name(brand name) Page Product 1: Main Keyword 1 | Keyword 2 | Site name (brand name) Page Product 2: Main Keyword 1 | Keyword 2 | Site name (brand name) Also can I use the brand name at the end of title tag with separate words ? example: londonescape or london escape or londonescape.net London Apartments | short term london apartments | London Escape or London Apartments | short term london apartments | LondonEscape I think ''London Escape'' is better because has more popularity. Looking forward to hear from you. Thanks, Giuseppe
White Hat / Black Hat SEO | | WorldEscape0 -
Finding out why Bing gave page-level penalty?
In the last couple of weeks Bing has gradually removed 5 webpages of my website from their SERP's. The URL's are totally gone. They all had top 5 rankings and just got removed out of nothing. Have can I investigate what went wrong with these pages? Are here perhaps experts who are willing to investigate this for a fee? How can I restore a page-level penalty? I have no messages in my Bing Webmastertools account.
White Hat / Black Hat SEO | | wellnesswooz0