Deal that expire what should i do?
-
Hey there Awesome team of Webmaster Forums,
Lets assume that I have a page that have deals in it. Those deals after a certain period of time expire. What should I do with the expired pages?My opinion is this.The page keeps the same URL but inside there is a content saying "Sorry but this deal has expired .... "and have some relevant deals beneath ORRedirect to a universal expired page. Kind Regards
-
Thanks mate. That was a very informative answer.
It's not that I am making some hand made items as the example of Matt was but I have a very small amount of deals that come and go every now and then.
I will not redirect or give a 404. I think that I will keep the page but explain that the deal is over and that there are more deals relevant to this one.
Forcible redirecting in my opinion is the worst in every situation except if the intent of the next page is Exactly the same as the previous which 99% of the times is not.
404 could be ok but the deals i offer are hard won and i dont want the traffic to just go into a 404 wall.
Adding the relevant deals seems like the best way to go.
Thanks again
-
Hi there
Here's a great resource from Matt Cutts on the subject. I like the idea of redirecting to a relevant page/category or creating a 404 page that assists the user in finding something related or gives them an opportunity to be notified when the offer/product is back.
You should also update internal links and your sitemap links so that this page isn't being constantly crawled if the offer is over and done, and not coming back.
Hope this helps! Good luck!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to deal with 100 product pages
It feels good to be BACK. I miss Moz. I left for a long time but happy to be back! 🙂 My client is a local HVAC company. They sell Lennox system. Lennox provides a tool that we hooked up to that allows visitors to their site to 'see' 120+ different kind of air quality, furnace and AC units. They problem is (I think its a problem) is Google and other crawl tools are seeing these 100+ pages that are not unique, helpful or related to my client. There is a little bit of cookie cutter text and images and specs and that's it. Are these pages potentially hurting my client? I can't imagine they are helping. Best way to deal with these? Thank you! Thank you! Matthew
Technical SEO | | Localseo41440 -
How to deal with 80 websites and duplicated content
Consider the following: A client of ours has a Job boards website. They then have 80 domains all in different job sectors. They pull in the jobs based on the sectors they were tagged in on the back end. Everything is identical across these websites apart from the brand name and some content. whats the best way to deal with this?
Technical SEO | | jasondexter0 -
Expired domain 404 crawl error
I recently purchased a Expired domain from auction and after I started my new site on it, I am noticing 500+ "not found" errors in Google Webmaster Tools, which are generating from the previous owner's contents.Should I use a redirection plugin to redirect those non-exist posts to any new post(s) of my site? or I should use a 301 redirect? or I should leave them just as it is without taking further action? Please advise.
Technical SEO | | Taswirh1 -
Expires Header
We are considering adding expires header to our site. If we add this meta tag to expire for a certain date, but we do not make any changes to the site after it expires, can you be penalized for this?
Technical SEO | | tdawson090 -
Best way to deal with over 1000 pages of duplicate content?
Hi Using the moz tools i have over a 1000 pages of duplicate content. Which is a bit of an issue! 95% of the issues arise from our news and news archive as its been going for sometime now. We upload around 5 full articles a day. The articles have a standalone page but can only be reached by a master archive. The master archive sits in a top level section of the site and shows snippets of the articles, which if a user clicks on them takes them to the full page article. When a news article is added the snippets moves onto the next page, and move through the page as new articles are added. The problem is that the stand alone articles can only be reached via the snippet on the master page and Google is stating this is duplicate content as the snippet is a duplicate of the article. What is the best way to solve this issue? From what i have read using a 'Meta NoIndex' seems to be the answer (not that i know what that is). from what i have read you can only use a canonical tag on a page by page basis so that going to take to long. Thanks Ben
Technical SEO | | benjmoz0 -
Dealing with 410 Errors in Google Webmaster Tools
Hey there! (Background) We are doing a content audit on a site with 1,000s of articles, some going back to the early 2000s. There is some content that was duplicated from other sites, does not have any external links to it and gets little or no traffic. As we weed these out we set them to 410 to let the Goog know that this is not an error, we are getting rid of them on purpose and so the Goog should too. As expected, we now see the 410 errors in the Crawl report in Google Webmaster Tools. (Question) I have been going through and "Marking as Fixed" in GWT to clear out my console of these pages, but I am wondering if it would be better to just ignore them and let them clear out of GWT on their own. They are "fixed" in the 410 way as I intended and I am betting Google means fixed as being they show a 200 (if that makes sense). Any opinions on the best way to handle this? Thx!
Technical SEO | | CleverPhD0 -
Dealing with closely related pages
I have a book with 8 pages which I offer free on my site: http://www.pottytrainingchart4kids.com/free-potty-training-book/ For technical reasons each of the 8 pages are on a seperate page. This might cause thin content/duplicate content since most of the code is the same besides the images and there isn't much on each page. How would you suggest I deal with this? I remember once reading about rel prev or something like that but I am not sure if it is applicable. I would like all page rank to go to the main page. Should I add no index to the other pages? I am not really sure what I should do to prevent a Panda penalty. Thanks in advance!
Technical SEO | | JillB20130 -
How best to redirect URL from expired classified ads?
We have problem because our content are classifieds. Every ad expired after one or two mounts and then ad becomes inactive and we keep his page for one mount latter like a same page but we ad a notice that ad is inactive. After that we delete the ad and his page but need to redirect that URL to search results page which contains similar ads because we don't want to lose the traffic form that pages. How is the best way to redirect ad URL? Our thinking was to redirect internal without 301 redirection because the httacces file will be very big after a while and we are thinking to try a canonicalization because we don't want engine to think that we have to much duplicate content.
Technical SEO | | Donaab0