Best way to handle expired ad in a classified
-
I don't think there is a definitive answer to this, but worth the discussion:
How to handle an expired ad in a classified / auction site?
Michael Gray mentioned you should 301 it to it's category page, and I'm inclined to agree with him. But some analysts say you should return a "product/ad expired" page with a 404.
For the user I think the 404 aproach is best, but from a SEO perspective that means I'm throwing link juice out.
What if I 301 him from the ad, and show a message saying why they're seeing the listing page instead of the product page?
Thoughts?
-
I would do #3.
-
Great inputs!
But what if, for legal reasons (price, pictures, etc), the ad has to be removed after it has expired. (real case here)
Ideas:
- Modify the ad page and return a 200? (remove ad data and add a message saying it's expired)
- Throw a friendly 404 page, saying the ad has expired and show other options for the user to navigate to
- 301 to it's parent page
(3) is my favourite, but (2) may be the best option for users.
Thoughts?
-
Interesting...
I don't know how "private" selling prices are in your area but maybe a couple pages on your site like thiese.....
WHAT YOU CAN BUY IN YOURCITY FOR $100,000
This would be a point of reference for buyers and sellers. Where I live there is a huge divergence between askin' and sellin' prices. They ask for the moon but get something a lot less.
RECENT SALES PRICES IN YOURCITY...
Nosy people would love this.
-
I do the same thing with our real estate site. If a listing has expired, I keep the page active, but I put a note at the top saying, "This listing has sold! Contact us and we can find you similar listings in the city."
My expired listings bring in a lot of search traffic.
-
Who is going to bet against Michael Gray? I think that you should listen to him.
I would give his answer one tweak. He says....
If the product goes out of stock forever, you have a couple choices. You can leave the page up with a discontinued notice on the page. IMHO that’s not the best way to go for search engines. Ideally I’d like to not lose any link equity and 301 the product page to a similar product, category/department page, or home page.
I would do exactly what he says 99% of the time, however, if that page is pulling a lot of search engine traffic and same manufacturer has a replacement product or something close that substitutes, I would leave that page in place and use it to explain... "This product has been retired but a new and improved widget is available... (then give the sales pitch for the new model with a buy button). This approach would be especially valuable if the product is something like running shoes where repeat customers with very high loyalty are looking to replace their favorite shoes up to several times per year.
When this shoe was replaced by Addiction there was a mad scramble to buy up all existing stock... (I am probably only person posting here old enough to have worn out a couple dozen pairs)
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What would be the best course of action to nullify negative effects of our website's content being duplicated (Negative SEO)
Hello, everyone About 3 months ago I joined a company that deals in manufacturing of transportation and packaging items. Once I started digging into the website, I noticed that a lot of their content was "plagiarized". I use quotes as it really was not, but they seemed to have been hit with a negative SEO campaign last year where their content was taken and being posted across at least 15 different websites. Literally every page on their website had the same problem - and some content was even company specific (going as far as using the company's very unique name). In all my years of working in SEO and marketing I have never seen something at the scale of this. Sure, there are always spammy links here and there, but this seems very deliberate. In fact, some of the duplicate content was posted on legitimate websites that may have been hacked/compromised (some examples include charity websites. I am wondering if there is anything that I can do besides contacting the webmasters of these websites and nicely asking for a removal of the content? Or does this duplicate content not hold as much weight anymore as it used to. Especially since our content was posted years before the duplicate content started popping up. Thanks,
White Hat / Black Hat SEO | | Hasanovic0 -
We have a site with a lot of international traffic, can we split the site some way?
Hello, We have a series of sites and one, in particular, has around 75,000 (20%) monthly users from the USA, but we don't currently offer them anything as our site is aimed at the UK market. The site is a .com and though we own the .co.uk the .com is the primary domain. We have had a lot of success moving other sites to have the .co.uk as the primary domain for UK traffic. However, in this case, we want to keep both the UK traffic and the US traffic and if we split it into two sites, only one can win right? What could do? It would be cool to have a US version of our site but without affecting traffic too much. On the other sites, we simply did 301 redirects from the .com page to the corresponding .co.uk page. Any ideas?
White Hat / Black Hat SEO | | AllAboutGroup0 -
Clean-up Question after a wordpress site Hack added pages with external links from a massive link wheel?
Hey All, Thought I would throw this out to ensure I am dotting my "i's" and crossing my "t's"..... Client WordPress site was hacked injected 3-4 pages that cross linked to hundreds (affiliate junk spam link wheel). Pages were removed, 3rd party cleared all malware/viruses. Heavy duty firewall and security monitoring are in place. Hacked pages are now showing as 404. No penalties, ranking issues....If anything there was a temporary BOOST in rankings due to the large link-wheel type net that the pages were receiving....That has since leveled out rankings. I guess my question is, in your opinion is it best to let those pages 404, I am noticing a large amount of links going to them from all over the world from this large link net that was built. I find the temptation to 301 re-direct deleted pages to the homepage difficult...lol..{the temptation is REAL}. Is there anything I am missing? Any other steps that YOU would take? I am assuming letting those pages 404 would be the best bet, as in time they will roll off index.... Thank you in advance, I appreciate any feedback or opinions....
White Hat / Black Hat SEO | | Anthony_Howard0 -
GWT 404 best practices
I'm getting back lots of 404 errors for old websites that are linking back to my current website. If the website content/anchor text has no relevancy to my current content is it still best practice to redirect to current home page, contact the web master to remove link or any other suggestions? Not exactly sure if redirecting looks spammy since it's irrelevant content. Thanks for your help!
White Hat / Black Hat SEO | | IceIcebaby0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
How to Handle Sketchy Inbound Links to Forum Profile Pages
Hey Everyone, we recently discovered that one of our craft-related websites has a bunch of spam profiles with very sketchy backlink profiles. I just discovered this by looking at the Top Pages report in OpenSiteExplorer.org for our site, and noticed that a good chunk of our top pages are viagra/levitra/etc. type forum profile pages with loads of backlinks from sketchy websites (porn sites, sketchy link farms, etc.). So, some spambot has been building profiles on our site and then building backlinks to those profiles. Now, my question is...we can delete all these profiles, but how should we handle all of these sketchy inbound links? If all of the spam forum profile pages produce true 404 Error pages (when we delete them), will that evaporate the link equity? Or, could we still get penalized by Google? Do we need to use the Link Disavow tool? Also note that these forum profile pages have all been set to "noindex,nofollow" months ago. Not sure how that affects things. This is going to be a time waster for me, but I want to ensure that we don't get penalized. Thanks for your advice!
White Hat / Black Hat SEO | | M_D_Golden_Peak0 -
How best to do Location Specific Pages for Eccomerce Post Panda Update..
Hi , We have an eCommerce site and currently we have a problem with duplicate content. We created Location specific landing pages for our product categories which initially did very well until the recent Google Panda update caused a big drop in ranking and traffic. example http://xxx.co.uk/rent/lawn-mower/London/100 http://.xxx.co.uk/rent/lawn-mower/Manchester/100 Much of the content on these location pages is the same or very similar apart from different H1 tag, title tag and in some cases slight variations on the on page content but given that these items can be hired from 200 locations it would take years to have unique content for every location for each category... We did this originally in April as we can't compete nationally but we found it was easier to compete locally ,hence the creation of the location pages and it did do well for us until now. My question is , since the last Google Panda update, our traffic has dropped 40% , and rankings have gone through the floor and we are stuck with this mess Should we get rid off (301 ) all of the location specific pages for each of the categories ot just keep say 10 locations per cities as the most popular ones and either do No follow no index the other locations or 301's or what would people recommend ? The only examples I can see on the internet that others do with multiple locations is to have a store finder type thing... but you cant' rank for the individual product /category doing it that way... If anyone has any advice or good examples of sites I could see that employ a good location url specific method, please let me know. thanks Sarah
White Hat / Black Hat SEO | | SarahCollins0 -
Purchasing an EXPIRING domain with quality related links
I'm about to purchase a domain that has highly trusted links pointed to it. I would like to 301 redirect that domain to point to a brand new domain in the same niche. Some of the links that the expiring domain would take me a long time to obtain so I am thinking to use this method to my advantage. I know that this is not really a legitimate way to go to build backlinks, but surely this is going to give me good serp improvement which is my main concern at the moment. I am going to test this method to see if it benefits me in anyway but I would like some opinions to this please.
White Hat / Black Hat SEO | | umtmedia0