Best practices for repetitive job postings
-
I have a client who is a recruiter for skilled trades jobs. They post quite a few jobs on their job board on a regular basis. They frequently have job postings that are very similar to older jobs or multiple current job postings that are similar to each other.
Looking at their webmaster tools and site: command search in google, it does appear they have some duplicate content issues. We're thinking it's because of the similar job posts.
What is the best practice for dealing with this? And is there any way to correct the situation so that the number of "omitted due to similarity" results declines?
Thanks for you help!
-
Ok if the previous job posts are causing your concern, you can easily fix this by setting up meta data expiry:
_It will automatically remove the content of the page from search engines index as soon as the job becomes unavailable. _
-
It could be worth posting the question in GWT forum, so at least there might be a chance one of the google employees takes a note and may (or may not) be able to do something about penalties given to the site.
-
Hmmm... This is an interesting situation for sure!
My first thought was adding a canonical tag on the postings, but I'm sure you don't have that kind of access. My first assumption is that this kind of duplicate content isn't going to hurt you. Mainly because this is not a new situation to Google. Kind of like how a /blog page would have a snippet of the actual blog post. Would you consider that duplicate content? Technically, but Google isn't going to see it like that.
If you're super worried or concerned about this, you could always have two job descriptions for the same job. One that you have on the corporate site, and the other that you're submitting to indeed, monster, etc. This doesn't need to take too much time. You could just have some generic copy then say "...to see more about this job posting, visit http://www.yoursite.com".
I still going to be surprised if Google is seeing this as duplicate content though... Also, Google may filter it out of their SERPs, but do you have any indication that your potential applicants are finding it in the SERPs anyways?
Was that helpful?
Kevin Phelps
http://www.linkedin.com/in/kevinwphelps
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What IS SEO FRIENDLY BEST PRACTICE FOR URLS FILTERED 'TAGGED'
EX: https://www.STORENAME.com/collections/all-deals/alcatel– Tagged "Alcatel", when I run audits, I come across these URLS that give me duplicate content and missing H1. This is Canonical: https://www.STORENAMEcom/collections/all-deals/alcatel Any advice on how to tackle these I have about4k in my store! Thank you
Technical SEO | | Sscha0030 -
Is it good practice to update your disavow file after a penalty is removed.
I was wondering if you could use the disavow file by adding to it - even after your site has recovered from a partial site penalty. As a recurring SEO procedure, we are always looking at links pointing to our Website. We then ascertain those links that are clearly of no value. In order to clean these up, would it be good practice to update your disavow file with more of theses domains. Is the disavow file just used for penalty issues to alert google of the work you have done? (we have had penalty in the past but fine now) Would this method help in keeping high quality links to the fore and therefore removing low quality links from Googles eyes? I would welcome your comments.
Technical SEO | | podweb0 -
How can I best handle parameters?
Thank you for your help in advance! I've read a ton of posts on this forum on this subject and while they've been super helpful I still don't feel entirely confident in what the right approach I should take it. Forgive my very obvious noob questions - I'm still learning! The problem: I am launching a site (coursereport.com) which will feature a directory of schools. The directory can be filtered by a handful of fields listed below. The URL for the schools directory will be coursereport.com/schools. The directory can be filtered by a number of fields listed here: Focus (ex: “Data Science”) Cost (ex: “$<5000”) City (ex: “Chicago”) State/Province (ex: “Illinois”) Country (ex: “Canada”) When a filter is applied to the directories page the CMS produces a new page with URLs like these: coursereport.com/schools?focus=datascience&cost=$<5000&city=chicago coursereport.com/schools?cost=$>5000&city=buffalo&state=newyork My questions: 1) Is the above parameter-based approach appropriate? I’ve seen other directory sites that take a different approach (below) that would transform my examples into more “normal” urls. coursereport.com/schools?focus=datascience&cost=$<5000&city=chicago VERSUS coursereport.com/schools/focus/datascience/cost/$<5000/city/chicago (no params at all) 2) Assuming I use either approach above isn't it likely that I will have duplicative content issues? Each filter does change on page content but there could be instance where 2 different URLs with different filters applied could produce identical content (ex: focus=datascience&city=chicago OR focus=datascience&state=illinois). Do I need to specify a canonical URL to solve for that case? I understand at a high level how rel=canonical works, but I am having a hard time wrapping my head around what versions of the filtered results ought to be specified as the preferred versions. For example, would I just take all of the /schools?focus=X combinations and call that the canonical version within any filtered page that contained other additional parameters like cost or city? Should I be changing page titles for the unique filtered URLs? I read through a few google resources to try to better understand the how to best configure url params via webmaster tools. Is my best bet just to follow the advice on the article below and define the rules for each parameter there and not worry about using rel=canonical ? https://support.google.com/webmasters/answer/1235687 An assortment of the other stuff I’ve read for reference: http://www.wordtracker.com/academy/seo-clean-urls http://www.practicalecommerce.com/articles/3857-SEO-When-Product-Facets-and-Filters-Fail http://www.searchenginejournal.com/five-steps-to-seo-friendly-site-url-structure/59813/ http://googlewebmastercentral.blogspot.com/2011/07/improved-handling-of-urls-with.html
Technical SEO | | alovallo0 -
Wordpress Page vs. Posts
My campaigns are telling me I have some duplicate content. I know the reason but not sure how to correct it. Example site here: Bikers Blog is a "static page" referencing each actual "blog post" I write. This site is somewhat orphaned and about to be reconstituted. I have a number of other sites with a similar problem. I'm not sure how to structure the "page" so it only shows a summary of the blog post on the page not the whole post. Permalinks is set as "/%postname%/" I've posted on Wordpress.org with no answer. Since this is an SEO issue I thought maybe someone with WP experience could chime in. Thanks, Don
Technical SEO | | NicheGuy0 -
What is the best approach to specifying a page's language?
I have read about a number of different tags that can accomplish this so it is very confusing. For example, should I be using: OR
Technical SEO | | BlueLinkERP0 -
Multi- language URL best practices
we have two different content perlanguage (Fr. EN )) they are not Duplicated and they are completly different. what is better for the URL a language sub domain or a folder fr.mycompany.com or mycompany.com/fr/
Technical SEO | | omarfk0 -
Site Change of Address - best method?
When changing domains, there's the obvious anxiety about sacrificing the value of your old domain. A client recently changed domains, immediately killed the old site (did everything properly with 301s, Webmaster Tools etc etc etc) and lost rankings completely for weeks. Turns out the site had been 'burnt' by the previous owner and it took a reconsideration request from Google before things recovered. Cost them rankings and cash with extra PPC spend. My question is: In order to avoid this potential hazard, what are your thoughts on submitting a change of address in Webmaster tools, but then leaving old site live for a few weeks to see how things pan out? I have never tried it and it seems to go against the grain, but interested to hear other people's experiences and how they have managed to change domain with minimal temporary damage. Thanks.
Technical SEO | | RiceMedia0 -
Oh! best community of seo the seomoz team! question:
Hi the best of the best on the seo world. I like to ask something, i like to know: So, from your strongest-bestest-proffessionallll experience, the web-scripts you use,must be the best of best scripts, i like to ask: for your newsletter, are you using external service? client managment,pro members,are you using any external service from another website or have you do the programming itself? and dont forget to say, your services without any comment are number one on the world,but your community now and your design,and your big heart to leave us talking together here and unlimited questiond to ask, you have attested that you are the best. But tell us more of your services what you use on your encyclopedy of seo, we like to know more from you. Thanks Meti.
Technical SEO | | leadsprofi0