Honeypot Captcha - rated as "cloaked content"?
-
Hi guys,
in order to get rid of our very old-school captcha on our contact form at troteclaser.com, we would like to use a honeypot captcha.
The idea is to add a field that is hidden to human visitors but likely to be filled in by spam-bots. In this way we can sort our all those spam contact requests.
More details on "honeypot captchas":
http://haacked.com/archive/2007/09/11/honeypot-captcha.aspxAny idea if this single cloaked field will have negative SEO-impacts? Or is there another alternative to keep out those spam-bots?
Greets from Austria,
Thomas -
Just in case anyone stumbles across this topic:
We started using honeypot captchas in 2011 and it really paid off. Not only because we got rid of the old captchas, but also because they are keeping out 99,99% of all bot inquiries or spam.
-
Hey Casey,
Thanks for the reply. Will have this tested soon. Really looking forward to getting rid of that captcha.
Regards,
Thomas
-
Hi Thomas,
I've done some studies on this and you will be fine using this technique and Google won't give you any problems doing it. Check out my post on the Honeypot Technique, http://www.seomoz.org/blog/captchas-affect-on-conversion-rates. The technique works quite well blocking about 98% of SPAM.
Casey
-
Hi Keri,
Those are users without Java-Support.
Does that mean that Java Script is no issue then? -
Thomas, double-check if that stat is for users without Java, or users without javascript.
-
Good point, thanks.
As 15% of our visitors don't have Java, this won't work out
Actually we're trying to get rid of the captcha to increase our CR, that's why the "honeypot" version is very appealing.
-
You won't get any SEO impact, think about it for all the form with JS interaction on big sites
One easy solution is to use ajax post of the form only, very effective BUT you won't be able to get contact from visitors without javascript enabled. Maybe a good alternative.
Otherwise, you can use Recaptcha : http://www.google.com/recaptcha
This is free and easy to setup, works well with bots and access to everyone !
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GSC: Change of Domain Not Processed, Despite Saying "Approved"?
Hi folks, I've just completed a straightforward olddomain -> newdomain migration. All the redirects were done on 7th Feb. I submitted the change of domain request on 7th Feb. All seemed fine - as can be seen in the attached. It's now 19th March and our pals at GSC are still saying that the domain migration is ongoing. I've never had this take so long before; 2-3 days tops. Their results are tanking as I can't geo target and more features in GSC are out of action as it's 'locked' due to this migration (I just get a screen as per the attached). Thoughts? Shall I risk withdrawing the request and starting anew? The old "turn it off and on again"? Thanks! hJXKC
Technical SEO | | tonyatfat0 -
Old Content Pages
Hello we run a large sports website. Since 2009 we have been doing game previews for most games every day for all the major sports..IE NFL, CFB, NBA, MLB etc.. Most of these previews generate traffic for 1-2 days leading up to or day of the event. After that there is minimal if any traffic and over the years almost nothing to the old previews. If you do a search for any of these each time the same matchup happens Google will update its rankings and filter out any old matchups/previews with new ones. So our question is what would you do with all this old content? Is it worth just keeping? Google Indexes a majority of it? Should we prune some of the old articles? The other option we thought of and its not really practical is to create event pages where we reuse a post each time the teams meet but if there was some sort of benefit we could do it.
Technical SEO | | dueces0 -
Duplicate content
Hello mozzers, I have an unusual question. I've created a page that I am fully aware that it is near 100% duplicate content. It quotes the law, so it's not changeable. The page is very linkable in my niche. Is there a way I can build quality links to it that benefit my overall websites DA (i'm not bothered about the linkable page being ranked) without risking panda/dupe content issues? Thanks, Peter
Technical SEO | | peterm21 -
Website Migration - Very Technical Google "Index" Question
This is my understanding of how Google's search works, and I am unsure about one thing in specifc: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" connects to the "page directory". I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I ask is I am starting to work with a client who has a newly developed website. The old website domain and files were located on a GoDaddy account. The new websites files have completely changed location and are now hosted on a separate GoDaddy account, but the domain has remained in the same account. The client has setup domain forwarding/masking to access the files on the separate account. From what I've researched domain masking and SEO don't get along very well. Not only can you not link to specific pages, but if my above assumption is true wouldn't Google have a hard time crawling and storing each page in the cache?
Technical SEO | | reidsteven750 -
Penalization for Duplicate URLs with %29 or "/"
Hi there - Some of our dynamically generated product URLs somehow are showing up in SEOmoz as two different URLs even though they are the same page- one with a %28 and one with a 🙂 e.g., http://www.company.com/ProductX-(-etc/ http://www.company.com/ProductX-(-etc/ Also, some of the URLs are duplicated with a "/" at the end of them. Does Google penalize us for these duplicate URLs? Should we add canonical tags to all of them? Finally, our development team is claiming that they are not generating these pages, and that they are being generated from facebook/pinterest/etc. which doesn't make a whole lot of sense to me. Is that right? Thanks!
Technical SEO | | sfecommerce0 -
Domain "Forwarded"?
Hi SEOMoz! The company I work for has a website, www.accupos.com, but they also have an old domain which is not used anymore called http://accuposretail.com/ These two sites had duplicate content so I requested the OLD site (http://accuposretail.com/) be redirected to accupos.com to eliminate the dupe content. Unfortunately, I do not understand completely what happened but when they performed this forwarding the accuposretail.com URL is still in use. Now it just displays EXACTLY what accupos.com displays and not something similar. The tech team told me it is forwarded but I can't help but see the URL still in the search box on top. Is this unacceptable? The actual URL has to forward and change to the accupos.com URL in order to not be duplicate content, correct? I have limited experience in this. Please let me know if we are good to go, or if I need to tell them more action is required. Thanks! Derek M
Technical SEO | | DerekM880 -
Same URL in "Duplicate Content" and "Blocked by robots.txt"?
How can the same URL show up in Seomoz Crawl Diagnostics "Most common errors and warnings" in both the "Duplicate Content"-list and the "Blocked by robots.txt"-list? Shouldnt the latter exclude it from the first list?
Technical SEO | | alsvik0 -
My site has a "Reported Web Forgery!" warning
When I check my bing cached page it comes up with a "Reported Web Forgery!" warning. I've looked at google web tools and no malware has been detected. I do have another site that has a very similar web address jaaronwoodcountertops.com and jaaron-wood-countertops.com. Could that be why? How do I go about letting bing and or firefox know this is not a forgery site?
Technical SEO | | JAARON0