Cloudflare - Should I be concerned about false positives and bad neighbourhood IP problems
-
I am considering using cloudflare for a couple of my sites.
What is your experience?I researched a bit and there are 3 issues I am concerned about:
-
google may consider site bad neighbourhood in case other sites on same DNS/IP are spammy.
Any way to prevent this? Anybody had a problem? -
ddos attack on site on same DNS could affect our sites stability.
-
blocking false positives. Legitimate users may be forced to answer captchas etc. to be able to see the page. 1-2% of legit visitor were reported by other moz member to be identified as false positive.
Can I effectively prevent this by reducing cloudflare basic security level?
Also did you experience that cloudflare really helped with uptime of site? In our case whenever our server was down for seconds also cloudflare showed error page and sometimes cloudflare showed error page that they could not connect even when our server response time was just slow but pages on other domains were still loading fine.
-
-
Thanks Cyrus.
-
You may be interested in this post titled "Cloudflare and SEO" : https://blog.cloudflare.com/cloudflare-and-seo/
"We did a couple things. First, we invented a new technology that, when it detects a problem on a site, automatically changes the site's CloudFlare IP addresses to isolate it from other sites. (Think of it like quarantining a sick patient.) Second, we worked directly with the crawl teams at the big search engines to make them aware of how CloudFlare worked. All the search engines had special rules for CDNs like Akamai already in place. CloudFlare worked a bit differently, but fell into the same general category. With the cooperation of these search teams we were able to get CloudFlare's IP ranges are listed in a special category within search crawlers. Not only does this keep sites behind them from being clustered to a least performant denominator, or incorrectly geo-tagged based on the DNS resolution IP, it also allows the search engines to crawl at their maximum velocity since CloudFlare can handle the load without overburdening the origin."
-
Thanks Tom.
I will move now one of my main domains and will use their PRO plan. Noticed they have quite a number of settings to address the false positives. Our problem with cloudflare error pages may have been a temporary one while they where building the cache of the site. Anyway it is easy to enable/disable the cloudflare protection. So not much risk here. Could save us of a lot of potential headache in the future if it works as advertised. -
Hi,
-
I have used CloudFlare for a few sites and never had an issue with this. It is a risk/concern with all shared hosting, but CloudFlare are very proactive about addressing anything impacting their customers, so I would not have a concern on this side of things at all.
-
Again, I wouldn't have concerns here. CloudFlare are very adept at handling large-scale DDOS attacks . Having read some of their post-attack analysis reports, they usually mitigate any impact to customers very quickly. They have loads of customers, and if this sort of thing was an issue I think we'd hear about it fairly often.
-
I can't speak to the % of users that might get falsely identified as a risk and presented a CAPTCHA, but I'd be very surprised if it was as high as 1-2%; I've rarely seen that CAPTCHA screen myself. You should check what CloudFlare have to say on this issue, but I would have no concern here either.
I have never had an issue with CloudFlare impacting SEO performance or impacting the user experience. It has generally performed well for me, but the biggest issue I see with it is people hoping it is a 'cure all' and means they don't need to properly address issues affecting the performance of their site. If your database performance is very poor, meaning dynamic pages take a long time to load, then CloudFlare is not the answer (it may help - but you should address the underlying issue).
I am unsure about the issue with CloudFlare failing when your server is slow - I'd imagine CloudFlare support could help you with this - there may be a configuration option somewhere.
Overall - my suggestion would be that you go for it.
-
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Implementation of structured data = a significant drop in positions in the results
Hi friends,
Intermediate & Advanced SEO | | zkouska
In one of our websites (ecommerce) with the implementation of structured data we noticed a significant drop in positions in the results.
Does anyone have a similar experience? Thanks... 🙂0 -
Shoemaker with ugly shoes : Agency site performing badly, what's our best bet?
Hi everyone,
Intermediate & Advanced SEO | | AxialDev
We're a web agency and our site www.axialdev.com is not performing well. We have very little traffic from relevant keywords. Local competitors with worse On-page Grader scores and very few backlinks outrank us. For example, we're 17th for the keyword "agence web sherbrooke" in Google.ca in French. Background info: In the past, we included 3 keywords-rich in the footer of every site we made (hundreds of sites by now). We're working to remove those links on poor sites and to use a single nofollow link on our best sites. Since this is on-going and we know we won't be able to remove everything, our link profile sucks (OSE). We have a lot of sites on our C-Block, some of poor quality. We've never received a manual penalty. Still, we've disavowed links as a precaution after running Link D-Tox. We receive a lot of trafic via our blog where we used to post technical articles about Drupal, Node js, plugins, etc. These visits don't drive business. Only a third of our organic visits come from Canada. What are our options? Change domain and delete the current one? Disallow the blog except for a few good articles, hoping it helps Google understand what we really do. Keep donating to Adwords? Any help greatly appreciated!
Thanks!2 -
Bad performance for low competition term
Hi everybody. I've been working on this page for some time, http://www.double-glazing-forum.com/anglian-windows.aspx. Until several months ago, it ranked really well for the terms 'Anglian windows' and 'Anglian windows reviews'. However, following a Google update it tanked and has got worse ever since. Here's what I've done to try and fix it. Added 800 words of unique copy Added YouTube videos Replaced scraped press releases with unique descriptions that link to the source Analysed the backlink profile and uploaded a disavow file containing all bad links Contacted webmaster to remove them where possible Getting a bit low on ideas now, so any help would be great!
Intermediate & Advanced SEO | | Blink-SEO0 -
Is using dots in URL path really a problem?
we have a couple of pages displaying a dot in the URL path like domain.com/mr.smith/widget-mr.smith It displays fine in chrome, firefox and IE and for the user it may actually look better than replacing it by _ or -. Did this ever cause problems to anybody?
Intermediate & Advanced SEO | | lcourse
Any statement from google about it?
Should I change existing URLs? If so, which other characters can I use in the URL instead of underscore and dash, since in our system dash and underscore are already used for rewriting other characters. Thanks0 -
How should I react to my site being "attacked" by bad links?
Hello, We have never bought links or done manipulative linbuilding. Meanwhile, someone has recently (15th of March) pointed at the top 5 websites on my main keyword with lots of bad quality links. So far it has not affected my rankings at all. Actually, I think it will not affect them because I think it was not a massive enough attack. The particular page that has been attacked had about 100 root domains pointing it and now it went up to something like 400. All those were in one day. All of those links use the same anchor text: the keyword we're ranking for. With those extra 300 root domains pointing at us, we went from 600 rootdomain to 900 pointing at our domain as a whole. The page that was targetted by the attack is not the homepage. What I wanted to do was to basically do nothing since I think it won't affect our rankings in any ways but I wanted you guys' opinion. Thanks.
Intermediate & Advanced SEO | | EndeR-0 -
Question on starting again after being penalised for bad links
Hi, in a scenario where you have been heavily penalised for bad links but the quality of your site is good, If you put the exact same version of your penalised site on a new domain (with no redirects), would Google recognise it and penalise it again, or would that give it a completely fresh start? Any advice or experience with this would be much appreciated. Thanks.
Intermediate & Advanced SEO | | em_welsby1 -
Is IP trust a factor in SEO
Hi If my website is on an IP address that has a bad reputation with AOL for email, would Google use that as a factor in where they place us in SERPs? If so, what should I do? Matt
Intermediate & Advanced SEO | | usedcarexpert0 -
SEOMOZ duplicate page result: True or false?
SEOMOZ say's: I have six (6) duplicate pages. Duplicate content tool checker say's (0) On the physical computer that hosts the website the page exists as one file. The casing of the file is irrelevant to the host machine, it wouldn't allow 2 files of the same name in the same directory. To reenforce this point, you can access said file by camel-casing the URI in any fashion (eg; http://www.agi-automation.com/Pneumatic-grippers.htm). This does not bring up a different file each time, the server merely processes the URI as case-less and pulls the file by it's name. What is happening in the example given is that some sort of indexer is being used to create a "dummy" reference of all the site files. Since the indexer doesn't have file access to the server, it does this by link crawling instead of reading files. It is the crawler that is making an assumption that the different casings of the pages are in fact different files. Perhaps there is a setting in the indexer to ignore casing. So the indexer is thinking that these are 2 different pages when they really aren't. This makes all of the other points moot, though they would certainly be relevant in the case of an actual duplicated page." ****Page Authority Linking Root Domains http://www.agi-automation.com/ 43 82 http://www.agi-automation.com/index.html 25 2 http://www.agi-automation.com/Linear-escapements.htm 21 1 www.agi-automation.com/linear-escapements.htm 16 1 http://www.agi-automation.com/Pneumatic-grippers.htm 30 3 http://www.agi-automation.com/pneumatic-grippers.htm 16 1**** Duplicate content tool estimates the following: www and non-www header response; Google cache check; Similarity check; Default page check; 404 header response; PageRank dispersion check (i.e. if www and non-www versions have different PR).
Intermediate & Advanced SEO | | AGIAutomation0