Are there SEO implications to blocking foreign IP addresses?
-
We are dealing with a foreign company that has completely ripped off our entire site template, design and branding. This is such an inconvenience and we've had similar things happen enough in the past that we're considering blocking large ranges of IP addresses from accessing our site, via htaccess.
Is this something that will potentially cause problems with search engine bots crawling or indexing our site? We are in the US and our site is hosted in the US, but I'm not sure if the major search engines could potentially be using foreign based bots.
Looking for any insight on this or if there are any other potential SEO problems to consider.
Thanks
-
Zee, did you implement this? Outcomes?
-
If the bot is in another country and you have blocked the range it's pretty obvious... What kind of "backup" are you looking for?
If you are asking me if I have a geographical list of bots for each search engine then no, I don't. But this might be of some use to you http://productforums.google.com/forum/#!topic/webmasters/TbpNyFiJvjs
Good luck with the whole site design / copyright issue, any chance you could PM me a link I would like to see what they have done... (just curious).
-
Thanks for the reply SEOKeith, but focusing "on making our site more authoritative" does not solve the problem.
The problem we have is not an SEO problem, it's a design, copyright, trademark and ethical problem. When you spend months developing and designing a site only to have it ripped off, it's not something we want to just ignore.
The damage has been done in this particular instance. However, we've had enough problems in the past from foreign visitors and our business doesn't come from foreign countries. Because of that, blocking actual humans from accessing our site from countries we've had problems with is a potential solution.
The solution we're considering could potentially impact the way search engines view our site and that's the question. Do you have anything to back up your comment about "blocking large ranges of IP addresses you could end up restricting access to legitimate...bots"?
-
By blocking large ranges of IP address you could end up restricting access to legitimate users, bots etc.
For a start how do you even know what site is harvesting your data is in the said country, sure they might be hosting there but the boxes that are ripping your content might be in the US they could then have some web heads in some other random countries serving up the content.
People copying / stealing / cloning your content is pretty common it happens to a lot of my sites - it's just the way it is your not going to be able to stop it you might as well just focus on making your site more authoritative.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can anyone tell me - in layman's terms - any SEO implications of a Netscaler redirect?
We are in the midst of exploring the best options for developing a "microsite" experience for a client and how we manage the site - subdomain vs. subdirectory... Netscaler redirect vs DNS change. We understand that a subdirectory is best for SEO purposes; however, we anticipate technical limitations when integrating the different hosting platforms and capabilities into the existing site. The proposed solutions that were provided are a netscaler redirect and/or dns changes. Any experience with these solutions?
Technical SEO | | jgrammer0 -
Wordpress SEO for ecommerce store
I have been working on a wordpress ecommerce website and want to check that I am doing something correctly so hope someone can help. The Wordpress theme did not allow for an introductory content so the developer built a content page Instead to work as a front end category page so there was more flexibility. I have set the product categories to follow- no index and set the content page as the canonical version and optimised the content page.. question is - is that the right thing to do?
Technical SEO | | musthavemarketing0 -
Trailing Slashes and SEO
Hi, We're currently using a third party blog platform (Blog Engine) on our site and we have a trailing slash issue. I can add as many trailing slashes as I want to the blog's homepage URL, but they don't redirect and our dev guys say this cannot be done with Blog Engine. We're in the process of building our own blog but, in the meantime, I just wanted to know if this will cause an issue? Individual blog posts with trailing slashes are redirected, it's just the homepage where it can't be done. I haven't noticed any traffic going to a blog URL with trailing slashes, and I don't believe any URLs with trailing slashes are being indexed, so should this be OK? Cheers, Lewis
Technical SEO | | PeaSoupDigital0 -
Duplicate Content on SEO Pages
I'm trying to create a bunch of content pages, and I want to know if the shortcut I took is going to penalize me for duplicate content. Some background: we are an airport ground transportation search engine(www.mozio.com), and we constructed several airport transportation pages with the providers in a particular area listed. However, the problem is, sometimes in a certain region multiple of the same providers serve the same places. For instance, NYAS serves both JFK and LGA, and obviously SuperShuttle serves ~200 airports. So this means for every airport's page, they have the super shuttle box. All the provider info is stored in a database with tags for the airports they serve, and then we dynamically create the page. A good example follows: http://www.mozio.com/lga_airport_transportation/ http://www.mozio.com/jfk_airport_transportation/ http://www.mozio.com/ewr_airport_transportation/ All 3 of those pages have a lot in common. Now, I'm not sure, but they started out working decently, but as I added more and more pages the efficacy of them went down on the whole. Is what I've done qualify as "duplicate content", and would I be better off getting rid of some of the pages or somehow consolidating the info into a master page? Thanks!
Technical SEO | | moziodavid0 -
Blocking https from being crawled
I have an ecommerce site where https is being crawled for some pages. Wondering if the below solution will fix the issue www.example.com will be my domain In the nav there is a login page www.example.com/login which is redirecting to the https://www.example.com/login If I just disallowed /login in the robots file wouldn't it not follow the redirect and index that stuff? The redirect part is what I am questioning.
Technical SEO | | Sean_Dawes0 -
Moving to Dynamic IP
Hi all, We are going to use CDN with geographically distributed IPs. However the website has strong positions in local search in UK and in regular search for geo kwds. Is it possible that with moving to from UK static IP to dynamic IPs can affect positions in Google? Thanks, Jane
Technical SEO | | Jane_Barry0 -
Geotargeting by IP and SEO
Hi, Part of our site displays localized results based on the user's IP (we get the zipcode based on IP). For example a user in NY would get a list of NY based stores, while a user in CA would get a list of CA based stores. So if CA Googlebot comes to our site, it will get results based on Mountain View CA. Given the pages are generated based on your zip, I'm not sure how we'd indicate to Google that we have results for lots of locations and not just the Googlebot IP locations. (users can change their zipcode, but by default we use geolocation). Our landing pages contain localized content and unique urls with the zipcode etc, but it isn't clear how Google will find results for KY etc.
Technical SEO | | NicB10 -
Does ViewState affect SEO Rankings
We are redesigning our website and looking at our large data in the viewstate hidden field. Our thoughts are to move to bottom of page but our developers are stating this is an old myth and no resent documentation if it improves our SEO. Does ViewState (big data in hidden fields ) effect SEO in any way? If ViewState (hidden field) is present at the end of HTML document instead of near the start of HTML document then will it increase our ranking in search engines? Does Google or any other engine read the hidden fields? or do they ignore them? Can you point me to some valid documentation to back this up? Feed back is appreciated. Cathy
Technical SEO | | SEO-Team0