Geotargeting a folder in GWT & IP targeting
-
I am curently managing a .com that targets Canada and we will soon be launching a .com/us/ that will target the US. Once we launch the /us/ folder, we want to display the /us/ content to any US IP. My concern is that Google will then only index the /us/ content, as their IP is in the US.
So, if I set up .com and .com/us/ as two different sites in GWT, and geotarget each to the Country it is targeting, will this take care of the problem and ensure that Google indexes the .com for Canada, and the /us/ for the US?
Is there any alternative method (that does not include using the .ca domain)? I am concerned that Google would not be able to see the .com content if we are redirecting all US traffic to .com/us/.
Any examples of this online anywhere?
-
Geotargeting is not uncommon and the search engines are pretty good at negotiating sites using it, depends on your implimentation though.
We use geotargeting of users on our site to send users to /country/ based on their IP, with a few conditions.
-
Users are only redirected when coming to the homepage, either as direct or from search.
If a user from France is coming to the main dot com we'll send them to the /fr/ site automatically.
If the user tries to get to the homepage via a search engine we redirect them to the appropriate language site. This is because the main dot com often shows up above or near the /country/ sites in search, especially for brand terms.
If a user has been given a subpage for a search (e.g. site/cat/page/) then they won't be redirected as we assume the search engines have done a decent enough job at matching them to a page they want. -
Users/bots can override the routing.
Users can internally navigate to other languages through the language menu.
If we send somebody in France to the French site, but they only speak English, instantly forcing them back to the /fr/ site will just frustrate them.
Link to the other language sites on your pages and check referral headers, if internal do not redirect.
All language folders are specified in WMT and it definitely works.
So Google comes in to the dot com, in the header there is a link to the /us/, so they'll crawl down that link getting both versions of the sites in their index.
You biggest challenge here is getting the /us/ ranking above the dot com in the USA, which will require some creative link building
Make sense?
-
-
Hi Bobby,
What eBay used to do when you were reaching the www.ebay.com from Québec in Canada, it used to display a lightbox saying something like : "Hey, we have a french canadian version of our website, would you like to use it? Click here" and then you would be redirected to www.cafr.ebay.ca
So, my suggestion is, instead of using a 301 redirection to redirect the user to the appropriate geotargeted portion of the website, let him chose by showing him a friendly javascript lightbox.
As web spider won't be prompted the javascript lightbox, they will be able to crawl and index both version without problem.
Best regards,
Guillaume Voyer.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No Follow & Rel Canon for Product Filters
Our site uses Canonicals to address duplicate content issues with product/facet filtering. example: www.mysite.com/product?color=blue Relcanon= www.mysite.com/product However, our site is also using no follow for all of the "filters" on a page (so all ?color=, etc. links are no follow). What is the benefit of utilizing the no follow on the filters if we have the rel canon in place? Is this an effort to save crawl budget? Are we giving up possible SEO juice by having the no follow and not having the crawler get to the canonical tag and subsequently reference the main page? Is this just something we just forget about? I hope we're not giving up SEO juice by
Technical SEO | | Remke0 -
Best Practices For Angular Single Page Applications & Progressive Web Apps
Hi Moz Community, Is there a proper way to do SPA (client side rendered) and PWA without having a negative impact on SEO? Our dev team is currently trying to covert most of our pages to Angular single page application client side rendered. I told them we should use a prerendering service for users that have JS disabled or use server side rendering instead since this would ensure that most web crawlers would be able to render and index all the content on our pages even with all the heavy JS use. Is there an even better way to do this or some best practices? In terms of the PWA that they want to add along with changing the pages to SPA, I told them this is pretty much separate from SPA's because they are not dependent. Adding a manifest and service worker to our site would just be an enhancement. Also, if we do complete PWA with JS for populating content/data within the shell, meaning not just the header and footer, making the body a template with dynamic JS as well would that effect our SEO in any way, any best practices here as well? Thanks!
Technical SEO | | znotes0 -
Geo ip filtering / Subdomain can't be crawled
My client has "load balancing" site traffic in the following way: domain: www.example.com traffic from US IP redirected to usa.example.com traffic from non-US IP redirected to www2.example.com The reason for doing this is that site contents on the www2 contains herbal medicine info banned by FDA."usa.example.com" is a "cleaned" site. Using HK IP, when I google an Eng keyword, I can see that www.example.com is indexed. When googling a Chi keyword, nothing is indexed - neither the domain or www2 subdomain. From Google Search Console, it shows a Dell Sonicwall geo ip filtering alert for www2 (Connection initiated from country: United States). GSC data also confirms that www2 has never been indexed by Google. Questions: Is geo ip filtering the very reason why www2 isn't indexed? What should I do in order to get www2 to be indexed? Thanks guys!
Technical SEO | | irene7890 -
GWT Soft 404 count is climbing. Important to fix?
In GWT I am seeing my mobile site's soft 404 count slowly rise from 5 two weeks ago to over 100 as of today. If I do nothing I expect it will continue to rise into the thousands. This is due to there being followed links on external sites to thousands of discontinued products we used to offer. The landing page for these links simply says the product is no longer available and gives links to related areas of our site. I know I can address this by returning a 404 for these pages, but doing so will cause these pages to be de-indexed. Since these pages still have utility in redirecting people to related, available products, I want these pages to stay in the index and so I don't want to return a 404. Another way of addressing this is to add more useful content to these pages so that Google no longer classifies them as soft 404. I have images and written content for these pages that I'm not showing right now, but I could show if necessary. But before investing any time in addressing these soft 404s, does anyone know the real consequences of not addressing them? Right now I'm getting 275k pages indexed and historically crawl budget has not been an issue on my site, nor have I seen any anomalous crawl activity since the climb in soft 404s began. Unchecked, the soft 404s could climb to 20,000ish. I'm wondering if I should start expecting effects on the crawl, and also if domain authority takes a hit when there are that many soft 404s being reported. Any information is appreciated.
Technical SEO | | merch_zzounds0 -
Changing the order of items on page against Google Terms & Conditions?
Good day, I am wondering if anybody here has done something like this before. I have a page in one of my sites that contains a number of different - but related - free resources. The resources can be sorted in different ways once the user is on the page. Now I am starting an outreach campaign, and want to be able to send out custom URLS (which pretty much means they have different query strings after them like '?id=123' ) so that when a person clicks on the link to the page it brings up the stuff they are more likely to be interested in at the top. I expect - hope - that some of these people will put links back to this page as a result of this. Now all the links may be slightly different, but they will come to the same page and the content will look slightly different. I will make sure to have the rel=canonical tag in place. Does anybody know if this would be in violation of Google Terms and Conditions. I can't see how, but I wanted to see what the experts here on Moz think before moving forward. Thanks in advance.
Technical SEO | | rayvensoft0 -
Local Search: Technically optimised for Reviews & Stars, but not showing in SERPS
Hi, for over a year now we actively use schema.org into our yellow pages platform.
Technical SEO | | TruvoDirectories
Simultaneaously we managed to set up a review platform to attract more users to write reviews. We also monitor closely local search experts like (blumenthal and co 😉 ). So I learned in this post http://blumenthals.com/blog/2013/07/19/how-many-reviews-to-get-the-star-treatment-somewhere-between-4-and-5/ that it takes you 4-5 reviews to get the star treatment by Google. But at this moment, I cannot find any star treatment. For example on this listing http://www.goudengids.be/hollywok-kortrijk-kortrijk-8500/1/ you can notice the presence of 6 review (http://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Fwww.goudengids.be%2Fhollywok-kortrijk-kortrijk-8500%2F1%2F) but in Google itself it is not displayed as such. So my question is: in your experience, are there any other parameters that will trigger the stars to appear?0 -
Http & https canonicalization issues
Howdyho I'm SEOing a daily deals site that mostly runs on https Versions. (only the home page is on http). I'm wondering what to do for canonicalization. IMO it would be easiest to run all pages on https. But the scarce resources I find are not so clear. For instance, this Youmoz blog post claims that https is only for humans, not for bots! That doesn't really apply anymore, right?
Technical SEO | | zeepartner0 -
Subdomains & CDNs
I've set up a CDN to speed up my domain. I've set up a CNAME to map the subdomain cdn.example.com to the URL where the CDN hosts my static content (images, CSS and JS files, and PDFs). www.example.com and cdn.example.com are now two different IP addresses. Internal links to my PDF files (white papers and articles) used to be www.example.com/downloads but now they are cdn.example.com/downloads The same PDF files can be accessed at both the www and the cdn. subdomain. Thus, external links to the www version will continue to work. Question 1: Should I set up 301 redirects in .htaccess such as: Redirect permanent /downloads/filename.pdf http://cdn.example.com/downloads/filename.pdf Question 2: Do I need to do anything else in my .htaccess file (or anywhere else) to ensure that any SEO benefit provided by the PDF files remains associated with my domain? Question 3: Am I better off keeping my PDF files on the www side and off of the CDN? Thanks, Akira
Technical SEO | | ahirai0