Non US site pages indexed in US Google search
-
Hi,
We are having a global site wide issue with non US site pages being indexed by Google and served up in US search results. Conversley, we have US en pages showing in the Japan Google search results.
We currently us IP detect to direct users to the correct regional site but it isn't effective if the users are entering through an incorrect regional page. At the top of each or our pages we have a drop down menu to allow users to manually select their preferred region. Is it possible that Google Bot is crawling these links and indexing these other regional pages as US and not detecting it due to our URL structure?
Below are examples of two of our URLs for reference - one from Canada, the other from the US
/ca/en/prod4130078/2500058/catalog50008/
/us/en/prod4130078/2500058/catalog20038/
If that is, in fact, what is happening, would setting the links within the drop down to 'no follow' address the problem?
Thank you.
Angie
-
John,
Thanks for adding all of these great suggestions - I don't do international that often so the full list of methods isn't always in my conscious awareness!
-
Here's all the things you can do to try geotarget your content for the search bots:
- Register each subfolder as a separate site in Google Webmaster Tools (e.g. example.com/ca/, example.com/us/), and geotarget it (see here).
- Set meta tags or http headers on each page to let Bing know the language and country (see here).
- For duplicate or near-duplicate pages across different English speaking localities, you can try out the hreflang tags to clue Google in that they're the same page, but geotargeting users in different locations. I haven't personally implemented this myself, so I can't speak to how well it works, but you can find more info about it hereand here.
Setting nofollows just stops PageRank from flowing, but bots can still follow these links, so I wouldn't do that.
-
Its absolutely possible that's what's happening. You cannot rely on Google's system being barred from crawling anything on your site, no matter how well you code it. Even if you blocked the URL with nofollow, it would not stop the bot.
Another factor is if all your content is in English (as your URL structure suggests it is). Google does a terrible job of discerning separation of international content when all the content is in the same language, on the same root domain.
Proper separation in a way Google can't confuse is vital. Since I expect you do not intend to change the language across sites, your best action would be to migrate international content to a completely different domain. At the very least you can then use GWT to inform Google that "this domain is for this country", however if you want to be even better off, you'd host that other content on a server in that country.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My Website Not Showing In Google English Search Results
My website is not visible on Google English. Selecting the language of Google in Hindi, Spanish, etc., my pages are visible in search results.
International SEO | | Jude_Wix0 -
Multiregional / Multilingual SEO - What do you do when there is no equivalent page?
Hello, We're building out a small number of pages for the US in a sub-folder .com/us. The idea is to show US specific pages to users in that location. However, we also have a number of pages which we will not be creating for the US as they're not relevant. I am planning on geo-targeting the US folder to instruct the search engines that this subfolder should appear in the US SERPS but since it isn't an exact science, there is a chance that US visitors may land on these non-us pages which could potentially give them a bad user experience. What should we do in instances where a US user lands on a non-us page with no equivalent page? Any help would be much appreciated!
International SEO | | SEOCT1 -
International Sites and Duplicate Content
Hello, I am working on a project where have some doubts regarding the structure of international sites and multi languages.Website is in the fashion industry. I think is a common problem for this industry. Website is translated in 5 languages and sell in 21 countries. As you can imagine this create a huge number of urls, so much that with ScreamingFrog I cant even complete the crawling. Perhaps the UK site is visible in all those versions http://www.MyDomain.com/en/GB/ http://www.MyDomain.com/it/GB/ http://www.MyDomain.com/fr/GB/ http://www.MyDomain.com/de/GB/ http://www.MyDomain.com/es/GB/ Obviously for SEO only the first version is important One other example, the French site is available in 5 languages and again... http://www.MyDomain.com/fr/FR/ http://www.MyDomain.com/en/FR/ http://www.MyDomain.com/it/FR/ http://www.MyDomain.com/de/FR/ http://www.MyDomain.com/es/FR/ And so on...this is creating 3 issues mainly: Endless crawling - with crawlers not focusing on most important pages Duplication of content Wrong GEO urls ranking in Google I have already implemented href lang but didn't noticed any improvements. Therefore my question is Should I exclude with "robots.txt" and "no index" the non appropriate targeting? Perhaps for UK leave crawable just English version i.e. http://www.MyDomain.com/en/GB/, for France just the French version http://www.MyDomain.com/fr/FR/ and so on What I would like to get doing this is to have the crawlers more focused on the important SEO pages, avoid content duplication and wrong urls rankings on local Google Please comment
International SEO | | guidoampollini0 -
US traffic falsely inflating traffic figures and bounce rate.
Hi fellow Mozzers! We're handling the digital marketing for a UK-based franchise of a Canadian SaaS company, and I've noticed that a large proportion of their traffic has been coming from the US (not the majority, but enough to skew the figures). The Canadian arm of the business deals with the US market, but the majority, if not all, is direct traffic which seems to suggest they've seen the web address somewhere (not sure where though). Is there a search-friendly way to move this traffic back to the Canadian site? I know I can set up a filter for US traffic so it stops distorting the stats we're seeing (which I have now done), but my worry is this is causing a high bounce rate that may be impacting Google's perception of the site quality. The traffic has a 100% bounce rate (not surprisingly), so if we could find a best practice way of sending them to the Canadian site, that would be great. My first thought was a screen that appears for US traffic prompting them to the Canadian site, but presumably this would still count as a bounce as they're only on one page? Any help much appreciated! Cheers guys,
International SEO | | themegroup
Nick0 -
Why has there been Massive increase in traffic to my clients .eu site after redirects were initiated?
Hi guys, This is a strange one thats really bugging me. I have a client that redirected their domain to a brand new domain that was already live for the previous two months. I have been trying analyse the data however I can't quite understand why there is a massive increase in visitors from the United States when the old site was redirected. The redirection took place at the beginning of July. It was badly managed in terms of the mapping of 301 redirects however thats not the issue here. The level of traffic is gradually decreasing I imagine due to the high level of bounces. The site in question is an EU funded website for education. The old site in the first 2 weeks of June received around 500 visits from the USA while the new site in the first 2 weeks of July (2 weeks into the redirects) received around 3,000 visits from the USA. The new site had previously received only 300 visits for the same period as the old site in the 1st 2 weeks of June. Any idea why this might be? Thanks Rob
International SEO | | daracreative0 -
Should product-pages with different currencies have different URLs?
Here is a question that should be of interest for small online merchants selling internationally in multiple currencies. When, based on geolocation, a product-page is served with different currencies, should a product-page have a different URL for each currency? Thanks.
International SEO | | AdrienOLeary0 -
SEO Audit "Hybrid Site"
Hi everyone! I'm trying to analyze a website which is regional in scope. The way the site for every market has been build out is like this : http://subdomain.rootdomain.com/market | http://asiapacific.thisismybrandname.com/ph OR http://subdomain.rootdomain.com/language | http://asiapacific.thisismybrandname.com/en Since this is the first time I'm trying to work on these kinds of sites, I would want to ask for any guidance / tips on how to do about SEO site and technical audit. FYI, the owner of the sites is not giving me access / data to their webmaster account nor their analytics tracking tool. Thanks everyone! Steve
International SEO | | sjcbayona-412180 -
What countries does Google crawl from? Is it only US or do they crawl from Europe and Asia, etc.?
Where does Google crawl the web from? Is it in the US only, or do they do it from a European base too? The reason for asking is for GeoIP redirection. For example, if a website is using GeoIP redirection to redirect all US traffic to a .com site and all EU traffic to a .co.uk site, will Google ever see the .co.uk site?
International SEO | | Envoke-Marketing2