Non US site pages indexed in US Google search
-
Hi,
We are having a global site wide issue with non US site pages being indexed by Google and served up in US search results. Conversley, we have US en pages showing in the Japan Google search results.
We currently us IP detect to direct users to the correct regional site but it isn't effective if the users are entering through an incorrect regional page. At the top of each or our pages we have a drop down menu to allow users to manually select their preferred region. Is it possible that Google Bot is crawling these links and indexing these other regional pages as US and not detecting it due to our URL structure?
Below are examples of two of our URLs for reference - one from Canada, the other from the US
/ca/en/prod4130078/2500058/catalog50008/
/us/en/prod4130078/2500058/catalog20038/
If that is, in fact, what is happening, would setting the links within the drop down to 'no follow' address the problem?
Thank you.
Angie
-
John,
Thanks for adding all of these great suggestions - I don't do international that often so the full list of methods isn't always in my conscious awareness!
-
Here's all the things you can do to try geotarget your content for the search bots:
- Register each subfolder as a separate site in Google Webmaster Tools (e.g. example.com/ca/, example.com/us/), and geotarget it (see here).
- Set meta tags or http headers on each page to let Bing know the language and country (see here).
- For duplicate or near-duplicate pages across different English speaking localities, you can try out the hreflang tags to clue Google in that they're the same page, but geotargeting users in different locations. I haven't personally implemented this myself, so I can't speak to how well it works, but you can find more info about it hereand here.
Setting nofollows just stops PageRank from flowing, but bots can still follow these links, so I wouldn't do that.
-
Its absolutely possible that's what's happening. You cannot rely on Google's system being barred from crawling anything on your site, no matter how well you code it. Even if you blocked the URL with nofollow, it would not stop the bot.
Another factor is if all your content is in English (as your URL structure suggests it is). Google does a terrible job of discerning separation of international content when all the content is in the same language, on the same root domain.
Proper separation in a way Google can't confuse is vital. Since I expect you do not intend to change the language across sites, your best action would be to migrate international content to a completely different domain. At the very least you can then use GWT to inform Google that "this domain is for this country", however if you want to be even better off, you'd host that other content on a server in that country.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multilang site: Auto redirect 301 or 302?
We need to establish if 301 or 302 response code is to be used for our auto redirects based on Accept-Language header. https://domain.com
International SEO | | fJ66doneOIdDpj
30x > https://domain.com/en
30x > https://domain.com/ru
30x > https://domain.com/de The site architecture is set up with proper inline HREFLANG.
We have read different opinions about this, Ahrefs says 302 is the correct one:
https://ahrefs.com/blog/301-vs-302-redirects/
302 redirect:
"You want to redirect users to the right version of the site for them (based on location/language)." You could argue that the root redirect is never permanent as it varies based on user language settings (302)
On the other hand, the lang specific redirects are permanent per language: IF Accept-Language header = en
https://domain.com > 301 > https://domain.com/en
IF Accept-Language header = ru
https://domain.com > 301 > https://domain.com/ru So each of these is 'permanent'. So which is the correct?0 -
International SEO - UK & US
Hi! I'm currently working with a brand that is well established in the UK and is looking to expand it's reach in US. The UK site has a solid link profile and I think that creating a sub-folder for the US site is by far the best solution. My only concern is that the UK site uses a .co.uk domain. Would it therefore be counter-productive to use a subfolder that looks like this: www.example.co.uk/us In an ideal world I would advise the brand to acquire a location neutral domain (e.g. www.example.com) however the [brandname].com isn't available and options are otherwise very limited! Steps would be taken to ensure all other technical bases are covered (hreflang tags etc) but I'm struggling to find any further insight on this issue. Any feedback from the community would be greatly appreciated! Many thanks, Harrison
International SEO | | harrycox0 -
How to best set up international XML site map?
Hi everyone, I've been searching about a problem, but haven't been able to find an answer. We would like to generate a XML site map for an international web shop. This shop has one domain for Dutch visitors (.nl) and another domain for visitors of other countries (Germany, France, Belgium etc.) (.com). The website on the 2 domains looks the same, has the same template and same pages, but as it is targeted to other countries, the pages are in different languages and the urls are also in different languages (see example below for a category bags). Example Netherlands:
International SEO | | DocdataCommerce
Dutch domain: www.client.nl
Example Dutch bags category page: www.client.nl/tassen Example France:
International domain: www.client.com
Example French bags category page: www.client.com/sacs When a visitor is on the Dutch domain (.nl) which shows the Dutch content, he can switch country to for example France in the country switch and then gets redirected to the other, international .com domain. Also the other way round. Now we want to generate a XML sitemap for these 2 domains. As it is the same site, but on 2 domains, development wants to make 1 sitemap, where we take the Dutch version with Dutch domain as basis and in the alternates we specify the other language versions on the other domain (see example below). <loc>http://www.client.nl/tassen</loc>
<xhtml:link<br>rel="alternate"
hreflang="fr"
href="http://www.client.com/sacs"
/></xhtml:link<br> Is this the best way to do this? Or would we need to make 2 site maps, as it are 2 domains?0 -
International Confusion between .com and .com/us
A question regarding International SEO: We are seeing cases for many sites that meet these criteria: -International sites that have www.example.com/ ip redirecting to country site based on ip redirect (ex. www.example.com/ 301 to www.example.com/us -There is a desktop + mobile site (www.example.com + m.example.com) The issue we see is Google shows www.example.com/ in US search results instead of www.example.com/us in search results. Since the .com/ redirects, there is no mobile version, and www.example.com/ also shows up in mobile SERPs instead of m.example.com/us. My questions are: 1. If www.example.com/ is redirecting users and Googlebot, why is Googlebot caching it with the content of www.example.com/us? 2. Why is www.example.com/ showing up in SERPs instead of www.example.com/us? 3. How can we help Google display www.example.com/us and m.example.com/us in SERPs instead of www.example.com/? Thanks!!
International SEO | | FranFerrara0 -
Getting pages that load dynamically into the SE's
SEO'ers, Am dealing with an issue I cannot figure out the best way to handle. Working on a website that shows the definitions of words which are loaded dynamically from an open source. Source such as: wiktionary.org When you visit a particular page to see the definition of the word, say; www.example.com/dictionary/example/ the definition is there. However, how can we get all the definition pages to get indexed in search engines? The WordPress sitemap plugin is not picking up these pages to be added automatically - guess because it's dynamic - but when using a sitemap crawler pages are detected. Can anybody give advice on how to go about getting the 200k+ pages indexed in the SE's? If it helps, here's a reference site that seems to load it's definitions dynamically and has succeeded in getting its pages indexed: http://www.encyclo.nl/begrip/sample
International SEO | | RonFav0 -
Google suggesting a translation
Hi everybody, I notice since some months that Google when used for german language results proposes a translation next to the listing of one of my websites. When searching for english results (hl=en) it does not propose a translation! My website is clearly in german (given as target in GWMT and by meta tag). Other pages on the same domain are not treated this way by Google. No translation is proposed for all subpages of this website. Obviously, Google considers the homepage of this website english instead of german. Any fix for that? It is a *.org Wolfgang
International SEO | | wgr_strategic0 -
Robots.txt issue with indexation
Hello i have a problem with one of the rules for robots.txt i have a multilingual mutation of entire page on www.example.com/en/ I want to make indexable /allow/ the main page under /en/ but not indexable /disallow/ everything else under /en/* Please help me how to write the rule.
International SEO | | profesia0 -
Site structure for multi-lingual hotel website (subfolder names)
Hi there superMozers! I´ve read a quite a few questions about multi-lingual sites but none answered my doubt / idea, so here it is: I´m re-designing an old website for a hotel in 4 different languages which are all** hosted on the same .com domain** as follows: example.com/english/ for english example.com/espanol/ for **spanish ** example.com/francais/ for french example.com/portugues/ for portuguese While doing keyword search, I have noticed that many travel agencies separate geographical areas by folders, therefor an **agency pomoting beach hotels in South America **will have a structure as follows: travelagency.com/argentina-beach-hotels/ travelagency.com/peru-beach-hotels/ and they list hotels in each folder, therefor benefiting from those keywords to rank ahead of many independent hotels sites from those areas. What **I would like to **do -rather than just naming those folders with the traditional /en/ for english or /fr/ for french etc- is take advantage of this extra language subfolder to_´include´_ important keywords in the name of the subfolders in the following way (supposing the we have a beach hotel in Argentina): example.com/argentina-beach-hotel/ for english example.com/hotel-playa-argentina/ for **spanish ** example.com/hotel-plage-argentine/ for french example.com/hotel-praia-argentina/ for portuguese Note that the same keywords are used in the name of the folder, but translated into the language the subfolders are. In order to make things clear for the search engines I would specify the language in the html for each page. My doubt is whether google or other search engines may consider this as ´stuffing´ although most travel agencies do it in their site structure. Any Mozers have experience with this, any idea on how search engines may react, or if they could penalise the site? Thanks in advance!
International SEO | | underground0