Best SEO practice for multiple languages in website
-
HI,
We would like to include multiple languages for our global website. What's the best practice to gain from UI and SEO too. Can we have auto language choosing website as per browsing location? Or dedicated pages for important languages like www.website.com/de for German. If we go for latter, how about when users browsing beside language page as they will be usually in English
-
Hi,
Thanks for the reply.
We are more interested in folders than different TLDs. In this case, how we gonna feed other pages of website to other language visitors?
For french, they will land on example.com/fr/ and if they browse to other pages, should all other pages must have French? If so what's the way to present "French" content to them? Just an auto-translation? Or French written content? Any best example site you can refer?
-
There are more than one possible answer for this. I believe it depends on your business needs.
You should let Google know about all the versions of the site you have via hreflang, that´s a line of code you must insert within the Let me give you an example:
Those lines are letting Google know that a specific page has an equivalent in spanish, french, portuguese and english.
All the versions should be included, even the one the visitor is actually reading.
That's the basic step, you can choose between having a folder per idiom (example.com/de) or a more specific TLD (example.de) a specific domain is great to point for specific countries and the folder structure is more suitable for pointing to different languages which may cover many countries (spanish may point only to Argentina if you use example.com.ar or to all spanish speaking countries if you choose example.com/es)
I'm sure the community will help you dig deeper on this, let me add some links to get more info on this.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO on dynamic website
Hi. I am hoping you can advise. I have a client in one of my training groups and their site is a golf booking engine where all pages are dynamically created based on parameters used in their website search. They want to know what is the best thing to do for SEO. They have some landing pages that Google can see but there is only a small bit of text at the top and the rest of the page is dynamically created. I have advised that they should create landing pages for each of their locations and clubs and use canonicals to handle what Google indexes.Is this the right advice or should they noindex? Thanks S
Intermediate & Advanced SEO | | bedynamic0 -
Robots.txt Blocking - Best Practices
Hi All, We have a web provider who's not willing to remove the wildcard line of code blocking all agents from crawling our client's site (user-agent: *, Disallow: /). They have other lines allowing certain bots to crawl the site but we're wondering if they're missing out on organic traffic by having this main blocking line. It's also a pain because we're unable to set up Moz Pro, potentially because of this first line. We've researched and haven't found a ton of best practices regarding blocking all bots, then allowing certain ones. What do you think is a best practice for these files? Thanks! User-agent: * Disallow: / User-agent: Googlebot Disallow: Crawl-delay: 5 User-agent: Yahoo-slurp Disallow: User-agent: bingbot Disallow: User-agent: rogerbot Disallow: User-agent: * Crawl-delay: 5 Disallow: /new_vehicle_detail.asp Disallow: /new_vehicle_compare.asp Disallow: /news_article.asp Disallow: /new_model_detail_print.asp Disallow: /used_bikes/ Disallow: /default.asp?page=xCompareModels Disallow: /fiche_section_detail.asp
Intermediate & Advanced SEO | | ReunionMarketing0 -
Advanced: SEO best practice for a large forum to minimise risk...?
Hi Hope someone can offer some insight here. We have a site with an active forum. The transactional side of the site is about 300 pages totals, and the forum is well over 100,000 (and growing daily) meaning the 'important' pages account for less that 0.5% of all pages on the site. Rankings are pretty good and we're ticking lots of boxes with the main site, with good natural links, logical architecture, appropriate keyword targeting. I'm worried about the following: crawl budget PR flow Panda We actively moderate the forum for spam and generally the content is good (for a forum anyway), so I'm just looking for any best practice tips for minimising risk. I've contemplated moving the forum to a subdomain so there's that separation, or even noindexing the forum completely, although it does pull in traffic. Has anyone been in a similar situation? Thanks!
Intermediate & Advanced SEO | | iProspect_Manchester1 -
Best way to link 150 websites together
Fellow mozzers, Today I got an interesting question from an entrepreneur who has plans to start about 100-200 webshops on a variety of subjects. His question was how he should like them together. He was scared that if he would just make a page on every website like: www.domain.com/our-webshops/ that would list all of the webshops he would get penalised because it is a link farm. I wasn't sure 100% sure which advise to give him so i told him i needed to do some research on the subject to make sure that i'm right. I had a couple of suggestions myself. 1. Split the amount of pages by 3 and divide them into three columns. Column A links to B, B links to C and C links to A. I realize this is far from ideal but it was one of the thoughts which came up. 2. Divide all the webshops into different categories. For example: Webshops aimed at different holidays, webshops aimed at mobile devices etcetera. This way you will link the relevant webshops together instead of all of them. Still not perfect. 3. Create a page on a separate website (such as a company website) where the /our-webshops/ page exists. This way you only have to place a link back from the webshops to this page. I've seen lots of webshops using this technique and i can see why they choose to do so. Still not ideal in my opinion. That's basicly my first thoughts on the subject. I would appreciate any feedback on the methods described above or even better, a completely different strategy in handling this. For some reason i keep thinking that i'm missing the most obvious and best method. 🙂
Intermediate & Advanced SEO | | WesleySmits0 -
Best Practice for ALT tags of flags to interlink multinational site
For a partial keyword match domain name what would you recommend as ALT tag to internlink country domains (different CCTLD)? Option 1)
Intermediate & Advanced SEO | | lcourse
DOMAIN.com DOMAIN.de DOMAIN.co.uk => I am a bit concerned about this option in terms of potential penalty for keywords in ALT (since partial match domains) Option 2)
UK
DE
FR ... Option 3)
English UK
Deutsch Deutschland
Deutsch Österreich
Francais France => concerned here about mixing lots of languages in ALT tags in each page, which may confuse google language detection.0 -
Website layout for a new website [Over 50 Pages & targeting Long Tail Keywords]
Hey everyone, We are designing a new website with over 50 pages and I have a question regarding the layout. Should I target my long tail keywords via blog pages? It will be easier to manage and list and link out to similar articles related to my long tail keywords using a word press blog. For this example - lets suppose the website is www.orange.com and we sells 'Oranges' Am I going about this in the right way? Main Section: Main Section 1 : Home Page - Keyword Targeted - Orange Main Section 2 : Important Conversion page - 'Buy oranges' Long Tail Keyword (LTK) 1: www.orange.com/blog/LTK1 Subsection(SS): www.orange.com/blog/LTK1/SS1 www.orange.com/blog/LTK1/SS1a www.orange.com/blog/LTK1/SS1b Long Tail Keyword (LTK) 2: www.orange.com/blog/LTK2 Long Tail Keyword (LTK) 3: www.orange.com/blog/LTK3 Subsection(SS): www.orange.com/blog/LTK1/SS3 www.orange.com/blog/LTK1/SS3a www.orange.com/blog/LTK1/SS3b All these long tail pages and sub sections under them are built specifically for hosting content that targets these specific long tail keywords. Most of my traffic will come initially via the sub section pages - and it is important for me to rank well for these terms initially. _E.g. if someone searches for the keyword 'SS3b' on Google - my corresponding page www.orange.com/blog/LTK1/SS3b should rank well on the results page. _ For ranking purposes - will using this blog/category structure hurt or benefit me? Instead do you think I should build static pages? Also, we are targeting more than 50 long tail keywords - and building quality content for each of these keywords - and I assume that we will be doing this continuously. So in the long term term which is more beneficial? Do you have any suggestions on if I am going about this the right way? Apologies for using these random terms - oranges, LKT, SS etc in this example. However, I hope that the question is clear. Looking forward to some interesting answers on this! Please feel free to share your thoughts.. Thank you! Natasha
Intermediate & Advanced SEO | | Natashadogres0 -
US version of our existing website - SEO copy duplication?
Hi, We are planning to launch a dedicated US version of our existing e-commerce website. The US version will sit on a separate domain and will be completely standalone to our existing website. My question is, since a lot of the pages are product pages and will exist on each respective site. Does the copy have to be completely re-written (or just slightly) on the new US site in order to avoid any SEO duplicate copy issues with the existing site? Any advice would be much appreciated
Intermediate & Advanced SEO | | MJMarketing0 -
Local + National seo for a new website
Hi, Look for idea for a website owner selling training courses on painting , he wants to be ranked locally first but also national on google Serp (only one physical address available). His domain name is rather a branded one (no kw in it), and the website is recent (1 year) . An audit will be focused on competitor rankings to find niche KW. I 'd advise : - For local : optimized Google address listing + local business directories + optumized page with local emphasis (shema.org ..) For national : to make unique relevant content pages with keywords geographically targeted (according audit), for instance for a specific town, to include terms related to this particular local market etc.. 1/ What 's else could i suggest to start a national ranking ? 2/ Have you heard of a tool to make distant queries on Google ? I mean , i leave in Madrid (spain) and want to see google serp as if i was in Barcelona ? (seems difficult as google uses Ip). Tks in advance for your advices...
Intermediate & Advanced SEO | | mlc0