Urgent: Any point having /au version of the website for Australia?
-
Hi,
We just migrated our website from /uk to the global one (but we still kept /us). We are expanding our business to Australia. Is there any point having the global .com site duplicated as .com/au provided the content will be identical?
What's the /au impact on the domain strength and rank in Australia in comparison to having just .com.
Is there any point? Anyone has direct experience? What's the best practice?
Many thanks for the answers.
Katarina
-
Duplicate Content
I have experience facing problems like these ones. In the past, I worked with sites multilingual and multi-region and even multi-location (same country but different cities) websites, mostly for Hotels, Restaurants, and Business related to the tourism.
First of all (probably you did it. But is ok keeping it in mind)
Add Every domain and every variation of your domains on Search Console- http:yoursite.com
- http:www.yoursite.com
- https:yoursite.com
- https:www.yoursite.com
Talking about your questions
It's common for websites to provide similar or the same content in different languages when targeting different regions while having different URLs. Google is okay with this as long as the users are from different countries. Your website will not be penalized when translation is manual and accurate. Even though Google still prefers unique content for each version, it understands that having unique content can be quite tough. Google clearly states that you don't need to hide such content by not allowing Google to crawl it using a robots.txt file or no index robots meta tag.
The circumstances are entirely different if you're providing the same content to the same audience through two URLs. Let me explain this with an example. Imagine you've created yourbusiness.com and yourbusiness.com.au. One targets the USA and other targets Australia respectively. Since both are in English, this will cause duplicate content. Luckily, it can be easily solved using an hreflang tag, which is widely accepted by all search engines globally.
The hreflang tag protects international SEO campaigns from being penalized with duplicate content. It's usually required by businesses that cater to different languages or countries through sub-domains, subfolders, or ccTLD. The hreflang tag also is important if you have multiple languages for one single targeted country.
Here's what I do to implementing it:
Step 1: First, we must handle language targeting. You'll have to list out the URLs that have equivalents in different languages. Any stand-alone or non-equivalent URLs would not need the hreflang tag, so don't list them.
Step 2: Now comes setting up the tag. This is what a general hreflang tag looks like:
All you need are the country-wide codes
http://www.mathguide.de/info/tools/languagecode.html
For having a site that targets different countries in same language, you'll use code like:**Step 3:**Here the hreflang="x-default" is used to create a default common page for all countries. This is generally the homepage or another neutral page for all countries.
After implementation, you can check that what you've done works properly by logging into your Google Webmaster Tool account. Proceed to "Search Traffic" and then "International Targeting." If the hreflang tags were placed properly, you'll be able to test them utilizing the feature presented there. When problems ensue, try using the hreflang tag generator tool to make things easy.
Common Mistakes to Avoid
- Incorrect use of language codes: All tags should contain codes as per ISO 639-1. Using incorrect ones will negatively impact your international SEO.
- Missing confirmation link: If page A links to page B, page B must link back to page A with a proper hreflang tag.
IF THIS ANSWER WERE USEFUL MARK IT AS A GOOD ANSWER
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sale Pages On An eCommerce Website
I have a client who sells 50 brands of shoes. At the moment the developer has a noindex/nofollow tag on all sale pages which is wrong as around 10% of site activity revolves around those pages. The structure looks like this: 1. For Cats/Sub Cats site/sale
Intermediate & Advanced SEO | | Nigel_Carr
site/womens/sale
site/womens/shoe/sale
site/womens/shoes/ballerinas/sale For every cat/subcat - there are 10 cats and average 5 subcats per cat so 50 pages of sale. 2. For Brands site/brand
site/brand/womens
site/sale/brand
site/sale/womens/brand
site/sale/womens/cat/brand
site/sale/womens/cat/subcat/brand So each brand can have four sale pages on top of its own brand page. 50 brands x 54 = around 2700. Now no one is going to start writing 2700 pieces of additional on page content (although Meta is OK! ) and we risk further diluting the brand pages we need to show highly for, so we need to do something. Should we Category Pages: 1. Allow all sale cat and subcat pages to proliferate through Google? or
2. Canonicalise all sale sub category pages back to category
3. Caonicalise all category and Subcategory pages back to sale/womens Brand Pages: 1. Allow all sale brand pages to proliferate through Google ?
2. Canonicalise Sub Cat brand pages back to sale/category/brand
3. Canonicalise Sub Cat and Cat back to sale/brand Note the lower pages never do well in search. If you search a brand + Sale in Google it is always the site/brand page that comes up, never the sale version (This is from research on other similar sites and my own analysis) Same with Sub Cats - eg, Brand + Subcat - it's always site/brand that comes up first wand has the highest PA. Also we can't analyse any of these sale pages in MOZ or anywhere else as they are not in search at all having been no indexed. That's my conundrum for today, Any thoughts would be appreciated!0 -
Ranking problems with international website
Hey there, we have some ranking issues with our international website. It would be great if any of you could share their thoughts on that. The website uses subfolders for country and language (i.e. .com/uk/en) for the website of the UK branch in English. As the company has branches all over the world and also offers their content in many languages the url structure is quite complex. A recent problem we have seen is that in certain markets the website is not ranking with the correct country. Especially in the UK and the US, Google prefers the country subfolder for Ghana (.com/gh/en) over the .com/us/en and .com/uk/en versions. We have hreflang setup and should also have some local backlinks pointing to the correct subfolders as we switched from many ccTLDs to one gTLD. What confuses me is that when I check for incoming links (Links to your site) with GWT, the subfolder (.com/gh/en) is listed quite high in the column (Your most linked content). However the listed linking domains are not linking at all to this folder as far as I am aware. If I check them with a redirect checker they all link to different subfolders. So I have now idea why Google gives such high authority to this subfolder over the specific country subfolders. The content is pretty much identical at this stage. Has any of you experienced similar behaviour and could point me in a promising direction? Thanks a lot. Regards, Jochen
Intermediate & Advanced SEO | | Online-Marketing-Guy0 -
301 Redirect from now defunct website?
Hi guys Quick question about 301 redirection between domains. I currently manage a website, lets call it website A. Website A sells a particular product range, however the decision has been made by the powers that be to pull the plug on the business and sell the products previously sold via Website A via another website within the parent companies control.....lets call it Website B. I need to make it clear to customers of Website A that the company no longer operates but want to pass the SEO equity that has been built up over time to the relevant pages on Website B. My plan was to 1. 301 Redirect all key landing pages on Website A to the most relevant pages on Website B 2. Initially keep the website A homepage live but change the message to say "Website A no longer operates, but Website B can help etc. etc." Remove all sub links from navigation. 3. Monitor referral and direct traffic levels and consider 301 redirecting website A homepage to Website B homepage in the long term. My questions: Does this sound like the best approach? If not, what alternatives are there? Will Website A look like a link farm for Website B? I dont want this obviously!
Intermediate & Advanced SEO | | DHS_SH0 -
Convertinh Website for Mobile ?
Dear All, My website (a rental website where you can hire anything online) currently has approx 1/3 of it's traffic from mobile devices , ipads, iphones, blackberry , HTC etc etc. The bounce rate is slightly higher for these at 47% but we have not optimised our site to be mobile friendly as yet. This is something I want to do. Do we just make it fit well into a 320 pixel screen or should we do new wire frames etc and essentially make a cut down website just for mobile. One of the things I know we will do is to make it pick up someones location and retrieve results nearest to them but is there any other do's and don'ts when going down the mobile route ?. I seem to find conflicting infomation when researching this , one example is about using a different url for mobile , one site says it gets you more traffic whilst another says don't use a different url. Also does anyone know of any good tips or sites where I can get some useful info about making the site Mobile Friendly >? thanks Sarah.
Intermediate & Advanced SEO | | SarahCollins0 -
International Version of Website
Our website is AluminumEyewear.com and we're considering launching a specific version for Australia, naturally I want to avoid any dupe content issues but the content would largely remain the same. I have read through this post and wondered if the options given here are still relevant? I'm currently leaning towards using a sub-domain, i.e. au.aluminumeyewear.com or should I go for aluminumeyewear.com.au? Will there be dupe content issues if I do that? Confused and hoping for help!
Intermediate & Advanced SEO | | smckenzie750 -
Export Website into XML File
Hi, I am having an agency optimize the content on my sites. I need to create XML Schema before I export the content into XML. What is best way to export content including meta tags for an entire site along with the steps on how to?
Intermediate & Advanced SEO | | Melia0 -
Effect of URL change on Website
Hello we are developers and we have just created a new webpage for a client of us. The problem is that we can not replace the old one by the new one, cause our client has developed over 15 satellite pages that calls directly to the code of the old page. If we completly remove the old page we will make those 15 pages go down. Those pages are working over domains specially register for SEO reasons. For example Main page is www.euroair.es Satellite page is www.aireacondicionadodaikin.com Satellite page has pretty good ranking for search term "aire acondicionado daikin" As I told you, we have a new page but we can not make the page work over root domain. So we thought we could make it work over www.euroair.es/es, and make a redirection 301 of homepage and another important inner pages. We chose "/es" folder because it seems like a language folder, but we are not very sure of the effects of pages working on that folder instead of working on root directory. What do you think? Is this matter important or doesn't? Thanks
Intermediate & Advanced SEO | | teconsite.com0 -
Should we block urls like this - domainname/shop/leather-chairs.html?brand=244&cat=16&dir=ascℴ=price&price=1 within the robots.txt?
I've recently added a campaign within the SEOmoz interface and received an alarming number of errors ~9,000 on our eCommerce website. This site was built in Magento, and we are using search friendly url's however most of our errors were duplicate content / titles due to url's like: domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=1 and domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=4. Is this hurting us in the search engines? Is rogerbot too good? What can we do to cut off bots after the ".html?" ? Any help would be much appreciated 🙂
Intermediate & Advanced SEO | | MonsterWeb280