Hreflang implementation issue
-
We are currently handling search for a global brand www.example.com which has presence in many countries worldwide. To help Google understand that there is an alternate version of the website available in another language, we have used “hreflang” tags. Also, there is a mother website (www.example.com/global) which is given the attribution of “x-default” in the “hreflang” tag. For Malaysia as a geolocation, the mother website is ranking instead of the local website (www.example.com/my) for majority of the products.
The code used for “hreflang” tag execution, on a product page, being:
These “hreflang” tags are also present in the XML sitemap of the website, mentioning them below:
<loc>http://www.example.com/my/product_name</loc>
<lastmod>2017-06-20</lastmod>
Is this implementation of “hreflang” tags fine? As this implementation is true across all geo-locations, but the mother website is out-ranking me only in the Malaysia market.
If the implementation is correct, what could be other reasons for the same ranking issue, as all other SEO elements have been thoroughly verified and they seem fine.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Poor Load Balancer Implementation, now the site is indexed 4 times
I was brought on to a project where the network admin has set up a load balancer to distribute traffic but somehow has incorrectly deployed the strategy. Now the site is listed 4 times as links to the primary domain in search console. How can I remove these from the index? I have already requested he no-index them, but they still remain in search console. What else can I do to ensure Google only sees this as a single site?
Intermediate & Advanced SEO | | DonFerrari21690 -
AU and US site needs Hreflang?
Hi guys, Just want to confirm if we need some SEO actions on two of our sites. Example:
Intermediate & Advanced SEO | | brandonegroup
https://www.example.com.au/collections/dresses
https://exampleamerica.com/collections/dresses Will the domain naming will fix the issue of possible duplication?
Do we still need to implement hreflang markups?0 -
Confusing mixture of cross-domain and multi-language - HREFLANG
Hi Mozzers, I am working for an international client, in a highly regulated industry. As such, their international set-up is slightly confusing. They currently operate websites across multiple countries (with ccTLDs), as well as a global .com. E.g: domain.co.uk domain.it domain. es domain.com etc. Additionally, they offer multiple languages across each of these domains, which often cross over. E.g: domain.co.uk/en/, domain.co.uk/fr/, domain.co.uk/de/ domain.es/en/, domain.es/es/ domain.it/en/, domain.it/it/ domain.com/en/, domain.com/es/, domain.com/fr/, domain.com/de/ They are not currently using HREFLANG of any sort. Using EN as an example, this results in 6 URLs showing the same content, albeit for different languages/locations: Main URL domain.co.uk/en/category-A/ hreflang="en-GB" Multi-lingual variants from same domain... domain.co.uk/fr/category-A/ hreflang="fr-GB" domain.co.uk/de/category-A/ hreflang="de-GB" Cross domain variants from other ccTLDs... domain.es/en/category-A/ hreflang="en-ES" domain.it/en/category-A/ hreflang="en-IT" domain.com/en/category-A/ hreflang="en" Can anyone cleverer than myself confirm that the above would be the most effective set-up for this scenario, with each URL referencing each other in this way?
Intermediate & Advanced SEO | | Pan12340 -
X Default on hreflang tags
Hi guys, I would like to clarify something about hreflang markups and most importantly, x-default. Sample URLs:
Intermediate & Advanced SEO | | geekyseotools
http://www.example.com/au/collection/dresses (Australia)
http://www.example.com/us/collection/dresses (United States)
http://www.example.com/uk/collection/dresses (United Kingdom) Sample Markups: Questions:
1. Can I use my AU page as x default? I noticed that some x default are US. Note that my biggest market is AU though.
2. If I indeed use AU page as x default, and the user is searching from China, does it mean that Google will return my AU page?
3. Can you spot any issues with these markups I made? Anything that I need to correct. Keen to hear from you! Cheers,
Chris0 -
Https Homepage Redirect & Issue with Googlebot Access
Hi All, I have a question about Google correctly accessing a site that has a 301 redirect to https on the homepage. Here’s an overview of the situation and I’d really appreciate any insight from the community on what the issue might be: Background Info:
Intermediate & Advanced SEO | | G.Anderson
My homepage is set up as a 301 redirect to a https version of the homepage (some users log in so we need the SSL). Only 2 pages on the site are under SSL and the rest of the site is http. We switched to the SSL in July but have not seen any change in our rankings despite efforts increasing backlinks and out put of content. Even though Google has indexed the SSL page of the site, it appears that it is not linking up the SSL page with the rest of the site in its search and tracking. Why do we think this is the case? The Diagnosis: 1) When we do a Google Fetch on our http homepage, it appears that Google is only reading the 301 redirect instructions (as shown below) and is not finding its way over to the SSL page which has all the correct Page Title and meta information. <code>HTTP/1.1 301 Moved Permanently Date: Fri, 08 Nov 2013 17:26:24 GMT Server: Apache/2.2.16 (Debian) Location: https://mysite.com/ Vary: Accept-Encoding Content-Encoding: gzip Content-Length: 242 Keep-Alive: timeout=15, max=100 Connection: Keep-Alive Content-Type: text/html; charset=iso-8859-1 <title>301 Moved Permanently</title> # Moved Permanently The document has moved [here](https://mysite.com/). * * * <address>Apache/2.2.16 (Debian) Server at mysite.com</address></code> 2) When we view a list of external backlinks to our homepage, it appears that the backlinks that have been built after we switched to the SSL homepage have been separated from the backlinks built before the SSL. Even on Open Site, we are only seeing the backlinks that were achieved before we switched to the SSL and not getting to track any backlinks that have been added after the SSL switch. This leads up to believe that the new links are not adding any value to our search rankings. 3) When viewing Google Webmaster, we are receiving no information about our homepage, only all the non-https pages. I added a https account to Google Webmaster and in that version we ONLY receive the information about our homepage (and the other ssl page on the site) What Is The Problem? My concern is that we need to do something specific with our sitemap or with the 301 redirect itself in order for Google to read the whole site as one entity and receive the reporting/backlinks as one site. Again, google is indexing all of our pages but it seems to be doing so in a disjointed way that is breaking down link juice and value being built up by our SSL homepage. Can anybody help? Thank you for any advice input you might be able to offer. -Greg0 -
Hey guys i have this issues on my crawling report what should i do to exlude the pages? are d
Overly-Dynamic URL Overly-Dynamic URL Although search engines can crawl dynamic URLs, search engine representatives have warned against using over 2 parameters in a given URL. Search engines may also see dynamic versions of the same URL as unique URLs, creating duplicate content.
Intermediate & Advanced SEO | | adulter0 -
Duplicate Page Title/Content Issues on Product Review Submission Pages
Hi Everyone, I'm very green to SEO. I have a Volusion-based storefront and recently decided to dedicate more time and effort into improving my online presence. Admittedly, I'm mostly a lurker in the Q&A forum but I couldn't find any pre-existing info regarding my situation. It could be out there. But again, I'm a noob... So, in my recent SEOmoz report I noticed that over 1,000 Duplicate Content Errors and Duplicate Page Title Errors have been found since my last crawl. I can see that every error is tied to a product in my inventory - specifically each product page has an option to write a review. It looks like the subsequent page where a visitor can fill out their review is the stem of the problem. All of my products are shown to have the same issue: Duplicate Page Title - Review:New Duplicate Page Content - the form is already partially filled out with the corresponding product My first question - It makes sense that a page containing a submission form would have the same title and content. But why is it being indexed, or crawled (or both for that matter) under every parameter in which it could be accessed (product A, B, C, etc)? My second question (an obvious one) - What can I do to begin to resolve this? As far as I know, I haven't touched this option included in Volusion other than to simply implement it. If I'm missing any key information, please point me in the right direction and I'll respond with any additional relevant information on my end. Many thanks in advance!
Intermediate & Advanced SEO | | DakotahW0 -
Would having a bilingual sitemap listed in Google cause issues?
For a bilingual site do we put the sitemap in Google Webmaster in both the languages? Would list cause any issues?
Intermediate & Advanced SEO | | Francis_GlobalMediaInsight0