Solved hreflang href: Should Japanese URL characters be encoded
-
Hi all,
I have searched in vain for a concrete answer to this question.
If you're dealing with the hreflang tags yourself (i.e. don't use automation plugins etc.), is it okay if the URLs (e.g. in Japanese) remain unencrypted?
Example (not encoded):
<link rel="alternate" hreflang="ja" href="https://domain. com/エグザイルリンク/" />The same encoded:
<link rel="alternate" hreflang="ja" href="https://domain. com/%e3%82%a8%e3%82%b0%e3%82%b6%e3%82%a4%e3%83%ab%e3%83%aa%e3%83%b3%e3%82%af" />When checking the unencoded tags in hreflang checkers, they don't seem to have a problem with this (they don't flag any issues).
Also on other websites I see both approaches with unencoded and encoded hreflang variants.
What is your opinion on this, could there be conflicts and/or is there a best practice?
Thanks all
-
@Hermski If you're manually adding hreflang tags to your website and not using automation plugins, using unencoded URLs is acceptable. Hreflang checkers usually don't have issues with unencoded tags, and many websites use both encoded and unencoded hreflang variants.
Encoded URLs help avoid potential issues with special characters or encoding errors. However, if you're comfortable using unencoded URLs and your hreflang tags are being properly recognized by search engines, there's no inherent conflict or best practice that dictates one approach over the other. -
@Hermski If you're manually adding hreflang tags to your website and not using automation plugins, using unencoded URLs is acceptable. Hreflang checkers usually don't have issues with unencoded tags, and many websites use both encoded and unencoded hreflang variants.
Encoded URLs help avoid potential issues with special characters or encoding errors. However, if you're comfortable using unencoded URLs and your hreflang tags are being properly recognized by search engines, there's no inherent conflict or best practice that dictates one approach over the other.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
href lang questions - please help
Hi I have a few questions about href lang implementation and I was hoping for some guidance / opinions. An international website is using mostly a folder structure, but for some locations it might have standalone sub-domains. Some folders are there to target locations and languages, with others just targeting languages. See the list below: domain.com/es-mx [Language: Spanish - Location: Mexico]
International SEO | | MarkCanning
domain.com/pt-br [Language: Portuguese - Location: Brazil]
domain.com/ja-jp [Language: Japanese - Location: Japan]
domain.com/en-jp [Language: English - Location: Japan]
domain.com/fr-ca [Language: French - Location: Canada]
domain.com/en-ca [Language: English - Location: Canada]
domain.com/en-ie [Language: English - Location: Ireland]
domain.com/ar [Language: Arabic]
domain.com/ph [Language: Tagalog]
domain.com/it [Language: Italian]
domain.com/tr [Language: Turkish]
domain.com/kr [Language: Korean]
domain.com/fr [Language: French]
domain.com/ru [Language: Russian]
domain.com/vn [Language: Vietnamese] domain.in/en [Language: English - Location Indian]
domain.in/hi [Language: Hindi - Location Indian] My questions are: Is href lang sitemap equally as good as the href lang meta tag in terms of effectiveness. I know that the sitemap is easier to maintain and upkeep but i don't know which one is better as google recommends both. How do you mix your listings when some are targeting language and country and others are just targeting language speakers (not tied to any specific country). So take for example in the list above: there would be a general site for french speakers and then one for french speakers in Canada. Thanks for your advise in advance.0 -
Moving from single domain to multiple CCTLDs
Hi, I have a website targeting 3 markets (and therefor 3 languages). I was currently using a single domain with each market being targeted in the following format: www.website.com/pl
International SEO | | cellydy
www.website.com/de
www.website.com/hu It's clear to me by looking at organic results, that in my industry (Real Estate) Google is putting a large emphasis on local businesses and local domains. Top 10 organic results for all my keywords in all markets have country specific CCTLDs. I decided to migrate from a single domain strategy to a multi domain strategy. I own the domains. The new structure is www.website.com/pl -> www.website.pl
www.website.com/de -> www.website.de
www.website.com/hu -> www.website.hu All the website have been added to google search console and 301 redirects are in place and working correctly. The pages are all interlinked and have rel=alternate to each other. The sitemaps are all done correctly. My question is how do I tell Google about this. The change of address feature only works for changing one domain to one other domain. It's been a week and the old www.website.com domain is still showing up (even considering 301 redirects). Or do I just need to be patient and wait it out? Any tips?0 -
Three version of english pages: EN-US, EN-GB und EN as x-default
We want address the search market for USA and UK. Therefore all english pages have small regional variations with similar content. Since a longer time (after a relaunch) Google has problems to identify the right page (/en-gb/) for the right search market (UK) - although we use hreflang and sitemaps from the beginning. We monitor those in moz for our UK campaign (/en-gb/ pages) by jumps in the ranking of individual keywords (>-50 and >+50). -50 means not that the ranking of our website is lost. In this case Google will substitute the ranking of the /en-gb/ page with the variant /en/. One excample:
Technical SEO | | PeterGolze
https://www.openmind-tech.com/en-gb/industries/cam-software-for-motor-sports/
This page lost the ranking and the other languag variant is ranking for position 2.
https://www.openmind-tech.com/en/industries/cam-software-for-motorsport/ In the moment I have no idea what we can change in our html code.0 -
Duplicate content on URL trailing slash
Hello, Some time ago, we accidentally made changes to our site which modified the way urls in links are generated. At once, trailing slashes were added to many urls (only in links). Links that used to send to
Intermediate & Advanced SEO | | yacpro13
example.com/webpage.html Were now linking to
example.com/webpage.html/ Urls in the xml sitemap remained unchanged (no trailing slash). We started noticing duplicate content (because our site renders the same page with or without the trailing shash). We corrected the problematic php url function so that now, all links on the site link to a url without trailing slash. However, Google had time to index these pages. Is implementing 301 redirects required in this case?1 -
Removing UpperCase URLs from Indexing
This search - site:www.qjamba.com/online-savings/automotix gives me this result from Google: Automotix online coupons and shopping - Qjamba
Intermediate & Advanced SEO | | friendoffood
https://www.qjamba.com/online-savings/automotix
Online Coupons and Shopping Savings for Automotix. Coupon codes for online discounts on Vehicles & Parts products. and Google tells me there is another one, which is 'very simliar'. When I click to see it I get: Automotix online coupons and shopping - Qjamba
https://www.qjamba.com/online-savings/Automotix
Online Coupons and Shopping Savings for Automotix. Coupon codes for online discounts on Vehicles & Parts products. This is because I recently changed my program to redirect all urls with uppercase in them to lower case, as it appears that all lowercase is strongly recommended. I assume that having 2 indexed urls for the same content dilutes link juice. Can I safely remove all of my UpperCase indexed pages from Google without it affecting the indexing of the lower case urls? And if, so what is the best way -- there are thousands.0 -
URL Keyword Structure and Importance
Hey Guys, I've done quite a bit of research on this but still can't decide what the correct answer is, so was hoping the Moz community might be able to give some clarification. Say I have a URL **www.yourdomain.com/product/domain-names **is there any benefit in changing my site's backend structure (a relatively lengthly process) so the URL can read **www.yourdomain.com/domain-names **without the 'product' slug? I understand keywords in the URL can have a small impact on SEO, but does the positioning to this degree play any part? Any advice would be great.
Intermediate & Advanced SEO | | paragongroup
Cheers.0 -
Overly-Dynamic URL
Hi, We have over 5000 pages showing under Overly-Dynamic URL error Our ecommerce site uses Ajax and we have several different filters like, Size, Color, Brand and we therefor have many different urls like, http://www.dellamoda.com/Designer-Pumps.html?sort=price&sort_direction=1&use_selected_filter=Y http://www.dellamoda.com/Designer-Accessories.html?sort=title&use_selected_filter=Y&view=all http://www.dellamoda.com/designer-handbags.html?use_selected_filter=Y&option=manufacturer%3A&page3 Could we use the robots.txt file to disallow these from showing as duplicate content? and do we need to put the whole url in there? like: Disallow: /*?sort=price&sort_direction=1&use_selected_filter=Y if not how far into the url should be disallowed? So far we have added the following to our robots,txt Disallow: /?sort=title Disallow: /?use_selected_filter=Y Disallow: /?sort=price Disallow: /?clearall=Y Just not sure if they are correct. Any help would be greatly appreciated. Thank you,Kami
Intermediate & Advanced SEO | | dellamoda2 -
Expiring URL seo
a buddy of mine is running a niche job board and is having issues with expiring URLs. we ruled it out cuz a 301 is meant to be used when the content has moved to another page, or the page was replaced. We were thinking that we'd be just stacking duplicate content on old urls that would never be 'replaced'. Rather they have been removed and will never come back. So 410 is appropriate but maybe we overlooked something. any ideas?
Intermediate & Advanced SEO | | malachiii0