Which url should i use? Thanks!
-
I have a question regarding how to use my url, we are a Swedish-based website which have the url, http://interimslösning.se/ (that contains the Swedish letter “ö”) so the url can also be written as http://xn--interimslsning-3pb.se/.
Which of the following url should I use for my backlinks, http://interimslösning.se/ or http://xn--interimslsning-3pb.se/ ? What is the difference between them regarding SEO? And is it good or bad to use letter like "ö" or other characters like that in your url?
I was thinking that maybe it is good to use the letter "ö" for local search optimization in sweden, but i don't know..
Thanks in advance!
Greetings,
Paul Linderoth -
That makes sense to me.
-
Hi Guys,
Thanks for great answers!
I guess after reading your answers that it is best that i change my url from http://interimslösning.se/ to for example http://interimslosning.se/ and then 301 redirect http://interimslösning.se/ to the new url without the letter "ö"?
Then I eliminate using the letter "ö" and I dont need to use a non user-friendly url like http://xn--interimslsning-3pb.se/
/Paul
-
Thanks for that point Russ, that is very relevant. I would then say find a better URL than either of them. I am not sure that a URL with seemingly random combinations of dashes is much better than the URL with the “ö”. It may make sense to get a new domain - unless the domain with all the dashes is well established. My impression from the post is that it was not.
-
This IS a question for Search Engine Optimization. Many web services out there will not properly handle UTF-8 characters and muck up the URL, causing it not to work at all and preventing users and Google from reaching your site via the link placed on another site. You might have already experienced this when trying to sign up with your email address while using the international characters codes. It is my recommendation that you link build with the version not including the “ö” and make sure your site has the correct canonical tag pointing to the version with the “ö”.
-
This is really not a question on search optimization as it is on usability. You need to consider how easy your URL will be for users to remember so if they need to type it in from memory or if they see it on an ad it is easy to remember.
Generally, the shorter the better - less for people to remember. The second URL with the double dash plus a single dash is not very user friendly. I would say the at the Swedish letter “ö” may be an issue for non Swedish customers who do not regularly use that letter, but if your audience does, then it should not be a problem and I would go with the http://interimslösning.se/
As far as localization, I have not seen anything on Google looking at the letters used in URL for location optimization. Normally they gather that from the domain name extension, what language you use on your site, the where the hosting server is located and most importantly, what you set as the website location in Google Search Console. The Search Console setting is the most important one for localization.
Most important, once you set your domain name, I would not use any other versions of it in anything that your publish. Set all pages on the alternate domain to 301 redirect to the proper page on the primary domain just incase the other domain leaks out somewhere.
Cheers!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google is indexing bad URLS
Hi All, The site I am working on is built on Wordpress. The plugin Revolution Slider was downloaded. While no longer utilized, it still remained on the site for some time. This plugin began creating hundreds of URLs containing nothing but code on the page. I noticed these URLs were being indexed by Google. The URLs follow the structure: www.mysite.com/wp-content/uploads/revslider/templates/this-part-changes/ I have done the following to prevent these URLs from being created & indexed: 1. Added a directive in my Htaccess to 404 all of these URLs 2. Blocked /wp-content/uploads/revslider/ in my robots.txt 3. Manually de-inedex each URL using the GSC tool 4. Deleted the plugin However, new URLs still appear in Google's index, despite being blocked by robots.txt and resolving to a 404. Can anyone suggest any next steps? I Thanks!
Technical SEO | | Tom3_150 -
The use of Markup language
Hi, We were thinking of adding markup language to our site. We have been reading about it to understand the actual benefits of doing so (we have seen many brands are not using it, including moz.com). So I have two questions: 1- Would you recommend using it for our site? www.memoq.com 2- If yes, would it be better to create a snippet of code for our home page as an "organization" and then different snippets for our product pages as "products". Looking forward to your comments,
Technical SEO | | Kilgray0 -
Using the Google Remove URL Tool to remove https pages
I have found a way to get a list of 'some' of my 180,000+ garbage URLs now, and I'm going through the tedious task of using the URL removal tool to put them in one at a time. Between that and my robots.txt file and the URL Parameters, I'm hoping to see some change each week. I have noticed when I put URL's starting with https:// in to the removal tool, it adds the http:// main URL at the front. For example, I add to the removal tool:- https://www.mydomain.com/blah.html?search_garbage_url_addition On the confirmation page, the URL actually shows as:- http://www.mydomain.com/https://www.mydomain.com/blah.html?search_garbage_url_addition I don't want to accidentally remove my main URL or cause problems. Is this the right way this should look? AND PART 2 OF MY QUESTION If you see the search description in Google for a page you want removed that says the following in the SERP results, should I still go to the trouble of putting in the removal request? www.domain.com/url.html?xsearch_... A description for this result is not available because of this site's robots.txt – learn more.
Technical SEO | | sparrowdog1 -
Using RewriteRule - SEO Implications
Hi There, My client has a website (www.activeadventures.com) which they relaunched in April 2013. The company sells inbound tourism trips to New Zealand, South America and the Himalayas. Previously, the websites for these destinations were on their own domains (activenewzealand.com, activehimalayas.com, activesouthamerica.com). With the launch of the new website those domains were all retired (but had 301 redirects put into place to the new site), and moved into sub directories of the activeadventures.com domain (eg: activeadventures.com/new-zealand). There has been no indication that this strategy has improved organic search results (based on analytics) and in my opinion I believe that having this structure has been detrimental to their results. My opinion is based off the following: Visitors to the websites are coming into the site with a specific destination in mind that they want to travel to. Thus... having the destination in the URL I believe provides more immediate relevancy and should result in a higher CTR. I also feel that having the sites on their own URL's will provide a more concentrated theme for the destination based search phrases. The new site is a custom Joomla build and I want to find the easiest way to keep the current Joomla set up AND move the country specific sections of the site back onto their original URL's. It seems on the face of it that the easiest way to get this done is to use the htaccess file and use "RewriteRule" to push all the relevant pages back onto their original domains. Obviously we will ensure we also cover off pointing the existing 301's from the new site and the old sites to this new structure. My question is, are their any potential negative SEO implications of using the RewriteRule in the htaccess file to achieve this? Many thanks in advance. Kind Regards
Technical SEO | | activenz
Conrad Cranfield0 -
Examples of sites using hreflang
Hi all, I'll soon be doing some work for a worldwide company who are launching a new site. The new site is a near clone of another of their sites in another country. Obviously I'll need to make use of rel="alternate" hreflang="x" on both sites. I've read all the Google documentation etc but was wondering if you guys could point me in the direction of a few sites which are currently implementing the tag successfully. Thanks in advance,
Technical SEO | | iProspect-Ireland0 -
Cyrillic letter in URL - Encoding
Hi all We are launching our site in Russia. As far as I can see by searching Google all sites have URLs in latin letters. Is there a special reason for this? - It seems that cyrillic letters also work. My technical staff says that it might give some encoding problems. Can anyone give me some insight into this? Thanks in advance.. / Kenneth
Technical SEO | | Kennethskonto0 -
URL rewriting from subcategory to category
Hello everybody! I have quite simple question about URL rewriting from subcategory to category, yet I can't find any solution to this problem (due to lack of my deeper apache programming knowledge). Here is my problem/question: we have two website url structures that causes dublicate problems: www.website.lt/language/category/ www.website.lt/language/category/1/ 1 and 2 pages are absolutely same (both also returns 200 OK). What we need is 301 redirect from 2 to 1 without any other deeper categories redirects (like www.website.com/language/category/1/169/ redirecting to .../category/1/ or .../category/). Here goes .htaccess URL rewrite rules: RewriteRule ^([^/]{1,3})/([^/]+)/([^/]+)/([^/]+)/([^/]+)/([^/]+)/$ /index.php?lang=$1&idr=$2&par1=$3&par2=$4&par3=$5&par4=$6&%{QUERY_STRING} [L] RewriteRule ^([^/]{1,3})/([^/]+)/([^/]+)/([^/]+)/([^/]+)/$ /index.php?lang=$1&idr=$2&par1=$3&par2=$4&par3=$5&%{QUERY_STRING} [L] RewriteRule ^([^/]{1,3})/([^/]+)/([^/]+)/([^/]+)/$ /index.php?lang=$1&idr=$2&par1=$3&par2=$4&%{QUERY_STRING} [L] RewriteRule ^([^/]{1,3})/([^/]+)/([^/]+)/$ /index.php?lang=$1&idr=$2&par1=$3&%{QUERY_STRING} [L] RewriteRule ^([^/]{1,3})/([^/]+)/$ /index.php?lang=$1&idr=$2&%{QUERY_STRING} [L] RewriteRule ^([^/]{1,3})/$ /index.php?lang=$1&%{QUERY_STRING} [L] There are other redirects that handles non-www to www and related issues: RedirectMatch 301 ^/lt/$ http://www.domain.lt/ RewriteCond %{HTTP_HOST} ^domain.lt RewriteRule (.*) http://www.domain.lt/$1 [R=301,L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_URI} !(.)/$RewriteRule ^(.)$ http://www.domain.lt/$1/ [R=301,L] At this moment we cannot solve this problem with rel canonical (due to our CMS limits). Thanks for your help guys! If You need any other details on our coding, just let me know.
Technical SEO | | jkundrotas0 -
Special characters in URL
Hello everybody, my question focus on special parameters in URL. I i am working for a website that use a lot of special entities in their URLS. For instance: www.mydomain.com/mykeyword1-mykeyword2%2C-1%2Cpage1.html I am about to make 301 redirect rules for all these urls to clean ones. IE: www.mydomain.com/mykeyword1-mykeyword2%2C-1%2Cpage1
Technical SEO | | objectif-mars
would become:
www.mydomain.com/mykeyword1-mykeyword.html I just wanted to know if anybody has already done this kind of "cleanup" and if i could expect a positive boost or not. Thanks0