Geographical targeting with Magento
-
We have a Magento store, with multiple stores/domains setup. There is only really one reason that we have the multiple domains; we use an automatic GEOIP store switcher to send a customer to the right store, so that they pay the proper shipping, see the proper pricing etc, and a couple small differences in the design templates. But all the content is identical.
So we have:
domain.com (main website)
domain.ca (where most other countries are directed to based on GEOIP)
domain.euSince the content is the same, what is the best strategy here? I looked at several options:
1. Custom canonical urls, making each page on the .ca and .eu use canonical url of the .com
2. Completely block the .ca and .eu from robots.
3. Leave it the way it is -
Hello Maarten,
I think you should consider using the rel alternate hreflang tags, as discussed here. If you need more background here is another great resource. And start following Aleyda Solis.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Targetting bad bounce-rate pages in search console
We are seeing a sharp increase in Bounce Rate on the website via Google Search Console. Is it possible to drill down and find out which pages are causing this? And if so, is it possible to find out why?
Technical SEO | | abisti20 -
Duplicate content issue on Magento platform
We have a lot of duplicate pages (600 urls) on our site (total urls 800) built on the Magento e-commerce platform. We have the same products in a number of different categories that make it easy for people to choose which product suits their needs. If we enable the canonical fix in Magento will it dramatically reduce the number of pages that are indexed. Surely with more pages indexed (even though they are duplicates) we get more search results visibility. I'm new to this particular SEO issue. What do the SEO community have to say on this matter. Do we go ahead with the canonical fix or leave it?
Technical SEO | | PeterDavies0 -
Robots.txt and Magento
HI, I am working on getting my robots.txt up and running and I'm having lots of problems with the robots.txt my developers generated. www.plasticplace.com/robots.txt I ran the robots.txt through a syntax checking tool (http://www.sxw.org.uk/computing/robots/check.html) This is what the tool came back with: http://www.dcs.ed.ac.uk/cgi/sxw/parserobots.pl?site=plasticplace.com There seems to be many errors on the file. Additionally, I looked at our robots.txt in the WMT and they said the crawl was postponed because the robots.txt is inaccessible. What does that mean? A few questions: 1. Is there a need for all the lines of code that have the “#” before it? I don’t think it’s necessary but correct me if I'm wrong. 2. Furthermore, why are we blocking so many things on our website? The robots can’t get past anything that requires a password to access anyhow but again correct me if I'm wrong. 3. Is there a reason Why can't it just look like this: User-agent: * Disallow: /onepagecheckout/ Disallow: /checkout/cart/ I do understand that Magento has certain folders that you don't want crawled, but is this necessary and why are there so many errors?
Technical SEO | | EcomLkwd0 -
Should I make a new URL just so it can include a target keyword, then 301 redirect the old URL?
This is for an ecommerce site, and the company I'm working with has started selling a new line of products they want to promote.Should I make a new URL just so it can include a target keyword, then 301 redirect the old URL? One of my concerns is losing a little bit of link value from redirecting. Thank you for reading!
Technical SEO | | DA20130 -
How many keywords should I target?
Hi there I'm looking for advice from the community on how many keywords to target. What are the pros and cons of: focussing on the 40 keywords that we rank for already, with specific attention paid to those where we are on pages 2-5. Spread our link building / onsite optimisation work a little further - and continue to target all 280 keywords on our list as and when they are appropriate to target. I'd love to hear what strategies people recommend. Thanks
Technical SEO | | HeatherBakerTopLine0 -
Targeting multiple keywords with index page
Quick keyword question.... I just started working with a client that is ranking fairly well for a number of keywords with his index page. Right now he has a bunch of duplicate titles, descriptions, etc across the entire site. There are 5 different keywords in the title of the index page alone. I am wondering if it OK to target 3 different keywords with the index page? Or, if I should cut it down to 1. Think blue widget, red widget, and widget making machines. I want each of the individual keywords to improve but don't want to lose what I have either. Any ideas? THANKS!!!!
Technical SEO | | SixTwoInteractive0 -
GWT, URL Parameters, and Magento
I'm getting into the URL parameters in Google Webmaster Tools and I was just wondering if anyone that uses Magento has used this functionality to make sure filter pages aren't being indexed. Basically, I know what the different parameters (manufacturer, price, etc.) are doing to the content - narrowing. I was just wondering what you choose after you tell Google what the parameter's function is. For narrowing, it gives the following options: Which URLs with this parameter should Googlebot crawl? <label for="cup-crawl-LET_GOOGLEBOT_DECIDE">Let Googlebot decide</label> (Default) <label for="cup-crawl-EVERY_URL">Every URL</label> (the page content changes for each value) <label style="color: #5e5e5e;" for="cup-crawl-ONLY_URLS_WITH_VALUE">Only URLs with value</label> ▼(may hide content from Googlebot) <label for="cup-crawl-NO_URLS">No URLs</label> I'm not sure which one I want. Something tells me probably "No URLs", as this content isn't something a user will see unless they filter the results (and, therefore, should not come through on a search to this page). However, the page content does change for each value.I want to make sure I don't exclude the wrong thing and end up with a bunch of pages disappearing from Google.Any help with this is greatly appreciated!
Technical SEO | | Marketing.SCG0 -
Global SEO Targeting
Hi, I have a website currently on the domain example.co.uk (.com is not available) I'm looking to enter other markets such as Brazil, Russia - obviously content will need to change to suit the desired market / language. I'm looking for some information on the best practice to enter foreign markets. I was thinking maybe to create individual sites for each location eg: example.br example.ru This way I could localise each site in terms of business directories, content, language etc. Or have my example.co.uk with various languages on it Experience, suggestions are welcomed - thanks.
Technical SEO | | Socialdude0