Google not index main keyword on homepage in 2 countries same language, rest of pages no problem
-
Hello,
Two the same websites, two countries, same language
http://www.lavistarelatiegeschenken.nl / http://www.lavistarelatiegeschenken.be
The main keyword "relatiegeschenken" in top 10 of netherlands (steady position for 2 years) and in ** belgium** not in top 15****0 the main keyword "relatiegeschenken| but other keywords good positions, thats so strange
I didn't understand and now every thing turned around suddenly:
Now the main keyword "relatiegeschenken suddenly " not anymore in top 10 in the netherslandsits gone and other kewyords still good positions , now **main keyword suddenly in top 10 of belgium 2 years was not **other pages still ok.
It are exactly the same websites and the same language. So double content
But my programmer told me in google webmaster tools settings are right, so no problem with double content ?
I really dont understand first main keyword in netherland in top 10 and in belgium not, now changed, now in belgium top 10 and not findable in the netherland on the main keyword.
Maybe problem in code ?
Maybe problems in code because websites are identical and active in two different countries wit same language ?
No message about a penalty message in WMT, no spam links
week i delete two strong but according to Linkdetox a bad links.
I can not find a solution but its really important keyword that my customer want back in top 10 in netherland, like it was. All other positions and visitors are the same.
Befor i have had this with belgium site, also main keyword google not index homepage. But suddenly no google show in belgium in top 10
Its turned around
Kind regards,
Marcel
-
Hi Marcel,
Content duplication shouldn't be an issue if you are using ccTLDs targeting to each country (as you are already doing with The Netherlands & Belgium), the geolocalization feature is disable in the Google Search Console already when you are using ccTLDs as they are "automatically" geolocated to the country of the extension. In fact, this is the "ideal" way to internationalize your web if you're targeting to countries (and not languages) and give the best possible signal to Google that you are targeting to different countries, as in this case.
Nonetheless, additionally you should also implement hreflang annotations: https://support.google.com/webmasters/answer/189077?hl=en in order to avoid the situation that due to a higher authority -with stronger link profile- one of the domains rank instead of the other in the non-relevant country. I've seen that you are already including hreflang annotations on your sites pages, however these are not correctly implemented because you are only pointing to the other country URL and specifying its language and country.
You need to always specify the language and country of the own page where you are adding the annotation as well as its alternative (check out the specification I included above). Please read: https://mza.seotoolninja.com/blog/using-the-correct-hreflang-tag-a-new-generator-tool and use this tool to see the type of tags you need to add: http://www.aleydasolis.com/en/international-seo-tools/hreflang-tags-generator/. This is how you're going to avoid showing a result for the NL in BE and viceversa.
However, the problem that you describe though sounds like if you have suffered from a lost of organic search visibility due to other reasons that don't have to do with internationalization: whether being affected after an update, a configuration change or issue on your site, etc. since I went to check your SEO visibility profile on SEMrush for NL: https://www.semrush.com/nl/info/lavistarelatiegeschenken.nl+(by+organic_positions)?positions=lost that show a few keywords with a negative trend and an overall negative visibility behavior on the site: https://www.semrush.com/nl/info/lavistarelatiegeschenken.nl+(by+organic) since January specifically.
My recommendation is that you run a full SEO audit, instead of just focusing on international configuration that is very unlikely to cause this -unless you had release a new international version that wasn't correctly targeted, but is not your case- and verify especially if this coincides with a recent update (in January there were a few that you can see here: https://mza.seotoolninja.com/google-algorithm-change and https://www.seroundtable.com/google-fluctuations-continue-likely-not-penguin-21489.html) and if you are suffering from content optimization issues for example that could have triggered this.
If you have any other question just let me know!
-
two identical webpages in different countries with same content in same language, everything is ok only the homepage got a problem
We do the settings in google webmaster tools and the rel=”alternate” hreflang=”x”
So only 2 solutions:
-
Can create an url http://www.lavistarelatiegeschenken.nl/relatiegeschenken and create a link building profile with good links on that. At the end belgium will find this page for the main keyword and in netherland homepage will come back
-
Create unique content on the categorie webpages and also in the navigation, so google will see this as two different companies. Product pages can be the same in both countries.
That was the solution i hope
-
-
Hello Chris,
Thanks for the answer !
As well as in holland as in belgium we have from all subpages good rankings, .be and . except from the homepage, that only in one country a time.
We wanted that people in belgium see our belgium site and in holland the dutch website, so we make those settings in google webmaster tools.
But than have the double content issue, we have solved it to implentate this tag: rel=”alternate” hreflang=”x” ta for both countries.
According to me it must be possible and also i see the competitor has also an identical website, but rank in both countries top 10.
AS told subpages rank same in booth countries, its only the homepage
https://www.google.be/?gws_rd=ssl#q=kaasprikkers
https://www.google.nl/?gws_rd=ssl#q=kaasprikkers
We have tried also to seperate both sites so no connection between the websites, own virtual adress and phonenumber, google places, no links to each other, own social media, no dofollow outgoing backlinks etc. So google will see it as different companies.
I have deleted some toxic links on the .nl domain with linkresearch tools, maybe something to do with it
answer 1: .be ranks better in .be and .nl ranks better in .nl
answer 2: i can make the navigation unique content. But there are so many products that is impossible, but as told a competitor has also identical website and ranks on main keyword "relatiegeschenken" in both countries with different domains top 10.
There are upcoming good quality links to the .nl name.
Its strange its only the homepage not the subpages and the homepage has diffierent unique content and the subpages all have duplicate content.
Hope somebody knows the answer, i never had befor
Grtz
Marcel
-
Hi Marcel,
I don't know German at all so I may have missed a few things but from what I can see, there are a few elements that may be causing this.
Your .nl site has the stronger link profile with 197 referring domains and a Domain Rating of 48 compared to 36 referring domains and a Domain Rating of 37 for the .be site.
As you pointed out, the two sites are almost exact duplicates of each other as well. Unless I'm missing something, there is nothing to mitigate this duplication so Google is typically going to rank the strongest of the two duplicates and almost ignore the other.
So, for the question of why you were seeing the .ne site ranking before and not .be, I would expect this is because they're both duplicates and .ne was the strongest.
As for why this has reversed now, I can see you've dropped a lot of links and referring domains from the .be domain recently, perhaps this was an intentional removal of low quality backlinks? If so this will have improved the strength of the profile.
On the other hand, the .ne site has picked up a large volume of links recently and only a handful of new referring domains which suggests these new linking domains are very low quality. Since you seem to have lowered the quality of the .ne profile and improved .be, I would expect to see the situation reverse, which is exactly what has happened.
Suggestions to Fix the Problems Assuming I am correct, there are a couple of ways to go about it:
**1. **Use only 1 website instead of 2. I realise they're for difference countries but the two locations are only 90 minutes apart. This could be dealt with by having 2 location pages on the one site. It removes the duplication you're currently struggling with and means you only have to work on a backlink profile and content for one site instead of two.
**2. **Alternatively, if you have a large budget or a lot of time available, you could tidy up those low quality new links to your .ne site and start writing some unique content for one of the domains so they're no longer duplicate.
So long as Google is seeing 2 identical websites, you're never likely to see them both rank well at the same time. Quality signals may go up and down over time which sees them trade rankings, but you're never going to see them both at #1 while they're duplicates of each other.
I hope that helps!
-
sorry, 2 same websites in two countries
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hreflang for multple countries but single language
I'm working on a site that has implemented hreflang. The site is all in English but has slight differences and breaks down to separate domains, so example.co.uk example.eu example.com the hreflang implementation targets specific countries per domain. This was tested using http://flang.dejanseo.com.au but Webmaster Tools has thrown up errors. For example, **URLs for your site and alternative URLs in 'en_GB' that do not have return tags. **But every page refers back.Any ideas what's going wrong?
Technical SEO | | MickEdwards0 -
Should I remove these pages from the Google index?
Hi there, Please have a look at the following URL http://www.elefant-tours.com/index.php?callback=imagerotator&gid=65&483. It's a "sitemap" generated by a Wordpress plug-in called NextGen gallery and it maps all the images that have been added to the site through this plugin, which is quite a lot in this case. I can see that these "sitemap" pages have been indexed by Google and I'm wondering whether I should remove these or not? In my opinion these are pages that a search engine would never would want to serve as a search result and pages that a visitor never would want to see. Attracting any traffic through Google images is irrelevant in this case. What is your advice? Block it or leave it indexed or something else?
Technical SEO | | Robbern0 -
Why my site is not indexing in google
In google webmaster i have updated my sitemap in Mar 6th..There is around 22000 links..But google fetched only 5300 links for long time...
Technical SEO | | Rajesh.Chandran
I waited for 1 month till no improvement in google index..So apr6th we have uploaded new sitemap (1200 links totally)..,But only 4 links indexed in google ..
why google not indexing my urls? Is this affect our ranking in SERP? How many links are advisable to submit in sitemap for a website?0 -
Does Google Still Pass Anchor Text for Multiple Links to the Same Page When Using a Hashtag? What About Indexation?
Both of these seem a little counter-intuitive to me so I want to make sure I'm on the same page. I'm wondering if I need to add "#s to my internal links when the page I'm linking to is already: a.) in the site's navigation b.) in the sidebar More specifically, in your experience...do the search engines only give credit to (or mostly give credit to) the anchor text used in the navigation and ignore the anchor text used in the body of the article? I've found (in here) a couple of folks mentioning that content after a hashtagged link isn't indexed. Just so I understand this... a.) if I were use a hashtag at the end of a link as the first link in the body of a page, this means that the rest of the article won't be indexed? b.) if I use a table of contents at the top of a page and link to places within the document, then only the areas of the page up to the table of contents will be indexed/crawled? Thanks ahead of time! I really appreciate the help.
Technical SEO | | Spencer_LuminInteractive0 -
After fixing duplicate pages problem - keyword rankings have fallen off a cliff!
We have recently signed up to SEOMOZ and found that our site had over 2,500 duplicated pages. We reported it the the web designer and they found links on the website to an old prototype version of the website and so they did a SQL run to get rid of them. Doing this got rid of 90% of them. However, this morning, moz has just done another crawl of our website and our keyword rankings have fallen off a cliff. Particularly, important one that we were at position 1 for. We are now on the fifth page. Can anyone shed any light on it? Will this be temporary? Thanks Stuart
Technical SEO | | Stuart260 -
Best way to handle indexed pages you don't want indexed
We've had a lot of pages indexed by google which we didn't want indexed. They relate to a ajax category filter module that works ok for front end customers but under the bonnet google has been following all of the links. I've put a rule in the robots.txt file to stop google from following any dynamic pages (with a ?) and also any ajax pages but the pages are still indexed on google. At the moment there is over 5000 pages which have been indexed which I don't want on there and I'm worried is causing issues with my rankings. Would a redirect rule work or could someone offer any advice? https://www.google.co.uk/search?q=site:outdoormegastore.co.uk+inurl:default&num=100&hl=en&safe=off&prmd=imvnsl&filter=0&biw=1600&bih=809#hl=en&safe=off&sclient=psy-ab&q=site:outdoormegastore.co.uk+inurl%3Aajax&oq=site:outdoormegastore.co.uk+inurl%3Aajax&gs_l=serp.3...194108.194626.0.194891.4.4.0.0.0.0.100.305.3j1.4.0.les%3B..0.0...1c.1.SDhuslImrLY&pbx=1&bav=on.2,or.r_gc.r_pw.r_qf.&fp=ff301ef4d48490c5&biw=1920&bih=860
Technical SEO | | gavinhoman0 -
Does page speed affect what pages are in the index?
We have around 1.3m total pages, Google currently crawls on average 87k a day and our average page load is 1.7 seconds. Out of those 1.3m pages(1.2m being "spun up") google has only indexed around 368k and our SEO person is telling us that if we speed up the pages they will crawl the pages more and thus will index more of them. I personally don't believe this. At 87k pages a day Google has crawled our entire site in 2 weeks so they should have all of our pages in their DB by now and I think they are not index because they are poorly generated pages and it has nothing to do with the speed of the pages. Am I correct? Would speeding up the pages make Google crawl them faster and thus get more pages indexed?
Technical SEO | | upper2bits0 -
Over 1000 pages de-indexed over night
Hello, On my site (www.bridgman.co.uk) we had a lot of duplicate page issues as reported by the Seomoz site report tool - this was due to database driven URL strings. As a result, I sent an excel file with all the duplicate pages to my web developer who put rel canonical tags on what I assumed would be all the correct pages. I am not sure if this is a coincidence, or a direct result of the canonical tags, but a few days after (yesterday) the amount of pages indexed by google dropped from 1,200 to under 200. The number is still declining, and other than the canonical tags I can't work out why Google would just start de-indexing most of our pages. If you could offer any solutions that would be greatly appreciated. Thanks, Robert.
Technical SEO | | 87ROB0