Concerned
-
Hi SEOmoz fans,
Hang on a minute, I sound like Rand, watching to many WBF's.
Ok let me start, I am currently doing all the marketing for a website which has both .co.uk & .ie due to legal reasons, the website and .co.uk & .ie have all the same content and even when I am writing new pages, which is on a regular basis I make sure both are being updated, anyway after all the research there should not be an issue with this (duplicate content) as Google recognizes that it's the same domain etc, however I was really ranking well for a specific keyword no.5 , very competitive now the page being ranked for that keyword is a .pdf in the site but is ranked no.68.
Now, I thought this is very strange as you can imagine, I never do any black hit linkbuilding or anything like that, that's a NO NO for me, anyway i put the URL which was ranking well in Google into the Google search box, and yes it appeared, so no sign the URL has been banned, however when I paste in the first few paragraphs of that page which was ranking well in Google.co.uk into the Google search box it's the .ie website which appears not the .co.uk
Can anyone help me out, advice etc
Kind Regards
-
Sorry Peter, my previous reply looks strange, was using the iPad, not sure what happened, anyway, what I meant to say was:
So just to clarify, the homepage of the website would show:
And product page would be:
-
So just to clarify, the homepage of the website would show: And product page would be:
-
Argh - I'm sorry, yes. The hreflang="" code is the same, but the URL is the cross-language version of that URL. As long as the URL structure stays the same, this shouldn't be too hard, but if you use different structures, it could be a pain. I'm editing my previous reply.
-
Just one more question
Example
If this is on a particular product page does it have to be :
-
Thanks Peter, you have been a great help so far.
I will make these changes and let you know how I get on.
-
No - I'll be perfectly honest: I don't do a ton of international. The international SEOs I trust seem to think positively about the new tags, but we don't have a ton of data. The upside is that they're relatively easy to implement and they don't carry any real risk. The worst that happens is that it doesn't work.
My gut reaction is that there's regional confusion and Google is having a tough time reconciling duplicates. That's more in line with the inconsistent ranking you describe than a full-blown penalty would be.
-
Ah! fantastic.
Have you tried this before? Do you recommend putting this across the whole site?
Another thing I noticed is that when I paste in a first paragraph from a co.uk webpage into Google.co.uk it's the .ie webpage that appears, however on another webpage on the .co.uk website it's the .co.uk webpage that appears in Google.co.uk, hope that makes sense? what I would say is that the page in question that is not ranked, if I paste the URL into Google.co.uk it still appears.
-
The two sites should point at each other and use the region codes, so...
(1) The English site should have this tag:
(2) The Irish site should have this tag:
That way, whichever site Google hits, they're aware of the other site(s).
-
Hi Peter,
The .ie website is not shown in the google.co.uk for the target keyphrase, however what I did in google.co.uk was I pasted the first paragraph of the page which was ranked on page 1 for that target keyphrase and it's the .ie website that appears, .co.uk website is not where to be seen.
I have been doing some link building, however nothing excessive, and on authority websites, industry specific, I just don't feel it could be this so the only thing left is that this webpage has been penalized for duplicate content even though the .co.uk page has been indexed before the .ie webpage.
The strange thing is, I am still ranking really well, top 5 for about 30 or so keywords, very competitive keywords at that, so why would Google just be penalizing that specific webpage in question and not others, arrrrrrggggghhhhh, this is really getting to me.
Do you recommend that I place this code on the .ie webpage:
Pointing to the .co.uk website?
-
This can get tricky - rel-canonical passes link juice, but it could also prevent the .ie pages from ranking. Google is a bit inconsistent with this internationally, sometimes, a non-canonical version will still rank, if it's more relevant to the country/language of the query, but I'd hate to trust that.
-
No you use robot.txt to restrict pages from pages. Rel-canonical passes link juice. However, I would also look into what Dr.Peter is suggesting.
-
Unfortunately, while you should be able to theoretically target .co.uk and .ie separately, Google can screw it up on occasion and treat them as duplicates. If you're seeing the copy bring up the .ie site on Google.co.uk, that's definitely a possibility. You could try the new hreflang approach - see this Google resource:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
It's basically for regional content where the language is the same (there are other variants, but that's a big one), since Google knows they don't always get it right.
It is also possible that the .co.uk page has been penalized and other content is just being brought in to fill the spot - since the PDF is at #68, that's also possible. Have you done any recent link-building pushes to this particular page?
-
Hi Donnie,
If I use a rel='canonical' on the .ie webpage, is this not telling Google that you do not want this page to rank?
-
Hmm, your link juice will flow through with a canonical code. However, I don't think this is the problem, in your case. I would experiment try adding the code and see if your results are back up in a few days... If not take the code down.
-
I don't want to use a rel='canonical" as I want the .ie website to rank well for all keywords in Google.ie and at the moment this seems to be the case.
-
Yes, maybe the .pdf was always there.
All optimization tests have been done, this was all done before pages went live.
Changes were made first & foremost for the user, and from the results I gave you, this is clearly proven a success.
It was the main body of the text and structure that was changed, header tags etc all remained the same.
I checked Bing, and the URL in question is still on page 1 for the keyword.
-
Maybe the .pdf was always there just unnoticed?
Perhaps its something you changes on the page: Did you run an SEOmoz onsite optimization test?
What did you change on the page? Also, did you change any internal links pointing at that page? if its none of these factors, it can also be an external linking factor.
-
Hi Donnie,
Thanks for your reply.
Yes, it was ranked no.5 and now it's gone and replaced with the .pdf in pos.68 for that phrase.
I have re-written this page and pages related to this page. What I have seen in Analytics when I have made these changes, is that bounce rate has improved from 70% to 30%, Avg time on site has increase by 2 minutes and page views has also increased, so from the user experience it has worked as I imagined, with Google not.
I have checked Google webmaster tools, no messages.
-
Hi Gary Do you have a rel="canonical" on the .ie version of the site that points to the .co.uk pages? Basically tells the bot that this site is a direct copy and the .co.uk is the one to crawl. It may be that because it is duplicated across 2 different domains you are getting penalised for it. More about rel="canonical" here:- http://www.seomoz.org/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps Also a WBF about cross domain canonical links:- http://www.seomoz.org/blog/cross-domain-canonical-the-new-301-whiteboard-friday Hope this helps
-
I am a little confused, what made you lose your spot 5 ranking? Did you move your page to a .pdf? how is the .pdf relevant to your keyword? Or was one page ranking in 5 and now its gone. However, you found your .pdf in spot 68 for that phrase?
Did you change anything on that page that was ranking or on your site? Usually something causes a loss in rankings. Esp. when you go from spot 5 to nowhere to be found. Have you checked your Google webmaster tool there may be a message there.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Schema markup concerning category pages on an ecommerce site
We are adding json+ld data to an ecommerce site and myself and one of the other people working on the site are having a minor disagreement on things. What it comes down to is how to mark up the category page. One of us says it needs to be marked up with as an Itempage, https://schema.org/ItemPage The other says it needs to be marked up as products, with multiple product instances in the schema, https://schema.org/Product The main sticking point on the Itemlist is that Itemlist is a child of intangible, so there is a feeling that should be used for things like track listings or other arbitrary data.
Intermediate & Advanced SEO | | LesleyPaone2 -
Is a different location in page title, h1 title, and meta description enough to avoid Duplicate Content concern?
I have a dynamic website which will have location-based internal pages that will have a <title>and <h1> title, and meta description tag that will include the subregion of a city. Each page also will have an 'info' section describing the generic product/service offered which will also include the name of the subregion. The 'specific product/service content will be dynamic but in some cases will be almost identical--ie subregion A may sometimes have the same specific content result as subregion B. Will the difference of just the location put in each of the above tags be enough for me to avoid a Duplicate Content concern?</p></title>
Intermediate & Advanced SEO | | couponguy0 -
Offering discounts and getting backlinks - concerned.
Hiya Mozzers, My client is about to offer discounts (to a few large multinationals... for staff) and there's every possibility these will appear on the web, with a backlink to my client's website (perhaps direct via websitest / via online newsletters and so on). I am thinking of telling client to restrict the number of companies they interact with while I monitor backlinks in case there's some kind of problem with backlinks generated. I am also telling them on no account to ask for backlinks or encourage keyword rich links. Any thoughts on this, anybody? Is there a risk of penalty or am I just being paranoid?
Intermediate & Advanced SEO | | McTaggart0 -
Google showing high volume of URLs blocked by robots.txt in in index-should we be concerned?
if we search site:domain.com vs www.domain.com, We see: 130,000 vs 15,000 results. When reviewing the site:domain.com results, we're finding that the majority of the URLs showing are blocked by robots.txt. They are subdomains that we use as production environments (and contain similar content as the rest of our site). And, we also find the message "In order to show you the most relevant results, we have omitted some entries very similar to the 541 already displayed." SEER Interactive mentions that this is one way to gauge a Panda penalty: http://www.seerinteractive.com/blog/100-panda-recovery-what-we-learned-to-identify-issues-get-your-traffic-back We were hit by Panda some time back--is this an issue we should address? Should we unblock the subdomains and add noindex, follow?
Intermediate & Advanced SEO | | nicole.healthline0 -
Changing Hosting Companies - Site Downtime - Google Indexing Concern
We are getting ready to switch to a new hosting company. When we make the switchover, our sites will be offline for a couple of hours and in some cases perhaps as long as 12 hours while DNS is configured -- should we be worried about Google trying to index pages and finding them unavailable? Any fear of Google de-indexing pages. Our guess was that Google would not de-index anything after just a short period of not being able to find pages -- it would have to be over an extended period of time before GOOGLE or BING would de-index pages -- CORRECT? Just want to gut check this before pulling the trigger on switch over to new hosting company. We appreciate input on this and/or any other thoughts regarding the switch over to new hosting company that we may not have thought of. Thanks, Matt
Intermediate & Advanced SEO | | MWM37720 -
Is this duplicate content something to be concerned about?
On the 20th February a site I work on took a nose-dive for the main terms I target. Unfortunately I can't provide the url for this site. All links have been developed organically so I have ruled this out as something which could've had an impact. During the past 4 months I've cleaned up all WMT errors and applied appropriate redirects wherever applicable. During this process I noticed that mydomainname.net contained identical content to the main mydomainname.com site. Upon discovering this problem I 301 redirected all .net content to the main .com site. Nothing has changed in terms of rankings since doing this about 3 months ago. I also found paragraphs of duplicate content on other sites (competitors in different countries). Although entire pages haven't been copied there is still enough content to highlight similarities. As this content was written from scratch and Google would've seen this within it's crawl and index process I wanted to get peoples thoughts as to whether this is something I should be concerned about? Many thanks in advance.
Intermediate & Advanced SEO | | bfrl0 -
Landing Page - Home Page redesign SEO factor question - Serious concern.
Hi Folks, I'm considering making a big change to our website and really need some expert advise. Will we lose ranking if we do what I propose? Our site www.meninkilts.com, needs to split users/clients by "Commercial" and "Residential" so we can message/market completely differently to each client. We are considering doing this structure: Landing Page | | Commercial Homepage Residential Homepage Right now we rank well, for our keywords like "Window Cleaning cityname" but are worried that adding a landing page, and splitting our site to two homepages will effect seo (ie: a landing page would only have two buttons: one for commercial and one for residential). What would be the best way to handle this. Looking forward to your insights! Cheers Brent
Intermediate & Advanced SEO | | MenInKilts0 -
Are there any concerns moving a site to https?
I am currently having analytics issues where the non-secured (http) front end of my site is not properly communicating to the backend (https) of my site. When a user jumps between the the secured and non-secured, it will display as a bounce in GA and I get duplicate visits. GA has a work around for this but it is messy and not working. So my question is, has anyone had good/bad experiences moving a non-secured site over to the secured side? Thanks!
Intermediate & Advanced SEO | | 2comarketing0