Competitor has same site with multiple languages
-
Hey Moz,
I am working with a dating review website and we have noticed one of our competitors is basically making duplicated of their site with .com, .de, .co.uk, etc.
My first thought is this is basically a way to game the system but I could be wrong.
They are tapping into googles geo results by including major cities in each state, i.e. "dating in texas" "dating in atlanta" however the content itself doesn't really change. I can't figure out exactly why they are ranking so much higher. For example using some other SEO tools they have a traffic estimate of $500,000 monthly, where as we are sitting around $2000. So, either the traffic estimates are grossly misrepresenting traffic volume, OR they really are crushing it.
TLDR: Is geo locating/translating sites a valid way to create backlinks? It's seems a lot like a PBN.
-
1. Good!
2. You are confused for good reason. There has not been clear direction for here for some time. If you used HREFLANG between the two, it seems for the last number of years the content would not be seen as duplicative. You are telling Google that the content is the same but in different languages inherently.
3. There is so much that goes into this, but I can tell you with years of experience under my belt that the numbers don't ever tell the whole story.
-
Thank you!
-
We have decided that localizing is not worth it, it appears spammy and we cannot offer curated content on that level.
-
I am still a little confused about HrefLang. For example lets say we have a .com and a .de website. Both are nearly identical with the exception of hreflang and a handful or product/pricing descriptions (for example the US website might not list german specific dating sites, where as the german site will most likely include most major dating sites simply because they have the reach). Does google see this as duplicate content? Or does the hreflang indicate to google that this should be treated as two different websites.
-
To sum this up, we are trying to determine just how our competition has such a large keyword footprint when as far as the numbers are concerned (page count, wordcount etc) we are basically on the same level.
-
-
This is a tough one because there are reasons on both sides of the street to give reasons why this duplicate content should and should not be allowed. Think about an ERP SAAS company that needs to change their content just a bit between countries. They might build what looks to be duplicate but doesn't have some pages in one country vs another.
In your example, it seems to be not a great experience, but as a logged in user, they might be getting different content depending on their location.
To my second point: Those outside tools are shit. Total shit. Don't trust those numbers. Do your numbers line up to theirs?
Final point: Do not build and maintain duplicate content in hopes of getting links. It won't work over time. Anything can work for a short period of time, but in the end, they will figure it out. Trust me.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Preventing CNAME Site Duplications
Hello fellow mozzers! Let me see if I can explain this properly. First, our server admin is out of contact at the moment,
White Hat / Black Hat SEO | | David-Kley
so we are having to take this project on somewhat blind. (forgive the ignorance of terms). We have a client that needs a cname record setup, as they need a sales.DOMAIN.com to go to a different
provider of data. They have a "store" platform that is hosted elsewhere and they require a cname to be
sent to a custom subdomain they set up on their end. My question is, how do we prevent the cname from being indexed along with the main domain? If we
process a redirect for the subdomain, then the site will not be able to go out and grab the other providers
info and display it. Currently, if you type in the sales.DOMAIN.com it shows the main site's homepage.
That cannot be allow to take place as we all know, having more than one domain with
exact same content = very bad for seo. I'd rather not rely on Google to figure it out. Should we just have the cname host (where its pointing at) add a robots rule and have it set to not index
the cname? The store does not need to be indexed, as the items are changed almost daily. Lastly, is an A record required for this type of situation in any way? Forgive my ignorance of subdomains, cname records and related terms. Our server admin being
unavailable is not helping this project move along any. Any advice on the best way to handle
this would be very helpful!0 -
Hreflang/Canonical Inquiry for Website with 29 different languages
Hello, So I have a website (www.example.com) that has 29 subdomains (es.example.com, vi.example.com, it.example.com, etc). Each subdomain has the exact same content for each page, completely translated in its respective language. I currently do not have any hreflang/canonical tags set up. I was recently told that this (below) is the correct way to set these tags up -For each subdomain (es.example.com/blah-blah for this example), I need to place the hreflang tag pointing to the page the subdomain is on (es.example.com/blah-blah), in addition to every other 28 subdomains that have that page (it.example.com/blah-blah, etc). In addition, I need to place a canonical tag pointing to the main www. version of the website. So I would have 29 hreflang tags, plus a canonical tag. When I brought this to a friends attention, he said that placing the canonical tag to the main www. version would cause the subdomains to drop out of the SERPs in their respective country search engines, which I obviously wouldn't want to do. I've tried to read articles about this, but I end up always hitting a wall and further confusing myself. Can anyone help? Thanks!
White Hat / Black Hat SEO | | juicyresults0 -
Competitor Bad Practice SEO Still Ranking Well But Why ?
Moz Friends, A very close competitor have always been challenging for similar competitive keywords. We seem to have the advantage for alot of long tail keywords but on one of the higher traffic relevant keywords they seem to do well. I really struggle to understand why, particularly with the back links they use Just my thoughts and notes on the two: Our Page Better written text content (Maybe slightly written to for experienced target audience but we are working on simplifying things) Good Clear site URL structure and navigation for usability Fresh content updates Mobile optimized Reasonable page speeds Good on-page optimization Good back links from industry influences Competitor Page Negatives Site structure and URL's are inconsistent and messy Lower quality content site wide They use tried and tested on page optimization methods like Keyword spamming, Bold Keywords,Underlining Keywords (Sarcasm) Terrible back links, all directories and free article submission sites (Seriously take a look) Less focused on page optimization Not mobile optimized Most of the rest of the sites carry on the same sort of differences, Engine: www.google.co.uk Keyword: Sound level meters **Our Page: **www.cirrusresearch.co.uk/products/sound-level-meters/ **Competitor Page: **www.pulsarinstruments.com/product-information/Sound-Level-Meter.html Any feedback would be greatly appreciated please, i am really struggling to get my head around this Thanks James
White Hat / Black Hat SEO | | Antony_Towle1 -
I think my site is affected by a Google glitch...or something
Although google told me No manual spam actions found i had not received an unnatural link request notice i figured it would be a good idea to clean these up so i did. So i have submitted 3 reconsideration requests from google. They all came back with the same response: No manual spam actions found. I really doubt that anyone at google really checked those out.You will notice that i don't even appear on page 1-10 at all...its clearly google filtering the site out from the results(except for my brand terms), but i have no idea what for.What do you guys think it is? If you see anythign let me know so i can have it fixed.This has been going on for 2 months now...my company has been around for a long time...i dont understand why suddenly im not showing up in searches for the keyword si used to rank for...
White Hat / Black Hat SEO | | CMTM0 -
Google-backed sites' link profiles
Curious what you SEO people think of the link profiles of these (high-ranking) Google-backed UK sites: http://www.opensiteexplorer.org/domains?site=www.startupdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.lawdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.marketingdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.itdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.taxdonut.co.uk Each site has between 40k and 50k inlinks counted in OSE. However, there are relatively few linking root domains in each case: 273 for marketingdonut 216 for startupdonut 90 for lawdonut 53 for itdonut 16 for taxdonut Is there something wrong with the OSE data here? Does this imply that the average root domain linking to the taxdonut site does so with 2857 links? The sites have no significant social media stats. The sites are heavily inter-linked. Also linked from the operating business, BHP Information Solutions (tagline "Gain access to SMEs"). Is this what Google would think of as a "natural" link profile? Interestingly, they've managed to secure links on quite a few UK local authority resources pages - generally being the only commercial website on those pages.
White Hat / Black Hat SEO | | seqal0 -
Penguin destroys 1 of my sites! Any ideas why the other was spared?
I have 2 main sites for my business. One is a creaky homestead site about 4 years old Another is a much more sophisticated wordpress site now almost 2 years old. That site's traffic steadily increased until May of 2011 when it suffereed a 25 to 30% decline probably due to Panda. I did all of the recommended fixes with little effect until about 3 months ago when its traffic started going up again and had almost a complete recovery until last week when my traffic is down about 95%. I strongly suspecct the penguin. Interestingly, my old site has been virtually unaffectted even though bost sites are fairly similar, on both sites I started with a lot of directory links including DMOZ, Yahoo, BOTW, some strong lawyer sites like NOLO.COM, Lawyers.com, and others not so strong but I tried to get the best directories I could find. Then I started getting a lot of natural links but some of these aee pretty junky sites and scraper type sites. I am curious if anyone has any thoughts on why www.uncontesteddivorce-nyc.com was hit so hard while www.affordable-uncontested-divorce.com is unscathed. The newer site has, accoring to majestic seo and market samurai, around 35, 000 backlinks, while the older site has around 3500. Thanks, Paul
White Hat / Black Hat SEO | | diogenes0 -
Can't figure out how my competitor has so many links
I suspect something possibly black-hat is going on with the amount of inbound links for www.pacificlifestylehomes.com ( http://www.opensiteexplorer.org/links?site=www.pacificlifestylehomes.com ) mainly because they have such a large volume of links (for my industry) with their exact targeted keyword. Can anyone help clear this up for me?
White Hat / Black Hat SEO | | theChris0 -
Thinking of redirecting *all* mobile traffic to another site (via an advertiser) - safe to do?
Hi, I am thinking of redirecting all mobile (iphone, cell phone, etc) to an advertiser (so completely different content than my site). Is there any risk of getting banned from google (etc) for doing this? (this is for an adult site)
White Hat / Black Hat SEO | | dmn020