Strange strategy from a competitor. Is this "Google Friendly"?
-
Hi all,We have a client from a very competitive industry (car insurance) that ranks first for almost every important and relevant keyword related to car insurance.
But they could always be doing a good job.A few days ago i found this: http://logo.force.com/
The competitor website is: http://www.logo.pt/
The competitor name is: Logo
What I found strange is the fact that both websites are the same, except the fact that the first is in a sub-domain and have important links pointing to the original website (www.logo.pt)
So my question is, is this a "google friendly" (and fair) technique? why this competitor has such good results?
Thanks in advance!!
I look forward to hearing from you guys
-
Be very careful about making assumptions regarding competitors.
Just because you see one thing, does not mean either that one thing is helping or hurting a site. SEO is a vast, complex environment. If a site has enough very strong signals across many areas, one or even a few very poorly executed things may not hurt the site. Or it may not hurt the site "until Google catches up with it".
Duplicate content, regardless of method (within a single site, across multiple domains, across a mix of domains and subdomains" is never a true best practice. Ever. it's artificial, and Google most definitely takes the position that if you are attempting to "artificially" (in their view) manipulate rankings in a non-best-practices manner, that's a violation of their guidelines, policies or TOS.
-
Hi Lesley!
Thanks for your response.
The robots.txt file is exactly the same as the "original" website.
I thought the strategy would be to obtain some benefit in being under a strong DA (79).
And i still find strange the fact that is always ranks first for the most important kws for this industry (very competitive one) but maybe it has something to do with the backlinks.
Thanks again!
-
Is there a chance that you could have found their dev site? Look at the source and robots.txt, is it set to noindex and to disallow?
edit: Actually in looking it up, it is something that sales force is doing. I think it would be considered bad, its duplicated content. Another one that is hosted on the same server is
which is also
It looks like salesforce is copying the websites for some reason.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Chrome79 shows warning on our domain "Did you mean...?" another website
On Chrome79 a large scary warning is shown to users on our site: "Did you mean this other domain? This site's domain looks similar to X domain. Attackers sometimes mimic sites by making small, hard-to-see changes to the domain." Screenshot: https://imgur.com/a/NOGEyLM Our online business is reputable, no black hat SEO practices, has been established since the early 2000s, with a relatively high DA. We don't have any warnings / manual actions in Google Search Console so I can't request a review there. I've reported it several weeks ago to Google's Incorrect Phishing Warning but the warning continues to display. I reported using: google.com/safebrowsing/report_error/ Does the Moz community have any suggestions on how to fix this or general thoughts? Thanks! NOGEyLM
White Hat / Black Hat SEO | | sb10300 -
HOW!??! Homepage Ranking Dropped Completely out of Top 100 on Google....
So I'm competing for a very competitive keyword, and I've been on the bottom of page 2 for a while now, ranking for my homepage, which is very content rich and has GREAT links pointing to it. Out of nowhere, last week I dropped completely out of the top 100 or so, yet one of my article posts now ranks on page 6 or so for the same keyword. I have great authoritative links, my on-page is spot on, all of my articles are super super high quality, I don't understand how my homepage, which has ranked for the main keyword for months on page 2, can just completely drop out of the top 100 or so.... Can anyone help provide some insight?
White Hat / Black Hat SEO | | juicyresults0 -
Why isn't a 301 redirect removing old style URLs from Google's index?
I have two questions:1 - We changed the URL structure of our site. Old URLs were in the format of kiwiforsale.com/used_fruit/yummy_kiwi. These URLs are 301 redirected to kiwiforsale.com/used-fruit/yummy-kiwi. We are getting duplicate content errors in Google Webmaster Tools. Why isn't the 301 redirect removing the old style URL out of Google's index?2 - I tried to remove the old style URL at https://www.google.com/webmasters/tools/removals, however I got the message that "We think the image or web page you're trying to remove hasn't been removed by the site owner. Before Google can remove it from our search results, the site owner needs to take down or update the content."Why are we getting this message? Doesn't the 301 redirect alert Google that the old style URL is toast and it's gone?
White Hat / Black Hat SEO | | CFSSEO0 -
Competitor outranking you with link spam. What would be your next steps?
FYI: I've already searched the forums for previous posts on this topic and although some are helpful, they don't tend to have many responses, so I'm posting this again in the hope of more interaction from the community 😉
White Hat / Black Hat SEO | | adamlcasey
So can I please ask the community to tell me what course of action you would take, if this was happening to you? We have been ranking in position 1 for a major keyword in our space for the past 18 months. Today I logged into my Moz account and to keyword rankings to find that we have dropped to 2nd. So I placed the competitors website; who's now in 1st position, into OSE and looked under the "Just Discovered" tab. There are 258 newly discovered links, 95% of which use keywords in the anchor text!
So I reviewed the rankings for all of these other keywords being targeted and sure enough they are now dominating the top 1-3 spots for most of them. (some of which we are also attempting to rank for and have subsequently been pushed down the rankings) Their links are made up of: Forum and blog comments - always using anchor text in the links Article's posted on web 2.0 sites (Squidoo, Pen.io, Tumblr, etc) Profile page links Low quality Press Release sites Classified ad sites Bookmarking sites Article Marketing sites Our competitors sell safety solutions into the B2B market yet the topics of some of the sites where these links appear include: t-shirts sports news online marketing anti aging law christian guitars computers juke boxes Of the articles that I quickly scanned, it was clear they had been spun as they didn't read well/make sense in places. So my conclusion is that they have decided to work with a person (can't bring myself to call them an seo company) who have provided them with a typical automated link building campaign using out dated, poor seo practices that are now classified as link spam. No doubt distributed using an automated link publishing application loaded with the keyword rich anchor text links and published across any site that will take them. As far as I was aware, all of the types of links we're supposed to have be penalised by Google's Penguin & Panda updates and yet it seems they are working for them! So what steps would you take next?0 -
If Google Authorship is used for every page of your website, will it be penalized?
Hey all, I've noticed a lot of companies will implement Google Authorship on all pages of their website, ie landing pages, home pages, sub pages. I'm wondering if this will be penalized as it isn't a typical authored piece of content, like blogs, articles, press releases etc. I'm curious as I'm going to setup Google Authorship and I don't want it to be setup incorrectly for the future. Is it okay to tie each page (home page, sub pages) and not just actual authored content (blogs, articles, press releases) or will it get penalized if that occurs? Thanks and much appreciated!
White Hat / Black Hat SEO | | MonsterWeb280 -
Massive drop in Google traffic after upping pagecount 8-fold.
I run a book recommendation site -- Flashlight Worthy. It's a collection of original, topical book lists: "The Best Books for Healthy (Vegetarian) Babies" or "Keystone Mysteries: The Best Mystery Books Set in Pennsylvania" or "5 Books That Helped Me Discover and Love My Italian Heritage". It's been online for 4+ years. Historically, it's been made up of: a single home page ~50 "category" pages, and ~425 "book list" pages. (That 50 number and 425 number both started out much smaller and grew over time but has been around 425 for the last year or so as I've focused my time elsewhere.) On Friday, June 15 we made a pretty big change to the site -- we added a page for every Author who has a book that appears on a list. This took the number of pages in our sitemap from ~500 to 4,149 overnight. If an Author has more than one book on the site, the page shows every book they have on the site, such as this page: http://www.flashlightworthybooks.com/books-by/Roald-Dahl/2805 ..but the vast majority of these author pages have just one book listed, such as this page: http://www.flashlightworthybooks.com/books-by/Barbara-Kilarski/2116 Obviously we did this as an SEO play -- we figured that our content was getting ~1,000 search entries a day for such a wide variety of queries that we may as well create pages that would make natural landing pages for a broader array of queries. And it was working... 5 days after we launched the pages, they had ~100 new searches coming in from Google. (Ok, it peaked at 100 and dropped down to a steady 60 or so day within a few days, but still. And then it trailed off for the last week, dropping lower and lower every day as if they realized it was repurposed content from elsewhere on our site...) Here's the problem: For the last several years the site received ~30,000 search entries a month... a little more than 1,000 a day on weekdays, a little lighter on weekends. This ebbed and flowed a bit as Google made tweaked things (Panda for example), as we garnered fresh inbound links, as the GoodReads behemoth stole some traffic... but by and large, traffic was VERY stable. And then, on Saturday, exactly 3 weeks after we added all these pages, the bottom fell out of our search traffic. Instead of ~1,000 entries a day, we've had ~300 on Saturday and Sunday and it looks like we'll have a similar amount today. And I know this isn't just some Analytics reporting problem as Chartbeat is showing the same drop. As search is ~80% of my traffic I'm VERY eager to solve this problem... So: 1. Do you think the drop is related to my upping my pagecount 8-fold overnight? 2. Do you think I'd climb right back into Google's good graces if I removed all the pages at once? Or just all the pages that only list one author (which would be the vasy majority). 3. Have you ever heard of a situation like this? Where Google "punishes" a site for creating new pages out of existing content? Really, it's useful content -- and these pages are better "answers" for a lot of queries. When someone searches for "Norah Ephron books" it's better they land on a page of ours that pulls together the 4 books we have than taking them to a page that happens to have just one book on it among 5 or 6 others by other authors. What else? Thanks so much, help is very appreciated. Peter
White Hat / Black Hat SEO | | petestein1
Flashlight Worthy Book Recommendations
Recommending books so good, they'll keep you up past your bedtime. 😉0 -
Has anyone seen this kind of google cache spam before?
Has anyone seen this kind of 'hack'? When looking at a site recently I found the Google cache version (from 28 Oct) strewn with mentions of all sorts of dodgy looking pharma products but the site itself looked fine. The site itself is www.istc.org.uk Looking in the source of the pages you can see the home pages contains: Browsing as googlebot showed me an empty page (though msnbot etc. returned a 'normal' non-pharma page). As a mildly amusing aside - when I tried to tell the istc about this, the person answering the phone clearly didn't believe me and couldn't get me off the line fast enough! Needless to say they haven't fixed it a week after being told.
White Hat / Black Hat SEO | | JaspalX0 -
Competitors have local "mirror" sites
I have noticed that some of my competitors have set up "mirror" homepages set up for different counties, towns, or suburbs. In one case the mirror homepages are virtually identical escept for the title and in the other case about half of the content id duplicate and the other half is different. both of these competors have excellent rankings and traffic. I am surprised about these results, does anyone care to comment about it and is this a grey hat technique that is likely to be penalized eventually. thx Diogenes
White Hat / Black Hat SEO | | diogenes0