Competitors and Duplicate Content
-
I'm curious to get people's opinion on this.
One of our clients (Company A) has a competitor that's using duplicate sites to rank. They're using "www.companyA.com" and "www.CompanyAIndustryTown.com" (actually, several of the variations). It's basically duplicate content, with maybe a town name inserted or changed somewhere on the page. I was always told that this is not a wise idea. They started doing this in the past month or so when they had a site redesign. So far, it's working pretty well for them. So, here's my questions:
-Would you address this directly (report to Google, etc.)?
-Would you ignore this?
-Do you think it's going to backfire soon?
There's another company (Company B) that's using another practice- using separate pages on their domain to address different towns, and using those as landing pages. Similar, in that a lot of the content is the same, just some town names and minor details changed. All on the same domain though. Would the same apply to that?
Thanks for your insight!
-
The only long lasting way to rank for local specific pages is to offer truly unique content on those pages, and build unique links to those pages.
The two methods you mentioned here, using near duplicate sites and pages, may work for a short time or in non-competitive niches. It may also work somewhat if a very strong link profile is backing it up... but in general these sorts of tricks usually result in a drop in rankings. If not now, then during an upcoming algorythm change.
Often times, misguided webmasters think they are doing the right thing in launching these sites and pages, and no ill intent is intended. Unless the pages are obviously spam or doorway pages, then in my opinion it's probably not worth it reporting them to Google, but that decision is of course best left to each individual.
Read more about doorway pages: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355
Consider how Yelp has 100s of pages about dentist, at least one page for every major city in America. Although the pages are similar, they are each filled with unique content and all have unique links pointing to them. Each delivers a similar message, but provides unique value based on that particular location.
Add unique value to each location specific page, and you're doing great.
-
Unfortunately, this isn't a method likely to work.
Most of the time, if you insert canonical tags on near similar pages, and Google interprets those canonical correctly, then they tend to index and rank the page that the canonical points to. So all of those other pages would have little or no search engine visibility whatsoever.
Not a good technique if you're trying to rank individual pages.
-
So ARE you suggesting that for local city pages that you add the canonical tag to point to the home page?
I guess I'm a little confused on this as Adam is?
Can you explain your thoughts behind this?
-
So let me clarify then, if they have (on same domain) multiple pages with near duplicate content, mostly changing names of cities, but use rel:canonical, they will still have the SEO benefit of ranking for different towns, but it won't be seen as duplicate content?
And then the multiple domain situation...that's just a wait and see.
-
The pages with the city specific information but similar content are pretty much the perfect space for a canonical tag. If you feel that they haven't been penalized, then this is probably the method they are using for hosting the same content.
-
here is an example of sites that have been using duplicate content with a few word changes
http://www.seomoz.org/q/duplicate-exact-match-domains-flagged-by-google-need-help-reinclusion
-
Having multiple sites with duplicate content is a bad idea as it affects your search engine rankings. The company is likely to be using bad SEO practice and soon google bots will pick this up and the domain will get penalised.
You can report to Google, but in most cases Google picks up sites that are using bad SEO techniques.
There is no harm in using separate pages on domains name to address they operate in different towns as this helps the site being found for local searches, but having content that is again duplicated and only a few words changed Google will pick this up.
Always remember Content is KING!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Third part http links on the page source: Social engineering content warning from Google
Hi, We have received "Social engineering content" warning from Google and one of our important page and it's internal pages have been flagged as "Deceptive site ahead". We wonder what's the reason behind this as Google didn't point exactly to the specific part of the page which made us look so to the Google. We don't employ any such content on the page and the content is same for many months. As our site is WP hosted, we used a WordPress plugin for this page's layout which injected 2 http (non-https) links in our page code. We suspect if this is the reason behind this? Any ideas? Thanks
White Hat / Black Hat SEO | | vtmoz1 -
Duplicate categories how to make sure I don't get penalized for this
Hi there How would I go about fixing duplicate categories? My products sell in multiple category areas and some overlap the other - how can I go about making sure that I don't get penalised for this? Each category and content is unique but my advisors offer different tools and insights.
White Hat / Black Hat SEO | | edward-may0 -
Preventing CNAME Site Duplications
Hello fellow mozzers! Let me see if I can explain this properly. First, our server admin is out of contact at the moment,
White Hat / Black Hat SEO | | David-Kley
so we are having to take this project on somewhat blind. (forgive the ignorance of terms). We have a client that needs a cname record setup, as they need a sales.DOMAIN.com to go to a different
provider of data. They have a "store" platform that is hosted elsewhere and they require a cname to be
sent to a custom subdomain they set up on their end. My question is, how do we prevent the cname from being indexed along with the main domain? If we
process a redirect for the subdomain, then the site will not be able to go out and grab the other providers
info and display it. Currently, if you type in the sales.DOMAIN.com it shows the main site's homepage.
That cannot be allow to take place as we all know, having more than one domain with
exact same content = very bad for seo. I'd rather not rely on Google to figure it out. Should we just have the cname host (where its pointing at) add a robots rule and have it set to not index
the cname? The store does not need to be indexed, as the items are changed almost daily. Lastly, is an A record required for this type of situation in any way? Forgive my ignorance of subdomains, cname records and related terms. Our server admin being
unavailable is not helping this project move along any. Any advice on the best way to handle
this would be very helpful!0 -
Competitor ranking well with duplicate content—what are my options?
A competitor is ranking #1 and #3 for a search term (see attached) by publishing two separate sites with the same content. They've modified the title of the page, and serve it in a different design, but are using their branded domain and a keyword-rich domain to gain multiple rankings. This has been going on for years, and I've always told myself that Google would eventually catch it with an algorithm update, but that doesn't seem to be happening. Does anyone know of other options? It doesn't seem like this falls under any of the categories that Google lists on their web spam report page—is there any other way to get bring this up with the powers that be, or is it something that I just have to live with and hope that Google figures out some day? Any advice would help. Thanks! how_to_become_a_home_inspector_-_Google_Search_2015-01-15_18-45-06.jpg
White Hat / Black Hat SEO | | inxilpro0 -
Duplicate keywords in URL?
Is there such a thing as keyword stuffing URLs? Such as a domain name of turtlesforsale.com having a directory called turtles-for-sale that houses all the pages on the site. Every page would start out with turtlesforsale.com/turtles-for-sale/. Good or bad idea? The owner is hoping to capitalize on the keywords of turtles for sale being in the URL twice and ranking better for that reason.
White Hat / Black Hat SEO | | CFSSEO0 -
Rel author and duplicate content
I have a question if a page who a im the only author, my web will duplicate content with the blog posts and the author post as they are the same. ¿what is your suggestion in that case? thanks
White Hat / Black Hat SEO | | maestrosonrisas0 -
Can I report competitor for asking to guest post?
I just had an email from one of my least preferred competitor's SEO company asking about guest posting. They are already totally dominating the SERPs where they have no natural reason for being. Is there anywhere to bring this to the attention of the search engines?
White Hat / Black Hat SEO | | Cornwall0 -
Shadow Pages for Flash Content
Hello. I am curious to better understand what I've been told are "shadow pages" for Flash experiences. So for example, go here:
White Hat / Black Hat SEO | | mozcrush
http://instoresnow.walmart.com/Kraft.aspx#/home View the page as Googlebot and you'll see an HTML page. It is completely different than the Flash page. 1. Is this ok?
2. If I make my shadow page mirror the Flash page, can I put links in it that lead the user to the same places that the Flash experience does?
3. Can I put "Pinterest" Pin-able images in my shadow page?
3. Can a create a shadow page for a video that has the transcript in it? Is this the same as closed captioning? Thanks so much in advance, -GoogleCrush0