Syndicated content outranks my original article
-
I have a small site and write original blog content for my small audience.
There is a much larger, highly relevant site that is willing to accept guest blogs and they don't require original content. It is one of the largest sites within my niche and many potential customers of mine are there.
When I create a new article I first post to my blog, and then share it with G+, twitter, FB, linkedin.
I wait a day. By this time G has seen the links that point to my article and has indexed it.
Then I post a copy of the article on the much larger site. I have a rel=author tag within the article but the larger site adds "nofollow" to that tag. I have tried putting a link rel=canonical tag in the article but the larger site strips that tag out.
So G sees a copy of my content on this larger site. I'm hoping they realize it was posted a day later than the original version on my blog. But if not will my blog get labeled as a scraper?
Second: when I Google the exact blog title I see my article on the larger site shows up as the #1 search result but (1) there is no rich snippet with my author creds (maybe because the author tag was marked nofollow?), and (2) the original version of the article from my blog is not in the results (I'm guessing it was stripped out as duplicate).
There are benefits for my article being on the larger site, since many of my potential customers are there and the article does include a link back to my site (the link is nofollow). But I'm wondering if (1) I can fix things so my original article shows up in the search results, or (2) am I hurting myself with this strategy (having G possibly label me a scraper)? I do rank for other phrases in G, so I know my site hasn't had a wholesale penalty of some kind.
-
Thanks, Tommy. That confirms what I thought. I wouldn't mind so much if the bigger site didn't nofollow my author tag but since they do then I'm getting little benefit from them other than exposure to their audience. And that is worth something, to be sure.
Maybe I'll post on their site for a day or two and then delete the post on their site (I have that ability) so that I get some exposure there but then the only copy of the article will be on my site after a couple of days.
-
Thanks, Egol. For my next few postings I will keep them on my own site and see what kind of rankings and traffic they get for a month or so. Then compare that traffic to the traffic I've seen from articles I've posted on the larger site.
Appreciate the input. I do want to build equity for my own site, but it's a trade off with getting more exposure/customers on the bigger site. I am in this for the long haul, though, so I suppose tons of unique content on my own site will be valuable in the future.
-
Hi Mike,
I also had a similar experience and debated for a while to finally come up with a solution.
If you are posting exactly the same content on your blog and on another blog, I belive that is already causing duplicate content even if you posted on your blog first. How duplicate content works is that if someone search for your article title (like what you did), Google will pull up websites that best match the search. If G sees that the bigger site has the exact same article as your blog, they will use the bigger site in the result because 1) it probably has more backlinks 2) it probably has more authority and 3) it's domain age is probably older than your blog.
One way to solve this is to use canonical tag but it seems like it doesn't work because they remove it.
Here is where you will have to debate and decide what works better for you.
-
Don't repost on the bigger site so that you article can actually be found via Search Engine instead of the bigger site. However, with this approach, you will lose those article views from the bigger site and you will lose the opportunity of reaching viewers that will never visit your site,
-
Continue to post on the bigger site so that you will have more views on your articles, you can reach those people who you might not be able to reach if you only post on blog, increase your audience and get your name out. However, with this approach, your website won't appear on search result since the bigger site obviously hav emore authority than your site and your site might get penalized for duplicate content. Well, you can stop posting the article on your blog and just post on the bigger site to avoid duplicate content.
You will have to decide which scenario benefits you more.
OR
- Post on your website but also create NEW and UNIQUE articles on the bigger site to increase view, hopefully traffic and etc.
To answer your questions
-
Yes, no snippet because the author tag is probably noindex.
-
My explantion on how duplicate works probably answered the question
Hope this helps.
-
-
I have experience republishing lots of content from universities and government agencies.
Their content on my site often outranks the same content on their site. It does not matter who publishes first. The content that I republish was on these other sites for weeks and months - sometimes years - in most cases before I republished. What matters is which domain google favors for that topic.
I get lots of links and traffic using their content.
As you get more and more duplicate content out there on other websites you increase your risk of getting hit with a Panda problem. For that reason, I have cut back on the amount of republishing that I do.
I never give my articles to other websites for republishing. In my opinion that feeds your competitors and creates new ones.
The only way that I would give one of my articles to another site is if that site has ENORMOUS traffic in comparison to mine and my goal is to "get the word out" about something. If you are republishing on other sites because you think you will get a link or a bit of temporary traffic, I believe that is a mistake and you would be better off building unique equity for your own site.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best format for animated content
We want to use some movement in our designs, charts etc. what format is the most SEO friendly?
Technical SEO | | remkoallertz1 -
Could a partial repost outrank my original post?
A blogger asked to repost one of my articles. I said they could if they used rel="canonical" However, they're on blogger and not too technically skilled. They couldn't figure it out, so I suggested just reposting part of the article. Here's their repost: http://ditchyourfridge.blogspot.ca/2013/07/graeme-blake-living-without-home.html Is this a good idea or a bad idea?
Technical SEO | | graemeblake0 -
What could be the cause of this duplicate content error?
I only have one index.htm and I'm seeing a duplicate content error. What could be causing this? IUJvfZE.png
Technical SEO | | ScottMcPherson1 -
Duplicate Content Issue
SEOMOZ is giving me a number of duplicate content warnings related to pages that have an email a friend and/or email when back in stock versions of a page. I thought I had those blocked via my robots.txt file which contains the following... Disallow: /EmailaFriend.asp Disallow: /Email_Me_When_Back_In_Stock.asp I had thought that the robot.txt file would solve this issue. Anyone have any ideas?
Technical SEO | | WaterSkis.com0 -
Doing SEO for site that publishes articles everyday?
Hi, I was wondering on the best way to do SEO for a site that publishes articles daily. In other words, I add a new url to my site daily. How should I ensure that the new pages are optimized? Do I need to submit each new site to search engines? Any tips?
Technical SEO | | waltergah0 -
Similar Content vs Duplicate Content
We have articles written for how to setup pop3 and imap. The topics are technically different but the settings within those are very similar and thus the inital content was similar. SEOMoz reports these pages as duplicate content. It's not optimal for our users to have them merged into one page. What is the best way to handle similar content, while not getting tagged for duplicate content?
Technical SEO | | Izoox0 -
Is 100% duplicate content always duplicate?
Bit of a strange question here that would be keen on getting the opinions of others on. Let's say we have a web page which is 1000 lines line, pulling content from 5 websites (the content itself is duplicate, say rss headlines, for example). Obviously any content on it's own will be viewed by Google as being duplicate and so will suffer for it. However, given one of the ways duplicate content is considered is a page being x% the same as another page, be it your own site or someone elses. In the case of our duplicate page, while 100% of the content is duplicate, the page is no more than 20% identical to another page so would it technically be picked up as duplicate. Hope that makes sense? My reason for asking is I want to pull latest tweets, news and rss from leading sites onto a site I am developing. Obviously the site will have it's own content too but also want to pull in external.
Technical SEO | | Grumpy_Carl0 -
Duplicate content
I have to sentences that I want to optimize to different pages for. sentence number one is travel to ibiza by boat sentence number to is travel to ibiza by ferry My question is, can I have the same content on both pages exept for the keywords or will Google treat that as duplicate content and punish me? And If yes, where goes the limit/border for duplicate content?
Technical SEO | | stlastla0