Syndicated content outranks my original article
-
I have a small site and write original blog content for my small audience.
There is a much larger, highly relevant site that is willing to accept guest blogs and they don't require original content. It is one of the largest sites within my niche and many potential customers of mine are there.
When I create a new article I first post to my blog, and then share it with G+, twitter, FB, linkedin.
I wait a day. By this time G has seen the links that point to my article and has indexed it.
Then I post a copy of the article on the much larger site. I have a rel=author tag within the article but the larger site adds "nofollow" to that tag. I have tried putting a link rel=canonical tag in the article but the larger site strips that tag out.
So G sees a copy of my content on this larger site. I'm hoping they realize it was posted a day later than the original version on my blog. But if not will my blog get labeled as a scraper?
Second: when I Google the exact blog title I see my article on the larger site shows up as the #1 search result but (1) there is no rich snippet with my author creds (maybe because the author tag was marked nofollow?), and (2) the original version of the article from my blog is not in the results (I'm guessing it was stripped out as duplicate).
There are benefits for my article being on the larger site, since many of my potential customers are there and the article does include a link back to my site (the link is nofollow). But I'm wondering if (1) I can fix things so my original article shows up in the search results, or (2) am I hurting myself with this strategy (having G possibly label me a scraper)? I do rank for other phrases in G, so I know my site hasn't had a wholesale penalty of some kind.
-
Thanks, Tommy. That confirms what I thought. I wouldn't mind so much if the bigger site didn't nofollow my author tag but since they do then I'm getting little benefit from them other than exposure to their audience. And that is worth something, to be sure.
Maybe I'll post on their site for a day or two and then delete the post on their site (I have that ability) so that I get some exposure there but then the only copy of the article will be on my site after a couple of days.
-
Thanks, Egol. For my next few postings I will keep them on my own site and see what kind of rankings and traffic they get for a month or so. Then compare that traffic to the traffic I've seen from articles I've posted on the larger site.
Appreciate the input. I do want to build equity for my own site, but it's a trade off with getting more exposure/customers on the bigger site. I am in this for the long haul, though, so I suppose tons of unique content on my own site will be valuable in the future.
-
Hi Mike,
I also had a similar experience and debated for a while to finally come up with a solution.
If you are posting exactly the same content on your blog and on another blog, I belive that is already causing duplicate content even if you posted on your blog first. How duplicate content works is that if someone search for your article title (like what you did), Google will pull up websites that best match the search. If G sees that the bigger site has the exact same article as your blog, they will use the bigger site in the result because 1) it probably has more backlinks 2) it probably has more authority and 3) it's domain age is probably older than your blog.
One way to solve this is to use canonical tag but it seems like it doesn't work because they remove it.
Here is where you will have to debate and decide what works better for you.
-
Don't repost on the bigger site so that you article can actually be found via Search Engine instead of the bigger site. However, with this approach, you will lose those article views from the bigger site and you will lose the opportunity of reaching viewers that will never visit your site,
-
Continue to post on the bigger site so that you will have more views on your articles, you can reach those people who you might not be able to reach if you only post on blog, increase your audience and get your name out. However, with this approach, your website won't appear on search result since the bigger site obviously hav emore authority than your site and your site might get penalized for duplicate content. Well, you can stop posting the article on your blog and just post on the bigger site to avoid duplicate content.
You will have to decide which scenario benefits you more.
OR
- Post on your website but also create NEW and UNIQUE articles on the bigger site to increase view, hopefully traffic and etc.
To answer your questions
-
Yes, no snippet because the author tag is probably noindex.
-
My explantion on how duplicate works probably answered the question
Hope this helps.
-
-
I have experience republishing lots of content from universities and government agencies.
Their content on my site often outranks the same content on their site. It does not matter who publishes first. The content that I republish was on these other sites for weeks and months - sometimes years - in most cases before I republished. What matters is which domain google favors for that topic.
I get lots of links and traffic using their content.
As you get more and more duplicate content out there on other websites you increase your risk of getting hit with a Panda problem. For that reason, I have cut back on the amount of republishing that I do.
I never give my articles to other websites for republishing. In my opinion that feeds your competitors and creates new ones.
The only way that I would give one of my articles to another site is if that site has ENORMOUS traffic in comparison to mine and my goal is to "get the word out" about something. If you are republishing on other sites because you think you will get a link or a bit of temporary traffic, I believe that is a mistake and you would be better off building unique equity for your own site.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content relaunch without content duplication
We write great Content for blog and websites (or at least we try), especially blogs. Sometimes few of them may NOT get good responses/reach. It could be the content which is not interesting, or the title, or bad timing or even the language used. My question for the discussion is, what will you do if you find the content worth audience's attention missed it during its original launch. Is that fine to make the text and context better and relaunch it ? For example: 1. Rechristening the blog - Change Title to make it attractive
Technical SEO | | macronimous
2. Add images
3. Check spelling
4. Do necessary rewrite, spell check
5. Change the timeline by adding more recent statistics, references to recent writeups (external and internal blogs for example), change anything that seems outdated Also, change title and set rel=cannoical / 301 permanent URLs. Will the above make the blog new? Any ideas and tips to do? Basically we like to refurbish (:-)) content that didn't succeed in the past and relaunch it to try again. If we do so will there be any issues with Google bots? (I hope redirection would solve this, But still I want to make sure) Thanks,0 -
Https Duplicate Content
My previous host was using shared SSL, and my site was also working with https which I didn’t notice previously. Now I am moved to a new server, where I don’t have any SSL and my websites are not working with https version. Problem is that I have found Google have indexed one of my blog http://www.codefear.com with https version too. My blog traffic is continuously dropping I think due to these duplicate content. Now there are two results one with http version and another with https version. I searched over the internet and found 3 possible solutions. 1 No-Index https version
Technical SEO | | RaviAhuja
2 Use rel=canonical
3 Redirect https versions with 301 redirection Now I don’t know which solution is best for me as now https version is not working. One more thing I don’t know how to implement any of the solution. My blog is running on WordPress. Please help me to overcome from this problem, and after solving this duplicate issue, do I need Reconsideration request to Google. Thank you0 -
Uservoice and Duplicate Page Content
Hello All, I'm having an issue where the my UserVoice account is creating duplicate page content (image attached). Any ideas on how to resolve the problem? A couple solutions we're looking into: moving the uservoice content inside the app, so it won't get crawled, but that's all we got for now. Thank you very much for your time any insight would be helpful. Sincerely,
Technical SEO | | JonnyBird1
Jon Birdsong SalesLoft duplicate duplicate0 -
Syndicated Content Appearing Above Original
Hi. I run a travel blog and my content is often re-posted by related sites with a backlink to my content (and full credit etc) but still ranks above my article in Google. Any ideas what I can do to stop this happening? Thanks
Technical SEO | | ben10000 -
Duplicate content
I have two page, where the second makes a duplicate content from the first Example:www.mysite.com/mypagewww.mysite.com/mysecondpageIf i insert still making duplicate content?Best regards,Wendel
Technical SEO | | peopleinteractive0 -
Json and crawlable content - simple explanation
Hi There, My IT colleagues are trying to improve performance on our homepage, and they suggest to use Json. They were inspired by Facebook that uses json. Now they ask me if this can have an impact on SEO. Most expert readings point to this page http://code.google.com/web/ajaxcrawling/ Fine. Can anyone explain me in a few simple words how I should proceed when I want to optimize my homepage http://www.example.com/ for both performance and crawling when using json. Do i need 2 separate pages (ugly/pretty)? Kind regards Pieter
Technical SEO | | TruvoDirectories0 -
Duplicate content handling.
Hi all, I have a site that has a great deal of duplicate content because my clients list the same content on a few of my competitors sites. You can see an example of the page here: http://tinyurl.com/62wghs5 As you can see the search results are on the right. A majority of these results will also appear on my competitors sites. My homepage does not seem to want to pass link juice to these pages. Is it because of the high level of Dup Content or is it because of the large amount of links on the page? Would it be better to hide the content from the results in a nofollowed iframe to reduce duplicate contents visibilty while at the same time increasing unique content with articles, guides etc? or can the two exist together on a page and still allow link juice to be passed to the site. My PR is 3 but I can't seem to get any of my internal pages(except a couple of pages that appear in my navigation menu) to budge of the PR0 mark even if they are only one click from the homepage.
Technical SEO | | Mulith0 -
Duplicate content
I am getting flagged for duplicate content, SEOmoz is flagging the following as duplicate: www.adgenerator.co.uk/ www.adgenerator.co.uk/index.asp These are obviously meant to be the same path so what measures do I take to let the SE's know that these are to be considered the same page. I have used the canonical meta tag on the Index.asp page.
Technical SEO | | IPIM0