Syndicated content outperforming our hard work!
-
Our company (FindMyAccident) is an accident news site. Our goal is to roll our reporting out to all 50 states; currently, we operate full-time in 7 states.
To date, the largest expenditure is our writing staff. We hire professional
journalists who work with police departments and other sources to develop written
content and video for our site. Our visitors also contribute stories and/or
tips that add to the content on our domain. In short, our content/media is 100% original.A site that often appears alongside us in the SERPs in the markets where we work full-time is accidentin.com. They are a site that syndicates accident news and offers little original content. (They also allow users to submit their own accident stories, and the entries index quickly and are sometimes viewed by hundreds of people in the same day. What's perplexing is that these entries are isolated incidents that have little to no media value, yet they do extremely well.)
(I don't rest my bets with Quantcast figures, but accidentin does use their pixel sourcing and the figures indicate that they are receiving up to 80k visitors a day in some instances.)
I understand that it's common to see news sites syndicate from the AP, etc., and traffic accident news is not going to have a lot of competition (in most instances), but the real shocker is that accidentin will sometimes appear as the first or second result above the original sources???
The question: does anyone have a guess as to what is making it perform so well?
Are they bound to fade away?
While looking at their model, I'm wondering if we're not silly to syndicate news in the states where we don't have actual staff? It would seem we could attract more traffic by setting up syndication in our vacant states.
OR
Is our competitor's site bound to fade away?
Thanks, gang, hope all of you have a great 2013!
Wayne
-
Basically, Google treats Syndicated content and duplicate content differently. So, if the competitor you are talking about is following the best practices for syndicated content and if Google sees their website or webpage to be more prominent (Because of more relevant/ related contents on that domain, SEO optimization or popularity etc.) and more relevant (Than the original creator of the content or the other syndication partners), in relation to the keywords searched for , then Google will show the content on that particular syndication partner's page (in this situation the competitor you are talking about) rather than that of original creator's page.And, no, as long as they are following the best practices for syndicated content, they won't have any problem. But, it could happen that in the future some other content syndication partner might be given more prominence over the other, if that page on that website has leveraged the content better or even the original creator might given more prominence if they do a good job at optimizing their syndicated content strategy.
As far as syndicated content goes, Google says this:
“If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you’d prefer.”
So, in a nut shell...there are no penalties for properly syndicated content, but, just the fact that Google will decide which page to display based on it's prominence and best practices. But, yeah, if they are not following the best practices for content syndication, then, Google will start to see them as duplicate pages, and, then it is a different story.
BTW, here is a post that will be of help to you which talks about how the original creators of the content can leverage it:
http://www.smashingmagazine.com/2012/06/28/content-creators-benefit-from-new-seo/
-
"The question: does anyone have a guess as to what is making it perform so well?"
Your hard work.
Stop allowing them to use your content and they should not appear in your SERPs.
-
The question: does anyone have a guess as to what is making it perform so well?
You have a stronger link profile but I think they are winning the SERPs because they post "Recent" links on their homepage that link to news and user submissions. This in turn lets crawlers syndicate the latest submissions quicker, their homepage is crawled more often, and they rank quicker/better because of the Query Deserves Freshness (QDF) factor.
I recommend you try doing the same thing and see if that helps you.
--
I also only found 5 instances of your articles being sourced - https://www.google.com/search?q=site:accidentin.com+intext%3Afindmyaccident.com
What kinds of kw are they outranking you for? Do you have a rss feed or how are they scraping you content?
--
In general, scraper sites are not supposed to do well and will probably lose value but I've seen several examples where they are performing really well.
Cheers & Good Luck,
Oleg
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal Links & Possible Duplicate Content
Hello, I have a website which from February 6 is keep losing positions. I have not received any manual actions in the Search Console. However I have read the following article a few weeks ago and it look a lot with my case: https://www.seroundtable.com/google-cut-down-on-similar-content-pages-25223.html I noticed that google has remove from indexing 44 out of the 182 pages of my website. The pages that have been removed can be considered as similar like the website that is mentioned in the article above. The problem is that there are about 100 pages that are similar to these. It is about pages that describe the cabins of various cruise ships, that contain one picture and one sentence of max 10 words. So, in terms of humans this is not duplicate content but what about the engine, having in mind that sometimes that little sentence can be the same? And let’s say that I remove all these pages and present the cabin details in one page, instead of 15 for example, dynamically and that reduces that size of the website from 180 pages to 50 or so, how will this affect the SEO concerning the internal links issue? Thank you for your help.
White Hat / Black Hat SEO | | Tz_Seo0 -
Duplicate content - multiple sites hosted on same server with same IP address
We have three sites hosted on the same server with the same IP address. For SEO (to avoid duplicate content) reasons we need to redirect the IP address to the site - but there are three different sites. If we use the "rel canonical" code on the websites, these codes will be duplicates too, as the websites are mirrored versions of the sites with IP address, e.g. www.domainname.com/product-page and 23.34.45.99/product-page. What's the best ways to solve these duplicate content issues in this case? Many thanks!
White Hat / Black Hat SEO | | Jade0 -
Duplicate product content - from a manufacturer website, to retailers
Hi Mozzers, We're working on a website for a manufacturer who allows retailers to reuse their product information. Now, this of course raises the issue of duplicate content. The manufacturer is the content owner and originator, but retailers will copy the information for their own site and not link back (permitted by the manufacturer) - the only reference to the manufacturer will be the brand name citation on the retailer website. How would you deal with the duplicate content issues that this may cause. Especially considering the domain authority for a lot of the retailer websites is better than the manufacturer site? Thanks!!
White Hat / Black Hat SEO | | A_Q0 -
Separating the syndicated content because of Google News
Dear MozPeople, I am just working on rebuilding a structure of the "news" website. For some reasons, we need to keep syndicated content on the site. But at the same time, we would like to apply for google news again (we have been accepted in the past but got kicked out because of the duplicate content). So I am facing the challenge of separating the Original content from Syndicated as requested by google. But I am not sure which one is better: *A) Put all syndicated content into "/syndicated/" and then Disallow /syndicated/ in robots.txt and set NOINDEX meta on every page. **But in this case, I am not sure, what will happen if we will link to these articles from the other parts of the website. We will waste our link juice, right? Also, google will not crawl these pages, so he will not know about no indexing. Is this OK for google and google news? **B) NOINDEX meta on every page. **Google will crawl these pages, but will not show them in the results. We will still loose our link juice from links pointing to these pages, right? So ... is there any difference? And we should try to put "nofollow" attribute to all the links pointing to the syndicated pages, right? Is there anything else important? This is the first time I am making this kind of "hack" so I am exactly sure what to do and how to proceed. Thank you!
White Hat / Black Hat SEO | | Lukas_TheCurious1 -
Duplicate Content for e-commerce help
Hi. I know I have duplicate content issues and Moz has shown me the issues on ecommerce websites. However a large number of these issues are for variations of the same product. For example a blue, armani t-shirt can be found on armani page, t-shirt page, armani t-shirt page and it also shows links for the duplicates due to sizing variations. Is it possible or even worthwhile working on these issues? Thanks
White Hat / Black Hat SEO | | YNWA0 -
Content optimized for old keywords and G Updates
Hi, We've got some old content, about 50 pages worth in an Ecommerce site, that is optimized for keywords that aren't the subject of the page - these keywords occur about 8 times (2 keywords per page) in the old content. We are going through these 50 pages and changing the title, H1, and meta description tag to match the exact subject of the page - so that we will increase in rankings again - the updates have been lowering our rankings. Do we need to completely rewrite the content for these 50 pages, or can we just sprinkle it with any needed additions of the one keyword that is the subject of the page? The reason I'm asking is that our rankings keep dropping and these 50 pages seem to be part of the problem. We're in the process of updating these 50 pages Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Does posting a source to the original content avoid duplicate content risk?
A site I work with allows registered user to post blog posts (longer articles). Often, the blog posts have been published earlier on the writer's own blog. Is posting a link to the original source a sufficient preventative solution to possibly getting dinged for duplicate content? Thanks!
White Hat / Black Hat SEO | | 945010 -
How tdo you replace an old SEO company's work?
I have a client that has been paying someone for what is basically directory placement on very specific niche sites that they have created. These sites are exact match keyword domains with not very high PA or DA (they're in the teens) and they provide no direct traffic. It's basically a link wheel that is probably helping them to rank for some of their bigger holy grail keywords. They are also providing some low quality article/blog marketing on these sites. Ultimately, they link to them ALOT and it's working in this specific niche. This client no longer wants to pay for these services, but there's the possibility of all of the links being taken down and their rankings being set back a ton. Has anybody ever experienced this and if so, how did you deal with it? What are some good tactics? Any tips would be great.
White Hat / Black Hat SEO | | MichaelWeisbaum0