How to Not Scrap Content, but still Being a Hub
-
Hello Seomoz members. I'm relatively new to SEO, so please forgive me if my questions are a little basic.
One of the sites I manage is GoldSilver.com. We sell gold and silver coins and bars, but we also have a very important news aspect to our site.
For about 2-3 years now we have been a major hub as a gold and silver news aggregator. At 1.5 years ago (before we knew much about SEO), we switched from linking to the original news site to scraping their content and putting it on our site. The chief reason for this was users would click outbound to read an article, see an ad for a competitor, then buy elsewhere. We were trying to avoid this (a relatively stupid decision with hindsight).
We have realized that the Search Engines are penalizing us, which I don't blame them for, for having this scraped content on our site.
So I'm trying to figure out how to move forward from here. We would like to remain a hub for news related to Gold and Silver and not be penalized by SEs, but we also need to sell bullion and would like to avoid loosing clients to competitors through ads on the news articles.
One of the solutions we are thinking about is perhaps using an iFrame to display the original url, but within our experience. An example is how trap.it does this (see attached picture). This way we can still control the experience some what, but are still remaining a hub.
Thoughts?
Thank you,
nick
-
I honestly can't offer any short term suggestions. It's a big challenge to know what the best short term path is. Ultimately, you'll need to remove all the scraped content. Do that without replacing it and in the short term, you won't see any gains, though you may even see some short term losses as it's possible you're not being purely penalized.
-
Alan,
Thank you for your thoughts. I agree we need to change our strategy and move away from scraped content. Any technical work arounds we try to do (like iFrame) may work now, but ultimately we seem to just be delaying the inevitable.
Since that strategy will take a while to implement, what would you recommend for the shorter term?
-
Nick,
You're in a difficult situation, to say the least. iFrames were a safe bet a couple years ago, however Google has gotten better and better at discovering content contained in previously safe environments within the code. And they're just going to get better at it over time.
The only truly safe solution for a long term view is to change strategy drastically. Find quality news elsewhere, and have content writers create unique articles built on the core information contained in those. Become your own news site with a unique voice.
The expense is significant given you'll need full time writers, however with a couple entry level writers right out of college, or just a year or two into the content writing / journalism path, you've got a relatively low cost of entry. The key is picking really good talent.
I was able to replace an entire team of 12 poorly chosen writers with 3 very good writers, for example.
The other reality with that is needing to lose all the scraped content. It's got to go. You can't salvage it, or back-date newly written content around it, not in the volume you're dealing with. So you're going to have to earn ranking all over again, but through real, value added reasons.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content Product Descriptions - Technical List Supplier Gave Us
Hello, Our supplier gives us a small paragraph and a list of technical features for our product descriptions. My concern is duplicate content. Here's what my current plan is: 1. To write as much unique content (rewriting the paragraph and adding to it) as there is words in the technical description list. Half unique content half duplicate content. 2. To reword the technical descriptions (though this is not always possible) 3. To have a custom H1, Title tag and meta description My question is, is the list of technical specifications going to create a duplicate content issue, i.e. how much unique content has to be on the page for the list that is the same across the internet does not hurt us? Or do we need to rewrite every technical list? Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Cloaking - is this still working ? And how ?
Hello, Recently i read about all the cloaking world.
White Hat / Black Hat SEO | | WayneRooney
I search some information on the internet about it and i fine this service : http://justcloakit.com/.
Since I'm pretty new to whole this "cloaking world" so I have a few questions from from experts in this field. Is this still working on SEO since all the Google update recently ?
How easy is that for someone that don't have much experience and knowledge on php and servers stuff ?
Is there are more sites such as the above example ? In general i have the budget and i don't think its very hard to learn all the technical part but i just want to know if this is something
that still working, is that good investment in your opinion ? (As its not really cheap) Cheers and thank you for your help0 -
Am I Syndicating Content Correctly?
My question is about how to syndicate content correctly. Our site has professionally written content aimed toward our readers, not search engines. As a result, we have other related websites who are looking to syndicate our content. I have read the Google duplicate content guidelines (https://support.google.com/webmasters/answer/66359?hl=en), canonical recommendations (https://support.google.com/webmasters/answer/139066?hl=en&ref_topic=2371375), and no index recommendation (https://developers.google.com/webmasters/control-crawl-index/docs/robots_meta_tag) offered by Google, but am still a little confused about how to proceed. The pros in our opinion are as follows:#1 We can gain exposure to a new audience as well as help grow our brand #2 We figure its also a good way to help build up credible links and help our rankings in GoogleOur initial reaction is to have them use a "canonical link" to assign the content back to us, but also implement a "no index, follow" tag to help avoid duplicate content issues. Are we doing this correctly, or are we potentially in threat of violating some sort of Google Quality Guideline?Thanks!
White Hat / Black Hat SEO | | Dirving4Success0 -
Switching site content
I have been advised to take a particular path with my domain, to me it seems "black hat" but ill ask the experts: Is it acceptable when one owns an exact match location domain eg london.com, to run as a tourist information site, gathering links from wikipedia,bbc,local paper/radio/sports websites etc, then after 6 - 12 months, switch the content to a business site? What could the penalties be? Please advise...
White Hat / Black Hat SEO | | klsdnflksdnvl0 -
Does having the same descrition for different products a bad thing the titles are all differnent but but they are the same product but with different designs on them does this count as duplicate content?
does having the same description for different products a bad thing the titles are all different but but they are the same product but with different designs on them does this count as duplicate content?
White Hat / Black Hat SEO | | Casefun1 -
White Papers! Is this still good for SEO
Does publishing a white paper good for SEO? We are trying to decide to publish one or not for the purpose of SEO. If it will not help, we will spend money for other things.
White Hat / Black Hat SEO | | AppleCapitalGroup0 -
Still not Recovered from Caffeine
In June of 2010, we lost rankings for four of our best key terms for our website http://tinyurl.com/6e73q52. No other terms were majorly impacted, so our assumption was that is was either due to Caffeine or a filter on those terms. We have worked with several companies using various strategies since then with little or no success. We later found that their linking strategies were less than white hat (I'm being polite) The best advice I was given recently is that we were over-optimized for those four terms (life insurance, life insurance quotes, term life insurance, term life insurance quotes) and we need to balance that out by linking with other terms. I know this will not be a fast process. My problem is I see many of my competitors in this extremely competitive space using black hat (grey at best) techniques and rising quickly to the first page of Google. Some of the SEO companies working with my competitors offered help to us, which I consider ethically questionable. These four terms convert the best, as you would imagine, so it has become very cut-throat in this competitive environment. We want to keep our hats white, as we are interested in longevity, not a quick hit and run (our site has been live and working for 8-plus years, by the way). I have become extremely gun-shy and generally suspicious of working with a new SEO company, so my question (finally) is: I would like some recommendations and success stories you have had with working with SEO companies (white hat only). Thank you for listening to my rant.
White Hat / Black Hat SEO | | rdreich490 -
Why Does Massive Reciprocal Linking Still Work?
It seems pretty well-settled that massive reciprocal linking is not a very effective strategy, and in fact, may even lead to a penatly. However, I still see massive reciprocal linking (blog roll linking even massive resource page linking) still working all the time. I'm not looking to cast aspersion on any individual or company, but I work with legal websites and I see these strategies working almost universally. My question is why is this still working? Is it because most of the reciprocally linking sites are all legally relevant? Has Google just not "gotten around" to the legal sector (doubtful considering the money and volume of online legal segment)? I have posed this question at SEOmoz in the past and it was opined that massively linking blogs through blog rolls probably wouldn't send any flags to Google. So why is that it seems that everywhere I look, this strategy is basically dismissed as a complete waste of time if not harmful? How can there be such a discrepency between what leading SEOs agree to be "bad" and the simple fact that these strategies are working en masse over the period of at least 3 years?
White Hat / Black Hat SEO | | Gyi0