Best way to start a fresh site from a penalized one
-
Dear all,
I was dealing with a penalized domain (Penguin, Panda), hundred of spamy links (Disavoved with no success), tiny content resolved in part and so on ....
I think the best way is to start a new fresh domain but we want to use some of the well written content from the old (penalized site).
To do this task I will mark as NOINDEX the source (penalized) page and move this content to the new fresh domain.
Question: do you think this is a non-dangerous aprouch or do you know other strategy?
I'll appreciate your point of view
Thank you
-
Hi Claudio,
To the question of "is it dangerous to start with similar content to the old site", I would say that it's very hard to tell. Some sites in some niches all have very similar content (think or real estate aggregator sites in the same cities - it's not as if they have access to different properties on the same market - they generally list the same houses for sale and rent at any one time). However, Google is ver adept at processing text to understand it if has been recycled or "spun" from other content it has seen before. If the original content came from a severely penalised website, re-using it in this manner would definitely not be risk-free.
You would probably also want to take the old site offline completely as opposed to simply noindexing its pages if you were to do this.
Google understands very "similar" content due to content spinning having been such a popular way to create content in previous years. If you can re-work your existing content to be of a fairly different length (shorter or longer), take a different paragraph structure, and be placed on the new site that is very dissimilar to the old one in terms of structure, this may work out well. I cannot say that this is risk free however, for all the reasons Casey has brought up already.
-
Dear Casey,
The new domain is on a different C-class, the whois info is different and event locked as private, the WMT and GAnalitycs will be on a different account, the design will be different and even I have planed upload a few products (pages) to start, and also it will be blocked by robots.
But my question is "is dangerous start with similar content to the old site", (some pages has a great content well written)
At this time I was working for two years with the old site and the traffic is recovery too slow, so our time has finished this is why we want to start a new domain using some of the old pages (previously marked as NOINDEX on the old site
Thank you for your time and knowledge
-
Claudio, I've always been inclined to believe the following:
"If Google CAN know something about your site, assume they DO know something about your site."
So in your case, yes, there is always going to be a danger that Google will see you as the owner of both sites (the penalized domain and the new one) and eventually move any penalties from one site to the other. Now, you can certainly minimize this possibility by doing the following:
- Keeping the sites out of the same GW Tools account.
- Making sure the new domain has different WHOIS information.
- Keeping the sites off of the same C-Class Server
- Minimizing similarities between the two sites as much as possible (including NO 301 redirects and design options).
Regardless, even doing the above may not be enough. I will say though that although Negative SEO does exist, I find it "questionable" that it is the main reason you are having problems. Google advises specifically that it's enough to just "drop those kind of links into a disavow." Most likely though, you have MUCH larger issues impacting the domain, especially if it's been 2+ years.
Definitely consider a professional audit. I really want you to consider existing all other methods before trying this strategy.
-
Dear Mates,
To clarify, I was working two years trying to resolve it, for example the toxic links comes from spam blogs created by competitors, take a look at this samples there are 200 blogs containing exactly this page and our site is linked there:
My plan is to create a fresh new site NO-301 no redirections, but I want to use some of the well performing contents (more than 500 words, well written), using these steps:
1. Make on the old site the content as NOINDEX.
2. Wait 15 days.
3. upload this content to the new site.
Do you think it could be dangerous?
Thank you for your responses
-
Hi Claudio,
I would echo the guys above in saying that it sounds like you could do more to revoke the penalty on the original site. If you begin anew, I would definitely not 301 the old domain (not that it sounds like you were going to), but I'd also invest in completely new content, rather than duplicating the old content. Google's ability to track duplicate content is amazingly good, so even a noindex on the old content could still have G draw a connection between content it penalised in the past and the new site.
Moosa is absolutely correct that it is better (and unfortunately much harder) to remove bad links than it is to just disavow them. Google's spam team often appreciate genuine effort to remove links - disavowal appears to work best if you have been unsuccessful in your link removal and can prove that you got in touch with as many sites as possible (screenshots of emails unanswered or answered unfavourably, for instance).
The other very good thing about removing links is that they can never hurt you again in the future if Google one day decides to change the way it views previous disavowals... which we certainly can't count on it not doing.
-
Casey is right Panda and Penguin are different penalties and they should be resolved differently! Penguin has to do with links so if there is a penguin penalty then you must have some toxic link within your link profile.
My idea here is to collect all the links (GWT, Moz, Ahrefs, and Majestic SEO) and then either manually check each link or use Link Detox or Link Risk to kill all the links that are unhealthy. If the penalty is penguin you will receive a Google message to either remove more links (with some examples of the link) or it will give you a positive message that will say “The penalty has now been revoked”
Note: It is better to remove as many bad links as you can before disavow them.
In case of Panda, the problem is within your site and content so may be the content you think is high quality isn’t really high quality in the eye of Google and in this case you should considerer redoing your content.
All in all I believe the decision of going for a new domain is too early at this stage, my advice is to look in to the penalty details and deal with it.
Hope this helps!
-
To be blunt, moving from one penalized domain to another to escape a penalty is most likely a complete waste of time. It's been a known fact for years that penalties follow 301 redirects. But it was recently clarified by Google that "moving" your penalized site (and that includes content) to another domain to escape a penalty is also a foolish choice. Google now reserves the right to move a penalty to any new domain (something we've suspected for awhile, but now can confirm).
In your case, I'd strongly look at continuing to salvage the domain. If you weren't aware of the above, then I'm hesitant to believe that you've also done EVERYTHING you can to unwind your algorithmic penalty. Further, you reference both Penguin/Panda above and yet BOTH have clear different approaches to how they should be resolved. Maybe your "high-quality content" isn't really as high-quality as you think? Maybe you haven't disavowed all the toxic links/domains affecting your site? Have you tried to seek out a professional Google penalty site audit? I'm not convinced you've done all you can just based on your question.
No judgments, but personally, no, I don't believe this is a "non-dangerous approach."
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Indexed Pages Different when I perform a "site:Google.com" site search - why?
My client has an ecommerce website with approx. 300,000 URLs (a lot of these are parameters blocked by the spiders thru meta robots tag). There are 9,000 "true" URLs being submitted to Google Search Console, Google says they are indexing 8,000 of them. Here's the weird part - When I do a "site:website" function search in Google, it says Google is indexing 2.2 million pages on the URL, but I am unable to view past page 14 of the SERPs. It just stops showing results and I don't even get a "the next results are duplicate results" message." What is happening? Why does Google say they are indexing 2.2 million URLs, but then won't show me more than 140 pages they are indexing? Thank you so much for your help, I tried looking for the answer and I know this is the best place to ask!
Intermediate & Advanced SEO | | accpar0 -
Best practice to consolidating authority of several SKU pages to one destination
I am looking for input on best practices to the following solution Scenario: I have basic product A (e.g. Yamaha Keyboard Blast) There are 3 SKUs to the product A that deserve their own page content (e.g. Yamaha Keyboard Blast 350, Yamaha Keyboard Blast 450, Yamaha Keyboard Blast 550) Objective: - I want to consolidate the authority of potential links to the 3 SKUs pages into one destination/URL Possible Solutions I can think of: - Query parameters (e.g /yamaha-keyboard-blast?SKU=550) - and tell Google to ignore SKU query parameters when indexing Canonical tag (set the canonical tag of the SKU pages all to one destination URL) Hash tag (e.g. /yamaha-keyboard-blast#SKU=550); load SKU dependent content through javascript; Google only sees the URLs without hashtag Am I missing solutions? Which solutions makes the most sense and will allow me to consolidate authority? Thank you for your input.
Intermediate & Advanced SEO | | french_soc0 -
One Website, Multiple Locations, One Blog?
There's definitely not going to be a "right" answer to this question, but I think it can lead to a great discussion. We are building a website for a client who has two locations, we are going to use a URL structure similar to this: www.Brand.com (this would be a landing page where users would select a location) www.Brand.com/Atlanta www.Brand.com/Boston However, we still want to focus on local SEO - so our deeper URL structure will be: www.Brand.com/Atlanta/Auto-Accident-Lawyer www.Brand.com/Atlanta/Motorcycle-Accident-Lawyer www.Brand.com/Boston/Auto-Accident-Lawyer www.Brand.com/Boston/Motorcycle-Accident-Lawyer The content on those pages will be unique and target local keywords. Each "version" of the website will have a navigation specific to that location. For example, once a user clicks into the Boston website, all of the navigation items will pertain to Boston. However, we run into an issue with the blog. Both locations will be using the same blog content, which ends up looking something like this: www.Brand.com/Atlanta/Blog/Blog-Article www.Brand.com/Boston/Blog/Blog-Article This obviously creates duplicate content. We could do something such as this: www.Brand.com/Blog/Blog-Article However, as noted above, each local version of the website has a separate navigation (this keeps a user in Boston on the Boston version of the website). So have a centralized blog is far from ideal unless navigations for both locations are included - which would allow users to return back to their local website. From my understanding, duplicate content doesn't necessarily "hurt" your SERPs, it simply keeps one of the duplicated pages from ranking. So the question comes down to this, is duplicate content a big enough issue to restructure a website to use a centralized blog?
Intermediate & Advanced SEO | | McFaddenGavender0 -
Why is my m-dot site outranking my main site in SERPs?
My client has a WP site and a Duda mobile site that we inherited. For some reason their m-dot site is ranking on P1 of Google for their top KWs instead of the main site which is much more robust. The main site might rank beyond page 5 when the generic home page for their m-dot site appears on P1. Does anyone have any idea why this might be happening?
Intermediate & Advanced SEO | | Etna0 -
3 Wordpress sites 1 Tumblr site coming under 1domain(4subdomains) WPMU: Proper Redirect?
Hey Guys, witnessSF.org (WP), witnessLA.org(Tumblr), witnessTO.com(WP), witnessHK.com(WP), and witnessSEOUL.com(new site no redirects needed) are being moved over to sf.ourwitness.com, la.ourwitness.com and so forth. All under on large Wordpress MU instance. Some have hundreds of articles/links others a bit less. What is the best method to take, I understand there are easy redirects, and the complete fully manual one link at a time approach. Even the WP to WP the permalinks are changing from domain.com/date/post-name to domain.com/post-name? Here are some options: Just redirect all previous witinessla.org/* to la.ourwitness.org/ (automatic direct all pages to home page deal) (easiest not the best)2) Download Google Analytics top redirected domains about 50 urls have significant ranking and traffic (in LA's sample) and just redirect those to custom links. (most bang for the buck for the articles that rank manually set up to the correct place) 3) Best of the both worlds may be possible? Automated perhaps?I prefer working with .htaccess vs a redirect plugin for speed issues. Please advise. Thanks guys!
Intermediate & Advanced SEO | | vmialik0 -
Best way to link 150 websites together
Fellow mozzers, Today I got an interesting question from an entrepreneur who has plans to start about 100-200 webshops on a variety of subjects. His question was how he should like them together. He was scared that if he would just make a page on every website like: www.domain.com/our-webshops/ that would list all of the webshops he would get penalised because it is a link farm. I wasn't sure 100% sure which advise to give him so i told him i needed to do some research on the subject to make sure that i'm right. I had a couple of suggestions myself. 1. Split the amount of pages by 3 and divide them into three columns. Column A links to B, B links to C and C links to A. I realize this is far from ideal but it was one of the thoughts which came up. 2. Divide all the webshops into different categories. For example: Webshops aimed at different holidays, webshops aimed at mobile devices etcetera. This way you will link the relevant webshops together instead of all of them. Still not perfect. 3. Create a page on a separate website (such as a company website) where the /our-webshops/ page exists. This way you only have to place a link back from the webshops to this page. I've seen lots of webshops using this technique and i can see why they choose to do so. Still not ideal in my opinion. That's basicly my first thoughts on the subject. I would appreciate any feedback on the methods described above or even better, a completely different strategy in handling this. For some reason i keep thinking that i'm missing the most obvious and best method. 🙂
Intermediate & Advanced SEO | | WesleySmits0 -
What's the best internal linking strategy for articles and on-site resources?
We recently added an education center to our site with articles and information about our products and industry. What is the best way to link to and from that content? There are two options I'm considering: Link to articles from category and subcategory pages under a section called "related articles" and link back to these category and subcategory pages from the articles: category page <<--------->> education center article education center article <<---------->> subcategory page Only link from the articles to the category and subcategory pages: education center article ---------->> category page education center article ---------->> subcategory page Would #1 dilute the SEO value of the category and subcategory pages? I want to offer shoppers links to more information if they need it, but this may also take them away from the products. Has anyone tested this? Thanks!
Intermediate & Advanced SEO | | pbhatt0 -
Best way to view Global Navigation bar from GoogleBot's perspective
Hi, Links in the global navigation bar of our website do not show up when we look at Google cache --> text only version of the page. These links use "style="<a class="attribute-value">display:none;</a>" when we looked at HTML source. But if I use "user agent switcher" add-on in Firefox and set it to Googlebot, the links in global nav are displayed. I am wondering what is the best way to find out if Google can/can not see the links. Thanks for the help! Supriya.
Intermediate & Advanced SEO | | SShiyekar0