Penalty for all new sites on a domain?
-
Hi @all,
a friend has an interesting problem. He got a manuel link penalty in the end of 2011...it is an old domain with domainpop >5000 but with a lot bad links (wigdet and banners and other seo domains, but nothing like scrapebox etc)...he lost most of the traffic a few days after the notification in WMT (unnatural links) and an other time after the first pinguin update in april´12. In the end of 2012 after deleting (or nofollowing) and disavow a lot of links google lifted the manuel penalty (WMT notification). But nothing happened after lifting, the rankings didn´t improve (after 4 months already!). Almost all money keywords aren´t in the top 100, no traffic increases and he has good content on this domain. We built a hand of new trust links to test some sites but nothing improved.
We did in february a test and build a completely new site on this domain, it´s in the menu and got some internal links from content...We did it, because some sites which weren´t optimized before the penalty (no external backlinks) are still ranking on the first google site for small keywords. After a few days the new site started to rank with our keyword between 40-45. That was ok and as we expected. This site was ranking constantly there for almost 6 weeks and now its gone since ten days. We didn´t change anything. It´s the same phenomena like the old sites on this domain...the site doesnt even rank for the title! Could it still be an manuel penalty for the hole domain or what kind of reasons are possible? Looking forward for your ideas and hope you unterstand the problem!
Thanks!!!
-
Hmmm...if your new domain has different keywords than the old one then I would think that it is a matter of the manual penalty not lifting off yet.
But there could be other factors. Is your new domain all completely unique content? If it is duplicated from elsewhere then the pattern of ranking for a while and then disappearing could fit with Panda.
It's also possible that the ranking you saw initially was a "honeymoon boost" and perhaps the new site has dropped out of the honeymoon boost phase and into it's normal ranking position and just needs to attract links to start to rank properly again.
-
Hi Marie,
thanks for your answer! I thougt the same and we´re waiting for the next update, but I supposed that just the old sites were in the pinguin filter and not the new one. Isn´t pinguin not just for some specific keywords (and sites) or is the hole domain impacted?
-
The most likely explanation is that the site is still under the effects of Penguin. Penguin is algorithmic as opposed to manual. What this means is that even if you have cleaned up the links to the point where Google is willing to remove the manual penalty, you may still be under the effects of Penguin. Most likely, once Penguin refreshes, the site will do better.
The other possibility is that the penalty is still there. Sometimes after getting the "manual penalty revoked" message the effects of the penalty will remain on the site for days, weeks or even months and then finally one day lift.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I've purchased a PR 6 domain what will be best use of it ?
I've purchased a PR 6 domain what will be best use of it ? Should make a new site or redirect it to my low pr sites? Or I wasted my $100 ?
White Hat / Black Hat SEO | | IndiaFPS0 -
Do some sites get preference over others by Google just because? Grandfathered theory
So I have a theory that Google "grandfathers" in a handful of old websites from every niche and that no matter what the site does, it will always get the authority to rank high for the relevant keywords in the niche. I have a website in the crafts/cards/printables niche. One of my competitors is http://printable-cards.gotfreecards.com/ This site ranks for everything... http://www.semrush.com/info/gotfreecards.com+(by+organic) Yet, when I go to visit their site, I notice duplicate content all over the place (extremely thin content, if anything at all for some pages that rank for highly searched keywords), I see paginated pages that should be getting noindexed, bad URL structure and I see an overall unfriendly user experience. Also, the backlink profile isn't very impressive, as most of the good links are coming from their other site, www.got-free-ecards.com. Can someone tell me why this site is ranking for what it is other than the fact that it's around 5 years old and potentially has some type of preference from Google?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Blog on 2 domains (.org/.com), Canonical to Solve?
I have a client that has moved a large majority of content to their .org domain, including the blog. This is causing some issues for the .com domain. I want to retain the blog on the .org and have it's content also show on the .com. I would place the canonical tag on the .com Is this possible? Is this recommended?
White Hat / Black Hat SEO | | Ngst0 -
HELP - Site architecture of E-Commerce Mega Menu - Linkjuice flow
Hi everyone, I hope you have a couple of mins to give me your opinion. Ecommerce site has around 2000 products, in english and spanish, and around only 70 hits per day if that. We have done a lot of optimisation on the site - Page Titles, URL's, Content, H1's, etc.... Everything on page is pretty much under control, except I am starting to realise the site architecture could be harming our SEO efforts. Once someone arrives on site they are language detected and do a 302 to either domain.com/EN or domain.com/ES depending on their preferred language. Then on the homepage, we have the big MEGA MENU - and we have
White Hat / Black Hat SEO | | bjs2010
CAT 1
SubCat 1
SubsubCat 1
SubsubCat 2
SubsubCat 3 Overall, there are 145 "categories". Plus links to some CMS pages, like Home, Delivery terms, etc... Each Main Category, contains the products of everything related to that category - so for example:
KITCHENWARE
COOKWARE BAKINWARE
SAUCEPANS BOWLS
FRYING PANS Kitchenware contains: ALL PRODUCTS OF SUBCATS BELOW, SO COOKWARE ITEMS, SAUCEPANS, FRYING PANS, BAKINGWARE, etc... plus links to those categories through breadcrumbs and a left hand nav in addition to the mega menu above. So once the bots hit the site, immediately they have this structure to deal with. Here is what stats look like:
Domain Authority: 18 www.domain.com/EN/
PA: 27
mR: 3.99
mT: 4.90 www.domain.com/EN/CAT 1
PA: 15
mR: 3.05
mT: 4.54 www.domain.com/EN/CAT 1/SUBCAT1
PA: 15
mR: 3.05
mT: 4.54 Product pages themselves - have a PA of 1 and no mR or mT. I really need some other opinions here - I am thinking of: Removing links in Nav menu so it only contains CAT1 and SUBCAT1 but DELETE SUBSUBCATS1 which represent around 80 links Remove products within the CAT1 page - eg., the CAT 1 would "tile" graphical links to subcategories, but not display products themselves. So products are only available right at the lowest part of the chain (which will be shortened) But I am willing to hear any other ideas please - maybe another alternative is to start building links to boost DA and linkjuice? Thanks all, Ben0 -
Finding out why Bing gave page-level penalty?
In the last couple of weeks Bing has gradually removed 5 webpages of my website from their SERP's. The URL's are totally gone. They all had top 5 rankings and just got removed out of nothing. Have can I investigate what went wrong with these pages? Are here perhaps experts who are willing to investigate this for a fee? How can I restore a page-level penalty? I have no messages in my Bing Webmastertools account.
White Hat / Black Hat SEO | | wellnesswooz0 -
Owning multiple domains across similar fields
We have a printing website specializing in a single field but had added other products to it along the way, we are looking at adding a second site (redesigned from the ground up) that with cover everything (so will inevitably cover some of the same keywords) is this ok. We are considering some of the other things such as putting both sites on different IP's and also wondered about the registration details (would it be ok to have them under the same company?). Just to make it clear they will not be used for cross link or any other such things so just wanting to know if they could end up on page one for some of the same keywords and Google would be ok with this. Thank you for your time.
White Hat / Black Hat SEO | | BobAnderson0 -
Methods for getting links to my site indexed?
What are the best practices for getting links to my site indexed in search engines. We have been creating content and acquiring backlinks for the last few months. They are not being found in the back link checkers or in the Open Site Explorer. What are the tricks of the trade for imporiving the time and indexing of these links? I have read about some RSS methods using wordpress sites but that seems a little shady and i am sure google is looking for that now. Look forward to your advice.
White Hat / Black Hat SEO | | devonkrusich0