Possible Penguin 2.1 fix - Anybody tested this?
-
Possible Penguin 2.1 fix?
This happened to client site - Stay with me - this takes some explaining…
A clients home page is set as index.html
Which in domain settings goes to the root address: http://www.domain.com/
But is a setting on a domain/hosting - you can set any page to the root-
I always link directly to the root address (the second one)
So if you set the new root page as http://www.domain.com/index.htm --- going to the root - essentially is a new page- any previous poor linking would be then broken and would have no effect
So it would be a matter of changing the domain settings to use the index.htm page (which would function exactly the same- internal link structure of site goes to the root)
thoughts?
-
Thanks- we have seen Unnatural links warnings come in as well (other domains) -ranking dropped we left it for a week- and it came back to the same place....
Undecided at this point what i will do- might start with Disavow of links....and go from there...
-
If I had to place my money of what would happen next from what I've seen int he past I would put it on one of two things assuming you are 301 redirecting to the new homepage:
1 - Nothing. Penalty passes instantly or is already assigned to the domain which was part of what the focus of Penguin 2.1 was about. Matt Cutts, discussing this update said:
"The previous iteration of Penguin would essentially only look at the home page of a site. The newer generation of Penguin goes much deeper and has a really big impact in certain small areas."
or ...
2 - Google gets tricked and passes the weight but not the penalty. In this event the rankings will likely last somewhere between 1 and 3 weeks before the penalty follows. I've seen this happen as well but mainly in cases where the penalty was an "unnatural links" warning.
Google luck and if you go for it, I'd love to hear how it turns out.
-
I see one of two things happening with that... Either your prior homepage 404s which will hurt you even more or your prior homepage will redirect to the new one and nothing will have changed.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website gone from PR 2 to PR 0
Hi guys! We're looking at a site in the trade industry here in Australia and it appears that around the Panda/Penguin updates in September & October last year, they've had their Google PageRank wiped out back to zero. I know we shouldn't be focusing too much on PR these days, but can't help but wonder what's caused this. It's a local business website who aren't selling links etc. I'm thinking backlinks pointing at the site that were giving them a boost have been discounted, however, they still have quite a number of quality links coming into them. Would love to pick your brains! Regards.
Intermediate & Advanced SEO | | WCR0 -
Is it possible to avoid redirect of penalties for 301 forwards?
We have been doing a good amount of competitive research lately and have noticed sites that have been changing their TTLD quite often to escape manual penalties / DCMA filings. An example evolution: brandterm.com -> brandterm.bz -> brandterm.me These competitors are able to quickly rank for money keywords in the top 3 soon after another domain switch. What we have noticed is that while its obvious they received Google penalties they continue to 301 redirect the old domains to the new ones. We have experienced first hand that penalties travel along domains with 301 redirects. Does anyone have an explanation how these companies are able to achieve quickly high volume of organic search while 301-redirecting from burnt domains? The only option I see is to disavow all previous domains in GWT to be able to employ 301 redirects without risking carrying over the penalty. Are there other theories ppl can think of? T
Intermediate & Advanced SEO | | petersocapro0 -
Traffic by Country: Is It Possible to Change it?
Let's say you have a .ng domain but you receive more traffic from USA than from Nigeria. Let's say you want traffic only from Nigeria. How do you correct this?
Intermediate & Advanced SEO | | YESdesign0 -
Best possible linking on site with 100K indexed pages
Hello All, First of all I would like to thank everybody here for sharing such great knowledge with such amazing and heartfelt passion.It really is good to see. Thank you. My story / question: I recently sold a site with more than 100k pages indexed in Google. I was allowed to keep links on the site.These links being actual anchor text links on both the home page as well on the 100k news articles. On top of that, my site syndicates its rss feed (Just links and titles, no content) to this page. However, the new owner made a mess, and now the site could possibly be seen as bad linking to my site. Google tells me within webmasters that this particular site gives me more than 400K backlinks. I have NEVER received one single notice from Google that I have bad links. That first. But, I was worried that this page could have been the reason why MY site tanked as bad as it did. It's the only source linking so massive to me. Just a few days ago, I got in contact with the new site owner. And he has taken my offer to help him 'better' his site. Although getting the site up to date for him is my main purpose, since I am there, I will also put effort in to optimizing the links back to my site. My question: What would be the best to do for my 'most SEO gain' out of this? The site is a news paper type of site, catering for news within the exact niche my site is trying to rank. Difference being, his is a news site, mine is not. It is commercial. Once I fix his site, there will be regular news updates all within the niche we both are in. Regularly as in several times per day. It's news. In the niche. Should I leave my rss feed in the side bars of all the content? Should I leave an achor text link on the sidebar (on all news etc.) If so: there can be just one keyword... 407K pages linking with just 1 kw?? Should I keep it to just one link on the home page? I would love to hear what you guys think. (My domain is from 2001. Like a quality wine. However, still tanked like a submarine.) ALL SEO reports I got here are now Grade A. The site is finally fully optimized. Truly nice to have that confirmation. Now I hope someone will be able to tell me what is best to do, in order to get the most SEO gain out of this for my site. Thank you.
Intermediate & Advanced SEO | | richardo24hr0 -
Google penguin penalty(s), please help
Hi MozFans, I have got a question out of the field about www.coloringpagesabc.com.
Intermediate & Advanced SEO | | MaartenvandenBos
Question is why the rankings and traffic are going down down down the last 4 months. Costumer thinks he got hit by google penguin update(s). The site has about 600 page’s/posts al ‘optimized’ for old seo:
- Almost all posts are superb optimized for one keyword combination (like … coloring pages) there is a high keyword density on the keyword titles and descriptions are all the same like: <keyword>and this is the rest of my title, This is my description <keyword>and i like it internal linking is all with a ‘perfect’ keyword anchor text there is a ok backlink profile, not much links to inner pages
- there are social signals the content quality is low The site to me looks like a seo over optimized content farm Competition:
When I look at the competition. The most coloring pages websites don’t offer a lot of content (text) on there page. The offer a small text and the coloring pages (What it is about :-)) How to get the rankings back:
What I was thinking to do. rewrite the content to a smaller text. Low keyword density on the keyword and put the coloring pages up front. rewrite all titles and descriptions to unique titles and descriptions Make some internal links to related posts with a other anchor text. get linkbuilding going on inner pages get more social signals Am I on the right track? I can use some advise what to do, and where to start. Thanks!!</keyword></keyword> Maarten0 -
How to place two NADs on site (One website, 2 locations)
Hello, For our site: nlpca(dot)com we have 2 locations. One location is based out of a hotel in California, and one location is where we have our offices in Utah. Our site is about both locations, emphisizing California. Do we need to create a Utah page and put the Utah NAD on that page with separate address and phone number? What do we use as an address since we only have a hotel room in California now? What do we need to do to rank for both in the natural and also Places listings? Right now we're #1 for NLP California and #4 for NLP Utah Thanks!
Intermediate & Advanced SEO | | BobGW0 -
2 sites or one sites: 2 locations
Hello, I have a dog training client who is offering services in 2 separate locations. We're looking to be first in the non-local search results and also rank well in google places. Would it be better to go for 2 separate sites or one site and try to rank for 2 different locations with one site? There's both local and standard search results when we type in our keywords. Thanks!
Intermediate & Advanced SEO | | BobGW0 -
Techniques to fix eCommerce faceted navigation
Hi everyone, I've read a lot about different techniques to fix duplicate content problems caused by eCommerce faceted navigation (e.g. redundant URL combinations of colors, sizes, etc.). From what I've seen suggested methods include using AJAX or JavaScript to make the links functional for users only and prevent bots from crawling through them. I was wondering if this technique would work instead? If we detect that the user is a robot, instead of displaying a link, we simply display its anchor text. So what would be for a human COLOR < li > < a href = red >red < /a > < /li >
Intermediate & Advanced SEO | | anthematic
< li > < a href = blue>blue < /a > < /li > Would be for a robot COLOR < li > red < /li >
< li > blue < /li > Any reason I shouldn't do this? Thanks! *** edit Another reason to fix this is crawl budget since robots can waste their time going through every possible combination of facet. This is also something I'm looking to fix.0