Penguin 2.0 drop due to poor anchor text?
-
Hi,
my website experienced a 30% drop in organic traffic following the Penguin 2.0 update, and after years of designing my website with SEO in mind, generating unique content for users, and only focusing on relevant websites in my link building strategy, I'm a bit disheartened by the drop in traffic.
Having rolled out a new design of my website at the start of April, I suspect that I've accidentally messed up the structure of the website, making my site difficult to crawl, or making Google think that my site is spammy. Looking at Google Webmaster Tools, the number 1 anchor text in the site is "remove all filters" - which is clearly not what I want! The "remove all filters" link on my website appears when my hotels page loads with filters or sorting or availability dates in place - I included that link to make it easy for users to view the complete hotel listing again. An example of this link is towards the top right hand side of this page:
http://www.concerthotels.com/venue-hotels/agganis-arena-hotels/300382?star=2
With over 6000 venues on my website, this link has the potential to appear thousands of times, and while the anchor text is always "remove all filters", the destination URL will be different depending on the venue the user is looking at. I'm guessing that to Google, this looks VERY spammy indeed!?
I tried to make the filtering/sorting/availability less visible to Google's crawl when I designed the site, through the use of forms, jquery and javascript etc., but it does look like the crawl is managing to access these pages and find the "remove all filters" link. What is the best approach to take when a standard "clear all..." type link is required on a listing page, without making the link appear spammy to Google - it's a link which is only in place to benefit the user - not to cause trouble!
My final question to you guys is - do you think this one sloppy piece of work could be enough to cause my site to drop significantly following the Penguin 2.0 update, or is it likely to be a bigger problem than this? And if it is probably due to this piece of work, is it likely that solving the problem could result in a prompt rise back up the rankings, or is there going to be a black mark against my website going forward and slow down recovery?
Any advice/suggestions will be greatly appreciated,
Thanks
Mike
-
Go to majestic SEO, type your url in. If your keywords you got penalized for are over 10% diversity you are being penalized generally, however there are a few exceptions, but not many. I analyzed 440 sites and found that the highest was 2.47 for a site that didn't have keywords in the url.
Also, I suggest you read this http://dailyseotip.com/what-other-marketing-firms-want-you-to-believe-that-isnt-true/3356/ I see that you are really focused on Onpage SEO. I think this will help you understand more.
The next thing you may want to do is start contacting admins and deleting low quality links if you have them. Use OSE and figure out low quality links. There are only a handful of directories I recommend out their. I have a message from Google telling one of my clients to get rid of their directory links, it was and example link coming from a directory site to be exact. Never use a keyword at a directory site, always use Brand name or your URL.
Make sure your Disavow is your last resort and I highly suggest you get someone to do it that has experience in it. Many have messed this up and really hurt their website.
Have a great day.
-
Hi Mat,
thanks for your reply. I'll definitely change the link, but I agree that it would be harsh if it was the sole reason for the 30% drop in organic traffic.
There are definitely some directories linking to ConcertHotels.com - at one stage I used the SEOmoz list of directories and got my website listed on some of the recommendations from the list. But my strategy for the last two years has been to approach venue's own websites and ask if they'd be interested in linking to our nearby hotels page, as a useful resource for their visitors. This strategy has worked quite well for me, and to me it sounds like a very natural, sensible link building strategy. I'll certainly work through my list of backlinks, but I would hope that the majority of them are from very relevant websites (due to the strategy I adopted). I guess there could be a percentage that I have not had any control over however, and I guess I should disavow these?
As for the directories, should I now be disavowing directory links? I didn't think that the percentage of directory links to my site would be that high -I used the directory link strategy in the past to simply enhance the number of links to my homepage - the strategy I described above is one which achieve links to specific pages throughout my website, not my homepage, so I felt the need to grow the number of homepage links.
Thanks again for your help and advice
Mike
-
That link is not ideal, but I really do not believe that it would cause the sort of drop you are talking about.
If you think you have been hit by penguin 2 then I'd start looking at your backlinks with a critical eye. I just stuck your domain into majestic seo and I hit a lot of questionable directories pretty quickly. That might be unfair - I certainly haven't analysed in any depth. However I took 10 domains at random and 9 were sites that at best are not helping you much.
If you're looking for a cause of a drop I'd say you could do worse than going through your backlink profile.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Login to see more (some text hidden by CSS height and jquery) will it ruin SEO?
Hey SEO masters! I have a website that is smashing it for SEO in Australia. In an effort to increase a user base I want to make it so only logged in users can see all the content. So today, I launched a new feature hiding content using CSS 'height:' property. The content is obviously still there and if you were a developer you could easily 'inspect element' and remove that CSS style to see everything... There are a few other tweaks i made for logged out users, but that only affects some json. Question: will this affect my SEO rankings? Here is a direct example: https://www.fishingspots.com.au/s/perth if you sign up, there is about 1400words of content.
Web Design | | thinkLukeSEO0 -
Multiple sites using same text - how to avoid Google duplicate content penalty?
Hi Mozers, my client located in Colorado is opening a similar (but not identical) clinic in California. Will Google penalize the new California site if we use text from our website that features his Colorado office? He runs the clinic in CO and will be a partner of the clinic in CA, so the CA clinic has his "permission" to use his original text. Eventually he hopes to go national, with multiple sites utilizing essentially the same text. Will Google penalize the new CA site for plagiarism and/or duplicate content? Or is there a way to tell Google, "hey Google, this new clinic is not ripping off my text"?
Web Design | | CalamityJane770 -
Help, site traffic has dropped significantly since we changed from http to https
Heya, so I am just in charge of the content on the site, and the SEO content, not the actual back-end stuff. A little under 2 weeks ago we switched to https, and our site traffic has been down a lot ever since. When I SERP check our keywords, they don't seem to have dropped in rankings pages. Here is what I got when I asked our dev guy if 301 redirects were put in: I did not add any redirects so all of the content is accessible on both unless individual links get hardcoded one way or the other. The only thing in place is a Cloudflare plugin which rewrites links in cached pages to match the way its accessed, so if for example you access a page over https you don’t get the version cached with a bunch of http links since that will throw up mixed content warnings in the browser. Other than that WP mostly generates all its links to match whatever protocol you are accessing the current page with. We can make specific pages redirect one way or the other in the future if we want to though... As a startup, site traffic is a metric we track to gouge progress, and so I really need to get to the bottom of if it was the change from http to https that has causes the drop, and if so, what can we do about it? Also, in case it is relevant: the bounce rate is now sky high (ave. 15% to 64% this last week!) Any help is very welcome! Site: https://mobileday.com Thank you!
Web Design | | MobileDay1 -
Drop Down Menus and Crawlability
Hello, We are working on a complete site redesign. One of the mock-ups that are being reviewed is of a page that encompasses and entire category of products, but the only way the user can see the products is to fill out several drop down menus, and then a subset of products that match that criteria will appear. Once that list appears, the user will then be able to click on each of the products and will then be taken to the product page. I'm concerned that this layout will pose a crawlability issue since click activity and drop down menus have always been a problem for bots in the past, has anything changed? Will the bot be able to follow the links to these product pages if it can't see them since it can't fill out the form? Also, depending on the functionality of this 'form', I'm assuming the product listing will be populated dynamically and pulled from another source, which means that the product links will not live in the html of the page, and hence cannot be crawled. Does anyone know how this is normally handled? Do the actual results usually live elsewhere or does it live in the html of that page? Any thoughts or clarity around this would be appreciated.
Web Design | | Colbys0 -
Is it cloaking/hiding text if textual content is no longer accessible for mobile visitors on responsive webpages?
My company is implementing a responsive design for our website to better serve our mobile customers. However, when I reviewed the wireframes of the work our development company is doing, it became clear to me that, for many of our pages, large parts of the textual content on the page, and most of our sidebar links, would no longer be accessible to a visitor using a mobile device. The content will still be indexable, but hidden from users using media queries. There would be no access point for a user to view much of the content on the page that's making it rank. This is not my understanding of best practices around responsive design. My interpretation of Google's guidelines on responsive design is that all of the content is served to both users and search engines, but displayed in a more accessible way to a user depending on their mobile device. For example, Wikipedia pages have introductory content, but hide most of the detailed info in tabs. All of the information is still there and accessible to a user...but you don't have to scroll through as much to get to what you want. To me, what our development company is proposing fits the definition of cloaking and/or hiding text and links - we'd be making available different content to search engines than users, and it seems to me that there's considerable risk to their interpretation of responsive design. I'm wondering what other people in the Moz community think about this - and whether anyone out there has any experience to share about inaccessable content on responsive webpages, and the SEO impact of this. Thank you!
Web Design | | mmewdell0 -
301 redirect on Windows IIS. HELP! (Part 2)
My webmaster's trying (but struggling) to 301-redirect the non.www version of my site to the www version. He's following these instructions given to me in a response to an SEOMoz Private Question (ah, the good old days!). So far he's 301-redirected the homepage but seems stuck on how to do the entire site. Any clues on what he should be doing?
Web Design | | Jeepster0 -
Rankings Dropped After Redesign
Hi, I've recently redesigned our website with the main changes being sidebar changes and source ordering (making the main content appear before the sidebars). No URL changes have been made. A few days after making these changes our positions dropped heavily and have been dropping ever since. It's been a week and a half now and traffic is down by around 40%. Google has the new changes cached. Do people feel this just a temporary drop and will we rankings to go back at least or should we revert to the old structure? Website: http://www.diyorgasms.co.uk (NSFW) Thanks
Web Design | | diyorgasms0 -
Advice on migrating from .com to .co.uk without dropping in rank?
I have a retail business in the UK whose website has *.com address and it has taken 3 years to reach a page rank of 3. We are building an updated site which will have a completely new url structure and optimized for SEO. We are considering launching the new site at a *.co.uk as we understand this will have advantages in local search and ranking as we are primarily targeting UK traffic. Does anyone have comments on **.com vs .co.uk and/or have any advice on how to handle the migration while minimizing any drop in traffic and ranking?
Web Design | | brian.james0