Penalization.... please help me...
-
First of all, sorry for my english but i'm an Italian girl seo.
Before Panda update seo was clear and easy: good quality, good natural backlink and so on...
Now there is an update fast every week and it's a mess!
I work as seo for a big italian e-commerce and (more or less) one month's ago in google webmaster tool I tried a message frome Google who told me the site: www.giordanoshop.com is penalty for innatural backlink.
But I ve do noting against Google politicy: no pay backlink, no fam and so on..
There are some streing link but I can't delete it because I don't do it.
I ask the riconsideretion of website but google still tell me it faund innatural link.
What shoud I do?
The pr of the site is the same but all keyword has lose ranking: from 1 page to 3 and from 1 to 6 page...
What can I do? I risk to lose my work sob.
-
grazie you are great I ll try to apply all your advice.
Only another question? If I find backlink not mine what can I do to delete it?
-
Arianna-
Bummer. Anybody that calls themselves an "italian girl seo" deserves some help.
I have found a couple of helpful articles and information sources over the last few months, as I have had to help new and existing clients with some of these same issues.
First of all, don't complain about what google is doing. This is where the marketplace is going and its like complaining about bad weather, just let it rain..........
Second-you have to find those links. You need to do a site audit/backlinks risk assessment. I have copied and pasted a great "down and dirty guide" on checking for low quality links and basically doing a site audit. This is a blog that was provided by another SEOmoz'r Modesto Siotos. Here is the link......
http://www.seomoz.org/blog/how-to-check-which-links-can-harm-your-sites-rankings
here is the copy and paste:"
The Right Time For a Backlinks Risk Assessment
Carrying out a backlinks audit in order to identify the percentage of low-quality backlinks would be a good starting point. A manual, thorough assessment would only be possible for relatively small websites as it is much easier to gather and analyse backlinks data – for bigger sites with thousands of backlinks that would be pointless. The following process expands on Richard Baxter's solution on 'How to check for low quality links', and I hope it makes it more complete.
- Identify as many linking root domains as possible using various backlinks data sources.
- Check the ToolBar PageRank (TBPR) for all linking root domains and pay attention on the TBPR distribution
- Work out the percentage of linking root domains that has been deindexed
- Check social metrics distribution (optional)
- Repeat steps 2,3 and 4 periodically (e.g. weekly, monthly) and check for the following:
- A spike towards the low end of the TBPR distribution
- Increasing number of deindexed linking root domains on a weekly/monthly basis
- Unchanged numbers of social metrics, remaining in very low levels"
END OF CUT AND PASTED INFO****************************
Third- Here is another post from a fellow SEO moz'r who does a good job of simplifying the process and providing the cold hard facts and options. Its entitles "6 ways to recover from bad links".
http://www.seomoz.org/blog/6-ways-to-recover-from-bad-links
I hope this information helps you. There are no quick and easy fixes. If you dont want to lose the business then you need to spend the time and make it happen. If this has helped you please make sure you show me some italian love and give me the thumbs UP!!!!!!!!
Ciao
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
A web audit for web traffic? Need answers please..
Hi, We are a PR agency based in Dubai and we produce a lot of web content. The website is build on ruby on rails and we have implemented keywords and SEO strategies but sadly the traffic pattern has not changed since the past three years. What surprised us today that we created a page 2-3 days ago for a client who is participating in Arab Health (a very prestigious healthcare event) and suddenly our page comes on top 3 on google.ae as well as google.com We are kind of convinced that there is something wrong with our code.. Do you think this could be a possibility? and the lack of change in the traffic pattern might not be an SEO issue but a code issue? What could be the possible reasons for this pattern? In such a scenario what would experts like you recommend we do? Do a SEO Audit? Web audit? code audit? hire a seo/ web / code consultant? Thanks - helpful answers are really appreciated and just btw if anyone feels they could professionally help us out of this mess, we are willing to work with him/her. Thanks in advance
Algorithm Updates | | LaythDajani0 -
Getting listed in the Google local result - help!
Good day, I'm really struggling to get a client to appear in the Google Local map snapshot (on the right of the SERPs), even when their company name is Googled. I've tried everything including getting the main Google Local account verified, had some reviews put up, all the required and relevant info has been completed, yet their location and the map never appear. Any help out there as to how I can remedy this? Thanks
Algorithm Updates | | Martin_S1 -
Panda, Negative SEO and now Penguin - help needed
Hi,
Algorithm Updates | | mlm12
We are small business owners who've been running a website for 5 years that provides our income. We've done very little backlinking ourselves, and never did paid directories or anything like that - usually just occasional forum or blog responses. A few articles here and there with some of our keyword phrases for internal pages. Of course I admit we've done some kwp backlinks on some blogs, but our anchor text profile is largely brand names and our domain name and non keywords (excepting for some "bad" backlinks). Our DA is 34, PA 45 for our home page. We were doing great until last Sept 27 when we got hit by Panda and have been working on deoptimizing our site for keywords, we made a new site in Wordpress for good architecture and ease of use for our customers, and we're deleting/repurposing low quality pages and making our content more robust. We haven't yet recovered from this and now it appears we got hit May 22 for Penguin...ARGH! I recently discovered (hard to have time to devote to everything with just two of us) that others can "negative seo" a site now and I feel this has happened based upon results below... I signed up for linkdetox.com yesterday and it gives a grim picture of our backlinks (says we are in "deadly risk" territory). We have 83 "toxic" links and 600 some "suspicious" links (many are in malware/malicious listed sites, many are .pl domains from Poland, others are I believe foreign domains, or domains that are a bunch or letters that make no sense, or spammy sounding emd domains), - this makes up 80% of our links. As this is our only business, our income is now 1/3 of what it has been, even with PPC ads going as we've been hit hard by all of this and are wondering if we can survive fixing this. We do have an SEO firm minimally helping us along with guidance on recovering, but with income so low, we are doing the work ourselves and can't afford much. Needless to say, we are quite distressed and from reading around, not sure if we'll be able to recover and that is deeply saddening, especially from Negative SEO. We want to make sure we are on the right path for recovery if possible, hence my questions. We haven't been in contact with Google for reconsideration, again, no penalty messages from them. First of all, if we don't have a manual penalty, would you still contact all the toxic/malicious/possible porn looking sites and ask for a link removal, wait, ask for link removal, wait then disavow? Or just go straight to Google disavow? For backlinks coming from sites that are "gone" (like a message saying the account has been suspended), or there is no website there anymore, do I try and contact them too? Or go direct to disavow? Or do nothing? For the sites flagged as malicious (by linkdetox, my browser, or by Google), I don't want to try and open them on my browser to see if this site is legitimate. If linkdetox doesn't have the contact info for these - what are we supposed to do? For "suspicious" foreign sites that I can't read the webpage -would you still disavow them (I've seen many here say links from foreign sites should be disavowed). How do you keep up with all this is someone is negative SEOing you? We're really frustrated that Google's change has made it possible for competitors to tank your business (arguably though, if we had a stronger backlink profile this may not have hurt, or not as much - not sure). When you are small biz owners and can't hire a group to constantly monitor backlinks, get quality backlinks, content, site optimization, etc - it seems an almost impossible task to do. Are wordpress left nav and footer link anchor text an issue for Penguin? I would think Google would realize these internal links will be repetitive for the same anchor text on Wordpress (I know Matt Cutts said to not use the same anchor text more than once for internal linking -but obviously nav and footer menus will do this). What would you do if this was you? Try and fix it all? Start over with a new domain and 301 it (some say this has been working)? Just start over with a new domain and don't redirect? Thanks for your input and advice. We appreciate it.0 -
Local Pages Help
Hi All, I have a client who is looking heavily at Google+ Local. He has a main business, with a number of locational franchises. He has created a local listing for each of these franchise pages. The question he has asked is 'How do I improve my rankings for these local listings?' Now some of them seem to rank well without any work performed to improve them, but some are not. My question is, What can we do to improve the rankings of Google+ Local listings? This has changed greatly since I last looked into it, so anyone who can say 'right, this is what you need to do to improve Google+Local listings' would be greatly appreciated!!!! Many thanks Guys!!
Algorithm Updates | | Webrevolve0 -
Need help with some duplicate content.
I have some duplicate content issues on my blog I'm trying to fix. I've read lots of different opinions online about the best way to correct it, but they all contradict each other. I was hoping I could ask this community and see what the consensus was. It looks like my category and page numbers are showing duplicate content. For instance when I run the report I see things like this: http://noahsdad.com/resources/ http://noahsdad.com/resources/page/2/ http://noahsdad.com/therapy/page/2/ I'm assuming that is just the categories that are being duplicated, since the page numbers only show on the report at the end of a category. What is the best way to correct this? I don't use tags at all on my blog, using categories instead. I also use the Yoast SEO plug in. I have a check mark in the box that disables tags. However it says, "If you're using categories as your only way of structure on your site, you would probably be better off when you prevent your tags from being indexed." There is a box that allows you to disable categories also, but the description above makes it seem like I don't want to block both tags and categories. Any ideas what I should do? Thanks.
Algorithm Updates | | NoahsDad0 -
Can you help with a few high-level mobile SEO questions?
Rolling out a mobile site for a client and I'm not positive about the following: Do these mobile pages need to be optimized with the same / similar page titles? If we have a product page on the regular site with an optimized title like "Men's Sweaters, Shirts and Ties - Company XYZ", should the mobile version's page have the same title? What if the dev team simply named it "Company XYZ Clothes" and missed the targeted keywords? Does it matter? Along the lines of question 1, isn't there truly just one index and your regular desktop browser version will be used for all ranking factors on both desktop and mobile SERPs? If that regular page indeed ranks well for "men's sweaters" and that term is searched on a mobile device, the visitor will be detected and served up the mobile page version, regardless of its meta tags and authority (say it's on a subdomain, m.example/.com/mens-department/ ), correct? Are meta descriptions necessary for the mobile version? Will the GoogleBot Mobile recognize them or will just the regular version work? Looks like mobile meta descriptions have about 30 less characters. Thanks in advance. Any advice is appreciated. AK
Algorithm Updates | | akim260 -
Are we penalized if our meta description is longer than 150-160 characters?
I've read on other SEO sites that description can be 350 characters or 60 words long. Some of my descriptions are a little bit over those numbers. Will Search Engines stop crawling through the description at after a certain amount of characters, or will it completely ignore it if it's too long, hence hurting my site's SEO performance?
Algorithm Updates | | jmbuytaert0 -
Index Page lost rankings? Please Help!
This morning I ranked highly (Page 1 UK Google) for over 50 keyword search terms for my website http://www.careworx.co.uk This afternoon my rankings have bottomed out and dropped pages? I have not been de-indexed it appears and many of my sub-pages are still highly ranked. Would anybody know what has happened? I know of Google Panda but I would've seen results drop before now so I'm very concerned. Don't seem to have lost any links etc and am careful to balance SEO with a mix of techniques to keep Google happy and again, have not been de-indexed. Can anybody offer advice please, or let me know how I can rectify this.
Algorithm Updates | | andystep0