Penalized In Google ?
-
Hello Guy´s.
Im terrible sad because we make an amazing SEO job for this client:
And the website was hacked..
Message from the hosting platform:
"It would appear that malicious individuals have found a way to upload spam
pages as well as backdoors to your site(s). We
have disabled the page(s) in question (via removing their permissions, e.g..
chmod) until you are able to address this matter."Result: we loose all our SERP
Somebody of yours was in a similar situation ?
Notes:
-
I was on Google Webmaster an anything seem to be normal.
-
The domain was relative new, maybe a late sandbox efect ?
Thanks a lot for your help.
Matias
-
-
Hi Alan.
Yes, our software is Ok.
Im will try to move the hosting, maybe that move bring us back our SERP.
Thanks
-
This really sounds like the hosting company moving blame to you.
Why not change hosting, i would do it quickly.
Do you have a image upload? seing the pages are in your image folder.
If so you can add some code to make sure it is actualy images that you are uploading.
-
Hi Alan.
I think they entered through weak folder permissions. They upload html content in our "image" folder.
The website still in Google index but, we loose more than 50 positions.
http://www.google.com/search?pws=0&gl=en&q=site%3Awww.medabcn.comWe were in the top 10 for this concepts: medical travel spain , medica spain etc...
Thanks
-
Sounds like your hoster has a problem not you.
How aree you to blame, what on the pages that they blocked? forms that allow html input (html editor)?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there proof that disavowing backlinks in GSC help to boost rankings in Google?
Hi Guys Let's say you have a website and you got some questionable back links or lower quality ones. Does anyone have proof that after disavowing back links helped in the rankings or had some positive effects? I am concerned that Google will place our website on their radar and instead possibly demote it or smth. Lastly, if disavowing is the way to go what criteria do you use to disavow backlinks? So if you get questionable back links over time, should you disavow ongoing as well? If so how often? Cheers John
White Hat / Black Hat SEO | | whiteboardwiz0 -
Does Trade Mark in URL matter to Google
Hello community! We are planning to clean up TM and R in the URLs on the website. Google has indexed these pages but some TM pages are have " " " instead displaying in URL from SERP. What's your thoughts on a "spring cleaning" effort to remove all TM and R and other unsafe characters in URLs? Will this impact indexed pages and ranking etc? Thank you! b.dig
White Hat / Black Hat SEO | | b.digi0 -
Is Google not Penalizing aggressively anymore for on page manipulation?
I wanted to throw this out where we have been seeing so much emphasis on Google cracking down on bad linking, have they let up enforcement on manipulative on-page tactics that have faded in current years? I've been seeing hidden text popping up again and ranking. Here is an example. Google "landscaping Portsmouth NH" and find the #1 result. Now find "Portsmouth" on the page. So what I find interesting, the site has a clean backilnk profile, but that's a pretty blatant manipulation hiding those keywords. What I find interesting is I filled out a report on it a year ago. (I'm not a big "fill out spam report" guy, I was curious if Google would take action). A year later it is still #1 for the competitive keyword. So I'm curious if others have seemed similar trends like font-size:0px, or text color as the background popping back up and ranking. I would love other's thoughts on it.
White Hat / Black Hat SEO | | BCutrer0 -
How does Google decide what content is "similar" or "duplicate"?
Hello all, I have a massive duplicate content issue at the moment with a load of old employer detail pages on my site. We have 18,000 pages that look like this: http://www.eteach.com/Employer.aspx?EmpNo=26626 http://www.eteach.com/Employer.aspx?EmpNo=36986 and Google is classing all of these pages as similar content which may result in a bunch of these pages being de-indexed. Now although they all look rubbish, some of them are ranking on search engines, and looking at the traffic on a couple of these, it's clear that people who find these pages are wanting to find out more information on the school (because everyone seems to click on the local information tab on the page). So I don't want to just get rid of all these pages, I want to add content to them. But my question is... If I were to make up say 5 templates of generic content with different fields being replaced with the schools name, location, headteachers name so that they vary with other pages, will this be enough for Google to realise that they are not similar pages and will no longer class them as duplicate pages? e.g. [School name] is a busy and dynamic school led by [headteachers name] who achieve excellence every year from ofsted. Located in [location], [school name] offers a wide range of experiences both in the classroom and through extra-curricular activities, we encourage all of our pupils to “Aim Higher". We value all our teachers and support staff and work hard to keep [school name]'s reputation to the highest standards. Something like that... Anyone know if Google would slap me if I did that across 18,000 pages (with 4 other templates to choose from)?
White Hat / Black Hat SEO | | Eteach_Marketing0 -
Massive drop in Google traffic after upping pagecount 8-fold.
I run a book recommendation site -- Flashlight Worthy. It's a collection of original, topical book lists: "The Best Books for Healthy (Vegetarian) Babies" or "Keystone Mysteries: The Best Mystery Books Set in Pennsylvania" or "5 Books That Helped Me Discover and Love My Italian Heritage". It's been online for 4+ years. Historically, it's been made up of: a single home page ~50 "category" pages, and ~425 "book list" pages. (That 50 number and 425 number both started out much smaller and grew over time but has been around 425 for the last year or so as I've focused my time elsewhere.) On Friday, June 15 we made a pretty big change to the site -- we added a page for every Author who has a book that appears on a list. This took the number of pages in our sitemap from ~500 to 4,149 overnight. If an Author has more than one book on the site, the page shows every book they have on the site, such as this page: http://www.flashlightworthybooks.com/books-by/Roald-Dahl/2805 ..but the vast majority of these author pages have just one book listed, such as this page: http://www.flashlightworthybooks.com/books-by/Barbara-Kilarski/2116 Obviously we did this as an SEO play -- we figured that our content was getting ~1,000 search entries a day for such a wide variety of queries that we may as well create pages that would make natural landing pages for a broader array of queries. And it was working... 5 days after we launched the pages, they had ~100 new searches coming in from Google. (Ok, it peaked at 100 and dropped down to a steady 60 or so day within a few days, but still. And then it trailed off for the last week, dropping lower and lower every day as if they realized it was repurposed content from elsewhere on our site...) Here's the problem: For the last several years the site received ~30,000 search entries a month... a little more than 1,000 a day on weekdays, a little lighter on weekends. This ebbed and flowed a bit as Google made tweaked things (Panda for example), as we garnered fresh inbound links, as the GoodReads behemoth stole some traffic... but by and large, traffic was VERY stable. And then, on Saturday, exactly 3 weeks after we added all these pages, the bottom fell out of our search traffic. Instead of ~1,000 entries a day, we've had ~300 on Saturday and Sunday and it looks like we'll have a similar amount today. And I know this isn't just some Analytics reporting problem as Chartbeat is showing the same drop. As search is ~80% of my traffic I'm VERY eager to solve this problem... So: 1. Do you think the drop is related to my upping my pagecount 8-fold overnight? 2. Do you think I'd climb right back into Google's good graces if I removed all the pages at once? Or just all the pages that only list one author (which would be the vasy majority). 3. Have you ever heard of a situation like this? Where Google "punishes" a site for creating new pages out of existing content? Really, it's useful content -- and these pages are better "answers" for a lot of queries. When someone searches for "Norah Ephron books" it's better they land on a page of ours that pulls together the 4 books we have than taking them to a page that happens to have just one book on it among 5 or 6 others by other authors. What else? Thanks so much, help is very appreciated. Peter
White Hat / Black Hat SEO | | petestein1
Flashlight Worthy Book Recommendations
Recommending books so good, they'll keep you up past your bedtime. 😉0 -
Could a sitewide footer EXACT MATCH anchor text link hurt or potentially penalize a site?
I am pretty sure this would hurt rankings yet I just want another's opinion on it. Would a sitewide footer link with exact match keyword anchor text to the page you want to rank for your main keyword hurt you? Basically if it were a link to the homepage, yet you wanted to make the anchor text your main objective keyword, would it hurt to have this in the footer along with the logo link at the top of a page that is just "home" anchor text?
White Hat / Black Hat SEO | | jbster130 -
Why did Google reject us from Google News?
I submitted our site, http://www.styleblueprint.com to Google to pontentially be a local news source in Nashville. I received the following note back: We reviewed your site and are unable to include it in Google News at this
White Hat / Black Hat SEO | | styleblueprint
time. We have certain guidelines in place regarding the quality of sites
which are included in the Google News index. Please feel free to review
these guidelines at the following link: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=35769#3 Clicking the link, it anchors to the section that says: These quality guidelines cover the most common forms of deceptive or manipulative behavior, but Google may respond negatively to other misleading practices not listed here (e.g. tricking users by registering misspellings of well-known websites). It's not safe to assume that just because a specific deceptive technique isn't included on this page, Google approves of it. Webmasters who spend their energies upholding the spirit of the basic principles will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit. etc... Now we have never intentionally tried to do anything deceptive for our rankings. I am new to SEOmoz and new to SEO optimization in general. I am working through the errors report on our campaign site but I cannot tell what they are dinging us for. Whatever it is we will be happy to fix it. All thoughts greatly appreciated. Thanks in advance, Jay0 -
Banned from google !
Hello, I realize (with GAnaltytics and command "link:") this morning that my domain host (share one) : "mlconseil.com" under which several websites are hosted has been banned from google. Here below the websites : www.amvo.fr :
White Hat / Black Hat SEO | | mozllo
www.apei-cpm.fr :
www.armagnac-les-vieux-chenes.fr
www.centraledelexpertise.fr
www.cleaning-pc-33.com
www.internet-33.fr
www.territoires-et-ntic.fr
www.vin-le-taillou.com
www.maliflo.asso.fr I don't kow why, i use since end of january 2011 IBP, only for some submissions to directories and for managing some lists of urls. I submitted about 30/40 directories never at the same time , but raher day after day, smoothly. On www.territoires-et-ntic.fr and www.amvo.fr which are blogs, i have installed some external rss feeds to display as articles, i decided to stop that but i don't know if it's related to such "blacklistage" from google. I don't use any nasty "blackhat" programs or else.. I'am really upset about that, i claim this morning with the same words as now, a new indexation but i don't know how long it will take ?Any idea ? Which are the tools which could help me to scan for maybe any malicious maleware on my hosting provider ? Many tks0