Access Denied
-
Our website which was ranking at number 1 in Google.co.uk for our 2 main search terms for over three years was hacked into last November. We rebuilt the site but had slipped down to number 4. We were hacked again 2 weeks ago and are now at number 7.
I realise that this drop may not be just a result of the hacking but it cant' have helped.
I've just access our Google Webmaster Tools accounts and these are the current results:
940 Access Denied Errors
197 Not Found
The 940 Access Denied Errors apply to Wordpress Blog pages.
Is it likely that the hacking caused the Access Denied errors and is there a clear way to repair these errors?
Any advice would be very welcome.
Thanks,
Colin
-
Glad I could help!
-
Thanks so much Jason. I'll take a look right now and be back with a Thumbs Up I'm sure.
Colin
-
Hi Nile
I've found this for you: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=2409441
It summarizes why access denied shows up.
My initial thought was that your robots.txt file was maliciously updated to block many of your pages.
-
Hi Jason,
I can visit those pages in Google browser.
Colin
-
A little more information needed here... can you visit those pages in your browser? If not, what error shows up?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Captcha wall to access content and cloaking sanction
Hello, to protect our website against scrapping, visitor are redirect to a recaptcha page after 2 pages visited. But for a SEO purpose Google bot is not included in that restriction so it could be seen as cloaking. What is the best practice in SEO to avoid a penalty for cloaking in that case ?
Intermediate & Advanced SEO | | clementjaunault
I think about adding a paywall Json shema NewsArticle but the content is acceccible for free so it's not a paywall but more a captcha protection wall. What do you recommend ?
Thanks, Describe your question in detail. The more information you give, the better! It helps give context for a great answer.1 -
Moving site to new domain without access to redirect from old to new. How can I do this with as little loss to SERP results as possible?
I've been hired to build a new site for a customer. They were duped by some shady characters at goglupe.com (If you can reach them, tell them they are rats--phone is disconnected, address is a comedy club on Mission in SF). Glupe owns the domain name and would not transfer or give FTP access prior to dropping off the face of the earth. The customer doesn't want to chase after them with lawyers, so we are moving on. New domain, new site with much of the same content as previous site. All that I have access to is the old wordpress site. I plan to build the new site, then remove all pages/posts from the old site. Is there anything I can do to salvage the current page 1 ranking? Obviously, the new domain will take some time to get back there. Just hoping to avoid any pitfalls or penalties if I can. If I had complete access, I would follow all the standard guidelines. But I don't. Any thoughts? Thanks! Chris
Intermediate & Advanced SEO | | c_estep_tcbguy0 -
Should my website be accessible by IP?
I have been doing some digging in to this today essentially triggered off by looking at the secure certificate on my site and comparing it to others as i have been seeing some security warnings on a random basis. I noticed that on all instances none of the other sites IP addresses re-direct to the website, whereas on my site it does. is re-directing the IP address to the website a big no-no?
Intermediate & Advanced SEO | | WAWKA1 -
Google can't access/crawl my site!
Hi I'm dealing with this problem for a few days. In fact i didn't realize it was this serious until today when i saw most of my site "de-indexed" and losing most of the rankings. [URL Errors: 1st photo] 8/21/14 there were only 42 errors but in 8/22/14 this number went to 272 and it just keeps going up. The site i'm talking about is gazetaexpress.com (media news, custom cms) with lot's of pages. After i did some research i came to the conclusion that the problem is to the firewall, who might have blocked google bots from accessing the site. But the server administrator is saying that this isn't true and no google bots have been blocked. Also when i go to WMT, and try to Fetch as Google the site, this is what i get: [Fetch as Google: 2nd photo] From more than 60 tries, 2-3 times it showed Complete (and this only to homepage, never to articles). What can be the problem? Can i get Google to crawl properly my site and is there a chance that i will lose my previous rankings? Thanks a lot
Intermediate & Advanced SEO | | granitgash
Granit FvhvDVR.png dKx3m1O.png0 -
After Receiving a "Googlebot can't access your site" would this stop your site from being crawled?
Hi Everyone,
Intermediate & Advanced SEO | | AMA-DataSet
A few weeks ago now I received a "Googlebot can't access your site..... connection failure rate is 7.8%" message from the webmaster tools, I have since fixed the majority of these issues but iv noticed that all page except the main home page now have a page rank of N/A while the home page has a page rank of 5 still. Has this connectivity issues reduced the page ranks to N/A? or is it something else I'm missing? Thanks in advance.0 -
Google Reconsideration - Denied for the Third Time
I have been in the process of trying to get past a "link scheme" penalty for just over a year. I took on the client in April 2012, they had received their penalty in February of 2012 before i started. Since then we have been trying to manually remove links, contact webmasters for link removal, blocking over 40 different domains via the disavow tool and requesting reconsideration multiple times. All i get in return "Site violates Google's quality guidelines." So we regrouped and did some more research to find that about 90% of the offending spam links pointed to only 3 pages of the website so we decided to just delete the pages, display a 404 error in their place and create new pages with new URLs. At first everything was looking good, the new pages were ranking and receiving page authority and the old pages were gone from the indexes. So we resubmitted for reconsideration for the third time and we got the same exact response! I don't know what else to do? I did everything i could think of with the exception of deleting the whole site. Any advice would be greatly appreciated. Regards - Kyle
Intermediate & Advanced SEO | | kchandler0 -
Googlebot Can't Access My Sites After I Repair My Robots File
Hello Mozzers, A colleague and I have been collectively managing about 12 brands for the past several months and we have recently received a number of messages in the sites' webmaster tools instructing us that 'Googlebot was not able to access our site due to some errors with our robots.txt file' My colleague and I, in turn, created new robots.txt files with the intention of preventing the spider from crawling our 'cgi-bin' directory as follows: User-agent: * Disallow: /cgi-bin/ After creating the robots and manually re-submitting it in Webmaster Tools (and receiving the green checkbox), I received the same message about Googlebot not being able to access the site, only difference being that this time it was for a different site that I manage. I repeated the process and everything, aesthetically looked correct, however, I continued receiving these messages for each of the other sites I manage on a daily-basis for roughly a 10-day period. Do any of you know why I may be receiving this error? is it not possible for me to block the Googlebot from crawling the 'cgi-bin'? Any and all advice/insight is very much welcome, I hope I'm being descriptive enough!
Intermediate & Advanced SEO | | NiallSmith1 -
Can I Improve Organic Ranking by Restrict Website Access to Specific IP Address or Geo Location?
I am targeting my website in US so need to get high organic ranking in US web search. One of my competitor is restricting website access to specific IP address or Geo location. I have checked multiple categories to know more. What's going on with this restriction and why they make it happen? One of SEO forum is also restricting website access to specific location. I can understand that, it may help them to stop thread spamming with unnecessary Sign Up or Q & A. But, why Lamps Plus have set this? Is there any specific reason? Can I improve my organic ranking? Restriction may help me to save and maintain user statistic in terms of bounce rate, average page views per visit, etc...
Intermediate & Advanced SEO | | CommercePundit1