403 forbidden error website
-
Hi Mozzers,
I got a question about new website from a new costumer http://www.eindexamensite.nl/.
There is a 403 forbidden error on it, and I can't find what the problem is.
I have checked on: http://gsitecrawler.com/tools/Server-Status.aspx
result:
URL=http://www.eindexamensite.nl/ **Result code: 403 (Forbidden / Forbidden)**
When I delete the .htaccess from the server there is a 200 OK :-). So it is in the .htaccess.
.htaccess code: ErrorDocument 404 /error.html
RewriteEngine On
RewriteRule ^home$ / [L]
RewriteRule ^typo3$ - [L]
RewriteRule ^typo3/.$ - [L]
RewriteRule ^uploads/.$ - [L]
RewriteRule ^fileadmin/.$ - [L]
RewriteRule ^typo3conf/.$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-l
RewriteRule .* index.phpStart rewrites for Static file caching
RewriteRule ^(typo3|typo3temp|typo3conf|t3lib|tslib|fileadmin|uploads|screens|showpic.php)/ - [L]
RewriteRule ^home$ / [L]Don't pull *.xml, *.css etc. from the cache
RewriteCond %{REQUEST_FILENAME} !^..xml$
RewriteCond %{REQUEST_FILENAME} !^..css$
RewriteCond %{REQUEST_FILENAME} !^.*.php$Check for Ctrl Shift reload
RewriteCond %{HTTP:Pragma} !no-cache
RewriteCond %{HTTP:Cache-Control} !no-cacheNO backend user is logged in.
RewriteCond %{HTTP_COOKIE} !be_typo_user [NC]
NO frontend user is logged in.
RewriteCond %{HTTP_COOKIE} !nc_staticfilecache [NC]
We only redirect GET requests
RewriteCond %{REQUEST_METHOD} GET
We only redirect URI's without query strings
RewriteCond %{QUERY_STRING} ^$
We only redirect if a cache file actually exists
RewriteCond %{DOCUMENT_ROOT}/typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html -f
RewriteRule .* typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html [L]End static file caching
DirectoryIndex index.html
CMS is typo3.
any ideas?
Thanks!
Maarten -
Hi everyone,
I know this thread hasn't been active for a while but i'm looking for an answer relating to a similar issue. Our infrastructure team had issues a few weeks ago and were routing bot traffic to a slave server. This obviously flagged up 403 errors in webmaster tools.
Having removed the traffic diversion our site hasn't been indexed in the three weeks since serving Googlebot with a 403 response. Does anyone have experience of Google delaying reindexing of a site after experiencing a 403 response?
Thanks
-
Hi Alan,
Ok we start cutting the htaccess. I'll keep you posted.
Thanks!
-
Thanks Anthony!
That's the strange thing, website is working only there is still a 403.
We will check chmod status.
-
Hello Maarten
Those RewriteCond entries are cumulative and it looks like there are missing commands.
Who edited that file last, and what did they change?
The way conditionals work is you set a condition, Then you add a command, then a line break You can add more than one condition and it acts as AND
This page has what look like too many conditions and not enough commands -but it could be ok
Try adding a blank line between the rule entries and the Cond entries (but not between the Cond and the Rule entries)
Here is what to do to test anything like this: Save a copy of the .htaccess Then start editing it Delete everything below ##Start rewrites See if that fixes it. If not, the problem is above or if that fixes it, the problem is below Keep cutting the file in half or adding half until you discover the problem line
It is harder with all those conditionals, I suspect it is the lower block that is the problem
So remove those Cond entries from the bottom up
-
Follow up:
I'm not seeing any errors when visiting the site (http://www.eindexamensite.nl/). It seems to be working perfectly. Could it be something client-side w/ your caching or system time?
-
Hi Maarten,
I'm not extremely familiar .htaccess or the typo3 CMS, but it could be the issue is simply a result of misconfigured file permissions for a specific directory or path.
I'd check the permissions on all of the paths that are affected by the .htaccess and make sure they're readable and executable (7,5,5).
This could explain why you get the 200 status w/o the .htaccess but the 403 error with it.
Good luck!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Setting Up A Website For Redirects
I've got an old defunct domain with a lot of backlinks to individual pages. I'd like to use these backlinks for link juice by redirecting them to individual pages on the new domain (both sites belong to the same company). What is the best way to set this up? I presume I need some kind of hosting & site, even if it's just a default Wordpress install, which I can then use to set up the redirects? Would it be best done using .htaccess file for 301 redirects or some other way?
Technical SEO | | abisti20 -
How To Cleanup the Google Index After a Website Has Been HACKED
We have a client whose website was hacked, and some troll created thousands of viagra pages, which were all indexed by Google. See the screenshot for an example. The site has been cleaned up completely, but I wanted to know if anyone can weigh in on how we can cleanup the Google index. Are there extra steps we should take? So far we have gone into webmaster tools and submitted a new site map. ^802D799E5372F02797BE19290D8987F3E248DCA6656F8D9BF6^pimgpsh_fullsize_distr.png
Technical SEO | | yoursearchteam0 -
Redirect Error
Hello, I was sent a report from a colleague containing redirect errors: The link to "http://www.xxxx.com/old-page/" has resulted in HTTP redirection to "http://www.xxxx.com/new-page".Search engines can only pass page rankings and other relevant data through a single redirection hop. Using unnecessary redirects can have a negative impact on page ranking. Our site is host on Microsoft Servers (IIS). I'm not sure what is causing these errors. Would it be the way the redirect was implemented.
Technical SEO | | 3mobileIreland0 -
Is this normal on my website speed tool
Hi, i would like to know if this is normal as i have never come across it before. i have just checked my speed which needs a lot of improving. I use joomla 3.0 and recently had a developer upgrade it from 1.5, but now i am seeing under the speed test my website showing twice which looks like it is causing a time delay. the tool is http://tools.pingdom.com/fpt/#!/dTjwDM/www.in2town.co.uk can someone please look and let me know if this is normal. my site www.in2town.co.uk is coming up twice which seems to be slowing the site down and i have checked this tool with other sites and they are fine many thanks
Technical SEO | | ClaireH-1848860 -
Odd URL errors upon crawl
Hi, I see this in Google Webmasters, and am now also seeing it here...when a crawl is performed on my site, I get many 500 server error codes for URLs that I don't believe exist. It's as if it sees a normal URL but adds this to it: %3Cdiv%20id= It's like this for hundreds of URLs. Good URL that actually exists http://www.ffr-dsi.com/food-retailing/supplies/ URL that causes error and I have no idea why http://www.ffr-dsi.com/food-retailing/supplies/%3Cdiv%20id= Thanks!
Technical SEO | | Matt10 -
Are 404 Errors a bad thing?
Good Morning... I am trying to clean up my e-commerce site and i created a lot of new categories for my parts... I've made the old category pages (which have had their content removed) "hidden" to anyone who visits the site and starts browsing. The only way you could get to those "hidden" pages is either by knowing the URLS that I used to use or if for some reason one of them is spidering in Google. Since I'm trying to clean up the site and get rid of any duplicate content issues, would i be better served by adding those "hidden" pages that don't have much or any content to the Robots.txt file or should i just De-activate them so now even if you type the old URL you will get a 404 page... In this case, are 404 pages bad? You're typically not going to find those pages in the SERPS so the only way you'd land on these 404 pages is to know the old url i was using that has been disabled. Please let me know if you guys think i should be 404'ing them or adding them to Robots.txt Thanks
Technical SEO | | Prime850 -
Duplicate page issue in website
i found duplicate pages in my website. seomoz is showing duplicate web pages this is issue or not please tell me?
Technical SEO | | learningall0 -
404 Errors
Hello Team, I noticed that my site has 1,000s of 404 errors. Not sure how this happened, maybe when I updated our CMS. My question is, should I worry about them. Should I delete them or just leave them alone. Thank you for your feedback!
Technical SEO | | Dallas0