403 Forbidden Crawl report
-
Hi,
I am getting 403 forbidden crawl report on some of my pages. However the pages are loading fine. Also when asked my web developer told that some times reports show errors when there is nothing wrong. Also will the errors affect the SEO/Ranking etc.
Some of the links:
https://www.medistaff24.co.uk/contact-us/https://www.medistaff24.co.uk/elderly-care-in-evesham-worcestershire/
-
I have a locks business website about locksmith Tampa Florida
we are facing the same issue on the Main page -
A 403 Forbidden error means that the server denied access to the requested page. This can happen for a few reasons, such as:
- The user does not have permission to access the page.
- The page is not published yet.
- There is a misconfiguration on the server.
If you are getting 403 Forbidden errors on your website, it is important to first check that the pages are actually loading fine for users. You can do this by visiting the pages yourself or by using a tool like Google Search Console.
If the pages are loading fine for users, then the errors in the crawl report are likely false positives. This can happen if Googlebot encounters a temporary error when crawling your website. In this case, you can ignore the errors and they should eventually go away.
However, if the pages are not loading fine for users, then the errors in the crawl report are likely real. In this case, you need to fix the underlying issue that is causing the 403 Forbidden errors.
Here are some steps you can take to fix 403 Forbidden errors:
- Check the permissions on the files and folders that contain the pages that are returning 403 Forbidden errors. Make sure that the user account that Googlebot is using has permission to access these files and folders.
- Check the robots.txt file to make sure that Googlebot is not being explicitly denied access to the pages that are returning 403 Forbidden errors.
- Check the server configuration to make sure that there are no misconfigurations that could be causing the 403 Forbidden errors.
If you have tried all of these steps and you are still getting 403 Forbidden errors, then you may need to contact your web hosting provider for assistance.
Will the 403 errors affect the SEO/Ranking
As for whether or not 403 Forbidden errors will affect your SEO/ranking, it depends on a few factors.
-
If the pages that are returning 403 Forbidden errors are important pages for your website, then the errors could potentially have a negative impact on your SEO and ranking.
-
However, if the pages that are returning 403 Forbidden errors are not important pages for your website, then the errors are unlikely to have a significant impact on your SEO and ranking.
It is best to fix 403 Forbidden errors as soon as possible. This will help to ensure that Googlebot can access all of the pages on your website and that your website is crawlable and indexable.
Warm Regards
Rahul Gupta
Suvidit Academy
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirect and Redirect Error in Moz Crawl
Hello, We have a wordpress blog attached to our magento website located at domain.co.uk/blog/ Moz was coming back showing we had multiple page versions on show (http and https) So i updated the htaccess file to what is below. This has fixed most of the errors, however the homepage is being a little tricky. Moz is now saying that the page is redirecting and redirecting again http://www.domain.co.uk/blog to
On-Page Optimization | | ATP
http://www.domain.co.uk/blog/ to
https://www.domain.co.uk/blog/ BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /blog/</ifmodule> RewriteCond %{HTTPS} !=on
RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301] RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /blog/index.php [L] END WordPress Within wordpress settings the urls are set up as follows Wordpress Address URL: https://www.domain.co.uk/blog Site Address URL: https://www.domain.co.uk/blog i tried to add a trailing / to these but it gets automatically removed. So i am assuming that wordpress is serving up https://www.domain.co.uk/blog **RewriteBase /blog/ **is re-directing it to / then my https rewrite is re-directing it again I am not sure where exactly to fix this, could anybody advise? Many thanks,0 -
When making content pages to a specific page; should you index it straight away in GSC or let Google crawl it naturally?
When making content pages to a specific page; should you index it straight away in GSC or let Google crawl it naturally?
On-Page Optimization | | Jacksons_Fencing0 -
Why don't all my pages have On Page Optimization Reports
Apologies if this question has been asked a million times, but I can't find it. I have 35 pages, yet only 5 of them have generated On Page Optimization Reports. I know I can create them manually, but wondered if I've done something incorrectly? Iain.
On-Page Optimization | | iainmoran0 -
Questions About My Report
Hi, I have a website that aggregates NFL analysis (not news). I write 3-5 line summaries about each article I link, so there is a pretty good amount of daily content. Here's the site: http://www.profootballhotreads.com/ After I received my initial report, there were several issues, and I just wanted to get some thoughts on them. Some of these might be related to the aggregate nature, some might be not a concern, but I want to know which ones I should really worry about. Too many links. My main page is a continuously running scroll of links, so obviously this is going to be tough to accommodate. I know this makes each link less "valuable," but does it actually affect my site in any way? I don't really have links to my site on the page other than in the menu, which I assume would be scrolled first. meta description on tag pages. For site design reasons, I have several "pages" that are actually tag collection pages rather than unique pages. For example, each team's page is simply a collection of anything tagged with that team. So, I don't know if I can provide a meta description for those pages without making that the default meta description for any post with that tag. I supposedly have tons of duplicate pages but when I go to those pages, I don't see it. Webmasters said I only had one duplicate. Not sure what's going on. I'm thinking anytime I update a post, it is reading it as two different posts even though only one post exists at a time on the site. I have tons of duplicate page titles. Basically, I have tons of pages on my main page because after a certain amount of posts, it goes to a "new" page, even though it's just a continuation. So, I have Main page 1, Main Page 2, etc with the same title and meta descriptions. I don't think this is a concern, is it? Thanks for anyone who might be able to help. Let me know if there are more questions. Jason
On-Page Optimization | | JMay0 -
On-Page Report Card refresh problem
I have noticed that on the On-Page Report Card, refresh does not work at all. Namely, when I make a change after receiving a suggestion, it does not change at all. I remember that option worked almost instantly before which helped me make an optimized page quickly. Also, I have seen that meta tags are not desirable and when I want to remove them, I have quotation marks left instead of meta keywords and an error reported. These are the most important issues for now, especially the refreshing issue since I tried deleting data from cash memory and all the other available options such as changing the browser etc. I have also tried deleting key words and repeating the grading on on-page report card but had no results. Exiting SEOmoz and re-entering also gave no results. Kasa
On-Page Optimization | | Kasa-Nenad0 -
Is there a way to export the On-page Optimization report data to Excel?
I am preparing recommendations for my client's Webmaster from the On-page Report Card. I am integrating them into a larger Excel spreadsheet with other recommended changes. So many SEO Moz reports can be exported to Excel. Is this an exception, or am I missing something? It would really save me a lot of time and effort.
On-Page Optimization | | calalouf0 -
On Page Optimisation Reports
Firstly sorry if this has already been answered - I did look I promise.
On-Page Optimization | | Jock
Secondly sorry if the answer to this is blatently obvious! In the process of trying to optimise my landing pages, I am using On Page Optimisation reports. I have several (ok lots) with F grades which is not surprising as the landing page is not the landing page optimised for a certain keyword. If I change the landing page to the one that I have for a certain keyword then hey presto A or B grade (clever me)! Now here's the thing - presumably the landing page that is listed by default is the one that Google "sees" for a particular keyword. How do I change this if I can or do I have to be patient or am I just being plain daft?! Many thanks0 -
Crawl error: duplicate title for home page
I'm seeing a duplicate title for the home page, both the static file name and the domain. like: http://domain.com
On-Page Optimization | | joshcanhelp
http://domain.com/index.cfm I know how to set this in Google Analytics but how would I make sure this isn't seen as an error? It's accounting for both a duplicate title and duplicate content. Thanks!0