403 Forbidden Crawl report
-
Hi,
I am getting 403 forbidden crawl report on some of my pages. However the pages are loading fine. Also when asked my web developer told that some times reports show errors when there is nothing wrong. Also will the errors affect the SEO/Ranking etc.
Some of the links:
https://www.medistaff24.co.uk/contact-us/https://www.medistaff24.co.uk/elderly-care-in-evesham-worcestershire/
-
I have a locks business website about locksmith Tampa Florida
we are facing the same issue on the Main page -
A 403 Forbidden error means that the server denied access to the requested page. This can happen for a few reasons, such as:
- The user does not have permission to access the page.
- The page is not published yet.
- There is a misconfiguration on the server.
If you are getting 403 Forbidden errors on your website, it is important to first check that the pages are actually loading fine for users. You can do this by visiting the pages yourself or by using a tool like Google Search Console.
If the pages are loading fine for users, then the errors in the crawl report are likely false positives. This can happen if Googlebot encounters a temporary error when crawling your website. In this case, you can ignore the errors and they should eventually go away.
However, if the pages are not loading fine for users, then the errors in the crawl report are likely real. In this case, you need to fix the underlying issue that is causing the 403 Forbidden errors.
Here are some steps you can take to fix 403 Forbidden errors:
- Check the permissions on the files and folders that contain the pages that are returning 403 Forbidden errors. Make sure that the user account that Googlebot is using has permission to access these files and folders.
- Check the robots.txt file to make sure that Googlebot is not being explicitly denied access to the pages that are returning 403 Forbidden errors.
- Check the server configuration to make sure that there are no misconfigurations that could be causing the 403 Forbidden errors.
If you have tried all of these steps and you are still getting 403 Forbidden errors, then you may need to contact your web hosting provider for assistance.
Will the 403 errors affect the SEO/Ranking
As for whether or not 403 Forbidden errors will affect your SEO/ranking, it depends on a few factors.
-
If the pages that are returning 403 Forbidden errors are important pages for your website, then the errors could potentially have a negative impact on your SEO and ranking.
-
However, if the pages that are returning 403 Forbidden errors are not important pages for your website, then the errors are unlikely to have a significant impact on your SEO and ranking.
It is best to fix 403 Forbidden errors as soon as possible. This will help to ensure that Googlebot can access all of the pages on your website and that your website is crawlable and indexable.
Warm Regards
Rahul Gupta
Suvidit Academy
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to determine what is causing an "F" on-page Report ?
I have a number of pages that I believe are optimized just like other pages that have "A" reports, but they get Fs. How can I specifically drill down and discover the cause of the F?
On-Page Optimization | | enotes0 -
Should I use www in my url when running On-Page Report Card?
When creating a On-Page Report Card I get 2 different results when using a WWW and without for my url. What is best?
On-Page Optimization | | thomas.wittine0 -
Keyword Density in Body in one page report.
Does anyone know how SEOMOZ look up the keyword frequency in one page report body part.. There are discrepancies between the keyword frequency in body text of SEOMOZ and other free check website.
On-Page Optimization | | RiseSEO0 -
Is reported duplication on the pages or their canonical pages?
There are several sections getting flagged for duplication on one of our sites: http://mysite.com/section-1/?something=X&confirmed=true
On-Page Optimization | | Safelincs
http://mysite.com/section-2/?something=X&confirmed=true
http://mysite.com/section-3/?something=X&confirmed=true Each of the above are showing as having duplicates of the other sections. Indeed, these pages are exactly the same (it's just an SMS confirmation page you enter your code in), however, they all have canonical links back to the section (without the query string), i.e. section-1, section-2 and section-3 respectively. These three sections have unique content and aren't flagged up for duplications themselves, so my questions are: Are the pages with the query strings the duplicates, and if so why are the canonical links being ignored? or Are the canonical pages without the query strings the duplicates, and if so why don't they appear as URLs in their own right in the duplicate content report? I am guessing it's the former, but I can't figure out why it would ignore the canonical links. Any ideas? Thanks0 -
Changing Subfolder that has been crawled before
Question: I am using a wordpress multisite and I enabled the crawl options yesterday www.abc.com/subfolder <-original but i find that www.abc.com/sub is good enough I checked the site:abc.com but I find that my pages in the /subfolder has been crawled before. Can I just change it to www.abc.com/sub or it will raise duplicate content issue?
On-Page Optimization | | joony20080 -
Source page leading to a 404 pages in reports
Hi everybody, I wonder how to find and quickly correct 404 errors in my crawl reports : SeoMoz says me "http://domain.com/this-page-is-dead" is 404, but I can't figure out a source page where a link to that url appears. I tried a google link:http://domain.com/this-page-is-dead request, with no more luck. I imagine the trick is trivial, but I need it 🙂 Moreover, why do not show a list of pages referring to this 404 page on reports ? Thanks, Loïc
On-Page Optimization | | mandinga0 -
How to Resolve Google Crawling Issues for My eCommerce Website?
I want to resolve Google crawling issues for my eCommerce website. My website is as follow. http://www.vistastores.com/ Google have crawled only 97 webpages from my website. My website is quite old. (~More than 6 months) But, Google have indexed only 97 webpages. I have created one campaign over SEOmoz tool and found some errors over there. So, I just assumed that due to it Google did not crawled my website. But, I have created one another campaign for my competitor website to know actual status and reason behind it. I found that, my competitor website have more error compare to me but, Google have crawled maximum pages compare to me. So, What is reason behind it? How can I improve my crawling rate and index maximum webpages to Google? [6133009604_af85d29730_b.jpg](img src=) 6133009604_af85d29730_b.jpg 6133009604_af85d29730_b.jpg 6139706697_4e252fdb82_b.jpg
On-Page Optimization | | CommercePundit0 -
Not making a change of the 100's in crawl Diagnostic
Based on the PRO crawl Diagnostics – if we don’t make a change on 1 page, does that just affect the SEO on that one page, or does it affect the SEO on all pages of the site? E.g. If we get a “Too many on page links” for a certain page that we don’t really want to rank for – does not fixing that particlaur page affect the site as a whole? Hope I explained this ok..
On-Page Optimization | | inhouseninja0