403 Forbidden Crawl report
-
Hi,
I am getting 403 forbidden crawl report on some of my pages. However the pages are loading fine. Also when asked my web developer told that some times reports show errors when there is nothing wrong. Also will the errors affect the SEO/Ranking etc.
Some of the links:
https://www.medistaff24.co.uk/contact-us/https://www.medistaff24.co.uk/elderly-care-in-evesham-worcestershire/
-
I have a locks business website about locksmith Tampa Florida
we are facing the same issue on the Main page -
A 403 Forbidden error means that the server denied access to the requested page. This can happen for a few reasons, such as:
- The user does not have permission to access the page.
- The page is not published yet.
- There is a misconfiguration on the server.
If you are getting 403 Forbidden errors on your website, it is important to first check that the pages are actually loading fine for users. You can do this by visiting the pages yourself or by using a tool like Google Search Console.
If the pages are loading fine for users, then the errors in the crawl report are likely false positives. This can happen if Googlebot encounters a temporary error when crawling your website. In this case, you can ignore the errors and they should eventually go away.
However, if the pages are not loading fine for users, then the errors in the crawl report are likely real. In this case, you need to fix the underlying issue that is causing the 403 Forbidden errors.
Here are some steps you can take to fix 403 Forbidden errors:
- Check the permissions on the files and folders that contain the pages that are returning 403 Forbidden errors. Make sure that the user account that Googlebot is using has permission to access these files and folders.
- Check the robots.txt file to make sure that Googlebot is not being explicitly denied access to the pages that are returning 403 Forbidden errors.
- Check the server configuration to make sure that there are no misconfigurations that could be causing the 403 Forbidden errors.
If you have tried all of these steps and you are still getting 403 Forbidden errors, then you may need to contact your web hosting provider for assistance.
Will the 403 errors affect the SEO/Ranking
As for whether or not 403 Forbidden errors will affect your SEO/ranking, it depends on a few factors.
-
If the pages that are returning 403 Forbidden errors are important pages for your website, then the errors could potentially have a negative impact on your SEO and ranking.
-
However, if the pages that are returning 403 Forbidden errors are not important pages for your website, then the errors are unlikely to have a significant impact on your SEO and ranking.
It is best to fix 403 Forbidden errors as soon as possible. This will help to ensure that Googlebot can access all of the pages on your website and that your website is crawlable and indexable.
Warm Regards
Rahul Gupta
Suvidit Academy
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How get rid of 403 crawl error?
My wordpress website has 162 crawl 403 errors. Based on what I read it means that the server is denying crawlers to access the pages. The pages itself will load so guessing it's just an issue with crawlers only. How do I go about fixing this issue?
On-Page Optimization | | emrekeserr30 -
PDF Instructions come up in Crawl report as Duplicate Content
Hello, My ecommerce site has many PDF instruction pages that are being marked as duplicate content in the site crawl. Each page has a different title, and then a PDF displayed in an iframe with a link back to the previous page & to the category that the product is placed in. Should I add text to the pages to help differentiate them? I included a screenshot of the code that is on all the pages. Thanks! Justin 9tD9HMr
On-Page Optimization | | JustinBSLW0 -
Is the HTML content inside an image slideshow of a website crawled by Google?
I am building a website for a client and i am in a dilemma whether to go for an image slideshow with HTML content on the slides or go for a static full size image on the homepage. My concern is that HTML content on the slideshow may not get crawled by Google and hence may not be SEO friendly.
On-Page Optimization | | aravinn0 -
404 errors in wordpress... Pages have never existed so why is google trying to crawl them?
I've just logged into webmaster tools and have over 100 404 errors. I'm running wordpress and I recently added child pages to 2 of my categories like so. www.mydomain.com/category1/lincolnshire www.mydomain.com/category1/cambridgeshire etc... The 404 errors though are for pages or categories I've never created though. I have over 20 root categories but decided to test adding child pages to only two of them. The 404 errors are for www.mydomain.com/category5/cambridgeshire .... It seem that gogle has tried to crawl these pages that don't exist. Can anyone explain what's going on? When I click 'linked from' in webmaster tools it's showing links from pages on my site that don't exist also.
On-Page Optimization | | SamCUK0 -
On-Page Report Card with https
Hi, Our site has a 301 redirect to https and I'm getting two different grades for my pages depending upon whether I type: https://www.domain.com (gets an A grade) or www.domain.com (gets a C grade) Is there a setting I need to use to make sure my campaign knows our site is at https? Thank you 🙂
On-Page Optimization | | GroundSix0 -
Why does my on-page report card say my page title is 403 forbidden when its not?
I'm trying to get on top of my on page stuff and I'm going through the SEO Moz on-page report cards and it says I'm scoring a fail on certain elements within the 'critical' and 'high importance' factors as my page title is '403 forbidden' but when I go on to my site, my sites CMS it's not '403 forbidden' it's the text I entered?
On-Page Optimization | | jamesj35mm0 -
Getting 403 error in forum
Hi all, I am getting 403 error for my site where it is throwing error for the following url http://www.topuniversityforum.in/members/member id/ignore and it is showing 7 similar url for 7 user ids. I want to know how can i resolve it and if it is going to have any negative effect on its ranking.
On-Page Optimization | | akhilendra0 -
How woud you deal with Blog TAGS & CATEGORY listings that are marked a 'duplicate content' in SEOmoz campaign reports?
We're seeing "Duplicate Content" warnings / errors in some of our clients' sites for blog / event calendar tags and category listings. For example the link to http://www.aavawhistlerhotel.com/news/?category=1098 provides all event listings tagged to the category "Whistler Events". The Meta Title and Meta Description for the "Whistler Events" category is the same as another other category listing. We use Umbraco, a .NET CMS, and we're working on adding some custom programming within Umbraco to develop a unique Meta Title and Meta Description for each page using the tag and/or category and post date in each Meta field to make it more "unique". But my question is .... in the REAL WORLD will taking the time to create this programming really positively impact our overall site performance? I understand that while Google, BING, etc are constantly tweaking their algorithms as of now having duplicate content primarily means that this content won't get indexed and there won't be any really 'fatal' penalties for having this content on our site. If we don't find a way to generate unique Meta Titles and Meta Descriptions we could 'no-follow' these links (for tag and category pages) or just not use these within our blogs. I am confused about this. Any insight others have about this and recommendations on what action you would take is greatly appreciated.
On-Page Optimization | | RoyMcClean0