Error Code 612: Error response for robots.txt
-
Hi,
We are getting Error Code 612: Error response for robots.txt in our crawl but everything looks to be ok with the robots file.
Can you confirm what is wrong?
Thanks
-
Hi Wendy! Kristina from Moz's Help Team here - I wanted to chime in here as I had a chance to look over your site and it appears that your site is blocking AWS.
We are getting a "403 Forbidden Error" when attempting to access your site as AWS: http://screencast.com/t/P858BVEQk
Additionally, this 3rd party tool, hurl.it which also uses AWS is getting an Internal Server Error when trying to access your site as well: http://screencast.com/t/N5T822Zpdo
Please re-connect with your developers and make sure they're addressing the issue with your site blocking AWS and that should resolve the issue you're seeing currently in Moz.
I hope this helps but please let us know if there's more we can assist with!
-Kristina -
Chiaryn can you take a look at something? I am getting a 612 error on this website: www.seminolepowersports.com the developer is telling me there is nothing wrong from what they can see and they are saying MOZ is using AWS server and they have blocked the AWS server from the site. Questions for you:
- Does MOZ use AWS server? and could it be the site is blocking then?
- or is the site confusing the Moz bot, Roger?
Thank you for your input.
Wendy -
Hey David, thanks for your question.
I took a look at your campaign and it seem that this is a bit different than the case in the previous post that Thomas linked to in his reply.
It actually looks like you have a redirect loop in place which could be confusing our bot, Roger. The robots.txt page redirects to the www version of the homepage, which redirects to an /en/home subfolder, which redirects to /en/home?r=US. You can verify this using the third party tool https://httpstatus.io/ (http://www.screencast.com/t/pk4fvGXJ1).
I can't say with entire certainty that this is causing the error message you are seeing, as I have never seen a redirect loop on the robots.txt file for a site, but I do know that the crawler will only follow two redirects and any more redirects than that will prevent us from accessing the page, which would likely be reported as an error with the robots.txt.
I would recommend fixing that so that you have only one 301 in place that points to a 200 page or by having the robots.txt file for the site respond with a 200 status. This will need to be done by your site administrator or developer.
-
I'm not sure if this is of any help to you? https://mza.seotoolninja.com/community/q/without-robots-txt-no-crawling
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Crawler was not able to access the robots.txt
I'm trying to setup a campaign for jessicamoraninteriors.com and I keep getting messages that Moz can't crawl the site because it can't access the robots.txt. Not sure why, other crawlers don't seem to have a problem and I can access the robots.txt file from my browser. For some additional info, it's a SquareSpace site and my DNS is handled through Cloudflare. Here's the contents of my robots.txt file: # Squarespace Robots Txt User-agent: GPTBot User-agent: ChatGPT-User User-agent: CCBot User-agent: anthropic-ai User-agent: Google-Extended User-agent: FacebookBot User-agent: Claude-Web User-agent: cohere-ai User-agent: PerplexityBot User-agent: Applebot-Extended User-agent: AdsBot-Google User-agent: AdsBot-Google-Mobile User-agent: AdsBot-Google-Mobile-Apps User-agent: * Disallow: /config Disallow: /search Disallow: /account$ Disallow: /account/ Disallow: /commerce/digital-download/ Disallow: /api/ Allow: /api/ui-extensions/ Disallow: /static/ Disallow:/*?author=* Disallow:/*&author=* Disallow:/*?tag=* Disallow:/*&tag=* Disallow:/*?month=* Disallow:/*&month=* Disallow:/*?view=* Disallow:/*&view=* Disallow:/*?format=json Disallow:/*&format=json Disallow:/*?format=page-context Disallow:/*&format=page-context Disallow:/*?format=main-content Disallow:/*&format=main-content Disallow:/*?format=json-pretty Disallow:/*&format=json-pretty Disallow:/*?format=ical Disallow:/*&format=ical Disallow:/*?reversePaginate=* Disallow:/*&reversePaginate=* Any ideas?
Getting Started | | andrewrench0 -
Our crawler was not able to access the robots.txt file on your site
Hello Mozzers! I've received an error message saying the site can't be crawled because Moz is unable to access the robots.txt. I've spoken to the webmaster and he can't understand why the robot.txt can't be accessed in Moz. https://www.thefurnshop.co.uk/robots.txt and Google isn't flagging anything up to us. Does anyone know how to solve this problem? Thanks
Getting Started | | tigersohelll0 -
How Do I Scan My New Site & Grade My Work With The Robots Turned Off? For Pre-Inspection before I launch my Site?
I have a new site that has all the bots turned off so google can't index my site until I'm finished it. I've been working on this site for a couple months now optimizing and I was wondering if there was anyway I can run a preliminary scan on the site for my titles, URLs, Headers, Alt Tags and pretty much anything else that will grade my work and tell me if i did anything wrong? Can MOZ do this with the Bots turned off? Thanks
Getting Started | | Inframan0 -
0 status codes
Hi, I am hoping someone could help explain - I have a screaming frog account and ran a crawl using it, it returned a lot of 0 status codes 'connection timeout', 'DNS lookup failed' and 'Connection Refused'. However, the SEO MOZ Crawl doesn't detect any of this. I am fairly new to these tools, do you know why this is? Thanks a lot! 🙂
Getting Started | | eLab_London0 -
Where do I enter a promotion code?
Hi, I've been referred from Mailchimp and I have a promo code. It said I could redeem it at the check out, but I didn't find it. How can I enter it now? (I am currently on the 30 days free trial) Thanks!
Getting Started | | Stukers0 -
Error help for newbie please
Hi, I signed up after seeing the videos on Udemy and YouTube (white board Friday) So I've started the free trial and am looking forward to getting my site ranking higher. I've crawled my site www.sussexchef.com and its come back with the following errors (please see below. 608 I'm sure this information is very important but I have no idea how to fix the 608, I found a robots.txt in my directory and deleted it as I think that maybe the problem? I crawled the site twice by accident so will have to wait till tomorrow to find out? 404 I found it quite hard to find the broken links at first but once I realized all the information I needed was in the table I think I got them all. did I miss a tutorial or am I just a little out of my depth here? 503 I have no idea how to fix these, I can click the links and it takes my to that page or file. so how can it be the server down? Or is this because they are links to PDF's? should i convert them to jpegs and give them meta data? I'd be grateful for any help anyone has to offer as I'm keen to learn how to promote my site better. Crawl Error Moz encountered an error on one or more pages on your site608 Page not Decodable as Specified Content EncodingInvestigate the cause of this issue on the Help Hub.Discovered: Sep 2 - 8Crawl Diagnostics Crawl Issue Found: 404 Errors 10% of site pages served 404 errors during the last crawlA high percentage of 404 pages can indicate a problem with the internal link structure.Crawl Diagnostics 404 : Received 404 (Not Found) error response for page. http://sussexchef.com/wedding-caterers.aspx4041215http://sussexchef.comN/AView Issue 404 : Received 404 (Not Found) error response for page. http://sussexchef.com/dinner-party-catering.aspx404115http://sussexchef.comN/AView Issue 404 : Received 404 (Not Found) error response for page. http://sussexchef.com/christmas-party-catering.aspx404115http://sussexchef.comN/AView Issue 404 : Received 404 (Not Found) error response for page. http://sussexchef.com/wedding-cakes.aspx404115http://sussexchef.comN/AView Issue 404 : Received 404 (Not Found) error response for page. http://sussexchef.com/outdoor-catering-specialists.aspx404115http://sussexchef.comN/AView Issue 404 : Received 404 (Not Found) error response for page. http://sussexchef.com/hen-party-cupcake-classes.aspx404115http://sussexchef.comN/AView Issue 404 : Received 404 (Not Found) error response for page. http://sussexchef.com/funeral-caterers.aspx404115http://sussexchef.comN/AView Issue 404 : Received 404 (Not Found) error response for page. http://sussexchef.com/drinks-service.aspx404115http://sussexchef.comN/AView Issue 404 : Received 404 (Not Found) error response for page. http://sussexchef.com/private-party-catering.aspx404115http://sussexchef.comN/AView Issue 404 : Received 404 (Not Found) error response for page. http://sussexchef.com/corporate-catering.aspx404115http://sussexchef.comN/AView Issue 404 : Received 404 (Not Found) error response for page. http://sussexchef.com/caterers.aspx40401http://sussexchef.com/wedding-catering/N/AView Issue 404 : Received 404 (Not Found) error response for page. http://sussexchef.com/funeral-caterers-brighton.aspx Crawl Issue Found: 500 Errors More than 5% of site pages served 500 errors during the last crawlExcessive 500 errors impact search engine indexation. Double check that your website is serving pages properly to both users and crawlers. 503 : Received 503 (Service Unavailable) error response for page. http://sussexchef.com/wp-content/uploads/2013/08/Wedding-Packages-2014.pdf141N/A50302N/A 503 : Received 503 (Service Unavailable) error response for page. http://sussexchef.com/wp-content/uploads/2013/08/Vegetarian-BBQ-Menu.pdf141N/A50301N/A 503 : Received 503 (Service Unavailable) error response for page. http://sussexchef.com/wp-content/uploads/2013/08/AllInclusiveMenuPrices1.pdf141N/A50301N/A 503 : Received 503 (Service Unavailable) error response for page. http://sussexchef.com/Finger%20Buffet.pdf141N/A50301N/A 503 : Received 503 (Service Unavailable) error response for page. http://sussexchef.com/Dessert.pdf141N/A50301N/A 503 : Received 503 (Service Unavailable) error response for page. http://sussexchef.com/wp-content/uploads/2013/08/SummerMenu.pdf141N/A50301N/A 503 : Received 503 (Service Unavailable) error response for page. http://sussexchef.com/HotorColdBuffet.pdf141N/A50301N/A 503 : Received 503 (Service Unavailable) error response for page. http://sussexchef.com/Canape%20Menu.pdf141N/A50301N/A 503 : Received 503 (Service Unavailable) error response for page. http://sussexchef.com/wp-content/uploads/2013/08/Susex-Chef-Xmas-Dinner-artwork-file.pdf141N/A50301N/A 503 : Received 503 (Service Unavailable) error response for page. http://sussexchef.com/BBQ.pdf141N/A50301N/A 503 : Received 503 (Service Unavailable) error response for page. http://sussexchef.com/wp-content/uploads/2013/08/Fun-Finger-Buffets.pdf141N/A50301N/A 503 : Received 503 (Service Unavailable) error response for page. http://sussexchef.com/HogRoast.pdf141N/A50301N/A 503 : Received 503 (Service Unavailable) error response for page. http://sussexchef.com/wp-content/uploads/2013/08/Private-Chef-Dinner-Packages.pdf141N/A50301N/A 503 : Received 503 (Service Unavailable) error response for page. http://sussexchef.com/Salad%20Menu.pdf141N/A50301N/A 503 : Received 503 (Service Unavailable) error response for page. http://sussexchef.com/wp-content/uploads/2013/08/Childrens-Menu.pdf141N/A50301N/A 503 : Received 503 (Service Unavailable) error response for page. http://sussexchef.com/wp-content/uploads/2013/08/HotForkBuffetDelivererd.pdf141N/A50301N/A 503 : Received 503 (Service Unavailable) error response for page. http://sussexchef.com/Afternoon%20Tea.pdf141N/A50301N/A 503 : Received 503 (Service Unavailable) error response for page. http://sussexchef.com/wp-content/uploads/2013/08/Christmas-Promo.pdf141N/A50301N/A 500 : Received 500 (Internal Server Error) error response for page. http://sussexchef.com/?attachment_id=51110N/A50000N/A 503 : Received 503 (Service Unavailable) error response for page. http://sussexchef.com/wp-content/uploads/2013/08/Biography.pdf10N/A50300N/A 503 : Received 503 (Service Unavailable) error response for page. http://sussexchef.com/?attachment_id=62810N/A50300N/A 503 : Received 503 (Service Unavailable) error response for page. http://sussexchef.com/wp-content/uploads/2014/02/2014-02-01-11.12.00.jpg
Getting Started | | SussexChef830 -
'not a valid url' error in campaign set up
I get the error not a valid url when I'm trying to set up a campaign. I know it's a valid url. I have tried with www, non-www, http://, https:// when I do the https it lets me start, but then I get an error that https is forwarding to http and I need to use that. When I then put in the http, I get the original error. thanks in advance for your help.
Getting Started | | HighVoltage0 -
Campaign.crawl-seed.bad-response
I am trying to set up a new campaign for a website, but I keep getting this error message... campaign.crawl-seed.bad-response 😞 I have no idea what the problem is. Can you tell me what I am suppose to do to fix this? The URL I am trying to set up is www.aboutplcs.com
Getting Started | | ChadC0