20 x '400' errors in site but URLs work fine in browser...
-
Hi, I have a new client set-up in SEOmoz and the crawl completed this morning... I am picking up 20 x '400' errors, but the pages listed in the crawl report load fine... any ideas?
example -
-
Most major robots obey crawl delays. You could check your errors in Google Webmaster Tools to see if your site is serving a lot of error pages when Google crawls.
I suspect Google is pretty smart about slowing down its crawl rate when it encounters too many errors, so it's probably safe to not include a crawl delay for Google.
-
Sorry, one last question.
Do I need to add a similar delay for Google Bots, or is this issue specifically a Roger Bot problem?
Thanks
-
Fantastic, thanks, Cyrus and Tampa, prevented many more hours of scratching head!!!
-
Hi Justin,
Sometimes when rogerbot crawls a site, the servers and/or the content management system can get overwhelmed if roger is going to fast, and this causes your site to deliver error pages as roger crawls.
If the problem persists, you might consider installing a crawl delay for roger in your robots.txt file. It would look something like this:
User-agent: rogerbot
Crawl-delay: 5This would cause the SEOmoz crawlers to wait 5 seconds before fetching each page. Then, if the problem still persists, feel free to contact the help team at [email protected]
Hope this helps! Best of luck with your SEO!
-
Thanks Tampa SEO, good advice.
Interestingly, the URL listed in SEOmoz is as follows:
www.morethansport.co.uk/brand/adidas?sortDirection=ascending&sortField=Price&category=sport and leisure
But when I look at the link in the referring page it is as follows:
/brand/adidas?sortDirection=ascending&sortField=Price&category=sport%20and%20leisure
notice the "%" symbol instead of the spaces.
The actual URL is the one listed in SEOmoz but even if I copy and paste the % version, the browser removed the '%' and the page loads fine.
I still can't get the site to throw-up a 400.
-
Just ran the example link that you provided through two independent HTTP response code checkers, and both are giving me a 200 response, i.e. the site is OK.
This question has been asked before on here; you're definitely not the first person to run into the issue.
One way to diagnose what's going on is to dig a little deeper into the crawling report that SEOmoz generated. Download the CSV file and look at the referring link, i.e. on which page Roger found the link. Then go to that page and look if your CMS is doing anything weird with the way it outputs the links that you create. I recall someone back in December having the same issue and eventually resolved it by noticing that his CMS put all sort of weird slashes (i.e. /.../...) into the link.
Good luck!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate 'meta title' issue (AMP & NON-AMP Pages)
how to fix duplicate meta title issue in amp and non-amp pages? example.com
On-Page Optimization | | 21centuryweb
example.com/amp We have set the 'meta title' in desktop version & we don't want to change the title for AMP page as we have more than 10K pages on the website. ----As per SEMRUSH Tool---- ABOUT THIS ISSUE It is a bad idea to duplicate your title tag content in your first-level header. If your page’s <title>and <h1> tags match, the latter may appear over-optimized to search engines. Also, using the same content in titles and headers means a lost opportunity to incorporate other relevant keywords for your page.</p> <p><strong>HOW TO FIX IT</strong></p> <p>Try to create different content for your <title> and <h1> tags.<br /><br />this is what they are recommending, for the above issue we have asked our team to create unique meta and post title for desktop version but what about AMP page?<br /><br />Please help!</p></title>0 -
Is it better to shorten my existing url to use only keyword after domain with a 301 redirect from existing url
I have a long existing URL that has included my key word but the url has about 5 additional words in the text ( eg url would have " /super widgets in stock at the widget store " as url text after domain. keywords is super widget The URL was at the top of search results for my keyword for many years until recently. Is it better to shorten my url text to now use only my keyword " /super-widgets " after the domain with a 301 direct from my existing url to optimize it Thanks
On-Page Optimization | | mrkingsley2 -
How to Structure URL's for Multiple Locations
We are currently undergoing a site redesign and are trying to figure out the best way to structure the URL's and breadcrumbs for our many locations. We currently have 60 locations nationwide and our URL structure is as follows: www.mydomain.com/locations/{location} Where {location} is the specific street the location is on or the neighborhood the location is in. (i.e. www.mydomain.com/locations/waterford-lakes) The issue is, {location} is usually too specific and is not a broad enough keyword. The location "Waterford-Lakes" is in Orlando and "Orlando" is the important keyword, not " Waterford Lakes". To address this, we want to introduce state and city pages. Each state and city page would link to each location within that state or city (i.e. an Orlando page with links to "Waterford Lakes", "Lake Nona", "South Orlando", etc.). The question is how to structure this. Option 1 Use the our existing URL and breadcrumb structure (www.mydomain.com/locations/{location}) and add state and city pages outside the URL path: www.mydomain.com/{area} www.mydomain.com/{state} Option 2 Build the city and state pages into the URL and breadcrumb path: www.mydomain.com/locations/{state}/{area}/{location} (i.e www.mydomain.com/locations/fl/orlando/waterford-lakes) Any insight is much appreciated. Thanks!
On-Page Optimization | | uBreakiFix0 -
URL advice
Hi & thanks for looking, I'm not sure if I've adopted the best SEO URL structure for my site, www.vintageheirloom.com For instance, www.vintageheirloom.com/product-category/authentic-designer-vintage-bags/ Works great for the top level category 'All bags', as I'm trying to keyword authentic designer vintage bags. However the sub categories for instance 'Clutch bags' appears as, www.vintageheirloom.com/product-category/authentic-designer-vintage-bags/vintage-clutch-bags/. As you can see at the moment this URL contains duplicate terms vintage & bags. I'm guessing that duplicate keywords in a url isn't too smart, but should amend with Option 1, 2, 3 or something completely different? Option 1 - keep the top level category url the same, change the subcategory: www.vintageheirloom.com/product-category/authentic-designer-vintage-bags/clutch/ Option 2 - amend the top level category: www.vintageheirloom.com/product-category/authentic-designer/vintage-clutch-bags/ Option 3 - amend the top level category as this: www.vintageheirloom.com/product-category/bags/authentic-designer-vintage-clutch/ By the way I'm using WordPress with Woocommerce. I've asked but it's not possible with some technical issues to remove the /product-category/ section. But each product is for example just: www.vintageheirloom.com/shop/vintage-coach-yellow-duffel-sac-bag/ .... sweet. Thanks again !!
On-Page Optimization | | well-its-1-louder0 -
What are the benefits of the URL meta tag?
We have too many meta tags and want to get rid of all the outdated ones. However, we don't want to eliminate valuable meta tags by mistake. So, before we say goodbye to the URL meta tag, we want to make sure we understand the pros and cons, if any. By the way, we are not referring to canonical URL tags, just URL as in:
On-Page Optimization | | GRIP-SEO0 -
URL extensions naming
I have always wrote URL extensions as www.mysite.com/two_words.html .... when I need to separate two words, I use _ as the separator ... I am a first time SEO Moz user ... I While looking around the tools on SEO Moz, I happened to stumble across the on-page analysis. A great tool indeed, rather worryingly though, one issue it flagged to me was my URL extension "Characters which are less commonly used in URLs may cause problems with accessibility, interpretation and ranking in search engines. It is considered a best practice to stick to standard URL structures to avoid potential problems." Can someone advice me if this really is a problem, its just not this project, its tons of sites I have already developed that I am also worried about ... I always write file extensions with more than one word using _ to separate the words. How should I write the extension, I am almost embarrassed to ask this question ... Surely, even Google's algorithms are not smart enough to decipher two words without some some sort of spacing .... Regards J
On-Page Optimization | | Johnny4B0 -
Close URL owned by competitors.
The following example is exactly analogous to our situation (site names slightly altered😞 We own www.business-skills.com. It's our main site. We don't own, and would rather avoid paying for, www.businessskills.com. It's a parked domain and the owners want a very large sum for it. We own www.business-skills.co.uk and point it to our main site. We don't own www.businessskills.co.uk. This is owned by our biggest competitor. We also own www.[ourbrand].com and .co.uk, and point them to the main site. My question is - how much traffic do you think we may be missing due to these nearly-but-not-quite URL matches? Does it matter in terms of lost revenue? What sort of things should I be looking at to get a very rough estimate?
On-Page Optimization | | JacobFunnell0