4xx Client Error
-
I have 2 pages showing as errors in my Crawl Diagnostics, but I have no idea where these pages have come from, they don't exist on my site.
I have done a site wide search for them and they don't appear to be referenced are linked to from anywhere on my site, so where is SEomoz pulling this info from?
the two links are:
http://www.adgenerator.co.uk/acessibility.asp
http://www.adgenerator.co.uk/reseller-application.asp
The first link has a spelling mistake and the second link should have an "S" on the end of "application"
-
Is your site verified in Google? Try logging into Google Webmaster Tools and looking at the 404 report there. It often will list the incoming link for a 404 error.
-
then the only other time I've seen this happen was from inbound links that Google found on other sites. You can try searching Google itself, however you may or may not be able to discover where they exist.
-
Nope, just checked the sitemap and not reference to them on there either.
-
Thanks I will check the site map, I have done a find all search and canot find any reference to those URL's anywhere in my site.
Where does SEOmoz pull this info from?
-
Google picks up links several ways. If you are 100% sure these links are not in your own site somewhere (even on a single page, either intentionally or accidentally), check your sitemap.xml file, but be aware that they also pick up links that come from 3rd party web sites.
If you can't find the source, consider setting up a 301 redirect to resolve both of them, or alternately don't worry about them if you only have 2 listed and your site is otherwise doing well.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Assistance with High Priority Duplicate Page Content Errors
Hi I am trying to fix the high priority duplicate content URL's from my recent MOZ crawl (6 URL's) in total. Would someone from the community be able to offer some web development advice? I had reached out on the Moz Community on the main welcome page. Samantha stated that someone in web development on Moz's Q&A forum would be better suited to assist me. I took a word press class on Lynda.com, but other than that, I am a novice. I manage my site www.rejuvalon.com on Go Daddy's managed wordpress site. Thanks so much for your help! Best, Jill
Technical SEO | | justjilly0 -
WebMaster Tools keeps showing old 404 error but doesn't show a "Linked From" url. Why is that?
Hello Moz Community. I have a question about 404 crawl errors in WebmasterTools, a while ago we had an internal linking problem regarding some links formed in a wrong way (a loop was making links on the fly), this error was identified and fixed back then but before it was fixed google got to index lots of those malformed pages. Recently we see in our WebMaster account that some of this links still appearing as 404 but we currently don't have that issue or any internal link pointing to any of those URLs and what confuses us even more is that WebMaster doesn't show anything in the "Linked From" tab where it usually does for this type of errors, so we are wondering what this means, could be that they still in google's cache or memory? we are not really sure. If anyone has an idea of what this errors showing up now means we would really appreciate the help. Thanks. jZVh7zt.png
Technical SEO | | revimedia1 -
Instead of a 301, my client uses a 302 to custom 404
I've found about 900 instances of decommissioned pages being redirected via 302 to a 404 custom page, even when there's a comparable page elsewhere on the site or on a new subdomain. My recommendation would be to always do a 301 from the legacy page to the new page, but since they're are so many instances of this 302->404 it seems to be standard operating procedure by the dev team. Given that at least one of these pages has links coming from 48 root domains, wouldn't it obviously be much better to 301 redirect it to pass along that equity? I don't get why the developers are doing this, and I have to build a strong case about what they're losing with this 302->404 protocol. I'd love to hear your thoughts on WHY the dev team has settled on this solution, in addition to what suffers as a result. I think I know, but would love some more expert input.
Technical SEO | | Jen_Floyd0 -
Wordpress 4xx errors from comment re-direct
About a month ago, we had a massive jump in 4XX errors. It seems the majority are being caused by the comment tool on wordpress, which is generating a link that looks like this "http://www.turnerpr.com/blog/wp-login.php?redirect_to=http%3A%2F%2Fwww.turnerpr.com%2Fblog%2F2013%2F09%2Fturners-crew-royal-treatment-well-sort-of%2Fphoto-2-2%2F" On every single post. We're using Akismet and haven't had issues in the past....and I can't figure out the fix. I've tried turning it off and back on; I'm reluctant to completely switch commenting systems because we'd lose so much history. Anyone seen this particular re-direct love happen before? Angela
Technical SEO | | TurnerPR0 -
Error on Magento database 301 bulk update
Hi all, One of my client has a magento website and I recently received received 404 errors for about 600 links on GWT and I tried to give 301 redirection via bulk upload but i get errors. It's magento 1.7 and I have following columns on csv file. I included first sample row as well. <colgroup><col width="120"><col width="71"><col width="120"><col width="402"><col width="253"><col width="120"><col width="120"><col width="120"><col width="120"><col width="120"></colgroup>
Technical SEO | | sedamiran
| url_rewrite_id | store_id | id_path | request_path | target_path | is_system | options | description | category_id | product_id |
| 125463 | 1 | 22342342_54335 | old_link | new_link | 0 | RP | NULL | NULL | NULL | | | | | | | | | | | | The error msg I receive is below. I was wondering if anyone has tried this before and if you know you how to fix this. Manual redirection works fine but probably this first 600 error is just a start, I'll be getting more 404 errors soon, somehow i need to figure out how to fix this. I appreciate if any one has experience on this and guide me through. Thanks in advance, Here is the error: SQL query: INSERT INTO 'mgn_core_url_rewrite'
VALUES ( 'url_rewrite_id', 'store_id', 'id_path', 'request_path', 'target_path', 'is_system', 'options', 'description', 'category_id', 'product_id' )MySQL said: #1452 - Cannot add or update a child row: a foreign key constraint fails ('ayb_mgn2'.'mgn_core_url_rewrite', CONSTRAINT 'FK_101C92B9EEB71CACE176D24D46653EBA' FOREIGN KEY ('category_id') REFERENCES 'mgn_catalog_category_entity' ('entity_id') ON DELETE CASCADE ON) <colgroup><col width="120"><col width="71"><col width="120"><col width="402"><col width="253"><col width="120"><col width="120"><col width="120"><col width="120"><col width="120"></colgroup>
| | | | | | | | | | |1 -
Has anyone else gotten strange WMT errors recently?
Yesterday, one of my sites got this message from WMT: "Over the last 24 hours, Googlebot encountered 1 errors while attempting to retrieve DNS information for your site. The overall error rate for DNS queries for your site is 100.0%." I did a fetch as Googlebot and everything seems fine. Also, the site is not seeing a decrease in traffic. This morning, a client for which I am doing some unnatural links work emailed me about a site of his that got this message: "Over the last 24 hours, Googlebot encountered 1130 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%." His robots.txt looks fine to me. Is anyone else getting messages like this? Could it be a WMT bug?
Technical SEO | | MarieHaynes1 -
Client has 3 websites, for various locations & duplicate content is a big issue...Is my solution the best?
Hi guys, I have a client who has 3 websites all for different locations in the same state in Australia. Obviously this is not the best practice but in the meeting he said that each area is quite particular about where they do business. What he means is that people from one area want to do business with a website from that particular area. He has 3 domains and we have duplicate content issues. We are solving these at the moment with the canonical tag however they are redesigning the site soon. My suggestion is that we have 1 domain and sub domains for the other 2 areas. This way the people from that area will see the company is from their area. Also this way we have 1 domain to optimise and build domain authority for. Has anyone else come across this and is my solution the best for this? Thanks! Jon
Technical SEO | | Jon_bangonline0 -
Wordpress SEO Errors - Any advice?
Hi all! My site is on the WP platform and I'm having a crawl error. Wondering if you guys could possibly help me figure out what's going on? I have a good number of 404 errors where the links seems to be appended and I can't figure out why. I've scoured my individual posts and cannot seem to find the broken link? The crawl error looks a bit like this: http://preciousthingsphotography.com/2007/12/10/chicago-family-photographer-welcome/http:%2F%2Fpreciousthingsphotography.com%2F2007%2F12%2F10%2Fchicago-family-photographer-welcome%2F You can see that my original link is somehow being doubled with the slashes being replaced? This is happening on all of my posts. Any ideas as to what could be going on? Thanks so much!
Technical SEO | | ptpgen0