4xx errors
-
Hi
I checked in my campaign to look for errors on my page and i have got a report showing me a lot of 404 broken or dead links error. So how can i view the source of the broken link in order to fix it.
Thank you!
-
I don't see any "drop down" on the SEOmoz listed broken links... I also would like to know where to find the source of the broken link! Why don't show the source right inside the error report?
-
Google Webmaster Tools -- create an account there if you don't already have one -- is also a useful way to find 404 errors and track down their sources. Once your account is setup, go to Diagnostics > Crawl Errors > HTTP (this is the deault tab for the "Crawl Errors" screen).
-
You want to see where the links to the broken page are coming from?
3 options:
-
Xenu - http://home.snafu.de/tilman/xenulink.html - run that on your site and it will tell you. It's not the prettiest solution though.
-
Webmaster tools - Diagnostics > Crawl Errors - click on the page that is a 404 and it will tell you where the links are coming from.
-
SEOmoz - Set up a campaign in the pro section and in the crawl it will give you 4xx errors. Click on that then on each broken link drop down there an 'Explore links' option. That will open up OpenSiteExplorer for that link and show you where you're getting links from
-
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are these Search Console crawl errors a major concern to new client site?
We recently (4/1) went live with a new site for a client of ours. The client site was originally Point2 before they made the switch to a template site with Real Estate Webmasters. Now when I look into the Search Console I am getting the following Crawl Errors: 111 Server Errors (photos) 104 Soft 404s (blogs, archives, tags) 6,229 Not Found (listings) I have a few questions. The server errors I know not a lot about so I generally ignore. My main concerns are the 404s and not found. The 404s are mostly tags and blog archives which I wonder if I should leave alone or do 301s for each to /blog. For not found, these are all the previous listings from the IDX. My assumption is these will naturally fall away after some time, as the new ones have already indexed. But I wonder what I should be doing here and which will be affecting me. When we launched the new site there was a large spike in clicks ( 250% increase) which has now tapered off to an average of ~85 clicks versus ~160 at time of launch. Not sure if the Crawl Errors have any effect, I'm guessing not so much right now. I'd appreciate your insights Mozzers!
Reporting & Analytics | | localwork0 -
Crawl errors for pages that no longer exist
Hey folks, I've been working on a site recently where I took a bunch of old, outdated pages down. In the Google Search Console "Crawl Errors" section, I've started seeing a bunch of "Not Found" errors for those pages. That makes perfect sense. The thing that I'm confused about is that the "Linked From" list only shows a sitemap that I ALSO took down. Alternatively, some of them list other old, removed pages in the "Linked From" list. Is there a reason that Google is trying to inform me that pages/sitemaps that don't exist are somehow still linking to other pages that don't exist? And is this ultimately something I should be concerned about? Thanks!
Reporting & Analytics | | BrianAlpert780 -
Duplicate Title Errors on Product Category Pages - The best practice?
I'm getting quite a few 'Duplicate Title Error' on category pages which span over 2 - 3 pages. E.g. http://www.partwell.com/cutting-punches http://www.partwell.com/cutting-punches?page=1 http://www.partwell.com/cutting-punches?page=2 http://www.partwell.com/cutting-punches?page=3 All 4 pages currently have the same title... <title>Steel Cutting Punches</title> I was thinking of adding Page Numbers to the title of each corresponding page, thus making them all unique and clearing the Duplicate Page Title errors. E.g. <title>Steel Cutting Punches</title> <title>Steel Cutting Punches | Page 1 of 3</title> <title>Steel Cutting Punches | Page 2 of 3</title> <title>Steel Cutting Punches | Page 3 of 3</title> Is this the best way to go around it? Or is there another way that I'm not thinking of? Would I need to use the rel=canonical tag to show that the original page is the one I want to be found? Thanks
Reporting & Analytics | | bricktech0 -
404 errors more than 1.8 lacs, Duplicate Content, Duplicate title, missing meta description increasing as site is based on regular ticket selling (CRM), kindly help
Sites error increasing i.e. 404 errors more than 1.8 lacs, Duplicate Content, Duplicate title, missing meta description increasing day by day as site is based on regular ticket selling (CRM), We have checked with webmasters for 404's, but it is not easy to delete 1.8 lac entries. How to resolve this issue for future. kindly help and suggest the solution.
Reporting & Analytics | | 1akal0 -
Webmaster tools crawl errors
Hi there, iv been tracking my webmaster tools crawl errors for a while now(6 months) and im noticing some pages that are far gone 404 are still poping out on the crawl errors. - that pages have no data for xml linking, and remote linking are from pages that are far gone 404 also. that pages have 404 error page + redirect to homepage, and google still notice them with old cache content. does someone have a clue why is this happening?
Reporting & Analytics | | Or.Shvartz0 -
Google Webmasters DNS error
Hi, In my webmaster tools I have a yellow triangle stating that there is a DNS error that is preventing Google crawling my sites. The site is indexed and I have checked fetch as Google and that seems ok but the triangle is still there every time I check it. The whois sites all have the correct information and point to Hostgator who I am using. I have contacted them and they said everything seems ok. Should I just carry on as normal with my link building as the site is indexed or investigate even further? Cheers, Stuart
Reporting & Analytics | | stuart420 -
Unexplained Crawl Diagnostic Errors & Opencart
Hi, I've been looking at the crawl diagnostics for my site and trying to fix the errors that are showing up but Seomoz is producing some strange results. It's saying pages are duplicated upto 16 times but those pages dont exist. It's adding "page=3", "page=4" to the end of the product URL but I don't see how it's finding those pages, nothing on the site(as far as I can tell) is linking to them. There is no "page=3", just the one product page. Again on the duplicate content it's saying under the "other URLs" there's URLs like "http:///product-a" but again I don't see where it's finding these URLs and obviously those URL's dont work. Those three slashes aren't a typo either. So far I've reduced the amount of errors from 2,005 to 543 but the rest of them I can't make sense of. Also, what does one do when you have two products, eg: "product-a-white" and "product-a-black" to prevent Seomoz from seeing duplicates? Canonical links wont work because there's no parent item, just those two. Google Webmaster tools doesn't seem to have a problem though. Using Opencart 1.5, if it helps. Cheers,
Reporting & Analytics | | AsOneDesign0 -
Strange 404 Error URL
Can anyone help determine how a URL like "www.mycompany.com/lago_www.bad-nsfw-content.com" would appear on the "not found" crawl error list in Google Webmaster Tools? The "www.bad-nsfw-content" site has nothing to do with our company and I don't how it would get associated with our site.
Reporting & Analytics | | pbhatt0