Seek help correcting large number of 404 errors generated, 95% traffic halt
-
Hi, The following GWT screen tells a bit of the story:
site: http://bit.ly/mrgdD0
http://www.diigo.com/item/image/1dbpl/wrbp
On about Feb 8 I decided to fix a large number of 'duplicate title' warnings being reported in GWT "HTML Suggestions" -- these were for URLs which differed only in parameter case, and which had Canonical tags, but were still reported as dups in GWT.
My traffic had been steady at about 1000 clicks/day.
At midnight on 2/10, google traffic completely halted, down to 11 clicks/day.
I submitted a recon request and was told 'no manual penalty'
Also, the 'sitemap' indexes in GWT showed 'pending' for 24x7 starting then.
By about the 18th, the 'duplicate titles' count dropped to about 600 or so... the next day traffic hopped right back to about 800 clicks/day - for a week - then stopped again, down to 10/day, a week later, on the 26th.
I then noticed that GWT was reporting 20K page-not found errors - this has now grown to 35K such errors!
I realized that bogus internal links were being generated as I failed to disable the PHP warning messages.... so I disabled PHP warnings and fixed what I thought was the source of the errors.
However, the not-found count continues to climb -- and I don't know where these bad internal links are coming from, because the GWT report lists these link sources as 'unavailable'.
I'v been through a similar problem last year and it took months (4) for google to digest all the bogus pages ad recover. If I have to wait that long again I will lose much $$.
Assuming that the large number of 404 internal errors is the reason for the sudden shutoff...
How can I a) verify the source of these internal links, given that google says the source pages are 'unavailable'..
Most critically, how can I do a 'RESET" and have google re-spider my site -- or block the signature of these URLs in order to get rid of these errors ASAP??
thanks
-
Hello Rand, I've been facing a similar problem with my site. I'd really appreciate your response here - http://www.seomoz.org/q/help-fixing-the-traffic-drop-that-started-on-4-september-2012.
-
I wouldn't feel too confident that the numbers and dates Google's showing you are precise or accurate. In fact, we've seen times when GWMT is considerably off. I'd watch how Google crawls your site and look at search traffic to your pages - those are likely leading indicators that things are/will be fixed.
-
Thanks for the replies guys - - I had run Xenu on the site and it found no broken links... but still GWT error count continues to climb, and as of today
Google released a MUCH improved timeline view for the error count --- problem is, it's still showing 58K errors as of yesterday and climbing, long after I fixed them - and it wont show me where it thinks the source is...
These errors are all on internal pages BTW..
Heres the new google view
http://awesomescreenshot.com/0ef1gy6c7
The new GUI also includes a way to mark errors 'fixed' -- one by one!! I need to mark 60 thousand at once!
Also I can see the date these errors started appearing and it just doesnt make sense given that is the day my traffic started reappearing as well..
-
I agree with Rand's suggestions. I just ran a Screaming Frog crawl of the whole site on 10,233 links, 8997 URLs and got no 404s. So I think it's pretty safe to assume you've fixed the 404 issue. Here's the output of the crawl in case you'd like it for a reference: http://www.sendspace.com/file/7zui0v
I'd say:
- Definitely clean up and resubmit your XML sitemap
- Double check your backlink profile with Open Site Explorer and MajesticSEO to be sure that there aren't sites linking to URLs that no longer exist. If you find any of these make sure to 301 redirect them. Just take all the target URLs and dump them into Screaming Frog in list mode. All the links from OSE point to your homepage so they are not an issue, I don't have access to Majestic right now so I couldn't run those for you.
- You can now Submit pages in Google Webmaster Tools as well in the Fetch as Googlebot section. So you may consider submitting some of the new pages the site generates in addition to your reconsideration request to help get Google to re-crawl and find the 404s are gone.
Good luck man and please let us know if nothing changes after you implement these fixes.
-Mike
-
Hi Mark - wow, sounds really rough. I've got a few suggestions:
- First off, you need to make 100% sure that you've actually fixed the issue and that the internal links are pointing to the right places AND any old URLs that may have had internal/external links are either rel=canonicaling or 301 redirecting to the correct, updated locations.
- You might try using a few tools to verify this, including the SEOmoz Crawl Test http://pro.seomoz.org/tools/crawl-test and Screaming Frog: http://www.screamingfrog.co.uk/seo-spider/
- When you are ready, submit new XML Sitemaps to Google with the proper URLs. Make sure you've deleted/removed your old ones.
- You can also send the reconsideration request again, indicating that while you're aware this isn't a penalty, you have realized some technical/navigation issues on the site and believe you've now fixed these.
Hope this helps and wish you the best of luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Country and Language tags-Running an SEO audit on a site that definitely has more than one language, but nothing is pulling up. I don't quite understand href lang or how to go about it. HELP HELP!
Ran an SEO audit and I don't really understand country and language tags. For example, sony.com definitely has more than one language, but how do I seo check href lang ? Do I inspect the page? etc?
Technical SEO | | Mindgruver0 -
Is this a correct use of 302 redirects?
Hi all, here is the situation. A website I'm working on has a small percentage of almost empty pages. Those pages are filled "dynamically" and could have new content in the future, so, instead of 404ing them, we automatically noindex them when they're empty and remove the noindex once they have content again. The problem is that, due to technical issues we can't solve at the moment, some internal links (and URLs listed in sitemaps) to almost empty pages remain live also when pages are noindexed. In order not to waste Google crawler's time, sending it to noindexed pages through those links, someone suggested us to redirect those pages to our homepage with a 302 (not a 301 since they could become indexable again, so it can't be a permanent redirect). We did that, but after some weeks Search Console reported an increase in soft 404s: we checked it and it is 100% related to the 302 implementation. The questions are: is this a correct use of 302 redirects? Is there a better solution we haven't thought about? Maybe is it better to remove 302s and go back to the past situation, since linking to noindexed pages isn't such a big problem? Thank you so much!
Technical SEO | | GabrieleToninelli0 -
404 or 503 Malware Content ?
Hi Folks When it comes to malware , if I have a site that uses iframe to show content off 3rd party sites which at times gets infected. Would you recommend 404 or 503 ing those pages with the iframe till the issue is resolved ? ( I am inclined to use 503 .. ) Then take the 404/503 off and ask for a reindex ( from GWT malware section ) OR Ask for a reindex as soon as the 404/503 goes up. ( I do understand we are asking to index as non existing page , but the malware warning gets removed ) PS : it makes sense for this business to showcase content using iframe on these special pages . I do understand these are not the best way to go about SEO.
Technical SEO | | Saijo.George0 -
Increase 404 errors or 301 redirects?
Hi all, I'm working on an e-commerce site that sells products that may only be available for a certain period of time. Eg. A product may only be selling for 1 year and then be permanently out of stock. When a product goes out of stock, the page is removed from the site regardless of any links it may have gotten over time. I am trying to figure out the best way to handle these permanently out of stock pages. At the moment, the site is set up to return a 404 page for each of these products. There are currently 600 (and increasing) instances of this appearing on Google Webmasters. I have read that too many 404 errors may have a negative impact on your site, and so thought I might 301 redirect these URLs to a more appropriate page. However I've also read that too many 301 redirects may have a negative impact on your site. I foresee this to be an issue several years down the road when the site has thousands of expired products which will result in thousands of 404 errors or 301 redirects depending on which route I take. Which would be the better route? Is there a better solution?
Technical SEO | | Oxfordcomma0 -
User Created Subdomain Help
Have I searched FAQ: Yes My issue is unique because of the way our website works and I hope that someone can provide some guidance on this.Our website http://breezi.com is a website builder where users can build their own website. When users build their site it creates a sub-domain route to their created site, for example: http://mike.breezi.com. Now that I have explained how our site works here is the problem: Google Webmaster Tools and Bing Webmaster Tools are indexing ALL the user created websites under our TLD and thus it is our impression that any content created in those sub-domains can confuse the search engine to thinking that the user created website and content is relevant to _OUR _main sitehttp://breezi.com. So, what we would like to know if there is a way to let search engines know that the user created sites and content is not related to our TLD site. Thanks for any help and advise.
Technical SEO | | breezi0 -
Directing traffic to subdomain
Hi everyone, For this question, please note that we will be directing traffic using a load balancer (an Amazon ELB, to be specific) rather than using a 301 redirect. The question: Will the SEO ranking of links to pages be negatively impacted by directing traffic to servers with a different hostname (or subdomain) within mycompany.com? For example, we would like to have www.mycompany.com load balanced between host1.mycompany.com and host2.mycompany.com. Many thanks for your input! Jay
Technical SEO | | SeoExpansion0 -
Duplicate title tag error
Hi all, I am new to SEO, and we have just launched a new version of our site (kept the domain name the same though). I keep getting errors for duplicate title tags - e.g. www.sandafayre.com/default.aspx and www.sandafayre.com/Default.aspx, www.sandafayre.com/StampAuctions.aspx and www.sandafayre.com/stampauctions.aspx (plus loads others :o). The only difference each time seems to be the capitalisation of the first character - but I though URLs were not case sensitive? I've been advised to add the rel canonical tag to one of the pages, but the problem is I really only have 1 version of each page! Can anybody help please? Many thanks in advance! Nikki
Technical SEO | | Stampy780 -
Traffic drop after migration?
Hi everybody. One of my clients is looking to move their e-commerce site to a new platform in the next few weeks. However, they been told by several sources that traffic will drop after a migration, which they want to avoid in the run up to Christmas. I've not heard this before, and I thought as long as you pay attention to structure, indexing and redirects a migration should have no impact. We'd be moving to a site with cleaner code, so surely there wouldn't be some kind of penalty for that? Your thoughts would be great! S
Technical SEO | | neooptic0