Will a large percentage of 404 links negatively impact SERP performance?
-
We discovered a broken link and issue with a dynamically generated sitemap that resulted in 9,000+ pages of duplicate content (namely there was not actual 404 page, but content for a 404 page that populated on the broken page).
We've corrected that issue so the 404 page is working correctly now and there aren't any more broken links on the site. However, we just reviewed our Google crawl report, and saw that now there are 9,000+ 404 links in the Google index.
We discovered the initial error when our SERP performance dropped 60% in a month.
Now that we've corrected all the duplicate content pages, will vast number of 404 pages negatively impact SERP results? If so, do you recommend doing 301 redirects to the page it should have gone to, and do you know of any automated tools performing the 301's (it's a standard HTML site, no CMS involved).
Thanks for your help!
-
Thanks for your response. Does it seem probable that this issue caused the 60% drop in SERP performance? The only other variable near the same time was changing hosting providers. We have moved to this provider for other clients, and never saw this kind of change, but that was usually at the beginning of the SEO campaign, not in the middle.
-
Give it a bit of time and it should fix itself. Google will crawl it and find that it is no longer broken.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google is showing erroneous results on SERPs page
Hello, All, In April, two months ago, we caught a hack on a client's website. It created about 40 pages in what looked to be a black hat link tactic. We removed the pages, resubmitted the sitemap.xml (it reprocessed) and ran it through screaming frog to confirm all the pages were gone, but the forty pages still show up in the search results for a site search. We have both the www. and non www. version of sites claimed and set a preference. Nothing is awry with the robots.text. We're not really sure what to do to resolve it. We asked Google to recrawl (fetch) the site. I'm not sure what's going on with it. The website's name is fortisitsolutions.com The site search bringing up the pages from the hack is below. site:www.fortisitsolutions.com Any ideas?
On-Page Optimization | | Cazarin-Interactive0 -
Product Page Links
I have a product category page at https://www.hurtlegear.com.au/s1000rr/ which currently has 38 products on it. Problem is, all the product titles start with the name of the text: "bmw s1000rr" (because that's what they are) - so that means there are 38 anchored internal links on that page, all starting with the same keyword. You can see how that might look to the Google crawler. Recently that page dropped from around 15 to outside the top 100, and Moz tells me that the page is keyword stuffed with "bmw s1000rr" (no suprise) so I'm guessing that may be the reason the page has disappeared out of the SERPs. I don't really want to change all the product titles (then they wouldn't make sense) so I'm just wondering if there is any way around this? Is there some way of telling Google that this is a product category page and therefore to ignore the anchor text in all of those product links? Can/should the links have some kind of markup on them? Or is the page beyond help? Basically I'm looking at a way of keeping the product titles as they are, but avoiding a page penalty from Google somehow. I'm a bit of a newbie, any suggestions would be most appreciated. Cheers, Graeme
On-Page Optimization | | graeme720 -
Impact of multiple links on the same page to the same url (different anchor text) ?
Hi, On our category pages, for every product we have several links pointing to the product : on the image, on the product name, on the short description, on "read more", and a javascript onclick on the entire div. Could this have a negative impact for link juice distribution, or is it counted as only 1 link with the first anchor text found on the page ? Thanks,
On-Page Optimization | | Strelok0 -
Impact of number of outgoing links on Page Rank of an optimized page?
What is the current best practice on preferred number of outbound links on a page you are trying to rank with: According to online resources form a pure page rank perspective a high number of outbound follow links can have a negative impact not only on child pages but also the page itself
On-Page Optimization | | thomaspro
http://pr.efactory.de/e-outbound-links.shtml Other resources suggest that particularly placing high quality outbound links on a page (nofollow) increases the trust and authority of a page Are there any other elements to keep in mind? Is the best practice to avoid any follow links on a page you want to rank well in Google for? Thanks /T0 -
404 Error to homepage
Is there any risk of forwarding 404 links directly to homepage. I already have 404 page but now google showing you have lots of 404 links and some of them i can't control to fix this issue. Is there any problem if i do so for SEO ?
On-Page Optimization | | chandubaba0 -
Which pages on my site should I back link to
The majority of the back links I have been creating link directly to our home page and to the store page. Is this the best approach or should I be trying to spread the links throughout our site to include product categories and subcategories etc?
On-Page Optimization | | Hardley0 -
What is the rule of thumb for adding links to your blog posts?
I have started keeping detailed records of all my blog postings. Is it ok to link to my own url? I make sure to link to another blog posting in each post, and link to sources as well. Thanks in advance for the advice!
On-Page Optimization | | rivercityransom0 -
Alternatives for having less then 100 links per page
Guys, I'm aware of the recomendation of having <100 links per page. The thing is I'm running a vacation rental website (my clients pay me to advertise their properties on my website). We use an AJAX interface with pagination to show the properties. So I have cities that have +400 properties on them... the pagination works fine but google can't crawl trough it (there is a google doc about making ajax systems crawlable, but that would invove a huge rewrite of our code and I dont understand how it helps the SEO). So my question is: what do I do to mantain each property having at least one link pointing to them at the same time that I keep the # of links in each page <100 ? Any suggestions ?
On-Page Optimization | | pqdbr0