How to remove 404 pages wordpress
-
I used the crawl tool and it return a 404 error for several pages that I no longer have published in Wordpress. They must still be on the server somewhere?
Do you know how to remove them? I think they are not a file on the server like an html file since Wordpress uses databases?
I figure that getting rid of the 404 errors will improve SEO is this correct?
Thanks,
David
-
Yeah...as others have noted, there often is the live link somewhere else that points to a page that is now gone...
So a 404 really is the LINK page....as long as it's out there, it'll point to that non-existant page....so a 301 can help, or (this was fun) you can 301 the incoming 404 link BACK to the linking page itself....
teeHee...yeah, not such a good idea but a tactic that we did have to use about 4 years ago to get a spam directory to "buzz off!!!"
-
Hey David
Once you publish a page/post in WordPress and submit a sitemap, you are stuck with those pages. I've experienced this problem a lot as I use WordPress often. Once you trash a page there and delete it permanently, it's not stored anywhere in the WordPress CMS. They are just reading as 404s since they existed and now no longer exist.
As stated above, just make sure you are not linking to your trashed page anywhere in your site.
I've done a couple things with 404 Pages on my WordPress sites:
1. Make an awesome 404 page so that people will stay on the site if they found your 404 page on accident. Google will eventually stop crawling 404s so this is a good temporary way to engage users.
2. 301 Redirect the 404s to relevant pages. This helps keep your link juice and also helps with the user experience (since they are reaching a relevant page)
Hope that helps!
-
404's are a natural part of websites, Google understands that. As long as you don't have links to pages on your site that are 404'ing you're fine. So basically, just make sure your website is not the source of your 404's.
-
Anything you type after your domain which isn't an actual page will return a not found error; it doesn't mean the page exists somewhere. [Try entering yourdomain.com/anythingyouwant and you will get a 404.] Or am I misunderstanding the question? In any case, 404 errors are not necessarily bad for SEO, as long as they are not harming the user experience.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Reclaim lost links to old pages?
We recently moved our site to a new CMS and did a complete redesign, with new content. It's really hit our SEO. Open site explorer is telling me there are lots of 404 web pages which were basically old product pages that no longer exist, since the way we have structured the site has changed. So is it worth trying to reclaim these links? If so, how can I do that without building pages at the URL - will a 301 redirect be enough?
Moz Pro | | jbritchford0 -
Social pages not lining up with my website/listing
Hi! Newbie here... 🙂 On my Moz report and I have been told that my social sites are not lining up with my webpage. Any tips on getting this done? Also on my Moz Analytic report, I've added my social sites but only facebook is coming up... Any tips would be appreciated!!! Thanks!
Moz Pro | | fullerton0 -
1 page crawled - again
Just had to let you know that it happend again. So right now we are at 2 out of the last 4 crawls. Uptime here is 99,8% for the last 30 days, with a small downtime due to an update process at the 18/5 from around 2:30 to 4:30 GMT In relation to: http://moz.com/community/q/1-page-crawled-and-other-errors
Moz Pro | | alsvik0 -
Crawl Diagnostics 403 on home page...
In the crawl diagnostics it says oursite.com/ has a 403. doesn't say what's causing it but mentions no robots.txt. There is a robots.txt and I see no problems. How can I find out more information about this error?
Moz Pro | | martJ0 -
On-Page SEO Fixes - Are They Relative?
So, I'm implementing on-page fixes for a site that my company runs SEO services for (www.ShadeTreePowersports.com). However, I was wondering if there was a way to rank a pages' SEO quality, in general? As of now, it seems like the only way your recommendations can be consumed and altered is on a keyword basis. However, this seems be the reason I have a good amount of my F-Grades. Since my website sells powersports apparel and accessories, we cover a variety of applicable (but different) keywords like 'Motorcycle parts' or 'snow tubes,' because we sell so many different types of products. But, when I look at my F-Grades - SEOMoz is telling me my homepage is ranking poorly for a multitude of those pertinent keywords - but only because my page isn't catered specifically to each of them (IE: 'Snowmobile Parts' - 'Water Sport Apparel') But, with so many different types of products, catering to a specific one is impossible and would be detrimental. Is there a way to see how a page ranks, without factoring in those keywords? Or a better way that I can use these recommendations more efficiently? Thanks guys!
Moz Pro | | BrandLabs0 -
Adding canonical still returns duplicate pages
According to SEOmoz, several of my campaigns show that I have duplicate pages (SEOmoz Errors). Upon reading more about how to resolve the issue, I followed SEOmoz's suggestion to add rel='canonical' <links>to each page. After the next SEOmoz crawl, the number of SEOmoz Errors related to duplicate pages remained the same and the number of SEOmoz notices shot up indicating that it recognized that I added rel='canonical'.</links> I'm still puzzled as to why the SEOmoz errors did not go down with respect to duplicate page errors after I added rel='canonical', especially since SEOmoz noticed that I added them. Can anyone explain this to me? Thanks,
Moz Pro | | MOZ2
Scott.0 -
Only few pages (308 pages of 1000 something pages) have been crawled and diagnosed in 4 days, how many days till the entire website is crawled complete?
Setup campaign about 4-5 days ago and yesterday rogerbot said 308 pages were crawled and the diagnostics were provided. This website has over 1000+ pages and would like to know how long it would take for roger to crawl the entire website and provide diagnostics. Thanks!
Moz Pro | | TejaswiNaidu0 -
My campaigns are not analyzing all my pages.
Hi I created a campaign against http://www.universalpr.com, and this campaign reports that only one page has been crawled. This site uses a jsvascript redirect to the real page which can be found through the following: www.universalpr.com/wps/portal/universal/univhome/!ut/p/c5/04_SB8K8xLLM9MSSzPy8xBz9CP0os_hQdwtfCydDRwN_Jw9LA0-LAOPQYCdDI_9QY_1wkA6zeAMcwNFA388jPzdVPzi1WL8gO68cANNcdLU!/dl3/d3/L2dBISEvZ0FBIS9nQSEh/ Now I also attempted to create a campaign against this page in case that the javascript redirect was breaking things, but that campaign also reported 1 page crawled. Can anyone instruct me as to what I'm doing wrong? Thank you
Moz Pro | | jcmoreno0