What's the best way to deal with deleted .php files showing as 404s in WMT?
-
Disclaimer: I am not a developer
During a recent site migration I have seen a bit of an increase in WMT of 404 errors on pages ending .php. Click on the link in WMT and it just shows as File Not Found - no 404 page. There are about 20 in total showing in webmaster tools and I want to advise the IT department what to do. What is the best way to deal with this for on-page best practice?
Thanks
-
Those pages will eventually drop out of Google's index, but if there are still sites (either pages within your own site or others) that are linking to any of those pages you will continue to see 404 error codes. I'm working on fixing the same issue on a site that I just started optimizing.
The best thing you can do is a 301 redirect from each of the old .php pages to a similar, relevant page that currently exists on the site. This will fix the 404 codes and also pass any page authority from the old page to the new page that it is being directed to.
Here's some helpful info from Moz on 301 redirects: http://moz.com/learn/seo/redirection
Hope that helps!
-
File not found errors should show an http status code of 404. 404 pages will be naturally dropped by Google, I wouldn't worry about them.
Make sure any of those 404'ed pages shouldn't have been 301'ed to a related page, to save any authority, if necessary.
You can use the Moz toolbar to check the http status code too.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301ing one site's links to another
Hi, I have one site with a well-established link profile, but no actual reason to exist (site A). I have another site that could use a better link profile (site B). In your experience, would 301 forwarding all of site A's pages to site B do anything positive for the link profile/organic search of the site B? Site A is about boating at a specific lake. Site B is about travel destinations across the U.S. Thanks! Best... Michael
Intermediate & Advanced SEO | | 945010 -
Why do people put xml sitemaps in subfolders? Why not just the root? What's the best solution?
Just read this: "The location of a Sitemap file determines the set of URLs that can be included in that Sitemap. A Sitemap file located at http://example.com/catalog/sitemap.xml can include any URLs starting with http://example.com/catalog/ but can not include URLs starting with http://example.com/images/." here: http://www.sitemaps.org/protocol.html#location Yet surely it's better to put the sitemaps at the root so you have:
Intermediate & Advanced SEO | | McTaggart
(a) http://example.com/sitemap.xml
http://example.com/sitemap-chocolatecakes.xml
http://example.com/sitemap-spongecakes.xml
and so on... OR this kind of approach -
(b) http://example/com/sitemap.xml
http://example.com/sitemap/chocolatecakes.xml and
http://example.com/sitemap/spongecakes.xml I would tend towards (a) rather than (b) - which is the best option? Also, can I keep the structure the same for sitemaps that are subcategories of other sitemaps - for example - for a subcategory of http://example.com/sitemap-chocolatecakes.xml I might create http://example.com/sitemap-chocolatecakes-cherryicing.xml - or should I add a sub folder to turn it into http://example.com/sitemap-chocolatecakes/cherryicing.xml Look forward to reading your comments - Luke0 -
404's and Ecommerce - Products no longer for sale
Hi We regularly have products which are no longer sold and discontinued. As we have such a large site, webmaster tools regularly picks up new 404's. These 404 pages aren't linked to from anywhere on the site any longer, however WMT will still report them as errors. Does this affect site authority? Thank you
Intermediate & Advanced SEO | | BeckyKey0 -
Best way to link 150 websites together
Fellow mozzers, Today I got an interesting question from an entrepreneur who has plans to start about 100-200 webshops on a variety of subjects. His question was how he should like them together. He was scared that if he would just make a page on every website like: www.domain.com/our-webshops/ that would list all of the webshops he would get penalised because it is a link farm. I wasn't sure 100% sure which advise to give him so i told him i needed to do some research on the subject to make sure that i'm right. I had a couple of suggestions myself. 1. Split the amount of pages by 3 and divide them into three columns. Column A links to B, B links to C and C links to A. I realize this is far from ideal but it was one of the thoughts which came up. 2. Divide all the webshops into different categories. For example: Webshops aimed at different holidays, webshops aimed at mobile devices etcetera. This way you will link the relevant webshops together instead of all of them. Still not perfect. 3. Create a page on a separate website (such as a company website) where the /our-webshops/ page exists. This way you only have to place a link back from the webshops to this page. I've seen lots of webshops using this technique and i can see why they choose to do so. Still not ideal in my opinion. That's basicly my first thoughts on the subject. I would appreciate any feedback on the methods described above or even better, a completely different strategy in handling this. For some reason i keep thinking that i'm missing the most obvious and best method. 🙂
Intermediate & Advanced SEO | | WesleySmits0 -
What is best practice to eliminate my IP addr content from showing in SERPs?
Our eCommerce platform provider has our site load balanced in a few data centers. Our site has two of our own exclusive IP addresses associated with it (one in each data center). Problem is Google is showing our IP addresses in the SERPs with what I would assume is bad duplicate content (our own at that). I brought this to the attention of our provider and they say they must keep the IP addresses open to allow their site monitoring software to work. Their solution was to add robots.txt files for both IP addresses with site wide/root disallows. As a side note, we just added canonical tags so the pages indexed within the IP addresses ultimately show the correct URL (non IP address) via the canonical. So here are my questions. Is there a better way? If not, is there anything else we need to do get Google to drop the several hundred thousand indexed pages at the IP address level? Or do we sit back and wait now?
Intermediate & Advanced SEO | | ovenbird0 -
What's the best way to manage content that is shared on two sites and keep both sites in search results?
I manage two sites that share some content. Currently we do not use a cross-domain canonical URL and allow both sites to be fully indexed. For business reasons, we want both sites to appear in results and need both to accumulate PR and other SEO/Social metrics. How can I manage the threat of duplicate content and still make sure business needs are met?
Intermediate & Advanced SEO | | BostonWright0 -
Who is a good provider of many class C hosting IP's ?
this is to host about 80 different websites all in the same niche, all doing very well in ranking for their specific keywords, currently at hostgator seohosting plan, but hostgator has issues I do not want to continue dealing with
Intermediate & Advanced SEO | | beehappy0 -
Whats the best way to handle product microformats such as hproduct, goodrelations on ecommerce for Google?
With web3.0 results with microfrmatting showing in google, yahoo etc through reviews, instock, events, sales, pricing etc.
Intermediate & Advanced SEO | | RampUpInteractive0