Dealing with 404 pages
-
I built a blog on my root domain while I worked on another part of the site at .....co.uk/alpha I was really careful not to have any links go to alpha - but it seems google found and indexed it. The problem is that part of alpha was a copy of the blog - so now soon we have a lot of duplicate content. The /alpha part is now ready to be taken over to the root domain, the initial plan was to then delete /alpha. But now that its indexed I'm worried that Ill have all these 404 pages. I'm not sure what to do.. I know I can just do a 301 redirect for all those pages to go to the other ones in case a link comes on but I need to delete those pages as the server is already very slow. Or does a 301 redirect mean that I don't need those pages anymore? Will those pages still get indexed by google as separate pages? Please assist.
-
after a 301 redirect can I delete the pages and the databases/folders associated with them?
Yes. Think of a 301 redirect like mail forwarding. If you have an address, 1000 main street and then move to a new address you would leave a forward order (e.g. 301 redirect) with the post office. Once that is done, you can bulldozer the house (e.g.. delete the webpage/database) and the mail should still be forwarded properly.
How does one create a 301 redirect?
The method of creating a 301 redirect varies based on your server setup. If you have a LAMP setup with cPanel, there is a Redirect tool. Otherwise I would suggest contacting your host and ask how to create a redirect based on your particular setup.
-
Ryan,
Two things.
First - after a 301 redirect can I delete the pages and the databases/folders associated with them?
Second - How does one create a 301 redirect?
-
Hi Ryan,
Agree with you, but I thought to provide alternate solution to the problem. I know it is difficult and not chosen one.
But as I said that if he can't get any traffic from it then and then only it can delete the pages for index. Plus as he told earlier in question that mistakenly alpha folder was indexed so lines as per you said in the comment "That tool was designed to remove content which is damaging to businesses such as when confidential or personal information is indexed by mistake." and Its contradictory statement too "The indexed content are pages you want in the index but simply have the wrong URL - The wrong URL means the different page.
Anyways will definitely go with your solution but sometimes two options helps you to choose better one.
Thanks
-
Semil, your answer is a working solution but I would like to share why it is not a best practice.
Once the /alpha pages were indexed you could have traffic on them. You cannot possibly know who has linked to those pages, e-mailed links, bookmarked them, etc. By providing a simple 301 the change will be completely seamless to users. All their links and bookmarks will still work. Additionally if any website did link to your /alpha pages, you will retain the link.
The site will also benefit because it is already indexed by Google. You will not have to wait for Google to index your pages. This means more traffic for the site.
The 301 is very quick and easy to implement. If you are simply moving from the /alpha directory to your main site then a single 301 redirect can cover your entire site.
I will offer a simple best practice of SEO (my belief which not everyone agrees with) which I do my best to follow. NEVER EVER EVER use the robots.txt file unless you have exhausted every other possibility. The robots.txt file is an inferior solution that many people latch on to because it is quick and easy. In your case, there is no need to adjust your robots.txt file at all. The original poster stated an intention to delete the /alpha pages. Those pages will no longer exist. Why block URLs which don't exist? It doesn't offer any benefit.
Also, it makes no sense to use the Google removal tool. That tool was designed to remove content which is damaging to businesses such as when confidential or personal information is indexed by mistake. The indexed content are pages you want in the index but simply have the wrong URL. The 301 redirect will allow your pages to remain in the index and for the URL to be properly updated. In order for the 301 to work correctly, you would need to NOT block the /alpha pages with robots.txt.
The solution you shared would work, but it is not as friendly all around.
-
Whoops! Thanks for correcting my answer...
-
The reason behind not using 301 is alpha is not a page or folder you want to create for your users so I don't want to put 301. Its indexed that's it. Are you getting any traffic from it ?
No, then why you need to redirect. Remove the page and ask search engine to remove that page from index. That is all.
-
Thanks Dan,
Is there a way of blocking an entire folder or do I have to add each link?
-
How can I ask them to remove it from webmaster? How can I ask everything on the /alpha folder not to be indexed - or do I have to write each link out?
Why do you think my case isn't good for 301 redirects?
-
You have to be very careful from the start, but now Google indexed your alpha. So dont worry about the thing.
Using 301 is something which I dont like to do on your case. Ask google to remove that urls from indexing from GWT, and put robots.txt to prevent alpha to be indexed.
Thanks,
-
You can perform the 301 redirect and you will not need those pages anymore. Using the redirect would be a superior SEO solution over using the robots.txt file. Since the content is already indexed, it will stay indexed and Google will update each page over the next 30 days as it crawls your site.
If you block /alpha with robots.txt, Google will still retain the pages in their index, users will experience 404s and your new pages wont start to be properly indexed until Google drops the existing pages which takes a while. The redirect is better for everyone.
-
Hi
If you do not want them in the index you should block them in your robots.txt file like so:
-
-
-
-
- -
-
-
-
User-agent: *
Allow: /
Disallow: /alpha
-Dan
PS - Some documentation on robots.txt
-
-
-
-
- -
-
-
-
EDIT: I left my answer, but don't listen to it. Do what Ryan says
-
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
404 Hurricane Update Page After?
Hi All, I am wondering if anyone could help me decide how I should go about handling a page i plan on removing and could possibly use later on. So, a perfect example is: Let's say a company in Florida posted a page about the stores hours and possibly closing due to the incoming hurricane. Once the hurricane passes and the store is reopened, should I 404 that page since another hurricane could come after? The url for the company is www.company.com/hurricane so this is a url that we would want to use again. I guess we could just 410 and name each url www.company.com/hurricane-irma & www.company.com/hurricane-jose for each new hurricane. I am just wonder what is the best practice for a situation like this. Thanks for the help!
Technical SEO | | aua0 -
Issue: Duplicate Page Content > Wordpress Comments Page
Hello Moz Community, I've create a campaign in Moz and received hundreds of errors, regarding "Duplicate Page Content". After some review, I've found that 99% of the errors in the "Duplicate Page Content" report are occurring due to Wordpress creating a new comment page (with the original post detail), if a comment is made on a blog post. The post comment can be displayed on the original blog post, but also viewable on a second URL, created by Wordpress. http://www.Example.com/example-post http://www.Example.com/example-post/comment-page-1 Anyone else experience this issue in Wordpress or this same type of report in Moz? Thanks for your help!
Technical SEO | | DomainUltra0 -
Duplicate page content
Hello, My site is being checked for errors by the PRO dashboard thing you get here and some odd duplicate content errors have appeared. Every page has a duplicate because you can see the page and the page/~username so... www.short-hairstyles.com is the same as www.short-hairstyles.com/~wwwshor I don't know if this is a problem or how the crawler found this (i'm sure I have never linked to it). But I'd like to know how to prevent it in case it is a problem if anyone knows please? Ian
Technical SEO | | jwdl0 -
How do I deal with my pages being seen as duplicate content by SeoMoz?
My Dashboard is giving my lots of warnings for duplicate content but it all seems to have something to do with the www and the slash / For example: http://www.ebow.ie/ is seen as having the same duplicate content as http:/ebow.ie/ and http://www.ebow.ie Alos lots to do with how Wordpress categorizes pages and tags that is driving me bonkers! Any help appreciated! Dave. seomoz.png
Technical SEO | | ebowdublin0 -
Pages not being found in serp
Hi I'm helping a collegue with his website. For what ever reason the pages in the Solutions Menu are not being found in the search result for keywords related to the pages. (Homepage mainly comes up in the search result). Does anyone have any advise to why this may be happening? *To give you a bit of a background understanding, previously all the menu content was copied (which I made him change), he also had hidden text on some pages (i made him remove, white text on white background) plus the url structure changed as well. Persoanlly I think he is over using , links, internal linking is not great & the general content is not great in the menu. Your Thoughts are welcomed, thank you.
Technical SEO | | Socialdude0 -
New Domain Page 7 Google but Page 1 Bing & Yahoo
Hi just wondered what other people's experience is with a new domain. Basically have a client with a domain registered end of May this year, so less than 3 months old! The site ranks for his keyword choice (not very competitive), which is in the domain name. For me I'm not at all surprised with Google's low ranking after such a short period but quite surprsied to see it ranking page 1 on Bing and Yahoo. No seo work has been done yet and there are no inbound links. Anyone else have experience of this? Should I be surprised or is that normal in the other two search engines? Thanks in advance Trevor
Technical SEO | | TrevorJones0 -
Duplicate Page Content
Hi within my campaigns i get an error "crawl errors found" that says duplicate page content found, it finds the same content on the home pages below. Are these seen as two different pages? And how can i correct these errors as they are just one page? http://poolstar.net/ http://poolstar.net/Home_Page.php
Technical SEO | | RouteAccounts0 -
Discrepency between # of pages and # of pages indexed
Here is some background: The site in question has approximately 10,000 pages and Google Webmaster shows that 10,000 urls(pages were submitted) 2) Only 5,500 pages appear in the Google index 3) Webmaster shows that approximately 200 pages could not be crawled for various reasons 4) SEOMOZ shows about 1,000 pages that have long URL's or Page Titles (which we are correcting) 5) No other errors are being reported in either Webmaster or SEO MOZ 6) This is a new site launched six weeks ago. Within two weeks of launching, Google had indexed all 10,000 pages and showed 9,800 in the index but over the last few weeks, the number of pages in the index kept dropping until it reached 5,500 where it has been stable for two weeks. Any ideas of what the issue might be? Also, is there a way to download all of the pages that are being included in that index as this might help troubleshoot?
Technical SEO | | Mont0