How to inform Google to remove 404 Pages of my website?
-
Hi,
I want to remove more than 6,000 pages of my website because of bad keywords, I am going to drop all these pages and making them ‘404’
I want to know how can I inform google that these pages does not exists so please don’t send me traffic from those bad keywords?
Also want to know can I use disavow tool of google website to exclude these 6,000 pages of my own website?
-
Have these a common structure?
I had the same issue some time ago, but was lucky enough to have them under fewer subdirectories, so that I could just work on them to actually inform Google about all the pages.
I managed to do that using a redirect 410 (url permanently gone) in my htaccess:
Redirect 410 /category/
Redirect 410 /category2/and so on. So that every article in those categories went away.
After that, I also disallowed these categories in my robots.txt.
Hope it helps.
-
On the other hand, 6.000 pages may take a little longer than 317 pages .. There must be a easier solution to this.
-
What I did was:
Go to www.google.com/webmasters/tools, and make sure your domain is set up.
Go to Google Index - Remove URL's and copy paste in the URL, and tell Google the site has been removed completely. It will send in the information, and let Google approve the changes you've made.
One problem I was having, was that you have to do this manually, so I had to copy/paste in 317 pages, so if anyone on this forum has a better solution, please do let me know!
Hope this will solve your problem
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will Google crawl and rank our ReactJS website content?
We have 250+ products dynamically inserted and sorted on our site daily (more specifically our homepage... yes, it's a long page). Our dev team would like to explore rendering the page server-side using ReactJS. We currently use a CDN to cache all the content, which of course we would like to continue using. SO... will Google be able to crawl that content? We've read some articles with different ideas (including prerendering): http://andrewhfarmer.com/react-seo/
Technical SEO | | Jane.com
http://www.seoskeptic.com/json-ld-big-day-at-google/ If we were to only load the schema important to the page (like product title, image, price, description, etc.) from the server and then let the client render the remaining content (comments, suggested products, etc.), would that go against best practices? It seems like that might be seen as showing the googlebot 1 version and showing the site visitor a different (more complete) version.0 -
Does Google see the connection between our 2 websites?
We have a main website and we created a satellite site to support the original one with backlinks. Can I add both sites to the same Google Analytics profile? My programmer said that there is no reason to use a new GA profile for the satellite site, since Google will see the connection between the 2 websites via scripts, Google Plus buttons and other programmed solutions. So, is there a reason to use a new GA account for the satellite site (and later the new satellite sites) as well?
Technical SEO | | Romaine0 -
404 errors is webmaster - should I 301 all pages?
Currently working on a retail site that shows over 1200 404 errors coming from urls that are from products that were on the site, but have now been removed as they are seasonal/out of stock. What is the best way of dealing with this situation ongoing? I am aware of the fact that these 404s are being marked as url errors in Google Webmaster. Should I redirect these 404s to a more appropriate live page or should I leave them as they are and not redirect them? I am concerned that Google may give the site a penalty as these 404s are growing (as the site is a online retail store and has products removed from its page results regularly). I thought Google was able to recognise 404s and after a set period of time would push them out of the error report. Also is there a tool out there that on mass I can run all the 404s urls through to see their individual page strength and the number of links that point at each one? Thanks.
Technical SEO | | Oxfordcomma0 -
Blank pages in Google's webcache
Hello all, Is anybody experiencing blanck page's in Google's 'Cached' view? I'm seeing just the page background and none of the content for a couple of my pages but when I click 'View Text Only' all of teh content is there. Strange! I'd love to hear if anyone else is experiencing the same. Perhaps this is something to do with the roll out of Google's updates last week?! Thanks,
Technical SEO | | A_Q
Elias0 -
How can I tell Google, that a page has not changed?
Hello, we have a website with many thousands of pages. Some of them change frequently, some never. Our problem is, that googlebot is generating way too much traffic. Half of our page views are generated by googlebot. We would like to tell googlebot, to stop crawling pages that never change. This one for instance: http://www.prinz.de/party/partybilder/bilder-party-pics,412598,9545978-1,VnPartypics.html As you can see, there is almost no content on the page and the picture will never change.So I am wondering, if it makes sense to tell google that there is no need to come back. The following header fields might be relevant. Currently our webserver answers with the following headers: Cache-Control: no-cache, must-revalidate, post-check=0, pre-check=0, public
Technical SEO | | bimp
Pragma: no-cache
Expires: Thu, 19 Nov 1981 08:52:00 GMT Does Google honor these fields? Should we remove no-cache, must-revalidate, pragma: no-cache and set expires e.g. to 30 days in the future? I also read, that a webpage that has not changed, should answer with 304 instead of 200. Does it make sense to implement that? Unfortunatly that would be quite hard for us. Maybe Google would also spend more time then on pages that actually changed, instead of wasting it on unchanged pages. Do you have any other suggestions, how we can reduce the traffic of google bot on unrelevant pages? Thanks for your help Cord0 -
Sitemap for dynamic website with over 10,000 pages
If I have a website with thousands of products, is it a good idea to create a sitemap for this website for the search engines where you show maybe 250 products on a page so it makes it easy for the search engine to find the part and also puts that part closer to the home page? Seems like google likes pages that are the closest to the home page (less clicks the better)
Technical SEO | | roundbrix0 -
Can I redirect when Google is showing these as 2 different pages?
Hi Guys, Google webmaster is showing 1000 duplicate title tags because its picking up our pages like this. How can I correct this? Please explain in detail please. Thank You Tim /store/ICICLES_NO_7_CLEAR_WITH_PINK_NUBBY//store/ICICLES_NO_7_CLEAR_WITH_PINK_NUBBY
Technical SEO | | fasctimseo0 -
Google indexing page with description
Hello, We rank fairly high for a lot of terms but Google is not indexing our descriptions properly. An example is with "arnold schwarzenegger net worth". http://www.google.ca/search?q=arnold+schwarzenegger+net+worth&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-US:official&client=firefox-a When we add content, we throw up a placeholder page first. The content gets added with no body content and the page only contains the net worth amount of the celebrity. We then go back through and re-add the descriptions and profile bio shortly after. Will that affect how the pages are getting indexed and is there a way we can get Google to go back to the page and try to index the description so it doesn't just appear as a straight link? Thanks, Alex
Technical SEO | | Anti-Alex0