Which is the best and quick way to remove URL(s) from Google,Bing search engines?
-
"Remove URL", "Set Expiry in meta tag", "no index no follow " or some thing else.
-
dont forget to submit a new sitemap to google within GWT AFTER you finish your removals and other changes as suggested. that should also force google to act a bit faster than it would otherwise do in recognizing the changes.
http://googlewebmastercentral.blogspot.com/2011/08/submit-urls-to-google-with-fetch-as.html
-
Thanks for your response. I applied your way but page not going to be decrease.
After that I applied "Expiry" meta on that pages.
-
Thanks for your interest.One more questions. Let you know by Example.
Search engine crawled lac of pages, because of some issue I would like to remove my pages from search engine result, as per my knowledge I set "Expiry" meta tag on that pages which I want to remove from search engine result. Will you please let us know that how much time will be taken place? Note: as per example I set 23-Aug as expiry date.
-
Thanks for you answer, Yes it's best way to do this but I am confuse in 301 redirection and Expiry Meta tag.Which method gives me faster response?
-
Thanks for your response.
We want to remove around 100+ URLs daily.I think its bit tough to do by remove URL.if we did then, Should It impact on Domain authority?
-
Another way we found helpful is:
-
adding the meta robots=noindex tag to these pages
-
submitting those URLs + sitemap.xml in Google Webmaster Tools
Hope this helps
-
-
Two very good answers IMO by Moosa and Philipp. I am not sure why you are removing, but remember there will be cached pages on the web for some indefinite period. If it was only a short time, etc. maybe very few or hard to find, but typically there is something...somewhere.
BEST
-
Actually, in my experience the quickest way is to 301 redirect it to another URL. This is even quicker than disabling the URL and let it return a 404 (which would be my second choice).
Also, keep in mind that only using GWT does not have any impact on Bing! -
The quickest way to remove the URL from the search index is by using Google’s webmaster tool!
Go to GWT and request a link removal and it will be gone for 90 days and then set the no index tag this way your URL will be out of the search index for the life to come!
Hope this helps!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google showing different links in SERPs
Google search results are showing my site links in both URLs, "mydomain.com" and "https://mydomain.com". However the one with https is showing a favicon, and the other one is not. So i wanna keep the https one and remove the other one. I went to GSC to submit "mydomain.com" for removal and it said that URL will be deleted in ALL of its variations.So how do i delete the "mydomain.com" links? Should i just index the ones with https again? Would that work? Someone suggested me to do 301 redirect on all pages that are being displayed twice. But i am not sure if i need to do that since i am using squarespace, and both of the links lead to the same page?
SERP Trends | | winter22330 -
What are the SEO challenges associated with private search engines, like DuckDuckGo?
I read recently that DuckDuckGo doubled in size in 2017. With their search engine, and other alternatives to Google, taking part of the search market away, how can SEO/Marketing/Web pros keep their websites optimized and get traffic from these private search engines? (Also, do any of you have experience with this? What portion of your search traffic is coming from private search engines?)
SERP Trends | | searchencrypt1 -
URL Parameter for Limiting Results
We have a category page that lists products. We have parameters and the default value is to limit the page to display 9 products. If the user wishes, they can view 15 products or 30 products on the same page. The parameter is ?limit=9 or ?limit=15 and so on. Google is recognizing this as duplicate meta tags and meta descriptions via HTML Suggestions. I have a couple questions. 1. What should be my goal? Is my goal to have Google crawl the page with 9 items or crawl the page with all items in the category? In Search Console, the first part of setting up a URL parameter says "Does this parameter change page content seen by the user?". In my opinion, I think the answer is Yes. Then, when I select how the parameter affects page content, I assume I'd choose Narrows because it's either narrowing or expanding the number of items displayed on the page. 2. When setting up my URL Parameters in Search Console, do I want to select Every URL or just let Googlebot decide? I'm torn because when I read about Every URL, it says this setting could result in Googlebot unnecessarily crawling duplicate content on your site (it's already doing that). When reading further, I begin to second guess the Narrowing option. Now I'm at a loss on what to do. Any advice or suggestions will be helpful! Thanks.
SERP Trends | | dkeipper0 -
Is Google Knowledge Graph in Other Countries?
Hello, I have a website that receives international traffic from all over the world. A lot of our keywords that we go after unfortunately show up on google knowledge graph. I'm curious to know if KG is showing up in the google search engines for other countries for the keywords that my site goes after. Is there anyway to actually perform a test to see? Thanks!
SERP Trends | | thetimenow0 -
"something" Search Operatos Usage
Hey guyz,
SERP Trends | | atakala
I know this "" operator to force the search results to include show the results what includes inside the operator exactly the same.
But when I do some google searches, some times google doesn't show the exact match what I wrote in the quote. Like this https://www.google.com/?gws_rd=ssl#q="rfid+çözümleri+şuan" I'm waiting for your replies,
Thank you guyz.0 -
I was wondering if anyone had any advice on Hyperlocal Searches and optimizing for them?
We're planning on building our own directory. Does anyone here have any experience with using directories to drive inbound traffic? Any advice in general on Hyperlocal Searches?
SERP Trends | | PeterConnor0 -
Google Merchant Center Feed Disapproved - Data Quality Good - No Warnings
I have noticed Google Merchant Center has been making many changes of over the last month. Feeds can now be optimized for certain product attributes. The dilemma currently is that I have a Google Merchant Center Data feed that shows zero warnings and that the data quality is good. Unfortunately, the entire feed has been disapproved. Across many other websites that I noticed the same issues, I have been able to fix all warnings and the feeds are taken perfectly. This one sites issues are eluding me. Anybody have any suggestions or experience dealing with this problem? Possible issues I have looked into but could be affecting feed. Merchant Center Guidelines have been reviewed multiple, multiple times and here is what I have found. 1. Website has limited duplicate content taken from distributors product listings (I have fought a unending battle with site owner to make all product content original) 2. Refurbished Products Issue: The sites feed has listed all products as "new". I found some of the product content in the site had "refurbished" listed. The guidelines state that products must be listed & marked as refurbished in the feed. To overcome this issue I disabled all refurbished products and resubmitted the feed. This did produce a good approved data feed.
SERP Trends | | SEMCLIX0 -
What is your favorite alternative search engine?
Outside of Google & Bing what is your favorite search engine and why? This could be for personal use, competitive research, user experience.
SERP Trends | | elephantseo0