Correct way to block search bots momentarily... HTTP 503?
-
Hi,
What is the best way to block googlebot etc momentarily? For example, if I am implementing a programming update to our magento ecommerce platform and am unsure of the results and potential layout/ file changes that may impact SEO (Googlebot continuously spiders our site)
How can you block the bots for like 30 mins or so?
Thanks
-
You can do that, but it is less specific on what you are actually doing with your server. The 503 and retry after lets the spiders know exactly what you are doing (no confusion). Thank you for the clever remark below.
-
Disregard mine, Clever was more... clever.. and beat me to it as well.
-
just disallow the root domain in your robots.txt file and when you're ready to let them back in edit your text file back to normal.
-
See the response here
http://moz.com/community/q/temporarily-shut-down-a-site
In short, the 503 is correct, you want to include a http header with a retry-after so it knows when to come back. Also, key to set this up on your robots.txt file as Google will key off of the status of this file. Once it sees that the robots.txt has a 503 it will wait until robots.txt shows a 200 again to then start crawling the entire site. Note that you still need to show the 503 on all pages, regardless.
Another option (that we use a lot on our larger sites) is that we have mirrored sites behind a load balancer. We will tell the load balancer to send traffic to www1,2 while we work on www3,4. When we have updated www3,4 we switch the load balancer to www3,4 and work on www1,2 and then when www1,2 are done we put them back into the mix on the load balancer. Makes it seamless for the users and for Google.
Cheers
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix site breadcrumbs on mobile google search
For past one month, I have been doing some research on how to fix this issue on my website but all my efforts didn't work out I really need help on this issue because I'm worried about this I was hoping that Google will cache or understand the structure of my site and correct the error the breadcrumb is working correctly on desktop but not shown on mobile. For Example take a look at : https://www.xclusivepop.com/omah-lay-bad-influence/
White Hat / Black Hat SEO | | Ericrodrigo0 -
Redirecting from https to http - will pass whole link juice to new http website pages?
Hi making permanent 301 redirection from https to http - will pass whole link juice to new http website pages?
White Hat / Black Hat SEO | | Aman_1230 -
On the use of Disavow tool / Have I done it correctly, or what's wrong with my perception?
On a site I used GSA search engine ranker. Now, I got good links out of it. But, also got 4900 links from one domain. And, I thought according to ahrefs. One link from the one domain is equal to 4900 links from one domain. So, I downloaded links those 4900 and added 4899 links to disavow tool. To disavow, to keep my site stable at rankings and safe from any future penalty. Is that a correct way to try disavow tool? The site rankings are as it is.
White Hat / Black Hat SEO | | AMTrends0 -
How do I write tags on a youtube video for a local Google search?
I've been reading into tags, and I would like to know what the best ways to do them for a local search are. Right now I have a title that reads similar to, "Keyword1 and Keyword2 in City X" Would I make a corresponding tag that reads "Keyword 1 and Keyword 2 in City X,"? Or would I do "Keyword 1," "Keyword 2," and, "City X," as separate tags? Thanks!
White Hat / Black Hat SEO | | OOMDODigital0 -
Penguin Maybe? Ranking low for main term: Trying to find cause and correct
Hello, For nlpca(dot)com one of our main keywords is the term "NLP" We are ranking 25th for that term.Possible causes: 1. keyword stuffing on home page, though we need to use the term over and over again to describe ourselves. Also, competitors like nlpco(dot)com and nlpu(dot)com also mention "NLP" a lot 2. Backlink profile: see this spreadsheet. We have a lot of sites from other countries and many sitewides but all natural and almost all branded. Ou company names are NLP Institute of California, NLP California, and NLP and Coaching Institute. 3. nlpcacoach(dot)org is a sitewide footer link. So is iepdoc.nl. We're going to ask the first site to take our link down. 4. No "What is NLP" article. I think that might help. 5. Most of our 60 articles are posted on other sites. We author about 30 of them. I'm working on authorship via rel="author" and rel="me" links. There's usually 2 authors 6. Most of the title tags used to be 4 keywords separated by pipes -"|" I changed them all after the updates took the keyword "NLP" down. That's about all I can think of. What do we do or clean up?
White Hat / Black Hat SEO | | BobGW0 -
What are the best ways of improving our domain authority?
My site's domain authority has gone down by a few points recently. What the best ways of increasing it? It's currently 29 out of 100. What is a good domain authority number?
White Hat / Black Hat SEO | | Saunders18650 -
Can you block backlinks from another domain
Wondering if this is somehow possible. A site got hacked and created a /data folder with hundreds of .php files that are web pages selling all sorts of stuff. We deleted the /data folder and blocked Google from indexing it. Just noticed in Webmaster Tools that the site has 35,000 backlinks from other sites that got hacked with the same way. Is there a way to block these sites? I am assuming there isn't, but wanted to see if anyone ran into the same problem. It is a wordpress site is that helps.
White Hat / Black Hat SEO | | phatride0