Longevity of robot.txt files on Google rankings
-
This may be a difficult question to answer without a ton more information, but I'm curious if there's any general thought that could shed some light on the following scenario I've recently heard about and wish to be able to offer some sound advice:
An extremely reputable non-profit site with excellent ranking had gone through a re-design and change-over into WordPress. A robots.txt file was used during development on the dev site on the dev server.
Two months later it was noticed through GA that traffic was way down to the site. It was then discovered that the robot.txt file hadn't been removed and the new site (same content, same nav) went live with it in place. It was removed and a site index forced. How long might it take for the site to re-appear and regain past standing in the SERPs if rankings have been damaged. What would the expected recovery time be?
-
They were paying attention to GA but lapsed and when they checked back in, saw a drop in traffic. Great point about that "critical" message.. The developers did force a crawl and I'm hoping you are correct about the time it might take.
-
Thank you methodicalweb. Great suggestions.
-
Thanks, Travis. You've offered a lot of very interesting points.
I will double-check that they have looked at the server log files, but I'm pretty confident that they have done that.
They did assure me that the proper redirects were done but I'm not sure what they did regarding extensions. There was also a server change.....
-
Thanks for clarifying KeriMorgret. Much appreciated. As are all your thoughts. I will definitely suggest that the monitoring software be used to avoid any future problems. This was such an unnecessary and frustrating experience.
-
If they were paying attention to WMT they would have seen a "critical" message that the site was blocked right away. Forcing a crawl (crawl all urls) should result in the site getting indexed extremely quickly. Rankings should return to where they were before.
-
The only thing I would add to the existing responses, is that if following a "site:www.mysite.com" query you notice that some key landing pages haven't been indexed then submit them via Webmaster Tools (Fetch as Google).
I would also make sure your sitemap is up to date and submitted via WMT too. It will also tell you how many of the sitemap URLs have been indexed.
These 2 things could speed up your re-indexing. My guess is that if it's a reputable site, and the migration of URLs was done properly, you'll probably get re-indexed quickly anyway.
George
-
Hi Gina,
Yes, that is what I mean. The dev team (or you, if you chose) would get an email that says the robots.txt file had changed. I was inhouse at a non-profit where we had an overseas dev team that wasn't too savvy about SEO, so I was the one who would get the emails, then go and send them an email asking them to fix it.
I don't believe there's a hard and fast answer here, as it in part depends on how quickly your site is crawled.
-
If possible, take a look at the server log files. That should give you a better idea of when/how often Google crawled the site in recent history. The user agent you're looking for is googlebot.
Aside from the robots.txt faux pas, it's also possible that the proper redirects weren't put in place. That would also account for a dip in traffic. Generally WordPress is extensionless. Which means any previous URL that contained an extension won't properly resolve - which means the site would lose a chunk of referral traffic and link equity if the URLs contained an extension (.php, .html, .aspx). Further, if the URL names have been changed from something like /our-non-profit.html to /about-our-non-profit those would require a redirect as well.
I've seen brand new domains index in a matter of days, then rank very well in as little as one month. But that's the exception, not the rule.
Provided proper redirects are in place and nothing too drastic happened to on-page considerations, I would guesstimate two weeks to a month. If you start heading into the month time frame, it's time to look a little deeper.
edit: If the server changed, that would also add another wrinkle to the problem. In the past, one of my lovely hosts decided to force a change on me. It took about a month to recover.
-
Thanks so much for your response KeriMorgret. I'm not sure I fully understand your suggestion unless you are saying that it would have alerted the dev team to the problem? I will pass this on to them and thank you if that is what your intention was.
The developer removed the robot.txt file which fixed the problem and I am trying to ascertain if there is a general expectation on how something like this - a de-indexing - gets reversed within the Google algorithm.
-
I don't know how long it will take for reindexing, but I do have a suggestion (have been in a real similar situation at a non-profit in the past).
Use a monitoring software like https://polepositionweb.com/roi/codemonitor/index.php that will check your robots.txt file daily on your live and any dev servers and email you if there is a change. Also, suggest that the live server's robots.txt file be made read-only, so it's harder to overwrite when updating the site.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking Standard pages with Robots.txt (t&c's, shipping policy, pricing & privacy policies etc)
Hi I've just had best practice site migration completed for my old e-commerce store into a Shopify environment and I see in GSC that it's reporting my standard pages as blocked by robots.txt, such as these below examples. Surely I don't want these blocked ? is that likely due to my migrators or s defaults setting with Shopify does anyone know? : t&c's shipping policy pricing policy privacy policy etc So in summary: Shall I unblock these? What caused it Shopify default settings or more likely my migration team? All Best Dan
Reporting & Analytics | | Dan-Lawrence0 -
Understanding Average Position in Google Anaylitics
Hello here, I have a question about the Queries report under "Search Engine Optimization" in Google Analytics: is the "Average Position" information a reliable one? I have a lot of queries that appear, from that report, to average first position, but when I verify that on Google by connecting anonymously, I can't even find my result on the first page! To me, that information is worthless and makes me think all the rest of that report is unreliable. If anyone can help me to understand it, I'd really appreciate it. Thank you in advance for any thoughts.
Reporting & Analytics | | fablau0 -
Google Analytics Goals - Different Domain
Hi, We currently need to setup a goal that involves a different domain. Any guidance would be greatly appreciated. more info: the checkout process is redirected to a secure.domain Just let me know if I can provide any additional info that may helpful.
Reporting & Analytics | | opusvo0 -
Google analytcics sub domain dot or not?
Buongiorno from 16 degrees C wetherby UK famous for the Wetherby Whaler Chippy 😉 OK... on this site http://www.philpotts.co.uk/ I've set up sub domain tracking as so:
Reporting & Analytics | | Nightwing
Parent site:
http://www.philpotts.co.uk/ Sub domain
http://shop.philpotts.co.uk/ So my question is please: should a dot be placed in the sub domain line as in : _gaq.push(['_setDomainName', '.philpotts.co.uk']); Some advice places a dot in setDomainName other advice doesnt 😞
Any insights welcome, Grazie,David0 -
How do I keep the SEOmoz bot from showing up in my Google Analytics?
I noticed today that we had a huge spike in visits on December 9th of this month. We are talking like 13,000 more visits. The network the visits were from was psinet inc. Any suggestions on how to keep these bots from registering in my Google Analytics? Is there an ip address I can exclude?
Reporting & Analytics | | Hosmercars0 -
My rankings are up and down like a Yo-Yo. Any ideas why?
One of the companies that I work for sells the latest smartphones. It ranks fairly well for key terms in an extremely competitive field. Recently the rankings of one particular handset have been extremely erratic and do not follow the pattern of other products, I am wondering if anyone can offer some insight as to why this is happening? When the handset was first announced, it ranked fairly well (20th). This is good when you bare in mind that it is an ecommerce site competing with top tech sites. Prior to release, the ranking gradually declined to 33rd. I believe this may be due to reviews being publsihed (as a result of review handsets being sent out prior to release) as Google has started to favour reviews and news updates. I have seen a similar pattern with other handsets, reviews and news are favoured by Google prior to release, but once the product is available, my rankings rise and remain on the first page provided that links are still coming in. However, the rankings of this handset dropped from 27th to below 50th upon launch. The next week the ranking rose from >50th to 7th, dropped to below 50th again the following week, rose back to 9th the week after and has now dropped again to below 50th. These are by far the biggest changes in rankings I have seen on a week to week basis. The page has a strong link profile in order to achieve such a high ranking in the first place. With this being the case I have no idea why ranking can drop off so much in the space of a week? Update: I forgot to mention initially that I have the variant listed twice, one for each colour variant. All content is the same apart from the colour being altered in the appropriate places. This has not been a problem in the past, but perhaps it is due to recent changes to Google algorithm?
Reporting & Analytics | | pugh0 -
Drop in rankings
Hey Mozzers, In the past week 8 of my keywords have drastically dropped and I have no idea why. We had been doing link building campaign and things were lookign good. We stopped it and crash, all the kerword rankings dropped. Would be awesome if someone couls shed some light onto this one? the site in question is www.gunshotdigital.com thank you so much!
Reporting & Analytics | | vijayvasu0 -
Google and bing search filed commands
Dose someone have / know a full list / resource with commands for google and bing ? Including filters for those commands ? (site:domain.com -filter etc) (like: site:domain.com, link:domain.com etc) I use the basic ones b ut I know there are much more and that there are several filters that can be used with success to filter down results. Thanks.
Reporting & Analytics | | eyepaq1