Do robot.txts permanently affect websites even after they have been removed?
-
A client has a Wordpress blog to sit alongside their company website. They kept it hidden whilst they were developing what it looked like, keeping it un-searchable by Search Engines. It was still live, but Wordpress put a robots.txt in place. When they were ready they removed the robots.txt by clicking the "allow Search Engines to crawl this site" button.
It took a month and a half for their blog to show in Search Engines once the robot.txt was removed.
Google is now recognising the site (as a "site:" test has shown) however, it doesn't rank well for anything. This is despite the fact they are targeting keywords with very little organic competition.
My question is - could the fact that they developed the site behind a robot.txt (rather than offline) mean the site is permanently affected by the robot.txt in the eyes of the Search Engines, even after that robot.txt has been removed?
Thanks in advance for any light you can shed on the situation.
-
No problem! Good Luck!
-
That is a very fair point. It is a completely new site and I hadn't even thought about things like the domain age. It does show up under a "site:http://www.____.com" search, I was just wondering if this is one of those things Google keeps a memory of, if that makes sense.
Thanks for your response Mike.
-
That is a very good suggestion. I'll try it (a useful URL also so thanks for sharing).
Thanks for the response Matthew.
-
I think the much more likely culprit is that it is a new site. What do you get when you enter "site:http://www._____.com" in google? If the pages are indexed, one can't blame for the robots file for lack of rank.
Good luck!
Mike
-
Have you submitted the updated robots.txt to google? This is separate from updating the sitemap. Here is a google page to help you do this.
https://support.google.com/webmasters/answer/6078399?hl=en
Best!
Matthew
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is the content on my website is garbage?
I received a mail from google webmasters, that my website is having low quality content. Website - nowwhatmoments.com
Technical SEO | | Green.landon0 -
Website Revision
Hi all~ We are completely remaking our website: www.containmydog.com. I believe we have a good handle on the visual aspects of the redesign. What are the backend or behind the scenes (probably not using the technical term) things that need to be done so search engines know where things are. For example i know we are not going to remove some pages , change were some pages are on the site and add new pages. Is there a checklist that lists the important things to do when designing/redesigning a website? If there is not a checklist what are the things I should be asking the web person we hire?
Technical SEO | | PhotographerSteve1 -
Sitemap international websites
Hey Mozzers,Here is the case that I would appreciate your reply for: I will build a sitemap for .com domain which has multiple domains for other countries (like Italy, Germany etc.). The question is can I put the hreflang annotations in sitemap1 only and have a sitemap 2 with all URLs for EN/default version of the website .COM. Then put 2 sitemaps in a sitemap index. The issue is that there are pages that go away quickly (like in 1-2 days), they are localised, but I prefer not to give annotations for them, I want to keep clear lang annotations in sitemap 1. In this way, I will replace only sitemap 2 and keep sitemap 1 intact. Would it work? Or I better put everything in one sitemap?The second question is whether you recommend to do the same exercise for all subdomains and other domains? I have read much on the topic, but not sure whether it worth the effort.The third question is if I have www.example.it and it.example.com, should I include both in my sitemap with hreflang annotations (the sitemap on www.example.com) and put there it for subdomain and it-it for the .it domain (to specify lang and lang + country).Thanks a lot for your time and have a great day,Ani
Technical SEO | | SBTech0 -
Launching Website
We are developing a new website and thinking google would not find it because of the directory we put it in (no homepage yet) and because there are no links to it. For example, we are building it in this directory example.com/wordpress/ but somehow google found it and indexed pages not ready to be indexed. What should we do to stop this until we are ready to launch? Should we just use a robots.txt file with this in it? User-agent: *
Technical SEO | | QuickLearner
Disallow: / Will this create repercussions when we officially launch?0 -
Pages removed from Google index?
Hi All, I had around 2,300 pages in the google index until a week ago. The index removed a load and left me with 152 submitted, 152 indexed? I have just re-submitted my sitemap and will wait to see what happens. Any idea why it has done this? I have seen a drop in my rankings since. Thanks
Technical SEO | | TomLondon0 -
How to change primary language of the website?
Problem: there is a domain.com which primary language is Lithuanian, we want to switch it to English. The English content is on the website fully translated under domain.com/en/english-url. Question: How do i switch English content to domain.com while moving the Lithuanian one to domain.com/lt/lithuanian-url The purpose of course is NOT to loose neither English nor Lithuanian organic traffic Possible solution: the only solution I though of is to 301 English /en urls to domain.com ant to 301 the Lithuanian domain.com urls to /lt. Is that everything I should do or is there some other meta tags, server side or other stuff i should be worried about?
Technical SEO | | SEO_MediaInno0 -
Best Practice to Remove a Blog
Note: Re-posting since I accidentally marked as answered Hi, I have a blog that has thousands of URL, the blog is a part of my site. I would like to obsolete the blog, I think the best choices are 1. 404 Them: Problem is a large number of 404's. I know this is Ok, but makes me hesitant. 2. meta tag no follow no index. This would be great, but the question is they are already indexed. Thoughts? Thanks PS A 301 redirect to the main page would be flagged as a soft 404
Technical SEO | | Bucky0 -
Duplicate Content within Website - problem?
Hello everyone, I am currently working on a big site which sells thousands of widgets. However each widget has ten sub widgets (1,2,3... say) My strategy with this site is to target the long tail search so I'm creating static pages for each possibly variation. So I'll have a main product page on widgets in general, and also a page on widget1, page on widget2 etc etc. I'm anticipating that because there's so much competition for searches relating to widgets in general, I'll get most of my traffic from people being more specific and searching for widget1 or widget 7 etc. Now here's the problem - I am getting a lot of content written for this website - a few hundred words for each widget. However I can't go to the extreme of writing unique content for each sub widget - that would mean 10's of 1,000's of articles. So... what do I do with the content. Put it on the main widget page was the plan but what do I do about the sub pages. I could put it there and it would make perfect sense to a reader and be relevant to people specifically looking for widget1, say, but could there be a issue with it being viewed as duplicate content. One idea was to just put a snippet (first 100 words) on each sub page with a link back to the main widget page where the full copy would be. Not sure whether I've made myself clear at all but hopefully I have - or I can clarify. Thanks so much in advance David
Technical SEO | | OzDave0