Accidently blocked our site for an evening?
-
Yesterday at about 5pm I switched our site to a new server and accidentally blocked our site from google for the evening. our domain is posnation.com and we are ranked in the top 3 in almost all pos related keywords. When i got in this morning i realized the mistake and went to google web tools and noticed the site was blocked so i went to fetch as google bot and corrected that. Now the message says:
Check to see that your robots.txt is working as expected. (Any changes you make to the robots.txt content below will not be saved.)
robots.txt file Downloaded Status
http://www.posnation.com/robots.txt 1 hours ago 200 (Success)When you go to google and type "pos systems" we are still #2 so i assume all is still ok. My question is will this potentially hurt our rankings and should i be worried and is there anything else I can do.
-
If you have any sort of caching installed, you could try refreshing it and resubmitting the sitemap.
I checked your robots.txt file at http://tool.motoricerca.info/robots-checker.phtml and it flagged the allow line. I don't think that would cause a problem, but you could try removing the "Allow: /" line and see if that helps.
-
Hey Nick thanks for your response...i did the first part but the sitemap on resubmit of the sitemap.xmp it wont take due to this error
URL restricted by robots.txt
but my sitemap file is here http://posnation.com/sitemap.xml
and is not blocking it...any ideas on what to do next
-
No you're ok. It used to be that if your site went down for even a few hours and the spiders came around that you could get deindexed. Now I guess they understand that stuff happens and you have a pretty long grace period before you get deindexed thankfully.
Good suggestions by Nick, also you can increase the googlebot crawl rate on your site in GWMT to get Google to come around again quicker.
-
If it was just blocked overnight you should be OK. Site's do go down for extended periods of time occasionally and I would assume Google won't de-index based on a relatively short outage.
To be safe, or at least make yourself feel like you have done what you can - resubmit your xml sitemap in webmaster tools. Also go to the "Fetch as GoogleBot" section and fetch your home page. Once it is fetched, click on the submit link and tell it to submit the page and all linked pages. You are probably OK without doing that, but it couldn't hurt to resubmit.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Seeing some really bad sites that ranked in my niche years ago reaching 1st page
It started after the update about 4 websites form the 1st page dropped to the 2nd and 4 of the other sites just popped back to the 1st page and the bad part is that the Da and inbound links of these sites are really bad, so my question is must we just wait this out till Google realises how bad these site are and some of them haven't been updated in years links broken i can go on and on. what these sites have is just the age of the domains, but can this really be the main focus of these results?
Algorithm Updates | | johan80 -
Dublicate Content: Almost samt site on different domains
Hi, I own a couple of casting websites, which I'm at the moment launching "local" copies of all over the world. When I launch my website in a new country, the content is basically allways the same, except the language sometimes changes country for country. The domains will vary, so the sitename would be site.es for Spain, site.sg for Singapore, site.dk for Denmark and so. The websites will also feature diffent jobs (castings) and diffent profiles on the search.pages and so, BUT the more static pages are the same content (About us, The concept, Faq, Create user and so). So my Questions are: Is this something that is bad for Google SEO? The sites are atm NOT linking to each other with language-flags or anything - Should I do this? Basically to tell google that
Algorithm Updates | | KasperGJ
the business behind all these sites are somewhat big. Is there a way to inform Google on, that these sites should NOT be treated as dublicate content (Canonical tag wont do, since I want the "same" content to be listet on the locally Google sites). Hope there is some experts here which can help. /Kasper0 -
Is there a we to get Google to index our site quicker?
I have updated some pages on a website, is there a way to get Google to index the page quicker?
Algorithm Updates | | webguru20140 -
Does Site Size Influence Rank?
The Scenario:
Algorithm Updates | | kchandler
Currently one of my clients has 7-8 products that they sell on their website. For each product they have two different pages one with the product info and one with a video demo. So the pages began to split their authority as they began receiving new links. Since only one of the two pages for each product rank i suggested that we combine the two and redirect the video page to the product page to increases it's authority and rank. The Clients Response:
After explaining my reasoning and next steps the client mentioned that he thought a site's size was a ranking factor. I had never heard of this before so i told them i would do some research to prove my point, after a little digging around i am now even more confused. http://www.seroundtable.com/google-size-ranking-17044.html http://www.webmasterworld.com/google/4591155.htm The Question:
Does a websites size/amount of content indexed in Google actually effect your sites ability to rank? I look forward to everyones feedback, thanks Kyle1 -
Any red flags associated with this site?
Hey gang, My client's keywords have recently taken a header... We've owned the top 3 spots in the SERPs for several keyword phrases for several years. In the past 3 months we've watched all those keywords and local results fade... Examples of the types of terms we were consistently ranking for included things like: Indianapolis injury lawyer Indiana accident attorneys personal injury lawyers in Indianapolis semi-truck injury attorneys and several other similar keyword phrases. Was hoping someone would be kind enough to give me a second opinion about what the cause(s) may be. The site: http://www.2keller.com/ Love and peace to all of you! 🙂 Wayne
Algorithm Updates | | Wayne760 -
Do links from unrelated sites dilute your rankings for your key phrases?
do links from unrelated sites dilute your rankings for your key phrases? i've always heard don't get links from unrelated sites but if that mattered, then how would sites with totally diverse pages such as newspaper sites, sears, and other catalogue sites rank for these diverse subjects on their site? How does Facebook rank when it gets 100,000 links a day from sites that have nothing to do with a social media site? I'd love to hear everyone's opinion on this. Also, Do links from unrelated sites give less push than related links? Take care,
Algorithm Updates | | Ron10
Ron0 -
How to optimise a news site? - tomorrows chip paper terms
Are there any specific tips to how to gain traffic from very short lived search terms? If the site you are SEO/SEMing want to go for search related to things like the latest celebrity breakup, or a fashion event that lasts less than a week The onsite stuff seems pretty good as SEO onsite tools generally give it an A grade Is it just a case of doing the same stuff as normal, but faster? 😉
Algorithm Updates | | Fammy0 -
Google's reaction to site updates
Hi, Is it safe to assume as soon as Google indexes updates I've made to my site that any ranking changes the updates effected will happen at that same time, or is there ever a lag time before these changes ( if any ) take effect?
Algorithm Updates | | minutiae0