Server is taking too long to respond - What does this mean?
-
A client has 3 sites that he would like for me to look at. Whenever I attempt to on my home internet I get this message:
The connection has timed out
The server is taking too long to respond.When I take my iphone off wifi and use AT&T, the site comes up fine. What is going on here?
-
More than likely it was one 3 things, a DNS issue, a peering issue, or a temp ban.
If you were currently ftp'ing into the site and had too many threads open, usually above 4 or 5 but all depends on the server setting. They can issue a temporary ban on your ip address for the site. Depending on how the server is set up, you can either get an explicit message, which is bad. Or you can just get an error like you, which is good and it means the server is shedding the load.
A DNS issue could be that a name server is down somewhere or having other problems. You generally cannot do anything about this and they are generally fixed quickly because of the amount of sites / information hosted on them is vital.
A peering problem, like a DNS issue is usually spotty. More than likely that is what was happening. A peering issue means you cannot access the "chunk" of internet that the peer directs traffic through. So say you can access 99.9% of everything you want, because it is not going through the peer with the issues.
The best tools you can use to diagnose these problems are TOR, it is a socks proxy that routes your traffic so essentially you will be accessing the site through another isp, who could not be having peering or DNS issues with the hosting isp. Also you can use http://www.whatsmydns.net/ which will let you know what different dns servers around the world are returning. It will let you know if a major DNS server is having an issue. For general checking you can use this as well, http://www.downforeveryoneorjustme.com/
-
Check with the IT folks or hosting service for your client. I think this is an outside chance, but if you have been running spiders from your home computer to check the site, you may have been hitting it too hard and slowed the site down and the server may be blocking your IP as you are seen as a spammer. That is why you change ISPs you are golden as you are seen as a different "user".
I took down one of our sites once with a spidering tool. They were pushing new code right when I hit the site. Also, the number of requests a second I thought were ok, well, it was during peak traffic time. (DOH!)
I adjusted my crawl rate down and everything was ok. Again, this is just a guess, but worth checking considering your symptoms.
Good luck!
-
Yeah they all work for me too.
So this remains one of the weirder topics on here but for different reasons than I first suspected. ..I'm really not sure what to tell you. Sorry.
-
They all work for me the topsmagic site takes a while to load though
-
-
that's weird. what are the domains let's see if I can access them?
-
Wait are you saying this is just for your clients' sites? You can access other sites just fine? That's how you posted this question?
Sorry i'm confused.
-
My internet is working fine. I'm on moz.org right now using my internet. It's only when I attempt to visit those 3 websites.
-
Your internet and/or router is down..? Yeah I'd power-cycle the router and modem and try again. Or contact your cable company.
No offense but this is one of the weirdest Q&A posts I've seen here. I'm having a weird morning though so it totally fits.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using GeoDNS across 3 server locations
Hi, I have multiple servers across UK and USA. I have a web site that serves both areas and was looking at cloning my sites and using GeoDNS to route visitors to the closest server to improve speed and experience So UK visitors would connect to UK dedicated server, North America - New York server and so on Is this a good way or would this effect SEO negatively. Cheers Keith
Technical SEO | | Keith-0071 -
Server Connection Error when using Google Speed Test Insight and GTMetrix
Hi Guys, Recently got into the issue when testing load speed of my website (https://solvid.co.uk). Occasionally, Google Speed Insights gives me a server connection error which states _"PageSpeed was unable to connect to the server. Ensure that you are using the correct protocol (_http vs https), the page loads in a browser, and is accessible on the public internet." Also, GTMetrix gives me an error as well, which states the following: "An error occurred fetching the page: HTTPS error: SSl connect attempt failed" All of my redirects seem to be set-up correctly as well as the SSL certificate. I've contacted my hosting provider (godaddy), they are saying that everything is fine with the server and the installation. Also, tried in different browsers in incognito mode, still gives me the same error. Until yesterday I haven't had such a problem. I've also attached the error screenshot links. I would really appreciate your help! Dmytro UxchPYR M52iPDf
Technical SEO | | solvid1 -
Connecting long tail keywords to pages
New to Moz and this forum, so be gentle. 🙂 I’m in the process of overhauling a generally neglected website and have just finished some research on long tail keywords. My question is, how do I implement these? For example, I’ve got a product “Acme Widget” which has its own page on the site (and ranking well for the product name itself). I have lots of long tail keyword sets which describe key benefits of the products – some of which appear in the product copy, others which don’t (perhaps because the thing that a user may search for is ugly/bad-English in copy). For the sake of argument, let’s say I have the following long tail keywords for my Acme Widgets. cheap red widget los angleles widget strong green widgets florida What is the best way to implement these? Do I need to simply incorporate the text into my main Acme Widgets page, or do I need to have separate pages which are highly targeted to each long tail keyword? The problem with the former is unnatural/ugly copy. The problem with the latter is that coming up with enough content to justify (and rank) a page on each keyword set would be quite a challenge. Regards,
Technical SEO | | Warren_Vick
Warren0 -
Need Third Party Input. Our Web host blocked all bots including Google and myself because they believe SEO is slowing down their server.
I would like some third party input... partly for my sanity and also for my client. I have a client who runs a large online bookstore. The bookstore runs in Magento and the developers are also apparently the web host. (They actually run the servers.. I do not know if they are sitting under someones desk or are actually in a data center) Their server has been slowed down by local and foreign bots. They are under the impression my SEO services are sending spammer bots to crawl and slow down their site. To fix the problem they disallowed all bots. Everything, Google, Yahoo, Bing. They also banned my access from the site. My clients organic traffic instantly took a HUGE hit. (almost 50% of their traffic is organic and over 50% is Organic + Adwords most everything from Google) Their keyword rankings are taking a quick dive as well. Could someone please verify the following as true to help me illustrate to my client that this is completely unacceptable behavior on part of the host. I believe: 1.) You should never disavow ALL robots from your site as a solution for spam. As a matter of fact most of the bad bots ignore robots.txt anyways. It is a way to limit where Google searches (which is obviously a technique to be used) 2.) On site SEO work as well as link building, etc. is not responsible for foreign bots and scrappers putting a heavy load on the server. 3.) Their behavior will ultimately lead to a massive loss of rankings (already happening) and a huge loss of traffic (already happening) and ultimately since almost half the traffic is organic the client could expect to lose a large sum of revenue from purchases made by organic traffic since it will disappear. Please give your input and thoughts. I really appreciate it!
Technical SEO | | JoshuaLindley1 -
Rel="Follow"? What the &#@? does that mean?
I've written a guest blog post for a site. In the link back to my site they've put a rel="follow" attribute. Is that valid HTML? I've Googled it but the answers are inconclusive, to say the least.
Technical SEO | | Jeepster0 -
Long URL
I am using seomoz software as a trial, it has crawled my site and a report is telling me that the URL for my forum is to long: <dl> <dt>Title</dt> <dd>Healthy Living Community</dd> <dt>Meta Description</dt> <dd>Healthy life discussion forum chatting about all aspects of healthy living including nutrition, fitness, motivation and much more.</dd> <dt>Meta Robots</dt> <dd>noodp, noydir</dd> <dt>Meta Refresh</dt> <dd>Not present/empty</dd> <dd> 1 Warning Long URL (> 115 characters) Found about 17 hours ago <dl> <dt>Number of characters</dt> <dd>135 (over by 21)</dd> <dt>Description</dt> <dd>A good URL is descriptive and concise. Although not a high priority, we recommend a URL that is shorter than 75 characters.</dd> </dl> </dd> <dd> URL: http://www.goodhealthword.com/forum/reprogramming-health/welcome-to-the-forum-for-discussing-the-4-steps-for-reprogramming-ones-health/ The problem is when I check the page via edit or in the admin section of wordpress, the url is a s follows: http://www.goodhealthword.com/forum/ My question is where is I cannot see where this long url is located, it appears to be a valid page but I cant find it. Thanks Pete </dd> </dl>
Technical SEO | | petemarko0 -
How to extract URLs from a site (without bringing the server down!)
Hi everybody. One of my clients is migrating to a new ecommerce platform, and we need to get a list of urls from the existing site to start mapping out the 301 redirects. Usually, I'd use a tool like Xenu or Integrity to crawl and output a list. However, the database and server setup is so bad that it can't handle the requests from these tools and it sends the site down. This, unsurprisingly, is one of the reasons for the migration. Does anybody know of a way to get a full list of urls without having to make a bunch of http requests which will kill the site? Any advice would be much appreciated!
Technical SEO | | neooptic0 -
How do I redirect non www pages to www on a windows server?
As the .htaccess file cannot be worked on, I added this php code 301 redirect if the URL does not contain a www on all the pages (small website - 10 pages) : header( "HTTP/1.1 301 Moved Permanently" ); header( "Location: $location" ); I want to know if this is ok for SEO? Has anyone done this on a windows server? Or if you have any better methods, it would be great if you can share. Please help. Thanks.
Technical SEO | | ArjunRajkumar0