Site Speed
-
I was wondering what benefits there are to investing the time and money into speeding up an eCommerce site. We are currently averaging 3.4 seconds of load time per page and I know from webmaster tools they hold the mark to be at closer to 1.5 seconds. Is it worth it to get to 1.5 seconds? Any tips for doing this?
Thanks
-
@JustDucky We recently migrated to a data center and the average loading time dropped from ~4 seconds to ~0.9. I to noticed only 1-2% drop in bounce rate. It seems only that many people were turned off by the loading times. Then again 1-2% can be anything.
@John O'Haver I would invest the time simply because ~3.4 is the average value. This means that sometimes it goes up to 10 or even more. Take a look at your analytics account and see the performance per country. Also, I've been benchmarking analytics with remote monitoring solutions and I find a discrepancy of about 30% (probably due to limited sample date from analytics). I don't want to advertise any available solutions, but trying one won't hurt. You may find your times to be better (I hope).
-
Cypra correctly points out that faster sites make for a better user experience and Alan pointed out how inexpensive CDN can be. I installed CDN on a site that already uses WP3TC. Page load speeds cut in half but the bounce rate (which is very high) dropped by only 1 or 2%.
Has anyone who has multiple sites sampled their bounce rates before and after they installed CDN?
-
As Doug just said, there is a strong correlation between Page speed and user experience, when a user needs to wait for a page or something to load before getting the information, there is a higher bounce rate. Since the bounce rate is a strong indicator of user satisfaction that will sooner or later be implemented in algorithmic factors, it's good to adress it right from the conception phase.
-
It's not just the search engines you need to consider. Is the speed of your site affecting user experience? Are people giving up because it's just too slow? How many abandoned sessions are you getting? Do you have any opportunity to get feedback from your users?
-
Matt Cutts has said that you need to be pretty slow to incure a penalty, less than 1% of sites fall into this category.
It all depends on what is taking so long. is it download, is it slow code, is it the server?
if downloads is the problem, i would look into using a content delevery system CDN, in short hosting your images and static files in the cloud, I use Microsoft Azure Cloud services This will cost you very little in money, could be as little as a $1 a month.
You can also use this tool from google to get suggestions, but using a cdn would be the best gain.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does an EAT score on my YMYL site impact my rankings?
I've read some conflicting information on YMYL and EAT. If the Google Quality Raters are out there reviewing YMYL pages and scoring them on EAT, does that site's score have an impact on that page's/site's ranking?
Algorithm Updates | | BFMichael0 -
Seeing some really bad sites that ranked in my niche years ago reaching 1st page
It started after the update about 4 websites form the 1st page dropped to the 2nd and 4 of the other sites just popped back to the 1st page and the bad part is that the Da and inbound links of these sites are really bad, so my question is must we just wait this out till Google realises how bad these site are and some of them haven't been updated in years links broken i can go on and on. what these sites have is just the age of the domains, but can this really be the main focus of these results?
Algorithm Updates | | johan80 -
I'm Pulling Hairs! - Duplicate Content Issue on 3 Sites
Hi, I'm an SEO intern trying to solve a duplicate content issue on three wine retailer sites. I have read up on the Moz Blog Posts and other helpful articles that were flooded with information on how to fix duplicate content. However, I have tried using canonical tags for duplicates and redirects for expiring pages on these sites and it hasn't fixed the duplicate content problem. My Moz report indicated that we have 1000s of duplicates content pages. I understand that it's a common problem among other e-commerce sites and the way we create landing pages and apply dynamic search results pages kind of conflicts with our SEO progress. Sometimes we'll create landing pages with the same URLs as an older landing page that expired. Unfortunately, I can't go around this problem since this is how customer marketing and recruitment manage their offers and landing pages. Would it be best to nofollow these expired pages or redirect them? Also I tried to use self-referencing canonical tags and canonical tags that point to the higher authority on search results pages and even though it worked for some pages on the site, it didn't work for a lot of the other search result pages. Is there something that we can do to these search result pages that will let google understand that these search results pages on our site are original pages? There are a lot of factors that I can't change and I'm kind of concerned that the three sites won't rank as well and also drive traffic that won't convert on the site. I understand that Google won't penalize your sites with duplicate content unless it's spammy. So If I can't fix these errors -- since the company I work conducts business where we won't ever run out of duplicate content -- Is it worth going on to other priorities in SEO like Keyword research, On/Off page optimization? Or should we really concentrate on fixing these technical issues before doing anything else? I'm curious to know what you think. Thanks!
Algorithm Updates | | drewstorys0 -
Canonical when using others sites
Hi all, I was wondering if this is a good way to safely have content on our website. We have a job search website, and we pull content from other sites. We literally copy the full content text from it's original source, and paste it on our own site on an individual job page. On every individual job page we put a canonical link to the original source (which is not my own website). On each job page, when someone wants to apply, they are redirected to the original job source. As far as I know this should be safe. But since it's not our website we are canonical linking to, will this be a problem? To compare it was indeed.com does, they take 1 or 2 senteces from the original source and put it as an excerpt on their job category page (ie "accountant in new york" category page). When you click the excerpt/title you are redirected to the original source. As you might know, indeed.com has very good rankings, with almost no original content whatsoever. The only thing that is unique is the URL of the indeed.com category where it's on (indeed.com/accountant-new-york), and sometimes the job title. Excerpt is always duplicate from other sites. Why does this work so well? Will this be a better strategy for us to rank well?
Algorithm Updates | | mrdjdevil0 -
Post penguin & panda update. what would be a good seo strategies for brand new sites
Hi there. I have the luxury of launching a few sites after the penguin and panda updates, so I can start from scratch and hopefully do it right. I will get SEO companies to help me with this so i just want to ask for advices on what would be a good strategies for a brand new site. my understand of the new updates is this content and user experience is important, like how long they spend, how many pages etc social media is important. we intent to engage FB and twitter alot. in New Zealand, not too many people use google+ so we will probbaly just concentrate on the first two hopefully we will try to get people to share our website via social media, apparent that is important should only concentrate on high quality backlinks with a good diverse set of alt tags, but concentrate on branding rather than keywords. Am i correct to say that so far? if that is the principle, what would be the strategy to implement these goals? Links to any articles would also be great please. Love learning. i just want to do this right and hopefully try to future proof the sites against updates as possible. i guess quality content and links will most likely to be safe. Thank you for your help.
Algorithm Updates | | btrinh0 -
Why is site dropping in rank after we update it?
One of our sites - supereyes.com - appears to drop in rank after we update it. The client notified us of this today and I've verified that it did indeed drop in Google -- four spots since last week. He says this happens every time we make changes to the site, but then a week later it will go back up and is usually higher than where it was before. I have not verified this, but I'm very worried it may not rise again In the past week, we've posted a new blog entry to their site and we've changed some of the content -- specifically, added their locations to the header, added a contact page and put two testimonials in their sidebar. We've also had someone submitting their site to directories and local business sites like Angie's List and so forth. There are about 16 new backlinks established in the past 2-3 weeks. Also, I should note, traffic is higher than it's ever been, but the client doesn't look at traffic. They only look at their Google results. Can anyone offer any insight into what's going on here and if I need to be worried the site won't rise again in the rankings?
Algorithm Updates | | aloley0 -
Data on Google Vs Bing, et al and changes to sites.
I am curious to know if anyone has any data that correlates site/page changes like content or Title Tag, H1, etc. and subsequent movement in rankings on Google and Bing and Yahoo? The equation is for example: ABCSite.com/home-page/ makes a change to the H1 and H2 and one paragraph of content is changed. Over next 6 to 12 weeks changes in page rank for the 3 engines is tracked to see where it started and where it "stopped." Obviously, there are more factors than individual algorithms in play here. An example of that would be that a significant number of sites will be indexed in Google by a dev and not in the others. We see this regularly. So, at least from a timing standpoint, different sites are entering/leaving the fray at different rates. We are going to begin to track this but I would love to see any data already around or speak with anyone involved in such a study about what they found. Thanks
Algorithm Updates | | RobertFisher0 -
Accidently blocked our site for an evening?
Yesterday at about 5pm I switched our site to a new server and accidentally blocked our site from google for the evening. our domain is posnation.com and we are ranked in the top 3 in almost all pos related keywords. When i got in this morning i realized the mistake and went to google web tools and noticed the site was blocked so i went to fetch as google bot and corrected that. Now the message says: Check to see that your robots.txt is working as expected. (Any changes you make to the robots.txt content below will not be saved.)
Algorithm Updates | | POSNation
robots.txt file Downloaded Status
http://www.posnation.com/robots.txt 1 hours ago 200 (Success) When you go to google and type "pos systems" we are still #2 so i assume all is still ok. My question is will this potentially hurt our rankings and should i be worried and is there anything else I can do.0