Good technical parameters worst load time.
-
I have recently created a page and added expires headers, nonconfigured e-tags and gzip to htaccess code and just after that according to pingdom tools my page load time has doupled although my yslow ponts went from 78 to 92. I always get a lite bit lost with this technical issue. I mean obviously a site should not produce worse results with adding these parameters and this increase in page load time should rather be due to bandwith usage. I suppose I should leave this stuff in the htacces. Than what is an accurate way to know if you have done a real improvement to your site or your load time has really went up?
This question is more up to date with css sprites as I always read that sometimes spriting every picture is a waste of resources. How can you decide when to stop?
-
MY page is a basic html page. I have already rewrited code, there are a handful of dom elemnts, css files sprited etc. PAge load time went from 230 milisec to 500 when I implemented the new features.
-
I think this should be the case, my page is a basic html at around 200kb. Thanks for the answer.
-
It takes time to compress and decompress a page, for a litwieght page compression can actualy take longer.
If you have a heavy page then compression can be a good thing, but if your page is light then it can work against you.
On a windows server you can tell it how big a file has to be before it is compressed, The default is 256b. Thats should tell you somthing
You can also cache the compressed your static files so that compression is not needed the next time.
-
Can you share the load time that you got before and after working on those technical parameters? In our website, we usually use the webmaster tools and at the same time compare it to our competitor. For example we are in the hotel business so we try to compare our site performance to the biggest hotel chains.
But then in my opinion once you worked on those technical parameters, there are still other aspect of your site that you need to check to increase the load performance.
1. Check the size of your page. Initially, our site loads around 10 secs and this is because of our layout and we use a lot of images. First step is we compress all our jpgs without degrading the quality of it. The second part is a reconstruct the layout or our scripts to reduce the DOM elements in our site. I notice a difference on the load time when our DOM elements is less than 1000
2. For sprite images. It will be better to create a sprite image upon development instead of spriting them on the fly which is why I think they said that it is a waste of resources. I use this site for spriting our images. http://spritegen.website-performance.org/
3. You need to minify and combined all your css and javascript.
I also follow all those rules in YSLOW and Page Speed and I can see a significant improvement in our page load time. Without using a CDN our site now loads around 4-5 secs and with CDN, I think around 3-4 secs.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Broad stroke drops in Keywords on a good site
Any thoughts on why there would be an across the board drop in keyword ranking for a site that hasn't changed any of it's content marketing strategies? All above board and DA continues to be strong. All indicators that the site is healthy. Many competitors in the space (in Australia) continue to rank in the top pages with little change. As an example take the term "web design sydney" the leading pack here have ranked for 3+ years with very little change and rather consistent questionable tactics. It's a mystery. 🙂
Technical SEO | | wearehappymedia0 -
150+ Pages of URL Parameters - Mass Duplicate Content Issue?
Hi we run a large e-commerce site and while doing some checking through GWT we came across these URL parameters and are now wondering if we have a duplicate content issue. If so, we are wodnering what is the best way to fix them, is this a task with GWT or a Rel:Canonical task? Many of the urls are driven from the filters in our category pages and are coming up like this: page04%3Fpage04%3Fpage04%3Fpage04%3F (See the image for more). Does anyone know if these links are duplicate content and if so how should we handle them? Richard I7SKvHS
Technical SEO | | Richard-Kitmondo0 -
Why is google webmaster tools ignoring my url parameter settings
I have set up several url parameters in webmaster tools that do things like select a specific products colour or size. I have set the parameter in google to "narrows" the page and selected to crawl no urls but in the duplicate content section each of these are still shown as being 2 pages with the same content. Is this just normal, i.e. showing me that they are the same anyway or is google deliberately ignoring my settings (which I assume it does when they are sure they know better or think I have made a mistake)?
Technical SEO | | mark_baird0 -
Mobile URL parameter (Redirection to desktop)
Hello, We have a parallel mobile website and recently we implemented a link pointing to the desktop website. This redirect is happening via a javascript code and results in a url followed by this paramenter: ?m=off Example:
Technical SEO | | echo1
http://www.m.website.com redirects to:
http://www.website.com/?m=off Questions: Will the "http://www.website.com/?m=off" be considered duplicate content with "http://www.website.com" since they both return the same content? Is there any possibility that Google will take into consideration the url ending in "/?m=off"? How should we treat this new url? The webmaster tools URL parameter configuration at the moment isn't experiencing problems but should we submit the parameter anyway in order not to be indexed or should we wait first and see the error response? In case we should submit this for removal... what's the best way to do it? Like this? Parameter: ?m=off Does this parameter change page content seen by the user? - doesn't affect page content Any help is much appreciated.
Thank you!0 -
How to solve Parameter Issue causing Duplicate Content
Hi everyone, My site home page comes up in SERP with following url www.sitename/?referer=indiagrid My question is:- Should I disallow using robots.txt.? or 301 redirect to the home page Other issue is i have few dynamic generated URL's for a form http://www.www.sitename/career-form.php?position=SEO Executive I am using parameter "position" in URL Parameter in GWT. But still my pages are indexed that is leading to duplicate page content. Please help me out.
Technical SEO | | himanshu3019890 -
Technical question about site structure using a CMS, redirects, and canonical tag
I have a couple of sites using a particular CMS that creates all of the pages under a content folder, including the home page. So the url is www.example.com/content/default.asp. There is a default.asp in the root directory that redirects to the default page in the content folder using a response.redirect statement and it’s considered a 302 redirect. So all incoming urls, i.e. www.example.com and example.com and www.example.com/ will go to the default.asp which then redirects to www.example.com/ content/default.asp. How does this affect SEO? Should the redirect be a 301? And whether it’s a 301 or a 302, can we have a rel=canonical tag on the page that that is rel=www.example.com? Or does that create some sort of loop? I’ve inherited several sites that use this CMS and need to figure out the best way to handle it.
Technical SEO | | CHutchins1 -
Parameter handling (where to find all parameters to handle)?
Google recently said they updated their parameter handling, but I was wondering what is the best way to know all of the parameters that need "handling"? Will Google Webmaster find them? Should the company know based on what is on their site? Thanks!
Technical SEO | | nicole.healthline0 -
Schema for Price Comparison Services - Good or Bad?
Hey guys, I was just wondering what the whole schema.org markup means for people that run search engines (i.e. for a niche, certain products) or price comparison engines in general. The intend behind schema.org was to help the engines better understand the pages content. Well, I guess such services don't necessarily want Google to understand that they're just another search engine (and thus might get thrown out of the index for polluting it with search result pages). I see two possible scenarios: either not implement them or implement them in a way that makes the site not look like an aggregator, i.e. by only marking up certain products with unique text. Any thoughts? Does the SEOmoz team has any advice on that? Best,
Technical SEO | | derderko
schuon0