How to get only the most needed css for faster loading?
-
I have been using the Firefox duster app to clean up my css so only the page rendering css is loaded when my page is loaded. But it doesn't seem to be working now. Does anyone know of another tool that will do this for me?
-
This seems like something that's a little beyond the scale of an app. SitePoint did a great breakdown of the whole ecosystem of page rending here: http://www.sitepoint.com/optimizing-critical-rendering-path/ that details the steps to go through regarding loading pages as quickly as possible. That said, a CDN like Cloudflare would be better suited to the task: https://www.cloudflare.com/features-optimizer. Cheers!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need Solution Related Wordpress Site
Hi, Everyone I started my new website on WordPress but I face some error on my website like sitemap indexing, Sidebar not showing so anyone how can help me to check my website Etrends News to explain to me how to solve this solution I am very helpful to you for your time. Thanks,
Technical SEO | | Sonumahan7270 -
Help with Getting Googlebot to See Google Charts
We received a message from Google saying we have an extremely high number of URLs that are linking to pages with similar or duplicate content. The main difference between these pages are the Google charts we use. It looks like Google isn't able to see these charts (most of the text are very similar) and the charts (lots of it) are the main differences between these pages. So my question is what is the best approach to allowing Google to see the data that exists in these charts? I read from here http://webmasters.stackexchange.com/questions/69818/how-can-i-get-google-to-index-content-that-is-written-into-the-page-with-javascr that a solution would be to have the text that is displayed on the charts coded into the html and hidden by CSS. I'm not sure but it seems like a bad idea to have it seen by Google but hidden to the user by CSS. It just sounds like a cloaking hack. Can someone clarify if this is even a solution or is there a better solution?
Technical SEO | | ERICompensationAnalytics1 -
Redirect to get better ranking
I have three pages of my website ranking for a keyword: landing page and two blogposts. They all rank on top of page 2 (positions 11-13).If I redirect these articles to the landing page, will it help to bring it up in rankings?
Technical SEO | | imoney0 -
Feedback needed on possible solutions to resolve indexing on ecommerce site
I’ve included the scenario and two proposed fixes I’m considering. I’d appreciate any feedback on which fixes people feel are better and why, and/or any potential issues that could be caused by these fixes. Thank you! Scenario of Problem I’m working on an ecommerce website (built on Magneto) that is having a problem getting product pages indexed by Google (and other search engines). Certain pages, like the ones I’ve included below, aren’t being indexed. I believe this is because of the way the site is configured in terms of internal linking. The site structure forces certain pages to be linked very deeply, therefore the only way for Googlebot to get to these pages is through a pagination page (such as www.acme.com/page?p=3). In addition, the link on the pagination page is really deep; generally there are more than 125 links on the page ahead of this link. One of the Pages that Google isn’t indexing: http://www.getpaper.com/find-paper/engineering-paper/bond-20-lb/430-20-lb-laser-bond-22-x-650-1-roll.html This page is linked from http://www.getpaper.com/find-paper/engineering-paper/bond-20-lb?p=5, and it is the 147<sup>th</sup> link in the source code. Potential Fixes Fix One: Add navigation tags to the template so that search engines will spend less time crawling them and will get to the deeper pages, such as the one mentioned above. Note: the navigation tags are for HTML-5; however, the Magento site in which this is built does not use HTML 5. Fix Two: Revised the Templates and CSS so that the main navigation and the sidebar navigation is on the bottom of the page rather than the top. This would put the links to the product pages in the source code ahead of the navigation links.
Technical SEO | | TopFloor0 -
Removed .html - Now Get Duplicate Content
Hi there, I run a wordpress website and have removed the .html from my links. Moz has done a crawl and now a bunch of duplicated are coming up. Is there anything I need to do in perhaps my htaccess to help it along? Google appears to still be indexing the .html versions of my links
Technical SEO | | MrPenguin0 -
Need Urgent Help
I have found one mistake that my place page address is little different than address on all local directories like on place page address is: 10010 S Tryon St #122 Charlotte, NC 28273 and on directories : 10010 South Tryon St 122 Charlotte, NC 28273 so on place page it is just "S" instead of South and "#" is before 122 but on all directories # is missing So what do you suggest ? Should i change address and re verify place page ? Re verify will put down place page value ???
Technical SEO | | mnkpso0 -
How do I get google to index the right pages with the right key word?
Hello I notice that even though I have a site map google is indexing the wrong pages under the wrong key words. As a result its not as relevant and is not ranking properly.
Technical SEO | | ursalesguru0 -
How similar do pages need to be in order to utilize the canonical tag
Here is my specific situation. My company released new versions of a few documents in the fall. I was hoping that over time the old version would decline and the new version would rise but after 6 months the old version continues to rank #1 and the new version #3. The old version needs to stay on our site but users should really be getting to the most recent version. I think utilizing the canonical tag would solve the issue but i am concerned because the content on the actual pages is not duplicate but it is updated. Below are the two URLs to see the differences in the content. http://www.sei.cmu.edu/library/abstracts/reports/06tr008.cfm http://www.sei.cmu.edu/library/abstracts/reports/10tr033.cfm Is this an appropriate situation to use the canonical tag? If not, is there a better solution.
Technical SEO | | SEI0