Lazy Loading of products on an E-Commerce Website - Options Needed
-
Hi Moz Fans.
We are in the process of re-designing our product pages and we need to improve the page load speed.
Our developers have suggested that we load the associated products on the page using Lazy Loading, While I understand this will certainly have a positive impact on the page load speed I am concerned on the SEO impact.
We can have upwards of 50 associated products on a page so need a solution.
So far I have found the following solution online which uses Lazy Loading and Escaped Fragments - The concern here is from serving an alternate version to search engines.
The solution was developed by Google not only for lazy loading, but for indexing AJAX contents in general.
Here's the official page: Making AJAX Applications Crawlable.The documentation is simple and clear, but in a few words the solution is to use slightly modified URL fragments.
A fragment is the last part of the URL, prefixed by #. Fragments are not propagated to the server, they are used only on the client side to tell the browser to show something, usually to move to a in-page bookmark.
If instead of using # as the prefix, you use #!, this instructs Google to ask the server for a special version of your page using an ugly URL. When the server receives this ugly request, it's your responsibility to send back a static version of the page that renders an HTML snapshot (the not indexed image in our case).It seems complicated but it is not, let's use our gallery as an example.
- Every gallery thumbnail has to have an hyperlink like:
http://www.idea-r.it/...#!blogimage=<image-number></image-number>
- When the crawler will find this markup will change it to
http://www.idea-r.it/...?_escaped_fragment_=blogimage=<image-number></image-number>
Let's take a look at what you have to answer on the server side to provide a valid HTML snapshot.
My implementation uses ASP.NET, but any server technology will be good.var fragment = Request.QueryString[``"_escaped_fragment_"``];``if
(!String.IsNullOrEmpty(fragment))``{``var escapedParams = fragment.Split(``new``[] { ``'='
});``if
(escapedParams.Length == 2)``{``var imageToDisplay = escapedParams[1];``// Render the page with the gallery showing ``// the requested image (statically!)``...``}``}
What's rendered is an HTML snapshot, that is a static version of the gallery already positioned on the requested image (server side).
To make it perfect we have to give the user a chance to bookmark the current gallery image.
90% comes for free, we have only to parse the fragment on the client side and show the requested imageif
(window.location.hash)``{``// NOTE: remove initial #``var
fragmentParams = window.location.hash.substring(1).split(``'='``);``var
imageToDisplay = fragmentParams[1]``// Render the page with the gallery showing the requested image (dynamically!)``...``}
The other option would be to look at a recommendation engine to show a small selection of related products instead. This would cut the total number of related products down. The concern with this one is we are removing a massive chunk of content from he existing pages, Some is not the most relevant but its content.
Any advice and discussion welcome
- Every gallery thumbnail has to have an hyperlink like:
-
Ok, cool. To reiterate - with escaped_fragment you are just serving the same content in a tweaked format and Google recommend it rather than frown upon it. Good to be sure though.
See you at SearchLove!
-
Hi Tom, Thank you for the response,
The concern about serving an alt version is that it would be frowned up from a SEO perspective and may lead to a form of penalty.
I agree that escaped_fragment would be the best approach and just wanted to satisfy my own concerns before I get them working on this.
Thank you and see you at Search Love
-
Hi,
I am not sure I follow your concerns around serving an alternative version of the page to search engines - is that concern based on concerns it will be frowned upon or technical concerns?
Using the escaped_fragment methodology would work for your purposes, and would be the best approach. If you have technical concerns around creating the HTML snapshots you could look at a service such as https://prerender.io/ which helps manage this process.
If that doesn't answer your question, please give more information so we can understand more specifically where you concerns are.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do I redirect my old PHP website to my new Java website?
Please could you help? My old website is written in php. I've created a new design of the website in Java. I'll be using the same domain name though. example.com and I'd like to pass my link juice to my new redesigned website. When I turn the domain name to point to my new website how do I make sure pages that are ranked in google that don't exist on my new website transfer 301 from my old website to a similar page on my new website. Old Website Example example.com/bootcampuk.php New Website Example example.com/bootcamps.jsp Many Thanks, Rob
Intermediate & Advanced SEO | | puamethod0 -
Is dynamic pages helps in E commerce SEO?
Whats are the best way to create dynamic pages in eCommerce website having static urls? Or what are other ways to increase/create more pages in websites.
Intermediate & Advanced SEO | | Obbserv0 -
SEO of blogging websites
What are the best practices of doing SEO of article/blogging websites.
Intermediate & Advanced SEO | | Obbserv0 -
E Commerce site - removing discontinued items
We have been hit with a Panda penalty and the site has slowly been losing rankings since January, I've now realised that we have 4000+ page indexed in Google, but only 2000 live products. We have never deleted any of the pages with discontinued items, most of which were created when keyword stuffing and thin content reigned supreme - which explains the Panda penalty. But which is the best and quickest way to delete them from Google? We have already implemented a 'noindex' across all these pages, but as they are no longer in the 'crawlable' site, how will Google find them to know this? Would a 404 work any better - I'm not concerned about any link juice etc to/from these pages, I just want rid. I'm not sure if we can move all these pages into a dedicated directory which would allow us to use Google's Removal Tool - using it with the individual urls would be a mammoth task. Any advice would be most greatly appreciated.
Intermediate & Advanced SEO | | ElaineAllkids0 -
Do image sitemaps provide value for non e-commerce sites?
Is it worth putting together an image sitemap to submit to Google if you're not an e-commerce site? Also, if you're using a CDN like Amazon Web Services (cloudfront), can you even submit an image sitemap? According to Google you need to verify your CDN in webmaster tools if you're going to do so. https://support.google.com/webmasters/answer/178636?hl=en
Intermediate & Advanced SEO | | kking41201 -
Wikipedia page need suggestions
http://en.wikipedia.org/wiki/Muslim_Academy Recently created this page but giving two errors at the moment. Need your advice with how to fix these two point mentioned by wikipedia. | This article has no links to other Wikipedia articles. (July 2013) This article is an orphan, as no other articles link to it. (July 2013) |
Intermediate & Advanced SEO | | csfarnsworth0 -
Can anyone tell me if this website was built with Frontpage or another cookie cutter drag and drop website creator by looking at the source code?
Can anyone tell me if this website was built with Frontpage or another cookie cutter drag and drop website creator by looking at the source code? http://naturespremiumpestdefense.com/ Thanks, Russell
Intermediate & Advanced SEO | | ULTRASEM0 -
Optimising My Website Link Containers
Hi, I'm looking at my links containers and trying to optimise them. I would be greatful if anyone can give me some feedback on my plan for perfect optimaisation. My links are constructed as follows: I have a two states:
Intermediate & Advanced SEO | | James77
1/. A Non Hover state which contains an Image and Text
2/. A Hover state which contains a bit more text - I do this as containing full text on the non hover state would not be good for users and would look ugly as well. Here's an example block of the HTML - as you can see from the URL, its quite a deep page level. From the URL and Alt / Titles the Page I am Linking to is about: "The Royal Hotel Accommodation New York Holidays". I Just a bit confused on how I should apply ALT and Title (Titles in particular) attributes given the nested DiV's etc - I can apply these to parent level, or apply all levels, or apply them to a mix. Also is there any obvious thinks you can think of I am missing that may help onsite SEO? Thanks in Advance CURRENT UNOPTIMISED CODE:
The Royal Hotel
New York Holidays Accommodation
The Royal Hotel
MY OPTIMISED CODE (Adding Title and Alt attributes):
The Royal Hotel
New York Holidays Accommodation
The Royal Hotel
0