Google pagespeed / lazy image load
-
Hi,
we are using the apache module of google pagespeed. It works really great, helps a lot. But today I've asked me one question:
Does the "lazy load" feature for images harm the ranking?
The module reworks the page to load the images only if the are visible at the screen. Is this behavior also triggered by the google bot? Or are the images invisible for google?
Any expirience about that?
Best wishes,
Georg.
-
this does a pretty good job of explaining lazy load
http://www.thesempost.com/lazy-loading-images-likely-will-indexed-google/
-
hey that was a fast response i usually dont get that response from google lol .. anyway post an update, ok? would like to know the answer aswell..
-
Yesterday, I've written a support mail to bing webmastertools. Surprisingly I got a very comprehensive answer within hours! Thumbs up!
The answer: "Yes, you are right. Since this lazy load feature is a 3<sup>rd</sup> party application, as initial troubleshooting steps and to isolate the issue, please try to turn off this feature on your end."
Well, I try to turn off the lazy load for the specific page and see what's happening.
Best wishes,
Georg. -
i think i already answered this question
" what i know is that anything generated by javascript is unreadable by any search engine robot"
so probably thats the reason why its not found on image search engine .. anyway ill wait for other answers too
-
Hi,
test google versus bing:
I am searching results for
site:schicksal.com Freitag, der 13.
Bing, organic: http://goo.gl/bfXAU0 - article found on 1st position
Bing, image search: http://goo.gl/EXDSdv - no search resultsGoogle, organic: http://goo.gl/VIi5C6 - article found on 1st position
Google, image: http://goo.gl/m5SRjA - main article image is found on 1st positionI've done some other quick checks with Bing: The big images are NOT found at the image search, only the teaser images which are on the overview pages.
So, can anybody confirm this behavior? Do Bing have a problem with the lazy load of google.pagespeed?
Best wishes,
Georg.
-
im curious too what i know is that anything generated by javascript is unreadable by any search engine robot.. they just dont know that language its client side .. but the thing with lazy load is that the content is there just the image is not loaded until its shown on screen.. i mean the tags wrapping up the image.. if webmaster tool "fetch as googlebot" could fetch it then you dont have to worry anything.. but still i wanna know others opinion too
-
Just tried to use the Google Webmaster Tool "fetch as googlebot" - the lazy loaded images where shown on the screenshot.
But the question remains: Is it possible that the google bot is not seeing the images for the ranking because the are loaded with javascript?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website Redesign & Ensuring Minimal Traffic/Rankings Lost
Hi there, We have undergone a website redesign (mycompany.com) and our site is ready to go live however the new website is built on a different platform so all of our blog pages will not be copied over - to avoid a large web developer expense. So our intention is to then leave all the blog pages as (on the old web design) but move it to the domain blog.mycompany.com with 301 redirects inserted on mycompany.com for each blog post pointing to the corresponding blog.mycompany.com. Is there anything else we should do to ensure minimal traffic/rankings are lost? Thank you so much for your help.
Web Design | | amitbroide0 -
Curious why site isn't ranking, rather seems like being penalized for duplicate content but no issues via Google Webmaster...
So we have a site ThePowerBoard.com and it has some pretty impressive links pointing back to it. It is obviously optimized for the keyword "Powerboard", but in no way is it even in the top 10 pages of Google ranking. If you site:thepowerboard.com the site, and/or Google just the URL thepowerboard.com you will see that it populates in the search results. However if you quote search just the title of the home page, you will see oddly that the domain doesn't show up rather at the bottom of the results you will see where Google places "In order to show you the most relevant results, we have omitted some entries very similar to the 7 already displayed". If you click on the link below that, then the site shows up toward the bottom of those results. Is this the case of duplicate content? Also from the developer that built the site said the following: "The domain name is www.thepowerboard.com and it is on a shared server in a folder named thehoverboard.com. This has caused issues trying to ssh into the server which forces us to ssh into it via it’s ip address rather than by domain name. So I think it may also be causing your search bot indexing problem. Again, I am only speculating at this point. The folder name difference is the only thing different between this site and any other site that we have set up." (Would this be the culprit? Looking for some expert advice as it makes no sense to us why this domain isn't ranking?
Web Design | | izepper0 -
Hidden Text w/ Java Script _ Is it Bad?
Just came across an article that stated that Google is looking negatively at sites that attempt to hide text or use javascripts to expand text on websites. We are about to launch our new website and believe we are using this technique but im not certain if what we are doing will hurt us. Our website tends to be a little heavy on the text so used a "read more" scrpit that will expand when clicked on. Three sections that use this on the new website Take a look and let me know your thoughts http://joomplateshop.com/demos/catdi.com/
Web Design | | ChopperCharlie0 -
Responsive image plugins and seo / crawlability
Note : For the background of this question please read the preface below. Ive been researching responsive image options the main issue i can see with them is that they are not semantic html so bots may not index them correctly. For instance many of the responsive image plugins use data-src for an image rather than src. Does any one have any experience with this and if it impacts on SEO ? Does any one know of a client side responsive image soltion that uses a normal img tag with the image stored in the src and with the option to set an alt attribute ? **Preface : ** Ive got a site we are currently developing, the site has a large full width responsive image slider. To serve images that wont be pixilated we are making the width of the images 1800px wide (which should cover most screens, but isn't actually big enough if the site was viewed on a 27" imac) these 1800px wide images weight about 350kb - 500kb per image and our image slider has about 20 of them. As you can see this would be a problem for anyone with a connection slower than c.10 mbps. This is especially true for mobile devices that will be downloading an image 1800px wide although only require a much smaller one, this coupled with a 3g connection will make the site really slow.
Web Design | | Sam-P0 -
Best way to indicate multiple Lang/Locales for a site in the sitemap
So here is a question that may be obvious but wondering if there is some nuance here that I may be missing. Question: Consider an ecommerce site that has multiple sites around the world but are all variations of the same thing just in different languages. Now lets say some of these exist on just a normal .com page while others exist on different ccTLD's. When you build out the XML Sitemap for these sites, especially the ones on the other ccTLD's, we want to ensure that using <loc>http://www.example.co.uk/en_GB/"</loc> <xhtml:link<br>rel="alternate"
Web Design | | DRSearchEngOpt
hreflang="en-AU"
href="http://www.example.com.AU/en_AU/"
/>
<xhtml:link<br>rel="alternate"
hreflang="en-NZ"
href="http://www.example.co.NZ/en_NZ/"
/> Would be the correct way of doing this. I know I have to change this for each different ccTLD but it just looks weird when you start putting about 10-15 different language locale variations as alternate links. I guess I am just looking for a bit of re-affirmation I am doing this right.</xhtml:link<br></xhtml:link<br> Thanks!0 -
Google result showing old Meta Title / Description even though page view source shows new info.
Hey guys! I'm struggling with why Google is ignoring my Meta Title / Description. I made a pretty drastic change to both about a week ago and on the results it hasn't changed. I'm on first page with several keywords and I think this weird caching is hurting me on where I'm at on the page. Thoughts / Ideas?
Web Design | | curtis_williams0 -
Did i got hit from some google updates.
Hello everybody, i got a problem and i hope someone can clear it up for me. my root domain authority is 42 and home page is 52 (jumped there only yesterday) ,while my google page rank is still PR2 (same for 3 month already). 1 month ago i changed my home page design (not the text) and since then my home page just disappeared from the search engines. can somebody look on my website www.kspiercing.com , and tell me if i got hit by some panda ,koala,penguin or some other sweet Google animal . thank you very much.
Web Design | | kspiercing0 -
SEO Issues From Image Hotlinking?
I have a client who is hotlinking their images from one of their domains. I'm assuming the images were originally stored on the first domain (let's call it SiteA.com) and when they were putting together SiteB.com, they decided to just link to the images directly on SiteA.com instead of moving the images to Site B. Essentially hotlinking. Site A is not using the images in any way and in essence is just a gateway for their other sites and in this case a storage for their images. It doesn't use those images at all, so it really doesn't get any benefits of the images being referenced since I read that Google sometimes counts that hotlinking as a "vote" for the original image. But again, since ite A doesn't use the images that are being hotlinked at all, there's no benefit for Site A. My concern is that it's affecting their SEO for Site B because it makes it look like Site B is simply scraping data by hotlinking those images from Site A. Their programmer suggested creating a virtual directory so that it "looked" like it was coming from Site B. My guess is that Google can see this, so then not only will it look like Site B is scaping/hotlinking images, but also trying to hide it which may send up red flags to Google. My suggesstion to them was to just upload the images correctly into their own images directory on Site B. They own the images, so there's not any copyright issue, but that if they want proper SEO credit for that content, it all needs to be housed on the correct server and not hotlinked. Am I correct in this or will the virtual directory serve just as well?
Web Design | | GeorgiaSEOServices1