Google pagespeed / lazy image load
-
Hi,
we are using the apache module of google pagespeed. It works really great, helps a lot. But today I've asked me one question:
Does the "lazy load" feature for images harm the ranking?
The module reworks the page to load the images only if the are visible at the screen. Is this behavior also triggered by the google bot? Or are the images invisible for google?
Any expirience about that?
Best wishes,
Georg.
-
this does a pretty good job of explaining lazy load
http://www.thesempost.com/lazy-loading-images-likely-will-indexed-google/
-
hey that was a fast response i usually dont get that response from google lol .. anyway post an update, ok? would like to know the answer aswell..
-
Yesterday, I've written a support mail to bing webmastertools. Surprisingly I got a very comprehensive answer within hours! Thumbs up!
The answer: "Yes, you are right. Since this lazy load feature is a 3<sup>rd</sup> party application, as initial troubleshooting steps and to isolate the issue, please try to turn off this feature on your end."
Well, I try to turn off the lazy load for the specific page and see what's happening.
Best wishes,
Georg. -
i think i already answered this question
" what i know is that anything generated by javascript is unreadable by any search engine robot"
so probably thats the reason why its not found on image search engine .. anyway ill wait for other answers too
-
Hi,
test google versus bing:
I am searching results for
site:schicksal.com Freitag, der 13.
Bing, organic: http://goo.gl/bfXAU0 - article found on 1st position
Bing, image search: http://goo.gl/EXDSdv - no search resultsGoogle, organic: http://goo.gl/VIi5C6 - article found on 1st position
Google, image: http://goo.gl/m5SRjA - main article image is found on 1st positionI've done some other quick checks with Bing: The big images are NOT found at the image search, only the teaser images which are on the overview pages.
So, can anybody confirm this behavior? Do Bing have a problem with the lazy load of google.pagespeed?
Best wishes,
Georg.
-
im curious too what i know is that anything generated by javascript is unreadable by any search engine robot.. they just dont know that language its client side .. but the thing with lazy load is that the content is there just the image is not loaded until its shown on screen.. i mean the tags wrapping up the image.. if webmaster tool "fetch as googlebot" could fetch it then you dont have to worry anything.. but still i wanna know others opinion too
-
Just tried to use the Google Webmaster Tool "fetch as googlebot" - the lazy loaded images where shown on the screenshot.
But the question remains: Is it possible that the google bot is not seeing the images for the ranking because the are loaded with javascript?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Having a Subfolder/Subdirectory With a Different Design Than the Root Domain
Hi Everyone, I was wondering what Google thinks about having a subfolder/subdirectory with a different design than the root domain. So let's say we have MacroCorp Inc. which has been around for decades. MacroCorp has tens of thousands of backlinks and a couple thousand referring domains from quality sites in its industry and news sites. MacroCorp Inc. spins off one of its products into a new company called MicroCorp Inc., which makes CoolProduct. The new website for this company is CoolProduct.MacroCorp.com (a subdomain) which has very few backlinks and referring domains. To help MicroCorp rank better, both companies agree to place the MicroCorp content at MacroCorp.com/CoolProduct/. The root domain (MacroCorp.com) links to the subfolder from its navigation and MicroCorp does the same, but the MacroCorp.com/CoolProduct/ subfolder has an entirely different design than the root domain. Will MacroCorp.com/CoolProduct/ be crawled, indexed, and rank better as both companies think it would? Or would Google still treat the subfolder like a subdomain or even a separate root domain in this case? Are there any studies, documentation, or links to good or bad examples of this practice? When LinkedIn purchased Lynda.com, for instance, what if they kept the https://www.lynda.com/ design as is and placed it at https://www.linkedin.com/learning/. Would the pre-purchase (yellow/black design) https://www.linkedin.com/learning/ rank any worse than it does now with the root domain (LinkedIn) aligned design? Thanks! Andy
Web Design | | AndyRCWRCM1 -
Without Keyword Info From Google - How do we re-do a site not knowing what to keep?
Bit of a riddle I am trying to figure out here... I have a client that receives some visits via organic searches (around 700). Most of which are to the homepage. The client isn't actively targeting any keywords yet (on purpose) and the homepage doesn't have much on it. I've been hired to do keyword research and re-develop the site but this is the first site I've done since google really put the hurt on keyword information. My worry is that without knowing what keywords people are using currently to search and find the site, I will be potentially deleting information that is bringing in traffic. Looking at the traffic and other keywords I can view I think the keywords are branded which makes it a bit easier but again, it is a bit worrisome, not so much for this client but for future work. Anyone have any ideas other than looking at webmaster tools and landing pages?
Web Design | | JoshBowers20120 -
After a website redesign, what is the impact and is it a good practice to use /v2/ naming convention?
Hi mightyful SEOMoz community. We just launched a redesign of our commercial website from https://www.data-field.com to https://www.data-field.com/v2/ All URLs from previous website were 301 permanent redirect to the appropriate page in the new website, and the root domains ( /, /v2/ ) send the users to their own language content /v2/en/, /v2/fr/, /v2/zh/ Up to here everything is fine. But then I setup the usual "Share" buttons, only to find that they were displaying a "0" count. Then I realized that it was because of the root URL change from / to /v2/ My question is the following: 1. Is using /v2/ a good practice? 2. If yes, then should I link the Social tool to https://www.data-field.com/ ( only ) instead of linking it to the actual page in the address bar? Thanks for your answers.
Web Design | | NicolasE0 -
Does Google take email server IP blacklists into account?
This is just a hypothetical, but would Google use information from email server blacklists to determine the quality of a website? The reason is that we're planning to code in an e-mail queuing system for our next CMS, and we would put SPF and DKIM in place. We wouldn't be sending any bulk e-mails (we use Constant Contact for this), but we might be sending personalised follow up e-mails, unpaid order emails and that sort of thing. There's no reason to think we'll be blacklisted, but from experience I know that these email blacklist directories quite often give false positives when an e-mail server is incorrectly configured. So the risk is that we might get blacklisted by mistake when we start using this new feature. Would Google take this into account as part of the algorithm? And if so, would the damage be permanent? (I.e. does getting removed from the blacklist mean Google will stop thinking we're a low quality / spammy site)
Web Design | | OptiBacUK0 -
Google penalty for links opening in new tab?
Our web services provided suggested that Google doesn't like in-text links that open the link in a new tab. Can anyone verify this? We often link to outside credible resources for our audience, though it seems smarter to open in a new tab rather than risk that the person will not navigate back to our site after finding us. Thank you in advance!
Web Design | | jhamlin0 -
XML Sitemap that updates daily/weekly?
Hi, I have a sitemap on my site, that updates but it isn't a XML sitemap. See here: http://www.designerboutique-online.com/sitemap/ I have used some free software to crawl the site and create a sitemap of pages, however I think that if I were to upload the sitemap, it would be out of date as soon as I listed new products on the site, so would need to rerun it. Does anyone know how I can get this to refresh daily or weekly? Or any software that can do it? I have a web firm that are willing to do one, but our relationship is at an all time low and I don't want to hand over £200 for them to do one. Anyone with any ideas or advice? Thanks Will
Web Design | | WillBlackburn0 -
How long does Google take to re-cache a site?
Specifically, I just redesigned my site. I'm reading Danny Dovers book, and learned about checking the cache version of the site to see what google is REALLY seeing . . . . . . which evidently is my old site. Obviously, my sites not going to make any real progress with SEO as long as the site is out of date. It says it last checked the site on 5/5 and I launched the site on 5/9. Obviously, it does not do these things immediately, but anyone have any ideas on how long it should take before google starts to show me some love?
Web Design | | damon12120 -
The primary search keywords for our news release network have dropped like a rock in Google... we are not sure why.
Hi, On April 11th, a month after the farmer update was released for U.S. users of Google, the primary keywords for ALL our sites significantly dropped in Google. I have some ideas why, but I wanted to get some second opinions also. First off, I did some research if Google did anything on the 11th of April... they did. They implemented the farmer update internationally, but that does not explain why our ranks did not drop in March for U.S. Google users... unless they rolled out their update based on what site the domain is registered in... in our case, Canada. The primary news release site is www.hotelnewsresource.com, but we have many running on the same server. EG. www.restaurantnewsresource.com, www.travelindustrywire.com and many more. We were number 1 or had top ranks for terms like ¨Hotel News¨, ¨Hotel Industry¨, ¨Hotel Financing¨, ¨Hotel Jobs¨, ¨Hotels for Sale¨, etc... and now, for most of these we have dropped in a big way. It seems that Google has issued a penalty for every internal page we link to. Couple obvious issues with the current template we use... too many links, and we intend to change that asap, but it has never been a problem before. The domain hotelnewsresource.com is 10 years old and still holds a page rank of 6. Secondly, the way our news system works, it´s possible to access an article from any domain in the network. E.G. I can read an article that was assigned to www.hotelnewsresource.com on www.restaurantnewsresource.com... we don´t post links to the irrelevant domain, but it does sometimes get indexed. So, we are going to implement the Google source meta tag option. The bottom line is that I think we put too much faith in the maturity of the domain... thinking that may protect us... not the case and it´s now a big mess. Any insight you can offer would be greatly appreciated. Do you think it was farmer or possibly something else? Thanks, Jarrett
Web Design | | jarrett.mackay0