Lazy loading images
-
Hello,
Currently we are working on a new website. Within this new website we have the option to lazy load the images. My question regarding this: will this cause any SEO problems? Will google detect / see all images properly? If not, how can we make sure that google does?
Thanks in advance!
Remco
-
@AMAGARD, reaf this article: developers.google.com/search/docs/crawling-indexing/javascript/lazy-loading
-
Some websites capture a screenshot of the main page (i.e. siteprice, wot, ...) and use it as an icon to identify your website visually. If the image loading is deferred the "preview" may appear as text only. Therefore, this feature can have a negative UX impact, but not in terms of SEO.
-
Hello,
Google will index the images the same way it would if they were visible, since google crawls the source code of the page to retrieve the information. On a visual level what you will do is that the image will take longer to appear but you will gain more loading speed, since this will be done in deferred.
Greetings
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is the image property really required for Google's breadcrumbs structured data type?
In its structured data (i.e., Schema.org) documentation, Google says that the "image" property is required for the breadcrumbs data type. That seems new to me, and it seems unnecessary for breadcrumbs. Does anyone think this really matters to Google? More info about breadcrumbs data type:
Intermediate & Advanced SEO | | Ryan-Ricketts
https://developers.google.com/search/docs/data-types/breadcrumbs I asked Google directly here:
https://twitter.com/RyanRicketts/status/7554782668788531220 -
What is the best way to hide duplicate, image embedded links from search engines?
**Hello! Hoping to get the community’s advice on a technical SEO challenge we are currently facing. [My apologies in advance for the long-ish post. I tried my best to condense the issue, but it is complicated and I wanted to make sure I also provided enough detail.] Context: I manage a human anatomy educational website that helps students learn about the various parts of the human body. We have been around for a while now, and recently launched a completely new version of our site using 3D CAD images. While we tried our best to design our new site with SEO best practices in mind, our daily visitors dropped by ~15%, despite drastic improvements we saw in our user interaction metrics, soon after we flipped the switch. SEOMoz’s Website Crawler helped us uncover that we now may have too many links on our pages and that this could be at least part of the reason behind the lower traffic. i.e. we are not making optimal use of links and are potentially ‘leaking’ link juice now. Since students learn about human anatomy in different ways, most of our anatomy pages contain two sets of links: Clickable links embedded via JavaScript in our images. This allows users to explore parts of the body by clicking on whatever objects interests them. For example, if you are viewing a page on muscles of the arm and hand and you want to zoom in on the biceps, you can click on the biceps and go to our detailed biceps page. Anatomy Terms lists (to the left of the image) that list all the different parts of the body on the image. This is for users who might not know where on the arms the biceps actually are. But this user could then simply click on the term “Biceps” and get to our biceps page that way. Since many sections of the body have hundreds of smaller parts, this means many of our pages have 150 links or more each. And to make matters worse, in most cases, the links in the images and in the terms lists go to the exact same page. My Question: Is there any way we could hide one set of links (preferably the anchor text-less image based links) from search engines, such that only one set of links would be visible? I have read conflicting accounts of different methods from using JavaScript to embedding links into HTML5 tags. And we definitely do not want to do anything that could be considered black hat. Thanks in advance for your thoughts! Eric**
Intermediate & Advanced SEO | | Eric_R0 -
Pages that takes more then 1,5 second to load penalized?
Hi all I just read an article (print) about the importance of af having a fast website. The author claims that all pages that are taking longer than 1,5 second to load is getting penalized in the SERPS. Speed is of course a ranking factor. But I have never heard a statement like this before. Is 1,5 second a guideline from Google? Can anyone say, where this number is coming from? Is there maybe another guideline to be followed? Thanks in advance for your comments / answers 🙂 Best regards, Kenneth Karl Nielsen
Intermediate & Advanced SEO | | KennethK0 -
Lazy loading images effect image seo?
Im using a wordpress plugin to lazy load images so that the site speed is a lot faster. Will this mess up image seo? The code for the image looks like this (with the site taken out) [ <noscript></p> <p> </p> <p>I see it has the 1x1.gif it loads to speed up page speed but does the fact the link is to the correct place make it ok?</p> <p>Thanks for letting me know</p></noscript>](http://.com/wp-content/uploads/2012/02/IMG_9477.jpg)
Intermediate & Advanced SEO | | Gordian0 -
SEO-Friendly Method to Load XML Content onto Page
I have a client who has about 100 portfolio entries, each with its own HTML page. Those pages aren't getting indexed because of the way the main portfolio menu page works: It uses javascript to load the list of portfolio entries from an XML file along with metadata about each entry. Because it uses javascript, crawlers aren't seeing anything on the portfolio menu page. Here's a sample of the javascript used, this is one of many more lines of code: // load project xml try{ var req = new Request({ method: 'get', url: '/data/projects.xml', Normally I'd have them just manually add entries to the portfolio menu page, but part of the metadata that's getting loaded is project characteristics that are used to filter which portfolio entries are shown on page, such as client type (government, education, industrial, residential, industrial, etc.) and project type (depending on type of service that was provided). It's similar to filtering you'd see on an e-commerce site. This has to stay, so the page needs to remain dynamic. I'm trying to summarize the alternate methods they could use to load that content onto the page instead of javascript (I assume that server side solutions are the only ones I'd want, unless there's another option I'm unaware of). I'm aware that PHP could probably load all of their portfolio entries in the XML file on the server side. I'd like to get some recommendations on other possible solutions. Please feel free to ask any clarifying questions. Thanks!
Intermediate & Advanced SEO | | KaneJamison0 -
Charity project for local women's shelter - need help: will Google notice if you alter the document title with Javascript after the page loads?
I am doing some pro-bono work with a local shelter for female victims of domestic abuse. I am trying to help visitors to the site cover their tracks by employing a document.title change when the page loads using JavaScript. This shelter receives a lot of traffic from Google. I worry that the Google bots will see this javascript change and somehow penalize this site or modify the title in the SERPs. Has anyone had any experience with this kind of javascript maneuver? All help would be greatly appreciated!
Intermediate & Advanced SEO | | jkonowitch0 -
SEOMoz mistaking image pages as duplicate content
I'm getting duplicate content errors, but it's for pages with high-res images on them. Each page has a different, high-res image on it. But SEOMoz keeps telling me it's duplicate content, even though the images are different (and named different). Is this something I can ignore or will Google see it the same way too?
Intermediate & Advanced SEO | | JHT0 -
What To Do For A Website That is Mainly Images
I have a website that is a desktop wallpaper script. People can come and upload 100's of wallpapers to share with the community. This is were the problems comes in. Files are normally called 27636dark.jpg or whatever and come with no description. This leads to 2 things. no text content that google can use to know what the page/image is about. Meta descriptions, URL's just look like spam. Example: /car-wallpapers/7636dark.jpg If a text description was added, it would still only be like "Green Trees in the distance". Which as you may guess, with 1,000's of wallpapers... would end up having a lot of descriptions the same. Is there any advice for sites that focus on image driven content?
Intermediate & Advanced SEO | | rhysmaster0