Infinite Scrolling: how to index all pictures
-
I have a page where I want to upload 20 pictures that are in a slideshow. Idea is that pictures will only load when users scroll down the page (otherwise too heavy loading). I see documentation on how to make this work and ensure search engines index all content. However, I do not see any documentation how to make this work for 20 pictures in a slideshow. It seems impossible to get a search engines to index all such pictures, when it shows only as users scroll down a page. This is documentation I am already familiar with, and which does not address my issue:
http://googlewebmastercentral.blogspot.com/2014/02/infinite-scroll-search-friendly.html http://www.appelsiini.net/projects/lazyload http://luis-almeida.github.io/unveil/thank you
-
Hi Pete, I just wanted to confirm, based on what you wrote:
"I don't think the picture- and video-heavy pages are going to rank all that well by themselves. It's just a question of whether those additional pages are diluting your MLS listing pages (by using similar regional keywords, etc.)."I did following:
- Deleted words "Home" and "Condo" from the title tag and H1 so the neighborhood name is still in title tag and H1, but no mention of home, condo, real estate etc.
- all written content has been moved from "guides" (where pictures and videos are) to lower part of MLS result pages and I imagine over a 1-2 month period the MLS result pages will get the SEO credit for this unique written content (despite no 301 redirect)
- I interlink from picture / video pages to MLS result pages with "neighborhood homes for sale"
My hypothesis is that over the next few months as G gets a better idea of my website (as the site gets more popular - still only 5 months old) G will know what to rank for "neighborhood homes for sale" search terms.
Makes sense?
-
Thats right. Zero search value. Maybe I can simply change Title tag, H1 etc. Get rid of keyword (ex "Honolulu") a d instead call ("Gallery 1"). In this way I can keep structure without diluting ranking potential for MLS result pages?
-
I generally wouldn't NOINDEX something that's part of your navigation structure, unless it's a deep layer (and you want to cut off anything "below" it). If you're concerned that they don't have search value, I'd consider consolidating somehow, which I thought was the general plan from the original question. I just don't know that you need all of the content or to get too complicated with the consolidation.
-
Interesting, thx. Can I do following: Add "noindex, follow" to those guide pages? In this way they wont compete w MLS result pages, which they currently do. Issue is all that geeat unique picture and video content wont be indexed by Google.....maybe not a big issue?
-
Yeah, I don't think the picture- and video-heavy pages are going to rank all that well by themselves. It's just a question of whether those additional pages are diluting your MLS listing pages (by using similar regional keywords, etc.).
At the scale of a large site, it's hard to tell without understanding the data, including where your traffic is coming from. If it's producing value (traffic, links, etc.), great. If not, then you may want to revisit whether those pages are worth having and/or can be combined somehow. I don't think "combined" means everything on both pages gets put onto one mega-page - you could pick and choose at that point.
-
thx, Pete. Guides are more for users who are curious about pictures and videos - not something I care about ranking for. Ex: http://www.honoluluhi5.com/waikiki-condos-real-estate/
MLS result pages is my life and I moved a lot of written content to MLS result pages to add unique content. Ex: http://www.honoluluhi5.com/oahu/honolulu/metro/waikiki-condos/ (you will see unique content below map and thumb MLS pictures).
I feel this layout is ideal long-term. I link from guide (as you can see above) to the corresponding MLS result page. Hope this makes sense
-
That depends on a lot of factors. Consolidating those to one page has advantages, SEO-wise, but you're losing the benefits of the photo page. I lean toward consolidation, but it really depends on how the pages are structured in the navigation, what sort of content and meta-data they have, etc. I'm not clear on what's left on Page A currently, but the biggest issue is probably dilution from the extra pages. Since there are "guide" pages, though, I'm not sure how they fit your site architecture. To remove 200 of them, you may need to also rethink your internal link structure.
-
thx a lot. "Viewing it as manipulative" - it makes sense. I will certainly refrain from doing so.
I keep saying last question, but this should be: moving some written content from Page A to Page B (yet keeping Page A, just less content remaining on Page A) is OK and will after a while be viewing as Page B's original content and Page B will get the SEO credit. This is done without a 301 re-direct, since Page A is still a page with pictures that are original and unique and I want Google to index all those pictures. Just that a bunch of unique written content was moved from Page A to Page B. I have moved written content from about 200 different guide type pages to 200 MLS result pages, as it makes more sense to have it there. Would it be safer to include the 301 re-direct and simply lose the picture indexing to play it safe?
-
That's a trick that used to occasionally work, but there's no evidence for it in the past couple of years. Google has gotten pretty good at understand how pages are rendered and is no longer completely dependent on source-code order. In some cases, they may even view it as manipulative.
-
thx. 1 last slight different, but related question: What is your view in placing written content above other content in source code, but on webpage written content displays below other content? In my case: MLS thumb pictures and descriptions (same as other realtors' websites) show on top of page and as users scroll down they see a lot of written unique original content I have. Search engines like written content higher on page, so would it be a good idea to place written content above the MLS data in the source code, though on webpage it will still display below MLS data.
-
I don't think the risk of harm, done right, is high, but: (1) it's easy to do wrong, and (2) I suspect the benefits are small at best. I think your time/money is better spent elsewhere.
-
thank you very much. The idea was to move a lot of great pictures from a "gallery" to a page I want to rank for. Gallery page serves no purpose but for users to see beautiful pictures and obviously for Google to index a lot of unique pictures. I guess I will leave the gallery as is and simply from the gallery inter-link to the important page.
Implementation on your suggestion can be done (my web developers have already completed, just not implemented), however, it sounds to me, if I read between the lines correctly, that there is a risk Google may screw up on interpretation of such implementation and this could potentially even hurt my site with duplicate content issues…….
-
By assigning a URL to each virtual "page", you allow Google to crawl the images, done correctly. What Google is suggesting is that you then set up rel=prev/next between those pages. This tells them to treat all of the image URLs as a paginated series (like a mutli-page article or search results).
My enterprise SEO friends have mixed feelings about rel=prev/next. The evidence of it's effectiveness is limited, but what it's supposed to do is allowing the individual pages (images, in this case) to rank while not looking like duplicate or near-duplicate content. The other options would be to rel=canonical these virtual pages, but then you'd essentially take the additional images out of ranking contention.
This infinite scroll + pagination approach is VERY technical and the implementation is well beyond Q&A's scope (it would take fairly in-depth knowledge of your site). Honestly, my gut reaction is that the time spent wouldn't be worth the gain. Most users won't know to scroll, and having 10-20 pictures vs. just a few may not add that much value. The SEO impact would be relatively small, I suspect. I think there may be easier solutions that would achieve 90% of your goals with a lot less complexity.
-
Hi Pete,
There is no mechanisim that will allow a) Lots of different pictures in a slideshow only to load when users scroll to a certain part of a part yet not slowing page speed and all pictures being indexed by Google. If you can show me 1 example on the Internet that has a solution to this, I would love to see it.This is what is possible to create (not my website, just an example): http://diveintohtml5.info/examples/history/brandy.html - I can implement such picture slideshow - which loads when users scroll down on my page - and then notice how the URL will change for each picture (as you change picture), but rest of the content on the page will stay the same. Now, the big questions go:
- Will the main (important) URL get the SEO credit for all these other URL's where each picture is located?
- Since each picture is on a different URL, each URL will get SEO credit separately and main URL will gain nothing from these pictures from an SEO perspective
- Since written content is EXACTLY the same across each of these picture URL's it will look like duplicate content and it would be good to use a canonical to make sure main URL gets all SEO credit.
- How would you place 20 unique copyrighted pictures on a URL and make sure that URL gets the SEO credit, keeping in mind the pictures can ONLY load after users scroll to a certain point on the page, as the page will otherwise load too slowly.
Highly appreciate your thoughts on this, since experts say there is a solution, but I am yet to seeing 1 concrete piece of evidence.
-
There should be no real difference, in terms of Google's infinite scroll solution. If you can chunk the content into pages with corresponding URLs, you can put any source code on those pages - text and/or images, along with corresponding alt text, etc. Once you've got one solution implemented, it should work for any kind of HTML. Not sure why images would be different in this case.
There are also ways to create photo galleries that can be crawled, mostly using AJAX. It's complex, but here's one example/discussion:
-
CORRECTION: URL 1 and URL 2 are the opposite of what I described. In other words, I want to move pictures from 1) to 2). I already moved written content from 1) to 2).
-
On this URL 1) http://www.honoluluhi5.com/oahu/honolulu-city-real-estate/ - you will see written content at lower part of the page. This written content was originally on this URL 2) http://www.honoluluhi5.com/oahu/honolulu-homes/. I moved it because the URL 1) is the page I want to rank for and 2) served more as a guide. I want to move the pictures from 2) as well to 1) and then add a 301 redirect. However, this is NOT possible, because if I place pictures on 1) where users only see it after scrolling down to a certain place on the URL, Google is not able to index all those pictures. Only way to index those pictures is having them load when users land on the page, which would slow down the page and be a terrible user experience.
I am told there is a solution to get these pictures indexed, but so far no one has been able to present a concrete solution.
-
thank you, Pete.
- All images are my own and unique (ex: http://www.honoluluhi5.com/oahu/honolulu-city-real-estate/)
- Infinite scrolling is what I am to use, otherwise loading will be too slow. Issue: When user scrolls and the pictures load, how do I set it so those images are indexed by Google? For written content it is easy to get the content indexed by Google with infinite scrolling. However, with images there seems to be no solution. In other words: if a URL has 10 images that only show after users scroll down to lower part of a given page, then those 10 images will not be indexed by Google and the page will not get the SEO credit. Any solution to this? These sources deals with the infinite scrolling and indexing issues, but does not apply to images:
http://googlewebmastercentral.blogspot.com/2014/02/infinite-scroll-search-friendly.html http://www.appelsiini.net/projects/lazyload http://luis-almeida.github.io/unveil/
-
Keep in mind that just adding 20 images/videos to this page isn't going to automatically increase the quality. Images have limited Google can crawl, and unless they're unique images that you own, they'll potentially be duplicated across the web. If adding those 20 images slows down the page a lot, that could actually harm your SEO and usability.
-
Unfortunately, it depends entirely on your implementation, but the short answer is that it depends if the images are loaded all at once and only displayed by scrolling or if they're loaded as you scroll. The latter is essentially what "infinite scrolling is" - it's generally not actually infinite, but scrolling will cause load events until there's nothing left to load.
The key is that the content has to be crawlable somehow and can't only be triggered by the event, or Google won't see it. So, if you're going to load as you go, the infinite scrolling posts should apply. If the images are pre-loaded, then you shouldn't have a problem, but I'd have to understand the implementation better.
-
I missed your point here. The page does not naturally suit for infinite scrolling in your opinion?
-
It's not an infinitely scrolling website. I'm going to drown myself now.
-
Travis: slight different, but related question: The written content you see at lower part of the URL I want to rank for, used to be on the other URL and I recently moved the content (no 301 redirect since I still have the pictures and video on the other URL). Will Google over time accept the unique content on the URL I want to rank for and credit that URL fully, OR will google notice the content originally was on the not important URL initially and therefore I risk the URL that now has the content will not get any credit for the content?
-
thx, Travis. The idea is not about being fancy: I do not want infinite scrolling. It comes down to me wanting to move a lot of great pictures and a video to this page that I want to rank for:
http://www.honoluluhi5.com/oahu/honolulu/metro/waikiki-condos/
…and here are the pictures and video: http://www.honoluluhi5.com/waikiki-condos-real-estate/The ladder page means nothing to me, except being nice pictures and video for the user. However, if I placed it under the written content on the 1st URL that would add extra "juice" of quality content to that page and I would long-term rank that much better. However, those pictures would tremendously slow loading and that is the issue……
-
I would say don't use infinite scrolling, not yet. A designer doesn't understand. They want everything to be fancy. Google isn't terribly ready for fancy yet.
At this point, I think infinite scroll is a horrible thing that needs to be shot in the face.
"Hey guys, let's load the entire site - all of the bells and whistles at once!"
That can be really mess with page load speed. So what about time to first byte? It doesn't matter if the first byte appears at the speed of light, if you're loading 450 MB.
If the Webmaster Central Blog didn't answer your question, you're pretty well on your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do uncrawled but indexed pages affect seo?
It's a well known fact that too much thin content can hurt your SEO, but what about when you disallow google to crawl some places and it indexes some of them anyways (No title, no description, just the link) I am building a shopify store and it's imposible to change the robots.txt using shopify, and they disallow for example, the cart. Disallow: /cart But all my pages are linking there, so google has the uncrawled cart in it's index, along with many other uncrawled urls, can this hurt my SEO or trying to remove that from their index is just a waste of time? -I can't change anything from the robots.txt -I could try to nofollow those internal links What do you think?
Intermediate & Advanced SEO | | cuarto7150 -
Robots.txt, Disallow & Indexed-Pages..
Hi guys, hope you're well. I have a problem with my new website. I have 3 pages with the same content: http://example.examples.com/brand/brand1 (good page) http://example.examples.com/brand/brand1?show=false http://example.examples.com/brand/brand1?show=true The good page has rel=canonical & it is the only page should be appear in Search results but Google has indexed 3 pages... I don't know how should do now, but, i am thinking 2 posibilites: Remove filters (true, false) and leave only the good page and show 404 page for others pages. Update robots.txt with disallow for these parameters & remove those URL's manually Thank you so much!
Intermediate & Advanced SEO | | thekiller990 -
Old pages still in index
Hi Guys, I've been working on a E-commerce site for a while now. Let me sum it up : February new site is launched Due to lack of resources we started 301's of old url's in March Added rel=canonical end of May because of huge index numbers (developers forgot!!) Added noindex and robots.txt on at least 1000 urls. Index numbers went down from 105.000 tot 55.000 for now, see screenshot (actual number in sitemap is 13.000) Now when i do site:domain.com there are still old url's in the index while there is a 301 on the url since March! I know this can take a while but I wonder how I can speed this up or am doing something wrong. Hope anyone can help because I simply don't know how the old url's can still be in the index. 4cArHPH.png
Intermediate & Advanced SEO | | ssiebn70 -
Does Unnatural Links penalization cause de-indexation?
Hi All, One of my sites was under Unnatural Links Manual Penalization. Its been over two months since it was revoked and we see no changes at all. In fact, we still have couple of pages (important landing pages) that are still de-indexed (I checked it by searching in quotes a whole sentence within the page and got no results). Does it mean that even though the site's penalization was revoked it is not completely over yet and I just need to be patient or is there something else hovering over the website? Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
How to get content to index faster in Google.....pubsubhubbub?
I'm curious to know what tools others are using to get their content to index faster (other than html sitmap and pingomatic, twitter, etc) Would installing the wordpress pubsubhubbub plugin help even though it uses pingomatic? http://wordpress.org/extend/plugins/pubsubhubbub/
Intermediate & Advanced SEO | | webestate0 -
Site Indexed by Google but not Bing or Yahoo
Hi, I have a site that is indexed (and ranking very well) in Google, but when I do a "site:www.domain.com" search in Bing and Yahoo it is not showing up. The team that purchased the domain a while back has no idea if it was indexed by Bing or Yahoo at the time of purchase. Just wondering if there is anything that might be preventing it from being indexed? Also, Im going to submit an index request, are there any other things I can do to get it picked up?
Intermediate & Advanced SEO | | dbfrench0 -
Webmaster Index Page significant drop
Has anyone noticed a significant drop in indexed pages within their Google Webmaster Tools sitemap area? We went from 1300 to 83 from Friday June 23 to today June 25, 2012 and no errors are showing or warnings. Please let me know if anyone else is experiencing this and suggestions to fix this?
Intermediate & Advanced SEO | | datadirect0 -
Index.php canonical/dup issues
Hello my fellow SEOs! I would LOVE some additional insight/opinions on the following... I have a client who is an industry leader, big site, ranks for many competitive phrases, blah blah..you get the picture. However, they have a big dup content/canonical issue. Most pages resolve with and without the /index.php at the end of the URL. Obviously this is a dup content issue but more importantly they SEs sometimes serve an "index.php" version of the page, sometimes they don't, and it is constantly changing which version it serves and the rank goes up and down. Now, I've instructed them that we are going to need to write a sitewide redirect to attempt a uniform structure. Most people would say, redirect to the non index.php version buttttt 1. The index.php pages consistently outperforms the non index.php versions, except the homepage. 2. The client really would prefer to have the "index.php" at the end of the URL The homepage performs extremely well for a lot of competitive phrases. I'd like to redirect all pages to the "index.php" version except the homepage and I'm thinking that if I redirect all pages EXCEPT the homepage to the index.php version, it could cause some unforeseen issues. I can not use rel=canonical because they have many different versions of the their pages with different country codes in the URL..example, if I make the US version canonical, it will hurt the pages trying to rank with a fr URL, de URL, (where fr/de are country codes in the URL depending where the user is, it serves the correct version). Any advice would be GREATLY appreciated. Thanks in advance! Mike
Intermediate & Advanced SEO | | MikeCoughlin0