Google Indexing of Images
-
Our site is experiencing an issue with indexation of images. The site is real estate oriented. It has 238 listings with about 1190 images. The site submits two version (different sizes) of each image to Google, so there are about 2,400 images. Only several hundred are indexed.
Can adding Microdata improve the indexation of the images?
Our site map is submitting images that are on no-index listing pages to Google. As a result more than 2000 images have been submitted but only a few hundred have been indexed. How should the site map deal with images that reside on no-index pages? Do images that are part of pages that are set up as "no-index" need a special "no-index" label or special treatment?
My concern is that so many images that not indexed could be a red flag showing poor quality content to Google.
Is it worth investing in correcting this issue, or will correcting it result in little to no improvement in SEO?
Thanks, Alan
-
I am chiming in a year late but there is just one thing I am not sure I understand. Why would you want to index images on no-index pages? What are these pages that you want to be no-indexed in the first place? If you do not want these pages to be found when searching in Google, why would you want some of the content, like images, be found instead?
I am with Michael and recommend that you fix the sitemap. I am also curious to know what has happened in the past year. Have your issues resolved? Have your SEO improved?
-
I would definitely update that sitemap. If your sitemap is telling Google one thing, and the pages themselves are contradicting the sitemap, AND it's happening thousands of times--that's a negative quality signal to Google, and could affect all sorts of things, from crawl budget to indexation to rankings.
ALT tags are worth fixing as well. That's really the #1 clue Google has to what the images are about. (Other clues: the image filename, and the page title, if it's the main image on the page). Here, I'm presuming that the images are ones you hope to have show up in image search results (otherwise why would you bother creating an image sitemap?)...in which case, you really, REALLY need to put the ALT text on them.
-
Apparently our site map submits images to Google even when they are on pages that are marked as no index.
The result is that only about 250 out of 2250 images are actually indexed by Google. Apparently Google (as you suggested) is not indexing images that are on pages that are marked "no-index".
Do you think it makes sense for my developers to modify the site map so it no longer submits images that are on pages that are marked as no-index? Is it worth investing resources in fixing this? If this is not going to cause SEO problems I would just as well leave it alone.
Also, the way images are set up, we do not have the ability to customize alt tags. Is this worth fixing? Could repairing these issues with images improve overall ranking?
Thanks, Alan
-
I've not seen instances where Google would index an image that's on a page that's marked noindex.
Be sure that you have consistency between your sitemap and your noindex/index tags on the pages, i.e. don't include a page or image in your sitemap where the page itself (or containing page) indicates noindex.
If you look at how Webmaster Tools OOPS I guess I mean "Search Console" (will Google EVER let a product keep the same name forever???) shows indexation of images in a image sitemap, you'll notice they pair the image indexation count with the web page indexation count. I take that as an indication that they're not interested in indexing images on noindexed pages (which I have to say makes sense to me).
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why did Google cache & index a different domain than my own?
We own www.homemenorca.com, a real estate website based in Spain. Pages from this domain are not being indexed: https://www.google.com/search?q=site%3Awww.homemenorca.com&oq=site%3Awww.homemenorca.com&aqs=chrome..69i57j69i58j69i59l2.3504j0j7&sourceid=chrome&ie=UTF-8Please notice that the URLs are Home Menorca, but the titles are not Home Menorca, they are Fincas Mantolan, a completely different domain and company: http://www.fincasmantolan.com/. Furthermore, when we look at Google's cache of Home Menorca, we see a different website: http://webcache.googleusercontent.com/search?q=cache%3Awww.homemenorca.com%2Fen&oq=cache%3Awww.homemenorca.com%2Fen&aqs=chrome..69i57j69i58j69i59.1311j0j4&sourceid=chrome&ie=UTF-8We reviewed Google Search Console, Google Fetch, the canonical tags, the XML sitemap, and many more items. Google Search Console accepted our XML sitemap, but is only indexing 5-10% of the pages. Google is fetching and rendering the pages properly. However, we are not seeing the correct content being indexed in Google. We have seen issues with page loading times, loading content longer than 4 seconds, but are unsure why Google would be indexing a different domain.If you have suggestions or thoughts, we would very much appreciate it.Additional Language Issue:When a user searches "Home Menorca" from America or the UK with "English" selected in their browser as their default language, they are given a Spanish result. It seems to have accurate hreflang annotations within the head section on the HTML pages, but it is not working properly. Furthermore, Fincas Mantolan's search result is listed immediately below Home Menorca's Spanish result. We believe that if we fix the issue above, we will also fix the language issue. Please let us know any thoughts or recommendations that can help us. Thank you very much!
Intermediate & Advanced SEO | | CassG12340 -
"No Index, No Follow" or No Index, Follow" for URLs with Thin Content?
Greetings MOZ community: If I have a site with about 200 thin content pages that I want Google to remove from their index, should I set them to "No Index, No Follow" or to "No Index, Follow"? My SEO firm has advised me to set them to "No Index, Follow" but on a recent MOZ help forum post someone suggested "No Index, No Follow". The MOZ poster said that telling Google the content was should not be indexed but the links should be followed was inconstant and could get me into trouble. This make a lot of sense. What is proper form? As background, I think I have recently been hit with a Panda 4.0 penalty for thin content. I have several hundred URLs with less than 50 words and want them de-indexed. My site is a commercial real estate site and the listings apparently have too little content. Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
How to Index Faster?
Hello, I have a new website and updated fresh content regularly. My indexing status is very slow. When I search how to improve my indexing rate by Google, I found most of the members of Moz community replied there is no certain technique to improve your indexing. Apart from this you should keep posting fresh content more and more and wait for Google Indexing. Some of them asked for submitting sitemap and share posts on Twitter, Facebook and Google Plus. Well the above comments are from the year of 2012. I'm curious to know is there any new technique or methods are used to improve indexing rate? Need your suggestions! Thanks.
Intermediate & Advanced SEO | | TopLeagueTechnologies0 -
How is Google crawling and indexing this directory listing?
We have three Directory Listing pages that are being indexed by Google: http://www.ccisolutions.com/StoreFront/jsp/ http://www.ccisolutions.com/StoreFront/jsp/html/ http://www.ccisolutions.com/StoreFront/jsp/pdf/ How and why is Googlebot crawling and indexing these pages? Nothing else links to them (although the /jsp.html/ and /jsp/pdf/ both link back to /jsp/). They aren't disallowed in our robots.txt file and I understand that this could be why. If we add them to our robots.txt file and disallow, will this prevent Googlebot from crawling and indexing those Directory Listing pages without prohibiting them from crawling and indexing the content that resides there which is used to populate pages on our site? Having these pages indexed in Google is causing a myriad of issues, not the least of which is duplicate content. For example, this file <tt>CCI-SALES-STAFF.HTML</tt> (which appears on this Directory Listing referenced above - http://www.ccisolutions.com/StoreFront/jsp/html/) clicks through to this Web page: http://www.ccisolutions.com/StoreFront/jsp/html/CCI-SALES-STAFF.HTML This page is indexed in Google and we don't want it to be. But so is the actual page where we intended the content contained in that file to display: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff As you can see, this results in duplicate content problems. Is there a way to disallow Googlebot from crawling that Directory Listing page, and, provided that we have this URL in our sitemap: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff, solve the duplicate content issue as a result? For example: Disallow: /StoreFront/jsp/ Disallow: /StoreFront/jsp/html/ Disallow: /StoreFront/jsp/pdf/ Can we do this without risking blocking Googlebot from content we do want crawled and indexed? Many thanks in advance for any and all help on this one!
Intermediate & Advanced SEO | | danatanseo0 -
Google + Local Pages
Hi, If I have a company with multipul addresses, Do I create separate Google + page for each area?
Intermediate & Advanced SEO | | Bryan_Loconto0 -
Does Google check Whois
Hello everyone, I own quite a lot of website active in the same niche and sometimes targeting the same keywords, these sites are hosted at different IP's. But they all have the same Whois details, i was wondering if Google checks the Whois-data? And if it affects the serp's? Regards, Yannick
Intermediate & Advanced SEO | | iwebdevnl0 -
Getting Google to index MORE per day than it does, not with greater frequency nec.
Hi The Googlebot seems to come around healthily, every day we see new pages that we've written the week before get ranked, however, if we are adding 12-15 new products/blog entries/content bits each day, only about 2-3 ever get indexed per day and so, after a few weeks, this builds up to quite a time lag. Is there any way to help step up the amount of new pages that get indexed every day? It really will take 2 or 3 each day, but no more than that, it seems strange. We're fairly new, around 6 months creating content but domain name 18 months old. Will this simply improve over time, or can something be done to help google index those pages? We dont mind if the 15 we do on Monday all get indexed the following Monday for example?
Intermediate & Advanced SEO | | xoffie0 -
Image and Content Management
My boss has decided that on our new website we are building, that he wants all content and images managed by not allowing copying content and/or saving images. Some of the information and images is proprietary, yet most is available for public viewing, but never the less, he wants it prohibited from copy and/or saving. We would still want to keep the content indexable and use appropriate alt tags etc... I wanted to find out if there is any SEO reason and facts to why this would not be a good idea?Would implementing code to prohibit (or at least make it difficult) to save images and copy content, penalize us?
Intermediate & Advanced SEO | | KJ-Rodgers0