If I have an https page with an http img that redirects to an https img, is it still considered by google to be a mixed content page?
-
With Google starting to crack down on mixed content I was wondering, if I have an https page with an http img that redirects to an https img, is it still considered by Google to be a mixed content page?
e.g. In an old blog article, there are images that weren't updated when the blog migrated to https, but just 301ed to new https images. is it still considered a mixed content page?
-
Thanks, I think I'm going to try to get it done, just because I like things neat and tidy, lol. Also, who knows when Google will switch it, might as well fix it now.
-
That is a leading cause of that error! If you have someone smart and confident who can write script to re-write all the links in like 30mins it's worth it. If it sounds like more of a 3-hour thing don't bother
-
I also caught them in SEMRush and there are a lot of them. I assume when they migrated the site they didn't bother with all the images and just 301ed them in a big batch later when they saw an issue in search console.
The question is, is it worth getting the developers to update all the imgs. I agree, ideally it should be done, just from a practical and time-consuming perspective, I know they are going to ask me whether it really matters.
-
It comes up as an error in SEMRush a lot when you produce mixed content like that. Myself I'd play it safe, it's not much effort to just rewrite the links to HTTPS using a script or something. If it takes seconds to fix it's probably not worth the potential risk (to leave it). If you think that for some reason, on your site it might take much longer to patch, it may not be worth doing
-
Thanks, I thought so, I just wasn't sure by a 301 if google follows the end source or doesn't even look at it relevant to the current page. Also, I checked in the developer's tools and a page I know to have an http img redirecting to an https img, isn't showing any security issues.
-
Yes, if you are directing users or their browser away from the secure web in any way (HTTP over HTTPS) then it counts as mixed content and you should sort it out
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it time to go https sitewide?
Hello mozzers, We are currently running a Magento 1 store and in the planning phase of migrating to Magento 2. As an eCommerce website, it goes without saying that we use secure pages for all pages where exchange of sensitive information is required, however, we do not use https on all pages. I recently read this Moz article stating that the adoption of sitewide https is at around 50% for top websites. I see Moz is also using it. I know that at this time right now, Google states not giving an SEO bump for https pages, but part of me thinks the time might be right (while re-platforming to M2) to make the move if this is what the future is going to be. Do you agree? Is this something that everyone will eventually have to do?
Algorithm Updates | | yacpro13
I am ambiguous about whether 301 redirects would be needed for http -> https ?
Also, is there a how-to or checklist to making the switch to sitewide https in order not to inadvertently cause an SEO drop? Thanks!0 -
Google Latest Algorithmic Change about Https & Mobile Friendliness
How effective did it prove for anyone with the latest algorithmic change google search engine made for being mobile friendly and using https (valid ssl certificate). I see a good change being made under the ecommerce category for sites being used for online shopping. Let me know if anyone observes a major difference.
Algorithm Updates | | mozexpone0 -
Does omitted results shown by Google always mean that website has duplicate content?
Google search results for a particular query was appearing in top 10 results but now the page appears but only after clicking on the " omitted results by google." My website lists different businesses in a particular locality and sometimes results for different localities are same because we show results from nearby area if number of businesses in that locality (search by users) are less then 15. Will this be considered as "duplicate content"? If yes then what steps can be taken to resolve this issue?
Algorithm Updates | | prsntsnh0 -
Google is forcing a 301 by truncating our URLs
Just recently we noticed that google has indexed truncated urls for many of our pages that get 301'd to the correct page. For example, we have:
Algorithm Updates | | mmac
http://www.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html as the url linked everywhere and that's the only version of that page that we use. Google somehow figured out that it would still go to the right place via 301 if they removed the html filename from the end, so they indexed just: http://www.eventective.com/USA/Massachusetts/Bedford/107/ The 301 is not new. It used to 404, but (probably 5 years ago) we saw a few links come in with the html file missing on similar urls so we decided to 301 them instead thinking it would be helpful. We've preferred the longer version because it has the name in it and users that pay attention to the url can feel more confident they are going to the right place. We've always used the full (longer) url and google used to index them all that way, but just recently we noticed about 1/2 of our urls have been converted to the shorter version in the SERPs. These shortened urls take the user to the right page via 301, so it isn't a case of the user landing in the wrong place, but over 100,000 301s may not be so good. You can look at: site:www.eventective.com/usa/massachusetts/bedford/ and you'll noticed all of the urls to businesses at the top of the listings go to the truncated version, but toward the bottom they have the full url. Can you explain to me why google would index a page that is 301'd to the right page and has been for years? I have a lot of thoughts on why they would do this and even more ideas on how we could build our urls better, but I'd really like to hear from some people that aren't quite as close to it as I am. One small detail that shouldn't affect this, but I'll mention it anyway, is that we have a mobile site with the same url pattern. http://m.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html We did not have the proper 301 in place on the m. site until the end of last week. I'm pretty sure it will be asked, so I'll also mention we have the rel=alternate/canonical set up between the www and m sites. I'm also interested in any thoughts on how this may affect rankings since we seem to have been hit by something toward the end of last week. Don't hesitate to mention anything else you see that may have triggered whatever may have hit us. Thank you,
Michael0 -
Local Pages Help
Hi All, I have a client who is looking heavily at Google+ Local. He has a main business, with a number of locational franchises. He has created a local listing for each of these franchise pages. The question he has asked is 'How do I improve my rankings for these local listings?' Now some of them seem to rank well without any work performed to improve them, but some are not. My question is, What can we do to improve the rankings of Google+ Local listings? This has changed greatly since I last looked into it, so anyone who can say 'right, this is what you need to do to improve Google+Local listings' would be greatly appreciated!!!! Many thanks Guys!!
Algorithm Updates | | Webrevolve0 -
Is it normal to receive 2 mails from Google?
I filed a reconsideration request that was answered in less than a week. Subsequently I was told that no manual penalty was in place but various algorithm factors might be causing my heavy drops in ranking. Then I got a second email which was even more specific. This was great, really heartening stuff and a total surprise as it was very helpful. Is it normal to receive 2 emails from Google with such clear information? I have been very pleased by the comments they have made as it has shown me that they're more customer focused than I had been led to believe by all the research I had done pre reconsideration request. Has anyone else had a clear outline of what they needed to fix and has their site subsequently rebounded post fixing?
Algorithm Updates | | swimwithfishes0 -
What do you think Google analyzes for SERP ranking?
I've been doing some research trying to figure out how the Google algorithm works. The one thing that is constant is that nothing is constant. This makes me believe that Google takes a variable that all sites have and divides it by that number. One example would be taking the load time in MS and dividing it by the total number or points the website scored. This would give all of the websites a random appearance since there that variable would throw off all the other constants. I'm going to continue doing research but I was wondering what you guys think matters in the Google Algorithm. -Shane
Algorithm Updates | | Seoperior0 -
Removing secure subdomain from google index
we've noticed over the last few months that Google is not honoring our main website's robots.txt file. We have added rules to disallow secure pages such as: Disallow: /login.cgis Disallow: /logout.cgis Disallow: /password.cgis Disallow: /customer/* We have noticed that google is crawling these secure pages and then duplicating our complete ecommerce website across our secure subdomain in the google index (duplicate content) https://secure.domain.com/etc. Our webmaster recently implemented a specific robots.txt file for the secure subdomain disallow all however, these duplicated secure pages remain in the index. User-agent: *
Algorithm Updates | | marketing_zoovy.com
Disallow: / My question is should i request Google to remove these secure urls through Google Webmaster Tools? If so, is there any potential risk to my main ecommerce website? We have 8,700 pages currently indexed into google and would not want to risk any ill effects to our website. How would I submit this request in the URL Removal tools specifically? would inputting https://secure.domain.com/ cover all of the urls? We do not want any secure pages being indexed to the index and all secure pages are served on the secure.domain example. Please private message me for specific details if you'd like to see an example. Thank you,0