Recovery from a HTTP to HTTPs migration using 302s ?
-
If a website did an HTTP to HTTPS migration using 302 re-directs, that were corrected to 301s about 4 months later, what is the expected impact? Will the website see a full recovery or has the damage been done? Thanks to anyone who can shed some light on this...
-
Hey,
If you check today's whiteboard Friday with Dr. Pete (https://mza.seotoolninja.com/blog/arent-301s-302s-canonicals-all-basically-the-same-whiteboard-friday), he mentions this case:
"Some types of 302s just don't make sense at all. So if you're migrating from non-secure to secure, from HTTP to HTTPS and you set up a 302, that's a signal that doesn't quite make sense. Why would you temporarily migrate?"
So answering your question, Google probably considered your initial http -> https redirects as 301.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I use AMP html on a CMS
I have been trying to research using AMP to improve our mobile speed. We have a whole lot of sites on the same platform managed by a CMS. From what I have read, AMP html can only be used on static pages. Does that mean we would not be able to incorporate this into the html through our CMS? I would like to implement this across all our homepages to test the effectiveness of it if possible, but there is no way to rebuild all our homepages statically. Any advice is much appreciated!
Intermediate & Advanced SEO | | chrisvogel0 -
Recovery from manual penalty, several sites sell same products
We are still struggling with the consequences of a manual penalty. Here the history we got a manual penalty in September 2013 on our main site kinderwagen.com for unnatural links we cleaned up our link profile and the penalty was lifted in February 2014 we did nothing in terms of link building until December 2014 from December 2014 we started to build natural links, we mostly gave away our products for reviews, that way we built first one content link per month now 2-3 content links per months we expanded our social acitivities on facebook and google plus we started a blog in the beginning of this year and rewrote our guide section For now the results of all this have been very modest. We are basically still stuck where we were after the penalty. In contrast our sites in other languages where we do similar activities perform quite well. There is one difference in Germany (where this kinderwagen.com is) we have other niche sites which partially sell the the same products. We made sure that every product is indexed only once, however it might still hurt the main site. Does anyone have experience in a similar situation or advice what we could do? Thanks in advance. Dieter
Intermediate & Advanced SEO | | Storesco0 -
Should I use meta noindex and robots.txt disallow?
Hi, we have an alternate "list view" version of every one of our search results pages The list view has its own URL, indicated by a URL parameter I'm concerned about wasting our crawl budget on all these list view pages, which effectively doubles the amount of pages that need crawling When they were first launched, I had the noindex meta tag be placed on all list view pages, but I'm concerned that they are still being crawled Should I therefore go ahead and also apply a robots.txt disallow on that parameter to ensure that no crawling occurs? Or, will Googlebot/Bingbot also stop crawling that page over time? I assume that noindex still means "crawl"... Thanks 🙂
Intermediate & Advanced SEO | | ntcma0 -
When to Use Schema vs. Facebook Open Graph?
I have a client who for regulatory reasons cannot engage in any social media: no Twitter, Facebook, or Google+ accounts. No social sharing buttons allowed on the site. The industry is medical devices. We are in the process of redesigning their site, and would like to include structured markup wherever possible. For example, there are lots of schema types under MedicalEntity: http://schema.org/MedicalEntity Given their lack of social media (and no plans to ever use it), does it make sense to incorporate OG tags at all? Or should we stick exclusively to the schemas documented on schema.org?
Intermediate & Advanced SEO | | Allie_Williams0 -
Penguin Recovery Possible Solution (when all fails...)?
Hi, INTRO We were hit pretty bad - first with unnatural links warning and then (we assume) by penguin. We removed a lot of links and disavowed the removed along with all others we couldn't.
Intermediate & Advanced SEO | | BeytzNet
The manual penalization was revoked but the site is still down. I understand that Penguin and Unnatural links are not the same.
I assume that while our removal and fixes were enough for the manual penalty to be removed the penguin algorithm still disapproves us. Also, I am not expecting to be where we were but we know our current locations don't make sense (several pages seem to be de-indexed). AND THE QUESTION... SINCE ALL HAS FAILED, we consider removing the main landing pages (which were the target for link-building) and build new ones with new URLs. In the old ones placing 404 and not 301. This means that all the spammy links that were built will point to non-existing pages (404)
(besides for those that point to the homepage...) Do you think it will resolve the problem? Or since the spammy links still point to our domain we are still in a problem? (even if to 404 pages). The way we see it, it is the last resort prior to dropping the domain! Thanks0 -
Using Images Instead of Text to Control Keywords on Page
We have recently updated a key page on our website. It is a template page that is used many times to display search results. The words "price", "revenue", and "cash flow", "not disclosed" are used for each listing on the page -- to minimize their impact on keyword density on the page we used images for these words. Here you can see some examples: http://www.businessbroker.net/State/Florida-Businesses_For_Sale.aspx http://www.businessbroker.net/City/Los Angeles-Businesses_For_Sale.aspx http://www.businessbroker.net/Industry/Auto_Car_Wash-Businesses_For_Sale.aspx You will note these words on this page are images and not regular text. We are certainly not doing this to "dupe" the visitors or Google -- we just want to ensure that each page has keywords pertinent to what the page is about. Bottom line question -- is this an OK practice? Are we running any risk with Google by doing this? I'm particularly nervous these days with all of the Google changes. Your thoughts and guidance on this issue would be much appreciated. Thanks. MWM
Intermediate & Advanced SEO | | MWM37720 -
Do you loose Link Equity when using RanDom CasE?
I seen a site linking internally using Caps from the home page to sub pages, the rest of the site links in lower-case. Are there any disadvantages in terms of link juice or duplication for doing this? Example link from homepage: /blah/Doctors.aspx Example link from other internal page: /blah/doctors.aspx The site is on a Windows based server and not Linux. Thanks in advance
Intermediate & Advanced SEO | | 3wh0 -
HTTP Errors in Webmaster Tools
We recently added a 301 redirect from our non-www domain to the www version. As a result, we now have tons of HTTP errors (403s to be exact) in Webmaster Tools. They're all from over a month ago, but they still show up. How can we fix this?
Intermediate & Advanced SEO | | kylesuss0