Push for site-wide https, but all pages in index are http. Should I fight the tide?
-
Hi there,
First Q&A question
So I understand the problems caused by having a few secure pages on a site. A few links to the https version a page and you have duplicate content issues.
While there are several posts here at SEOmoz that talk about the different ways of dealing with this issue with respect to secure pages, the majority of this content assumes that the goal of the SEO is to make sure no duplicate https pages end up in the index.
The posts also suggest that https should only used on log in pages, contact forms, shopping carts, etc."
That's the root of my problem. I'm facing the prospect of switching to https across an entire site. In the light of other https related content I've read, this might seem unecessary or overkill, but there's a vaild reason behind it.
I work for a certificate authority. A company that issues SSL certificates, the cryptographic files that make the https protocol work. So there's an obvious need our site to "appear" protected, even if no sensitive data is being moved through the pages. The stronger push, however, stems from our membership of the Online Trust Alliance. https://otalliance.org/
Essentially, in the parts of the internet that deal with SSL and security, there's a push for all sites to utilize HSTS Headers and force sitewide https. Paypal and Bank of America are leading the way in this intiative, and other large retailers/banks/etc. will no doubt follow suit. Regardless of what you feel about all that, the reality is that we're looking at future that involves more privacy protection, more SSL, and more https.
The bottom line for me is; I have a site of ~800 pages that I will need to switch to https.
I'm finding it difficult to map the tips and tricks for keeping the odd pesky https page out of the index, to what amounts to a sitewide migratiion.
So, here are a few general questions.
- What are the major considerations for such a switch?
- Are there any less obvious pitfalls lurking?
- Should I even consider trying to maintain an index of http pages, or should I start work on replacing (or have googlebot replace) the old pages with https versions?
- Is that something that can be done with canonicalization? or would something at the server level be necessary?
- How is that going to affect my page authority in general?
- What obvious questions am I not asking?
Sorry to be so longwinded, but this is a tricky one for me, and I want to be sure I'm giving as much pertinent information as possible.
Any input will be very much appreciated.
Thanks,
Dennis
-
Hi Dennis Lees,
I had to deal with something similar in the past, the website was about online donations and wanted to look secure.
All pages were 301 redirected to the https version and it didn't seem to affect their rankings.
If you are to force sitewide https, I suggest to 301 redirect all http pages to their https version and search engine spiders will do their jobs at crawling the new urls and replacing them in the search results.
Don't expect this to happen overnight! It will take some time, you might see some rankings greatly fluctuate, but things should get back to normal and definitely better than having duplicate content all over the place.
Best regards,
Guillaume Voyer.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help with force redirect HTTP to HTTPS
Hi, I'm unsure of where I should be putting the following code for one of my Wordpress websites so that they redirect all HTTP requests to HTTPS. RewriteEngine On RewriteCond %{HTTPS} !=on RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301] This is my current htaccess file: *missing
Intermediate & Advanced SEO | | Easigrass0 -
HTTP URL hangover after move to HTTPS
A clients site was moved to https recently. It's a small site with only 6 pages. One of the pages is to advertise an emergency service. HTTPS move worked fine. Submitted https to webmaster tools, submitted sitemap. 301 redirects. Rankings preserved. However, a few weeks later doing the site:example.com there are two pages for the emergency service. One says https the other is http. But the http one says the correct SEO title and the https one says an old SEO title. This wasn't expected. When you click the HTTP URL link it 301 redirects to the HTTPS url and the correct SEO title is displayed in the browser tab. When you click the HTTPS url link it returns a 200 and the correct SEO title is shown as expected in the browser tab. Anyone have any idea what is going on? And how to fix? Need to get rid of the HTTP URL but in the site search it contains the correct title. Plus- why is it there anyway?
Intermediate & Advanced SEO | | AL123al0 -
HTTP HTTPS Migration Gone Wrong - Please Help!
We have a large (25,000 Products) ecommerce website, and we did an HTTP=>HTTPS migration on 3/14/17, and our rankings went in the tank, but they are slowly coming back. We initially lost 80% of our organic traffic. We are currently down about 50%. Here are some of the issues. In retrospect, we may have been too aggressive in the move. We didn't post our old sitemaps on the new site until about 5 days into the move. We created a new HTTPS property in search console. Our redirects were 302, not 301 We also had some other redirect issues We changed our URL taxonomy from http://www.oursite.com/category-name.html to https://www.oursite.com/category-name (removed the .html) We changed our filters plugin. Proper canonicals were used, but the filters can generate N! canonical pages. I added some parameters (and posted to Search Console) and noindex for pages with multiple filter choices to cut down on our crawl budget yesterday. Here are some observations: Google is crawling like crazy. Since the move, 120,000+ pages per day. These are clearly the filtered pages, but they do have canonicals. Our old sitemaps got error messages "Roboted Out". When we test URLs in Google's robots.txt tester, they test fine. Very Odd. At this point, in search console
Intermediate & Advanced SEO | | GWMSEO
a. HTTPS Property has 23,000 pages indexed
b. HTTP Property has 7800 pages indexed
c. The crawl of our old category sitemap (852 categories) is still pending, and it was posted and submitted on Friday 3/17 Our average daily organic traffic in search console before the move was +/-5,800 clicks. The most recent Search Console had HTTP: 645 Clicks HTTPS: 2000 clicks. Our rank tracker shows a massive drop over 2 days, bottoming out, and then some recovery over the next 3 days. HTTP site is showing 500,000 backlinks. HTTPS is showing 23,000 backilinks. I am planning on resubmitting the old sitemaps today in an attempt to remap our redirects to 301s. Is this typical? Any ideas?0 -
Cache and index page of Mobile site
Hi, I want to check cache and index page of mobile site. I am checking it on mobile phone but it is showing the cache version of desktop. So anybody can tell me the way(tool, online tool etc.) to check mobile site index and cache page.
Intermediate & Advanced SEO | | vivekrathore0 -
My blog is indexing only the archive and category pages
Hi there MOZ community. I am new to the QandA and have a question. I have a blog Its been live for months - but I can not get the posts to rank in the serps. Oddly only the categories rank. The posts are crawled it seems - but seen as less important for a reason I don't understand. Can anyone here help with this? See here for what i mean. I have had several wp sites rank well in the serps - and the posts do much better. Than the categories or archives - super odd. Thanks to all for help!
Intermediate & Advanced SEO | | walletapp0 -
Only the mobile version of the site is being indexed
We've got an interesting situation going on at the moment where a recently on-boarded clients site is being indexed and displayed, but it's on the mobile version of the site that is showing in serps. A quick rundown of the situation. Retail shopping center with approximately 200 URLS Mobile version of the site is www.mydomain.com/m/ XML sitemap submitted to Google with 202 URLs, 3 URLS indexed Doing site:www.mydomain.com in a Google search brings up the home page (desktop version) and then everything else is /m/ versions. There is no rel="canonical" on mobile site pages to their desktop counterpart (working on fixing that) We have limited CMS access, but developers are open to working with us on whatever is needed. Within desktop site source code, there are no "noindex, nofollow, etc" issues on the pages. No manual actions, link issues, etc Has anyone ever encoutnered this before? Any input or thoughts are appreciated. Thanks
Intermediate & Advanced SEO | | GregWalt0 -
Site less than 20 pages shows 1,400+ pages when crawled
Hello! I’m new to SEO, and have been soaking up as much as I can. I really love it, and feel like it could be a great fit for me – I love the challenge of figuring out the SEO puzzle, plus I have a copywriting/PR background, so I feel like that would be perfect for helping businesses get a great jump on their online competition. In fact, I was so excited about my newfound love of SEO that I offered to help a friend who owns a small business on his site. Once I started, though, I found myself hopelessly confused. The problem comes when I crawl the site. It was designed in Wordpress, and is really not very big (part of my goal in working with him was to help him get some great content added!) Even though there are only 11 pages – and 6 posts – for the entire site, when I use Screaming Frog to crawl it, it sees HUNDREDS of pages. It stops at 500, because that is the limit for their free version. In the campaign I started here at SEOmoz, and it says over 1,400 pages have been crawled…with something like 900 errors. Not good, right? So I've been trying to figure out the problem...when I look closer in Screaming Frog, I can see that some things are being repeated over and over. If I sort by the Title, the URLs look like they’re stuck in a loop somehow - one line will have /blog/category/postname…the next line will have /blog/category/category/postname…and the next line will have /blog/category/category/category/postname…and so on, with another /category/ added each time. So, with that, I have two questions Does anyone know what the problem is, and how to fix it? Do professional SEO people troubleshoot this kind of stuff all of the time? Is this the best place to get answers to questions like that? And if not, where is? Thanks so much in advance for your help! I’ve enjoyed reading all of the posts that are available here so far, it seems like a really excellent and helpful community...I'm looking forward to the day when I can actually answer the questions!! 🙂
Intermediate & Advanced SEO | | K.Walters0 -
How to have pages re-indexed
Hi, my hosting company has blocked one my web site seeing it has performance problem. Result of that, it is now reactivated but my pages had to be reindexed. I have added my web site to Google Webmaster tool and I have submitted my site map. After few days it is saying: 103 number of URLs provided 39 URLs indexed I know Google doesn't promesse to index every page but do you know any way to increase my chance to get all my pages indexed? By the way, that site include pages and post (blog). Thanks for your help ! Nancy
Intermediate & Advanced SEO | | EnigmaSolution0