Confused About Problems Regarding Adding an SSL
-
After reading Cyrus' article: http://moz.com/blog/seo-tips-https-ssl, I am now completely confused about what adding SSL could do to our site. Bluehost, our hosting provider, says if we get their SSL, they just add it to our site and it's up in a few hours: no problem whatsoever. If that's true, that'd be fantastic...however, if that's true, there wouldn't need to be like 10 things you're supposed to do (according to Cyrus' article) to ensure your rankings after the switch.
Can someone clarify this for me?
Thanks,
Ruben
-
Thanks Cyrus!
-
Hi Ruben,
Thanks for writing in. I'm unfamiliar with Bluehost's HTTPs service, but I assume they are taking care of top level issues. You'll still want to go through the checklist to make sure everything is valid and you follow SEO best practices.In short:
- Check your links
- Check your assets (images, CSS, javascript)
- Canonical tags
- Register with Google Webmaster Tools
- Update your sitemaps and robots.txt files
This covers the important stuff. As you noted, a few more tips here: http://moz.com/blog/seo-tips-https-ssl
-
Maybe was obvious to everybody but 301 redirect for every single page is also a fundamental step, otherwise you are going to have broken external links, not to mention WMT which I don't think would be satisfied by just the canonical update.
Sitemap must be updated as well.
We recently switched a website from HTTP to HTTPS and in term of performance there was no difference after the update, at least according to WMT and analytics.
I was kind of scared before to update but at the end everything was smoother than expected, WMT took around 10 days to completely re-index the https version.
But of course we kept finding some non https link embedded here and there in some pages for days and we had to manually edit some content to avoid ssl warning from browsers.
-
I have no idea what CMS you are using but check the server side code generating the link, not just the code sent to the browser.
We recently switched to SSL, and our CMS was already building internal links on pages using the protocol of the http request.
-
Thanks Highland!
-
Great, thanks!
-
Ruben, I had a look at your website and your URLs all have HTTP in them so these would need to be updated all across your site before you make the switch to HTTPS. Because you are using WordPress this should be as simple as updating the site URL to https://www.kempruge.com.
The tip by @Highland about using Firebug is excellent. This will allow you to quickly debug if there are non-HTTPS links remaining - in the WordPress theme or template, for example.
Have a look at the WordPress HTTPS documentation also.
-
Hi Alex,
I'm not really sure if we use a protocol-less linking pattern or not. I don't see http:// in any of our urls, so if that's the criteria I'm guessing we don't? I included a screenshot of one of our URLs. Would you mind telling me if it's clear from the image whether we do or do not?
Thanks for your response. I really appreciate your time and input.
Best,
Ruben
-
One major tip I always point people to is that using protocol-less links for anything external is a great way to make sure your site always supports SSL without issue.
Firebug is a great way to make sure everything is loading HTTPS. Turn it on, switch to the Net tab, and load your page. It will show you every request sent as part of your page. It makes spotting non-SSL requests easy.
You can turn HSTS on yourself if your provider uses Apache and supports htaccess. (sorry I can't link an article, Moz won't let me). If they don't, you will have to have your host enable it on their end.
-
Implementing SSL should be straightforward for the most part
You need to ensure that links around your site (including canonical links) are updated to use HTTPS (so https://example.com/link as opposed to http://example.com/link where example.com is your domain name). If you are already using a protocol-less linking pattern (//example.com/link) you don't need to update the links.
You can also configure your web server to only serve HTTPS. If your web server is Apache you can do this with the SSLRequireSSL directive.
<code><location>SSLRequireSSL</location></code>
HTTPS also causes a significant slow-down as the browser and the server negotiate a secure connection. If your site has already been optimized for speed it should not cause a problem but if in doubt revisit that process and ensure that you are getting the best possible speed for your visitors.
The article by Cyrus has a great checklist to double check everything.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should you automatically resolve URLs with extra trailing slashes added by accident?
Hi - I was just wondering whether a URL with extra trailing slashes should actuall redirect to the version without the extra trailing slashes... e.g. www.domainname.com/folder////// should automatically resolve to www.domainname.com/folder/ - what is your opinion on this?
Intermediate & Advanced SEO | | McTaggart0 -
Base copy on 1 page, then adding a bit more for another page - potential duplicate content. What to do?
Hi all, We're creating a section for a client that is based on road trips - for example, New York to Toronto. We have a 3 day trip, a 5 day trip, a 7 day trip and a 10 day trip. The 3 day trip is the base, and then for the 5 day trip, we add another couple of stops, for the 7 day trip, we add a couple more stops and then for the 10 day trip, there might be two or three times the number of stops of the initial 3 day trip. However, the base content is similar - you start at New York, you finish in Toronto, you likely go through Niagara on all trips. It's not exact duplicate content, but it's similar content. I'm not sure how to look after it? The thoughts we have are:1) Use canonical tags 3,5,7 day trips to the 10 day trip.
Intermediate & Advanced SEO | | digitalhothouse
2) It's not exactly duplicate content, so just go with the content as it is We don't want to get hit by any penalty for duplicate content so just want to work out what you guys think is the best way to go about this. Thanks in advance!0 -
Is it a problem to use a 301 redirect to a 404 error page, instead of serving directly a 404 page?
We are building URLs dynamically with apache rewrite.
Intermediate & Advanced SEO | | lcourse
When we detect that an URL is matching some valid patterns, we serve a script which then may detect that the combination of parameters in the URL does not exist. If this happens we produce a 301 redirect to another URL which serves a 404 error page, So my doubt is the following: Do I have to worry about not serving directly an 404, but redirecting (301) to a 404 page? Will this lead to the erroneous original URL staying longer in the google index than if I would serve directly a 404? Some context. It is a site with about 200.000 web pages and we have currently 90.000 404 errors reported in webmaster tools (even though only 600 detected last month).0 -
Enormous 7 page drop after switching servers and adding load balancers. Thoughts?
Hello Everyone, I'm a longtime Moz user but I had to switch accounts after switching jobs. I was hoping someone might be able to give me some insight on whats going on if possible. Our startup had first page position for our most valuable keyword: "Crowdfunding real estate" for about 6 or 7 months. Once we launched and switched to a production server behind load balancers, we dropped almost overnight to 7th page and we've been there for about a month. We don't have many links yet and some of the ones we DO have are kind of spammy (no idea where they came from and in process of trying to get them removed) but we thought it'd be strange to see that massive drop. We are even pages below a competitor who has NO links and basically zero content on the page. We don't have any notifications in WMT about a manual penalty or anything. I'd really, really appreciate any advice and If anyone has any ideas, the page is at: PatchofLand.com Thanks, Jason
Intermediate & Advanced SEO | | PatchofLand0 -
SEO Problem with PowerPoint to PDF?
Can anyone think of any reasons why it would be a bad idea to use PowerPoint to create documents and then convert them to PDFs? Do you think this could cause any crawling issues for Google?
Intermediate & Advanced SEO | | BlueLinkERP0 -
I had most of my sites down for a month for technical problems, how do I recover my SEO status ?
I had most of my sites down for a month for technical problems, how do I recover my SEO status ? I did everything possible to not get offline, but I did, some months before my domais were extremely slow, leading to failures over failures. I got them down and moved to another host. What should I do in SEO know that the mess is done ?
Intermediate & Advanced SEO | | aamato0 -
Adding a huge new product range to eCommerce site and worried about Duplicate Content
Hey all, We currently run a large eCommerce site that has around 5000 pages of content and ranks quite strongly for a lot of key search terms. We have just recently finalised a business agreement to incorporate a new product line that compliments our existing catalogue, but I am concerned about dumping this huge amount of content (that is sourced via an API) onto our site and the effect it might have dragging us down for our existing type of product. In regards to the best way to handle it, we are looking at a few ideas and wondered what SEOMoz thought was the best. Some approaches we are tossing around include: making each page point to the original API the data comes from as the canonical source (not ideal as I don't want to pass link juice from our site to theirs) adding "noindex" to all the new pages so Google simply ignores them and hoping we get side sales onto our existing product instead of trying to rank as the new range is highly competitive (again not ideal as we would like to get whatever organic traffic we can) manually rewriting each and every new product page's descriptions, tags etc. (a huge undertaking in terms of working hours given it will be around 4,400 new items added to our catalogue). Currently the industry standard seems to just be to pull the text from the API and leave it, but doing exact text searches shows that there are literally hundreds of other sites using the exact same duplicate content... I would like to persuade higher management to invest the time into rewriting each individual page but it would be a huge task and be difficult to maintain as changes continually happen. Sorry for the wordy post but this is a big decision that potentially has drastic effects on our business as the vast majority of it is conducted online. Thanks in advance for any helpful replies!
Intermediate & Advanced SEO | | ExperienceOz0 -
Ranking problems
Hi All My site is live for a year now. I;m getting tons of traffic (alexa 54k) and business are good. The only problem is that I have 0 page rank....I have checked again and again the site;s structure to see if there is anything wrong with the site but everything seems to be ok. Google just added search links to the site (megamoneygames) which looks very nice. For example, none of my competitors have search links but they all have page rank of 4 while I have 0. In addition, for some reason the site's age (days) shows 0 although it is live for a year now... Do you have any idea of what is going on? do I have errors in the site? Thanks
Intermediate & Advanced SEO | | Pariplay0