Can you keep you old HTTP xml sitemape when moving to HTTPS site wide?
-
Hi Mozers,
I want to keep the HTTP xml sitemape live on my http site to keep track of indexation during the HTTPS migration. I'm not sure if this is doable since once our tech. team forces the redirects every http page will become https.
Any ideas? Thanks
-
Hi Zack!
Migrating your site to HTTPS all your URLs will turn into HTTPS. So, there will no need to keep the old sitemap alive or keep track of the http indexation.
Of course you must keep track of the indexation of the new site. Remember to create a new Search Console profile for that.Here, an excellent article and a checklist on everthing you should do in a HTTPS migration.
The HTTP to HTTPs Migration Checklist in Google Docs to Share, Copy & Download, from Aleyda Solis.
Hope this helped you.
Best luck.
GR. -
I searched the subject and the only I have found is this:
https://productforums.google.com/forum/#!topic/webmasters/9uXVsQ18WQk
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
Should I keep a website which is outdated or close it down? It has a few links. If I keep it can I redirect people to our newer site?
We are in the process of buying some intellectual property, and it's websites are very dated and only have around 5 external links each. What's the best course of action? Do we close down the sites; then redirect the urls to our current website, or do we leave the sites up, but redirect people to our new site. Reference: current website: www.psychometrics.com Old sites that come with the intellectual property: http://www.eri.com/ plus http://www.hrpq.com/ Thanks, Dan Costigan
Technical SEO | | dcostigan0 -
Robots.txt on http vs. https
We recently changed our domain from http to https. When a user enters any URL on http, there is an global 301 redirect to the same page on https. I cannot find instructions about what to do with robots.txt. Now that https is the canonical version, should I block the http-Version with robots.txt? Strangely, I cannot find a single ressource about this...
Technical SEO | | zeepartner0 -
Redirecting old Sitemaps to a new XML
I've discovered a ton of 404s from Google's WMT crawler looking for mydomain.com/sitemap_archive_MONTH_YEAR. There are tons of these monthly archive xmls. I've used a plugin that for some reason created individual monthly archive xml sitemaps and now I get 404s. Creating rules for each archive seems a bad solution. My current sitemap plugin creates a single clean one mydomain.com/sitemap_index.xml. How can I create a redirect rule in the Redirection WP plugin that will redirect any URL that has the 'sitemap' and 'xml' string in it to my current xml sitemap? I've tried using a wildcard like so: mysite.com/sitemap*.*, mysite.com/sitemap ., mysite.com/sitemap(.), mysite.com/sitemap (.) but none of the wildcard uses got the general redirect to work. Is there a way to make this happen with the WP Redirection plugin? If not, is there a htaccess rule, and what would the code be for it? Im not very fluent with using general redirects in htaccess unfortunately. Thanks!
Technical SEO | | IgorMateski0 -
Switching site from http to https. Should I do entire site?
Good morning, As many of you have read, Google seems to have confirmed that they will give a small boost to sites with SSL certificates this morning. So my question is, does that mean we have to switch our entire site to https? Even simple information pages and blog posts? Or will we get credit for the https boost as long as the sensitive parts of our site have it? Anybody know? Thanks in advance.
Technical SEO | | rayvensoft1 -
Can view pages of site, but Google & SEOmoz return 404
I can visit and view every page of a site (can also see source code), but Google, SEOmoz and others say anything other than home page is a 404 and Google won't index the sub-pages. I have check robots.txt and HTAccess and can't find anything wrong. Is this a DNS or server setting problem? Any ideas? Thanks, Fitz
Technical SEO | | FitzSWC0 -
Can we use our existing site content on new site?
We added 1000s of pages unique content on our site and soon after google release penguin and we loose our ranking for major keywords and after months of efforts we decided to start a new site. If we use all the existing site content on new domain does google going to penalized the site for duplicate content or it will be treated as unique? Thanks
Technical SEO | | mozfreak0 -
I changed the domain and structure of my site,is there anything I can do to help speed the recovery in SERPs?
I change the domain of my site in March (pretty much exactly when Panda hit, by coincidence). Our search traffic has dropped by 90% in that time with little recovery. In webmaster tools it shows about 400,000 pages on the new domain and about 85,000 still indexed on the old domain. I set up custom 301 redirects to all of the new pages on the new domain so everything that was moved has a good one hop redirect. I've been told that the only thing I can do is sit back and wait for everything to finish transitioning. The problem is that it has been 5 months of poor traffic, which means 5 months of slow sales. Is there anything I can do the speed up the transition?
Technical SEO | | iJeep0