Correct Indexing problem
-
I recently redirected an old site to a new site. All the URLs were the same except the domain. When I redirected them I failed to realize the new site had https enable on all pages. I have noticed that Google is now indexing both the http and https version of pages in the results. How can I fix this? I am going to submit a sitemap but don't know if there is more I can do to get this fixed faster.
-
Okay I may have understood your original post differently then what you meant.
So the case is you have HTTPS enabled, but Google is Indexing Both HTTP & HTTPS pages. However, you want them to only index the HTTP version. You are also running a cart or checkout which is only HTTPS which is likely not relevant to Google so I would recommend blocking those pages with robots.txt.
I would recommend coding an IF statement to deal with duplicate indexing (https & http) & setting up a robots.txt file to prevent crawling pages that have no value and are there for customer use only.
Something like this would work in php:
_
_if ( isset($_SERVER['HTTPS']) || (isset($SERVER['HTTPS']) && strtolower($SERVER['HTTPS'])) == 'on' ) {echo ''."\n";}
else {echo ''."\n";}
?>_I'm not sure the code in asp since I rarely ever use Windows servers but you should be able to find that with Google.
Then setup your robots.txt to block all urls that are specific to personal data like this: (Example)
Disallow: /catalog/account.php
Disallow: /catalog/account_edit.php
Disallow: /catalog/account_history.php
Disallow: /catalog/account_history_info.php
Disallow: /catalog/account_password.php
Disallow: /catalog/add_checkout_success.php
Disallow: /catalog/address_book.php
Disallow: /catalog/address_book_process.php
Disallow: /catalog/checkout_confirmation.php
Disallow: /catalog/checkout_payment.php
Disallow: /catalog/checkout_process.php
Disallow: /catalog/checkout_shipping.php
Disallow: /catalog/checkout_shipping_address.php
Disallow: /catalog/checkout_success.php
Disallow: /catalog/cookie_usage.php
Disallow: /catalog/create_account.phpI hope that helps
Don_
-
My site should be running http on all pages except the checkout. Would this work the opposite of what you have written and I can make a rule for the checkout to allow https?
Thanks
jared
-
If your site is running on https only, then a simple edit to your .htaccess file will correctly re-direct (301) any request for a http page to the correct https page.
Sample Code:
RewriteCond %{HTTPS} !=on
RewriteRule .* https://%{SERVER_NAME}%{REQUEST_URI} [R=301,L]
There are several ways to handle this, so you may also benefit from Searching ".htaccess 301 redirect http to https"
Hope that helps.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google not Indexing images on CDN.
My URL is: https://bit.ly/2hWAApQ We have set up a CDN on our own domain: https://bit.ly/2KspW3C We have a main xml sitemap: https://bit.ly/2rd2jEb and https://bit.ly/2JMu7GB is one the sub sitemaps with images listed within. The image sitemap uses the CDN URLs. We verified the CDN subdomain in GWT. The robots.txt does not restrict any of the photos: https://bit.ly/2FAWJjk. Yet, GWT still reports none of our images on the CDN are indexed. I ve followed all the steps and still none of the images are being indexed. My problem seems similar to this ticket https://bit.ly/2FzUnBl but however different because we don't have a separate image sitemap but instead have listed image urls within the sitemaps itself. Can anyone help please? I will promptly respond to any queries. Thanks
Technical SEO | | TNZ
Deepinder0 -
Indexed pages
Just started a site audit and trying to determine the number of pages on a client site and whether there are more pages being indexed than actually exist. I've used four tools and got four very different answers... Google Search Console: 237 indexed pages Google search using site command: 468 results MOZ site crawl: 1013 unique URLs Screaming Frog: 183 page titles, 187 URIs (note this is a free licence, but should cut off at 500) Can anyone shed any light on why they differ so much? And where lies the truth?
Technical SEO | | muzzmoz1 -
Why is the Dev site indexing and not my actual Domain
hi guys I had 2 word press sites built but since they went live a couple of things aren't working. 1.when I do a keyword search one of the sites comes back with the actual URL and when I do another keyword search for the same site the DEV site comes back indexed and not the actual URL. 2. The other site originally started indexing with the DEV site and not the URL and the developer tried to fix it and now it doesn't index at all. Its been long enough time for it to index. Both URL's are live when put into a browser. Any advice would be great Thanks Jamie
Technical SEO | | HLAS0 -
How To Cleanup the Google Index After a Website Has Been HACKED
We have a client whose website was hacked, and some troll created thousands of viagra pages, which were all indexed by Google. See the screenshot for an example. The site has been cleaned up completely, but I wanted to know if anyone can weigh in on how we can cleanup the Google index. Are there extra steps we should take? So far we have gone into webmaster tools and submitted a new site map. ^802D799E5372F02797BE19290D8987F3E248DCA6656F8D9BF6^pimgpsh_fullsize_distr.png
Technical SEO | | yoursearchteam0 -
Help removing duplicate content from the index?
Last week, after a significant drop in traffic, I noticed a subdomain in the index with duplicate content. The main site and subdomain can be found below. http://mobile17.com http://232315.mobile17.com/ I've 301'd everything on the subdomain to the appropriate location on the main site. Problem is, site: searches show me that if the subdomain content is being deindexed, it's happening really slowly. Traffic is still down about 50% in the last week or so... what's the best way to tackle this issue moving forward?
Technical SEO | | ccorlando0 -
Discrepency between # of pages and # of pages indexed
Here is some background: The site in question has approximately 10,000 pages and Google Webmaster shows that 10,000 urls(pages were submitted) 2) Only 5,500 pages appear in the Google index 3) Webmaster shows that approximately 200 pages could not be crawled for various reasons 4) SEOMOZ shows about 1,000 pages that have long URL's or Page Titles (which we are correcting) 5) No other errors are being reported in either Webmaster or SEO MOZ 6) This is a new site launched six weeks ago. Within two weeks of launching, Google had indexed all 10,000 pages and showed 9,800 in the index but over the last few weeks, the number of pages in the index kept dropping until it reached 5,500 where it has been stable for two weeks. Any ideas of what the issue might be? Also, is there a way to download all of the pages that are being included in that index as this might help troubleshoot?
Technical SEO | | Mont0 -
/index.php in sitemap? take it out?
Hi Everyone, The following was automatically generated at xml-sitemaps.com Should I get rid of the index.php url from my sitemap? If so, how do I go about redirecting it in my htaccess ? <url><loc>http://www.mydomain.ca/</loc></url>
Technical SEO | | RogersSEO
<url><loc>http://www.mydomain.ca/index.php</loc></url> thank you in advance, Martin0 -
HTML and no index, follow
I’m just learning about HTML and I was wondering can a tag be put into a dynamic HTML page?
Technical SEO | | EricVallee340