Correct Indexing problem
-
I recently redirected an old site to a new site. All the URLs were the same except the domain. When I redirected them I failed to realize the new site had https enable on all pages. I have noticed that Google is now indexing both the http and https version of pages in the results. How can I fix this? I am going to submit a sitemap but don't know if there is more I can do to get this fixed faster.
-
Okay I may have understood your original post differently then what you meant.
So the case is you have HTTPS enabled, but Google is Indexing Both HTTP & HTTPS pages. However, you want them to only index the HTTP version. You are also running a cart or checkout which is only HTTPS which is likely not relevant to Google so I would recommend blocking those pages with robots.txt.
I would recommend coding an IF statement to deal with duplicate indexing (https & http) & setting up a robots.txt file to prevent crawling pages that have no value and are there for customer use only.
Something like this would work in php:
_
_if ( isset($_SERVER['HTTPS']) || (isset($SERVER['HTTPS']) && strtolower($SERVER['HTTPS'])) == 'on' ) {echo ''."\n";}
else {echo ''."\n";}
?>_I'm not sure the code in asp since I rarely ever use Windows servers but you should be able to find that with Google.
Then setup your robots.txt to block all urls that are specific to personal data like this: (Example)
Disallow: /catalog/account.php
Disallow: /catalog/account_edit.php
Disallow: /catalog/account_history.php
Disallow: /catalog/account_history_info.php
Disallow: /catalog/account_password.php
Disallow: /catalog/add_checkout_success.php
Disallow: /catalog/address_book.php
Disallow: /catalog/address_book_process.php
Disallow: /catalog/checkout_confirmation.php
Disallow: /catalog/checkout_payment.php
Disallow: /catalog/checkout_process.php
Disallow: /catalog/checkout_shipping.php
Disallow: /catalog/checkout_shipping_address.php
Disallow: /catalog/checkout_success.php
Disallow: /catalog/cookie_usage.php
Disallow: /catalog/create_account.phpI hope that helps
Don_
-
My site should be running http on all pages except the checkout. Would this work the opposite of what you have written and I can make a rule for the checkout to allow https?
Thanks
jared
-
If your site is running on https only, then a simple edit to your .htaccess file will correctly re-direct (301) any request for a http page to the correct https page.
Sample Code:
RewriteCond %{HTTPS} !=on
RewriteRule .* https://%{SERVER_NAME}%{REQUEST_URI} [R=301,L]
There are several ways to handle this, so you may also benefit from Searching ".htaccess 301 redirect http to https"
Hope that helps.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Indexed, but not shown in search result
Hi all We face this problem for www.residentiebosrand.be, which is well programmed, added to Google Search Console and indexed. Web pages are shown in Google for site:www.residentiebosrand.be. Website has been online for 7 weeks, but still no search results. Could you guys look at the update below? Thanks!
Technical SEO | | conversal0 -
Accidental No Index
Hi everyone, We control several client sites at my company. The developers accidentally had a no index robot implemented in the site code when we did the HTTPS upgrade without knowing it (yes it's true). Ten days later we noticed traffic was falling. After a couple days we found the no index tags and removed them and resumbitted the sitemaps. The sites started ranking for their own keywords again within a day or two. The organic traffic is still down considerably and other keywords they are not ranking for in the same spot as they were before or at all. If I look in Google Search console, it says we submitted for example 4,000 URLs and only 160 have been indexed. I feel like maybe Google is taking a long time to re-index to remainder of the sites?? Has anyone has this issue?? We're starting to get very concerned so any input would be appreciate. I read an article on here from 2011 about a company that did the same and they were ranking for their keywords within a week. It's been 8 days since our fix.
Technical SEO | | AliMac260 -
Google crawling but not indexing for no apparent reason
Client's site went secure about two months ago and chose root domain as rel canonical (so site redirects to https://rootdomain.com (no "www"). Client is seeing the site recognized and indexed by Google about every 3-5 days and then not indexed until they request a "Fetch". They've been going through this annoying process for about 3 weeks now. Not sure if it's a server issue or a domain issue. They've done work to enhance .htaccess (i.e., the redirects) and robots.txt. If you've encountered this issue and have a recommendation or have a tech site or person resource to recommend, please let me know. Google search engine results are respectable. One option would be to do nothing but then would SERPs start to fall without requesting a new Fetch? Thanks in advance, Alan
Technical SEO | | alankoen1230 -
Not All Submitted URLs in Sitemap Get Indexed
Hey Guys, I just recognized, that of about 20% of my submitted URL's within the sitemap don't get indexed, at least when I check in the webmaster tools. There is of about 20% difference between the submitted and indexed URLs. However, as far as I can see I don't get within webmaster tools the information, which specific URLs are not indexed from the sitemap, right? Therefore I checked every single page in the sitemap manually by putting site:"URL" into google and every single page of the sitemap shows up. So in reality every page should be indexed, but why does webmaster tools shows something different? Thanks for your help on this 😉 Cheers
Technical SEO | | _Heiko_0 -
ECommerce site - Duplicate pages problem.
We have an eCommerce site with multiple products being displayed on a number of pages. We use rel="next" and rel="prev" and have a display ALL which I understand Google should automatically be able to find. Should we also being using a Canonical tag as well to tell google to give authority to the first page or the All Pages. Or was the use of the next and prev rel tags that we currently do adequate. We currently display 20 products per page, we were thinking of increasing this to make fewer pages but they would be better as this which would make some later product pages redundant . If we add 301 redirects on the redundant pages, does anyone know of the sort of impact this might cause to traffic and seo ?. General thoughts if anyone has similar problems welcome
Technical SEO | | SarahCollins0 -
Problem wth Crawling
Hello, I have a website http://digitaldiscovery.eu here in SEOmoz. Its strange since the last week SEOmoz is crawling only one page! And before it was crwaling all the pages. Whats happening? Help SEOmoz! :))
Technical SEO | | PedroM0 -
What is the most effective way of indexing a localised website?
Hi all, I have a website, www.acrylicimage.com which provides products in three different currencies, $, £ and Euro. Currently a user can click on a flag to indicate which region they are in, or if the user has not manually selected the website looks at the users Locale setting and sets the region for them. The website also has a very simple content management system which provides ever so slightly different content depending on which region the user is in. The difference in content might literally be a few words per page, like contact details, measurements i.e. imperial to metric. I dont believe that GoogleBot, or any other bot for that matter, sets a Locale, and therefore it will only ever be indexing the content on our default region - the UK. So, my question really is if I need to be able to index different versions of content on the same page, is the best route to provide alternate urls i.e.: /en/about-us
Technical SEO | | dotcentric
/us/about-us
/eu/about-us The only potential downside I see to this is there are currently a couple of pages that do have exactly the same content regardless of whether you have selected the UK or USA regions - could this be considered content duplication? Thanks for your help. Al0 -
Is z-indexing a black-hat trick?
I use z-indexing for a floating bar that scrolls vertically along the side of my page. I'm not hiding anything. Is this safe or not?
Technical SEO | | BradBorst0