Htaccess Question - Is this the way to go to consolidate?
-
Hi all,
My site seems to have www.xyz.com, http://www.xyz.com, http://xyz.com and other variations! From an old agency doing this. All showing differing backlinks etc. SO I want to merge them so I can just look at one analytics account - with everything combined.
I want it just to consolidate all to https:///www.xym.com as the client wants - how do I do this? Does it take long to take effect??
Also I presume in webmaster I'll have to set up the preferred extension?
Thanks very much for any advice
-
We can help you get all those urls to point to the https on the server. But what reports are you referring to?
-
It's actually just by looking at link reports to the variations that I can see they fluctuate with differing versions. So I'm looking for a simple way to make sure all the prefixes redirect to https://. Thanks for the comments so far.
-
Excellent addition. Probably a combination of both since he states that the client wants to migrate to HTTPS. We'll have to get clarification from Giles.
-
Ryan,
I was going to mention this but after I re-read it. It seems he is dealing with an issue inside Google Webmaster Central that shows reports for more then one domain. I have seen this before where they list domains likehttp://www.site.com
http://www.site.com/shop
http://www.site.com/Each showing different metrics.
Please correct me if I'm wrong.
-
Hi Giles. You'll end up with something similar to this:
RewriteEngine On
RewriteCond %{HTTPS} !=on
RewriteRule ^(.*) https://%{SERVER_NAME}/$1 [R,L]RewriteRule ^page1(.*) https://%{SERVER_NAME}/page1$1 [R,L]
The first being an example of redirecting an entire domain to HTTPS and the second, a specific page.
The effects of the redirection are immediate in the sense that once the .htaccess file is changed the redirection will be in place. What takes a longer amount of time is for the change to carry over in Google's ranking and throughout the web. Using the tools in GWT and Bing can help.
Here are a couple of guides that you might find useful: http://moz.com/blog/web-site-migration-guide-tips-for-seos and http://moz.com/learn/seo/redirection.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do I have to go off page to establish an entity?
So I have always thought of Google establishing an Entity (by entity I mean https://goo.gl/RP8e9B) by looking at the markup on a website. Recently watched a video that discussed creating an entity by using external websites. I have a feeling that you do not have to go off of the main website to have an entity created. Is there anything valid in this concept/video? (https://goo.gl/CNe7qD) The concept focuses on creating branded pages on properties owned by Google and various other well known websites. My main concern is whether this is something I should do for our website or if just doing the little bit of social marketing we are doing is fine.
Intermediate & Advanced SEO | | marghutch1 -
Question about Indexing of /?limit=all
Hi, i've got your SEO Suite Ultimate installed on my site (www.customlogocases.com). I've got a relatively new magento site (around 1 year). We have recently been doing some pr/seo for the category pages, for example /custom-ipad-cases/ But when I search on google, it seems that google has indexed the /custom-ipad-cases/?limit=all This /?limit=all page is one without any links, and only has a PA of 1. Whereas the standard /custom-ipad-cases/ without the /? query has a much higher pa of 20, and a couple of links pointing towards it. So therefore I would want this particular page to be the one that google indexes. And along the same logic, this page really should be able to achieve higher rankings than the /?limit=all page. Is my thinking here correct? Should I disallow all the /? now, even though these are the ones that are indexed, and the others currently are not. I'd be happy to take the hit while it figures it out, because the higher PA pages are what I ultimately am getting links to... Thoughts?
Intermediate & Advanced SEO | | RobAus0 -
Quickest way to deindex large parts of a website
Hey there, my clients website was set up with subdirectories for almost every country in the world plus multiple languages in each country. The content in each subfolder is (almost) identical. So no surprise: They have a big problem with duplicate content and ranking fluctuations. Since they don't want to change the site's structure I recommended limiting the languages available in each subfolder with robots.txt. However before doing this we marked the contents to be exluded with noindex, nofollow. It's only been 2 days now but I hardly notice any decline in the number of indexed pages. I was therefore wondering if it would speed up things if I marked the pages with just noindex instead of noindex and nofollow. It would be great if you could share your thoughts on that. Cheers, Jochen Hey there,
Intermediate & Advanced SEO | | Online-Marketing-Guy
my clients website was set up with subdirectories for almost every country in the world plus multiple languages in each country. The content in each subfolder is (almost) identical. So no surprise: They have a big problem with duplicate content and ranking fluctuations.
Since they don't want to change the site's structure I recommended limiting the languages available in each subfolder with robots.txt. However before doing this we marked the contents to be exluded wiht noindex, nofollow. It's only been 2 days now but I hardly notice any decline in the number of indexed pages.
I was therefore wondering if it would speed up things if I marked the pages with just noindex instead of noindex and nofollow.
It would be great if you could share your thoughts on that.
Cheers,Jochen0 -
.htaccess newby
Sorry to ask a really dumb question. I want to sort out a load of old 404 errors. I've exported the list of URL and I'm more than happy to go through that and work out what needs to go where. After that my only option at the moment is to use the re-direct function in my WordPress install and do all the work manually. There are loads to do so I want to be able to upload all the re-directs. I know I need to create a htaccess file and upload it. I know where to upload it. This is where I get nervous. I need to get this file right. Is there a really obvious idiots file which I can use and then save as the correct file type? I've got all the URLs in a CSV at the moment. Sorry for being a bit thick. Hope you can help.
Intermediate & Advanced SEO | | GlobalLingo0 -
Best way to transfer pagerank from one site to another
We currently own two sites (with unique domains) that accomplish a similar goal, but are completely different (so there's no duplicate content, etc) and were developed independently. Both sites have very good pagerank due to great press and inbound links over several years. Also both have thousands of pages and get a lot of inbound deep links. We plan on shutting one of the sites down so we can focus on the other. We'd like to transfer as much traffic and SEO/pagerank value from the one we're shutting down to the one we're continuing to focus on. What's the best way to do that? Should we just do a 301 redirect? Or keep the site running in some diminished form and link it to the site we're focusing on? I saw SEOmoz has a good guide on moving sites http://www.seomoz.org/learn-seo/redirection which recommends a 301 redirect, but I wanted to see if the same applies when merging sites as we are in this case.
Intermediate & Advanced SEO | | 212areacode0 -
Page Titles... question about which is better
Hi, I'm kind of a newbie and I'm working on an e commerce website. I would love to be able to optimize the site so that the keyword "dog boutique" was ranking for the homepage. B/C a lot of the pages call from php to create the meta data, most of generated page titles look like "Product Name, Category - Moondoggie Dog Boutique" My question is would it be more helpful to just have Moondoggie Dog Boutique on the page title on the page I would like to rank for "dog boutique" and use Moondoggie Inc. or Just Moondoggie in it's place on all of the other pages? Would this help or make it worse? Thanks! KristyO If you would like to see hte site: http://www.moondoggieinc.com
Intermediate & Advanced SEO | | KristyO0 -
Is there a way to contact Google besides the google product forum?
Our traffic from google has dropped more than 35% and continues to fall. We have been on this forum and google's webmaster forum trying to get help. We received great advice, have waited months, but instead of our traffic improving, it has worsened. We are being penalized by google for many keywords such as trophies, trophies and awards and countless others - we were on page one previously. We filed two reconsideration requests and were told both times that there were no manual penalties. Some of our pages continue to rank well, so it is not across the board (but all of our listings went down a bit). We have made countless changes (please see below). Our busy season was from March to May and we got clobbered. Google, as most people know, is a monopoly when it comes to traffic, so we are getting killed. At first we thought it was Penquin, but it looks like we started getting killed late last year. Lots of unusual things happened - we had a large spike in traffic for two days, then lost our branded keywords, then our main keywords. Our branded keywords came back pretty quickly, but nothing else did. We have received wonderful advice and made most of the changes. We are a very reputable company and have a feeling we are being penalized for something other than spamming. For example, we have a mobile site we added late last year and a wholesale system was added around the same time. Since the date does not coincide with Penquin, we think there is some major technical driver, but have no idea what to do at this point. The webmasters have all been helpful, but nothing is working. We are trying to find out what one does in a situation as we are trying to avoid closing our business. Thank you! Changes Made: 1. We had many crawl errors so we reduced them significantly 2. We had introduced a mobile website in January which we
Intermediate & Advanced SEO | | trophycentraltrophiesandawards
thought may have been the cause (splitting traffic, duplicate content, etc.),
so we had our mobile provider add the site to their robots.txt file. 3. We were told by a webmaster that their were too many
links from our search provider, so we have them put the search pages in a
robots.txt file. 4. We were told that we had too much duplicate content. This was / is true, as we have hundred of legitate products that are similar:
example trophies and certificates that are virtually the same but are
for different sports or have different colors and sizes. Still, we added more content and added no index tags to many products. We compared our % of dups to competitors and it is far less. 5. At the recommendation of another webmaster, we changed
many pages that might have been splitting traffic. 6. Another webmaster told us that too many people were
linking into our site with the same text, namely Trophy Central and that it
might have appeared we were trying to game the system somehow. We have never bought links and don't even have a webmaster although over the last 10 years have worked with programmers and seo companies (but we don't think any have done anything unusual). 7. At the suggestion of another webmaster, we have tried to
improve our link profile. For example,
we found Yahoo was not linking to our url. 8. We were told to setup a 404 page, so we did 9. We were told to ensure that all of the similar domains
were pointing to www.trophycentral.com/ so we setup redirects 10. We were told that a site that we have linking to us from too many places so we reduced it to 1. Our key pages have A rankings from SEOMOZ for the selected keywords. We have made countless other changes recommended by experts
but have seen no improvements (actually got worse). I am the
president of the company and have made most of the above recent changes myself. Our website is trophycentral.com0 -
Rel Alternate tag and canonical tag implementation question
Hello, I have a question about the correct way to implement the canoncial and alternate tags for a site supporting multiple languages and markets. Here's our setup. We have 3 sites, each serving a specific region, and each available in 3 languages. www.example.com : serves the US, default language is English www.example.ca : serves Canada, default language is English www.example.com.mx : serves Mexico, default language is Spanish In addition, each sites can be viewed in English, French or Spanish, by adding a language specific sub-directory prefix ( /fr , /en, /es). The implementation of the alternate tag is fairly straightforward. For the homepage, on www.example.com, it would be: -MX” href=“http://www.example.com.mx/index.html” /> -MX” href=”http://www.example.com.mx/fr/index.html“ />
Intermediate & Advanced SEO | | Amiee
-MX” href=”http://www.example.com.mx/en/index.html“ />
-US” href=”http://www.example.com/fr/index.html” />
-US” href=”http://www.example.com/es/index.html“ />
-CA” href=”http://www.example.ca/fr/index.html” />
-CA” href=”http://www.example.ca/index.html” />
-CA” href=”http://www.example.ca/es/index.html” /> My question is about the implementation of the canonical tag. Currently, each domain has its own canonical tag, as follows: rel="canonical" href="http://www.example.com/index.html"> <link rel="canonical" href="http: www.example.ca="" index.html"=""></link rel="canonical" href="http:>
<link rel="canonical" href="http: www.example.com.mx="" index.html"=""></link rel="canonical" href="http:> I am now wondering is I should set the canonical tag for all my domains to: <link rel="canonical" href="http: www.example.com="" index.html"=""></link rel="canonical" href="http:> This is what seems to be suggested on this example from the Google help center. http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077 What do you think?0