Second rebranding, what's the best approach?
-
Our client rebranded in 2007 and it worked very successfully from an SEO persepctive. They put in place page-to-page 301 redirects and the new website replaced the old one in the SERPS very quickly in similar positions.
The market has changed and they now need to rebrand again so they are moving to a third domain.
So in 2007 they redirected DomainA to DomainB and now are moving to DomainC
Domain A was in existence since 1996 so a majority of the link profile is still directed to DomainA and is passing through it via 301 to DomainB.
Is the best approach 1. to just redirect DomainB to DomainC, leaving the DomainA links pass through a second set of 301 redirects?
or 2. would it be better to change the redirects on DomainA to go directly to DomainC (the theory here is that each 301 dilutes the value of a link so taking out a hop could be better) -
Option 2, redirect chains should always be avoided and should go straight to the final destination URL.
-
Option 2, but also try and change some of the back links to A/B to point to C if you can.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GSC Performance completely dropped off, but Google Analytics is steady. Why can't GSC track my site anymore?
Hey everyone! I'm having a weird issue that I've never experienced before. For one of my clients, GSC has a complete drop-off in the Performance section. All of the data shows that everything fell flat, or almost completely flat. But in Google Analytics, we have steady results. No huge drop-off in traffic, etc. Do any of you know why GSC would all of a sudden be unable to crawl our site? Or track this data? Let me know what you think!
Algorithm Updates | | TaylorAtVelox
Thanks!2 -
Meta robots at every page rather than using robots.txt for blocking crawlers? How they'll get indexed if we block crawlers?
Hi all, The suggestion to use meta robots tag rather than robots.txt file is to make sure the pages do not get indexed if their hyperlinks are available anywhere on the internet. I don't understand how the pages will be indexed if the entire site is blocked? Even though there are page links are available, will Google really index those pages? One of our site got blocked from robots file but internal links are available on internet for years which are not been indexed. So technically robots.txt file is quite enough right? Please clarify and guide me if I'm wrong. Thanks
Algorithm Updates | | vtmoz0 -
What does it mean to build a 'good' website.
Hi guys. I've heard a lot of SEO professionals, Google, (and Rand in a couple of whiteboard Friday's) say it's really important to build a 'good' website if you want to rank well. What does this mean in more practical terms? (Context... I've found some sites rank much better than they 'should' do based on the competition. However, when I built my own site (well-optimised (on-page) based on thorough keyword research) it was nowhere to be found (not even top 50 after I'd 'matched' the backlink profile of others on page 1). I can only put this down to there being 'good quality website' signals lacking in the latter example. I'm not a web developer so the website was the pretty basic WordPress site.)
Algorithm Updates | | isaac6630 -
Is Having Content 'Above The Fold' Still Relevant for Website Design and SEO
Hey there, So I have a client who recently 're-skinned' their website and now there is little to no content above the fold. Likewise, I've noticed that since the transition to this new front-end design there has been a drop in rankings for a number of keywords related to one of the topics we are targeting. Is there any correlation here? Is having content 'above the fold' still a relevant factor in determining a websites' searchability? I appreciate you reading and look forward to hearing from all of you. Have a great day!
Algorithm Updates | | maxcarnage0 -
Best practice for cleaning up multiple Google Places listings and multiple Google accounts when logins were lost.
We are an inbound marketing agency, most of our clients are not relying on local seo. I have a pretty good understanding of it when starting fresh but not so much in joining a "movie in progress" kind of scenario. Recently we've brought on two clients who have had their websites in place for awhile, have made small attempts at marketing themselves online over the years and its resulted in multiple Google places listings, variations of the company names (one of them changed their name), worried there are yet more accounts out there they aren't aware of, etc (analytics, and others from well intentioned employees and past service providers - no internal leadership at the company level). In reading Google help forums I'm seeing some recently having their accounts suspended when they try to clean things up - in one case a person setup a new Google account thinking he would start fresh and in trying to claim listings, get rid of duplicates, etc. his account was suspended. What is the CURRENT recommended course of action in situations like these? With all the changes going on with Google, I don't know which route to take and have combed the Internet reading articles about this (including Google's resources) - would like some current real world advise.
Algorithm Updates | | rhgraves651 -
Changing in website design reduce traffic? I don't think so.
HI, Around the month of Nov I was working on the website. Due to some reasons I have to change the design of website. I saw my traffic going down and down(70 - 100/day) so roll back it on previous one. after that it improve little bit but not as on previously. (traffic 250 - 300/day). Question: All Urls, content and links are same then how that can effect on the traffic. We have removed all the errors that was shown in the seomoz report.But traffic is still the issue here. We are working on SEO area enough and try to recover from it. Your suggestion may be helpful for us.So I am looking forward for your answers. how i can over come with it. Thanks Regards
Algorithm Updates | | lucidsoftech0 -
Difference between Google's link: operator and GWT's links to your sites
I haven't used the Google operator link: for a while, and I noticed that there is a big disparity between the operator "link:" and the GWT's links to your site. I compared these results on a number of websites, my own and competitors, and the difference seem to be the same across the board. Has Google made a recent change with how they display link results via the operator? Could this be an indication that they are clean out backlinks?
Algorithm Updates | | tdawson090 -
No-follow tags on links in the footer...do it or don't do it?
With some of the great reports SEOMoz has provided I've been able to start to take the correct steps towards fixing crawl issues, on-page issues, etc. One of my websites allows a customer to drill down to their specific state and then their city to apply for an auto loan. The SEOMoz reports told me I had too many links on these pages specifically. One of my ways to remedy this would be to add "no-follow" tags on the links in the footer as well as the links to the cities. Am I steering myself in the right/wrong direction? Should I be approaching this problem from a different perspective? Any help is greatly appreciated!
Algorithm Updates | | fergseo0