Can you regain any SERPs / link juice of links that have 404'd?
-
We have a client whose 301 redirects disappeared and have been gone for about 6 months now. We are going to be putting the 301 redirects back in place.
Will we be able to regain any of the previous SERPs or link juice from old links or is all lost?
Thanks in advance!
-
Awesome thank you for sharing your experience that's a great data point.
-
In my experience, yes, you can regain value.
I redirected an old microsite with external links (which had been abandoned a few years prior) to the relevant page on my current, active site and that page quickly grew to become one of the strongest pages.
[There had been a thread on this here in Moz if you want to read more.]
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Spam pages / content created due to hack. 404 cleanup.
A hosting company's server was hacked and one of our customer's sites was injected with 7,000+ pages of fake, bogus, promotional content. Server was patched and spammy content removed from the server. Reviewing Google Webmaster's Tools we have all the hacked pages showing up as 404's and have a severe drop in impressions, rank and traffic. GWT also has 'Some manual actions apply to specific pages, sections, or links'... What do you recommend for: Cleaning up 404's to spammy pages? (I am not sure redirect to home page is a right thing to do - is it?) Cleaning up links that were created off site to the spam pages Getting rank bank // what would you do in addition to the above?
Technical SEO | | GreenStone0 -
Manual Action - When requesting links be removed, how important to Google is the address you're sending the requests from?
We're starting a campaign to get rid of a bunch of links, and then submitting a disavow report to Google, to get rid of a manual action. My SEO vendor said he needs an @email domain from the website in question @travelexinsurance.com, to send and receive emails from vendors. He said Google won't consider the correspondence to and from webmasters if sent from a domain that is not the one with the manual action penalty. Due to company/compliance rules, I can't allow a vendor not in our building to have an email address like that. I've seen other people mention they just used a GMAIL.com account. Or we could use a similar domain such as @travelexinsurancefyi.com. My question, how critical is it that the domain the correspondence with the webmasters be from the exact website domain?
Technical SEO | | Patrick_G0 -
Does a GTLD extension 'count' as part of the target keyword?
Hopefully someone can shed some light on this for me. Reading about GTLDs, I came across this quote from TSO Host: 'What we don’t know is whether an extension can double up as a keyword, which is picked up by Google and treated identically to the rest of a domain name. I.e. - would ‘bristolguitars.music’ have more ranking potential than ‘bristolguitars.com’ as ‘music’ is a relevant search word?' Source: https://www.tsohost.com/blog/how-do-new-gtlds-affect-seo Does anyone know if a GTLD extension does double up as a keyword? For example, if Nike buys 'Nike.shoes', does this double as the keyword 'Nike shoes', or is Google and other search engines just looking at the domain name _before _the GTLD extension? I'm looking at .photography for examples (not my niche) and seeing folks are having mixed results ranking for 'Keyword + Photography', so would be keen to hear your thoughts.
Technical SEO | | ecommercebc0 -
Transferring link juice on a page with over 150 links
I'm building a resource section that will probably, hopefully, attract a lot of external links but the problem here is that on the main index page there will be a big number of links (around 150 internal links - 120 links pointing to resource sub-pages and 30 being the site's navigational links), so it will dilute the passed link juice and possibly waste some of it. Those 120 sub-pages will contain about 50-100 external links and 30 internal navigational links. In order to better visualise the matter think of this resource as a collection of hundreds of blogs categorised by domain on the index page (those 120 sub-pages). Those 120 sub-pages will contain 50-100 external links The question here is how to build the primary page (the one with 150 links) so it will pass the most link juice to the site or do you think this is OK and I shouldn't be worried about it (I know there used to be a roughly 100 links per page limit)? Any ideas? Many thanks
Technical SEO | | flo20 -
Why am I not showing up in the SERP's or Google Local?
I have been trying to optimise the following site for both Google SERP's and Google Local - Pixel Primate The URL has been around for around 3 years now but they just updated the website and launched it in December 2012. I did the on-page optimisation early in January 2013 and Google seems to have indexed the changes, for the home page at least. One major keyword I am targeting for the home page is 'Web Design Leicester'. I understand that the DA is fairly low (24) so this is something I need to improve. However, I've experienced positive results fairly quickly from just on-page optimisation for other sites I have worked on. The site just doesn't seem to be ranking at all for any keywords. Maybe the industry type is just extremely competitve but I find it very strange to not be visible anywhere in the SERPs. The site does not seem to have any penalties as it ranks for 'Pixel Primate' and all pages appear when doing a site: search. Also what's strange is that I set up the Google Local listing years ago but it doesn't appear anywhere in the local listing, not even when I search for it manually. Any suggestions would be appreciated.
Technical SEO | | CWseo0 -
Outlinks not link with profitok.com/index.php
Dear sir my website www.profitok.com is not indexing in google can you give me the right answer wht to do
Technical SEO | | Luckykhullar0 -
How Can I Block Archive Pages in Blogger when I am not using classic/default template
Hi, I am trying to block all the archive pages of my blog as Google is indexing them. This could lead to duplicate content issue. I am not using default blogger theme or classic theme and therefore, I cannot use this code therein: Please suggest me how I can instruct Google not to index archive pages of my blog? Looking for quick response.
Technical SEO | | SoftzSolutions0 -
Follow up from http://www.seomoz.org/qa/discuss/52837/google-analytics
Ben, I have a follow up question from our previous discussion at http://www.seomoz.org/qa/discuss/52837/google-analytics To summarize, to implement what we need, we need to do three things: add GA code to the Darden page _gaq.push(['_setAccount', 'UA-12345-1']);_gaq.push(['_setAllowLinker', true]);_gaq.push(['_setDomainName', '.darden.virginia.edu']);_gaq.push(['_setAllowHash', false]);_gaq.push(['_trackPageview']); Change links on the Darden Page to look like http://www.darden.virginia.edu/web/MBA-for-Executives/ and [https://darden-admissions.symplicity.com/applicant](<a href=)">Apply Now and make into [https://darden-admissions.symplicity.com/applicant](<a href=)" > onclick="_gaq.push(['_link', 'https://darden-admissions.symplicity.com/applicant']); return false;">Apply Now Have symplicity add this code. _gaq.push(['_setAccount', 'UA-12345-1']);_gaq.push(['_setAllowLinker', true]);_gaq.push(['_setDomainName', '.symplicity.com']);_gaq.push(['_setAllowHash', false]);_gaq.push(['_trackPageview']); Due to our CMS system, it does not allow the user to add onClick to the link. So, we CANNOT add part 2) What will be the result if we have only 1) and 3) implemented? Will the data still be fed to GA account 'UA-12345-1'? If not, how can we get cross domain tracking if we cannot change the link code? Nick
Technical SEO | | Darden0