When to consolidate and when to bid Link Juice farewell?
-
Greetings all!
I've got a couple of questions about when and if it's alright to let accumulated Link Juice (LJ) slip into the depths of oblivion. I arrived 4 years late to the ticketing website that I work for (www.charged.fm), and found the website in a certain state of disarray. For the past 6 months I've been trying to wrap my head around SEO and our 750k+ page site, and lately we've been making good progress cleaning things up and redesigning. I'm at a loss, though, as to what to do with some pages.
Example: The blog director has been using hash tags for years now that created new pages for each different #, which created a lot of instances of 2 [bytag] pages for 2 different hash tags that had the same article on them.
http://www.charged.fm/blog/bytag/31631/steve-masiello-usf
http://www.charged.fm/blog/bytag/31632/steve-masiello-south-floridaWe've added 'noindex, follow' to this directory (which is the correct solutions, riiight??), but now I'm wondering if some of these pages should be 301'd to more relevant sections of the site, or back to the blog homepage. I know this could be bad for UI, but I don't believe that they're frequently used pages and don't want to let these PA 15 pages go to waste. Any thoughts on this?
Example 2: A similar situation is that they used 302s to redirect to search results pages instead of using category pages. So now there are hundreds, if not thousands, of search results pages that have a PA of 15 or more.
http://www.charged.fm/search/results/music-tickets
We're working on restructuring the site and removing the 302s, but I'm wondering if it's necessary to 301 all of the search results pages to the new category pages like so:
http://www.charged.fm/search/results/music-tickets >>> http://www.charged.fm/concert-tickets
This would require the programmer to create new search/results pages in order to 301 the old ranking ones, correct? Should I put this in queue for him or just leave the search results pages with 'noindex, follow' and let the PA 15 go to waste?
There are many other instances like this like a Login page with PA 20, and I just can't decide if everything should be redirected or what to leave as dust in the wind. Because all we are is dust in the wind ; )
Thanks for any help,
Luke
-
Thanks Jane! That's the affirmation I was looking for. If I might, one more question:
In your opinion, is PA 15 too valuable to leave on a page with no real purpose? Is it relative to the site?
Thanks again,
Luke Thomas -
Hi Luke,
Noindex, follow will work fine for the duplicated tag pages, although you could consider canonicalising them or redirecting them to a more useful resource - if you can do this en masse in an automated way, or if you only do this manually for tags / topics of high importance.
302ing to the search pages isn't good for two reasons: one, search engines traditionally don't follow or pass PR through a 302, and they also prefer you don't include your own "search pages" in their indexes. The "easy" way around this is exactly as you describe: produce quality category pages in place of what was a search results page. You can probably get away with having search pages indexed, and many companies get them to rank, but what Google wants to avoid is the indexation of hundreds or thousands of random search results pages from a website, often with complex query strings that can result in almost infinite numbers of pages being created.
Cheers,
Jane
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will link juice still be passed if you have the same links in multiple, outreach articles?
We are developing high quality, unique content and sending them out to bloggers to for guest posts. In these articles we have links to 2 to 3 sites. While the links are completely relevant, each article points to the same 2 to 3 sites. The link text varies slightly from article to article, but the linked-to site/URLs remain the same. We have read that it is best to have 2 to 3 external links, not all pointing to the same site. We have followed this rule, but the 2 to 3 external sites are the same sites on the other articles. I'm having a hard time explaining this, so I hope this makes sense. My concern is, will Google see this as a pattern and link juice won't be passed to the linked-to URLs, or worst penalize all/some of the sites being linked to or linked from? Someone I spoke to had suggest that my "link scheme" describes a "link wheel" and the site(s) will be penalized by Penguin. Is there any truth to this statement?
Intermediate & Advanced SEO | | Cutopia0 -
GWT app and link emulation
Hi, I have a gwt site - https://www.whatiswhere.com. I have tab control which emulates the menu. I am planning to put <a>links instead of text into tab labels to create internal links. I am thinking to add java script to stop onclick event of</a> <a>otherwise i will get to the new session of GWT site. What I want is to just change the tab but at the same time have a link for the crawler. Would my approach work? Will it be equivalent to non-follow link? Will it improve the ranking comparing to 'no link at all' case?</a> <a>Thanks, Andrei.</a>
Intermediate & Advanced SEO | | Anazar_20010 -
Link from Google.com
Hi guys I've just seen a website get a link from Google's Webmaster Snippet testing tool. Basically, they've linked to a results page for their own website test. Here's an example of what this would look like for a result on my website. http://www.google.com/webmasters/tools/richsnippets?q=https%3A%2F%2Fwww.impression.co.uk There's a meta nofollow, but I just wondered what everyone's take is on Trust, etc, passing down? (Don't worry, I'm not encouraging people to go out spamming links to results pages!) Looking forward to some interesting responses!
Intermediate & Advanced SEO | | tomcraig860 -
Do image "lightbox" photo gallery links on a page count as links and dilute PageRank?
Hi everyone, On my site I have about 1,000 hotel listing pages, each which uses a lightbox photo gallery that displays 10-50 photos when you click on it. In the code, these photos are each surrounded with an "a href", as they rotate when you click on them. Going through my Moz analytics I see that these photos are being counted by Moz as internal links (they point to an image on the site), and Moz suggests that I reduce the number of links on these pages. I also just watched Matt Cutt's new video where he says to disregard the old "100 links max on a page" rule, yet also states that each link does divide your PageRank. Do you think that this applies to links in an image gallery? We could just switch to another viewer that doesn't use "a href" if we think this is really an issue. Is it worth the bother? Thanks.
Intermediate & Advanced SEO | | TomNYC0 -
Link juice site structure?
If we have a top nav with contact us, about us, delivery, FAQ, Gallery, how to order ect but none of these we want to rank and then we have the usual left hand nav.are we wasting juice with the top nav and would we be better either removing it and putting them further down the page or consolidating them and adding an extra products tab so the product pages are first.
Intermediate & Advanced SEO | | BobAnderson0 -
How quickly should you aquire links?
Hi Guys, How often should you aquire links without getting into trouble with Goolge? Should you aqure a linka day? Or a link every 2 days? What should it be? Thanks guys Gareth
Intermediate & Advanced SEO | | GAZ090 -
Google consolidating link juice on duplicate content pages
I've observed some strange findings on a website I am diagnosing and it has led me to a possible theory that seems to fly in the face of a lot of thinking: My theory is:
Intermediate & Advanced SEO | | James77
When google see's several duplicate content pages on a website, and decides to just show one version of the page, it at the same time agrigates the link juice pointing to all the duplicate pages, and ranks the 1 duplicate content page it decides to show as if all the link juice pointing to the duplicate versions were pointing to the 1 version. EG
Link X -> Duplicate Page A
Link Y -> Duplicate Page B Google decides Duplicate Page A is the one that is most important and applies the following formula to decide its rank. Link X + Link Y (Minus some dampening factor) -> Page A I came up with the idea after I seem to have reverse engineered this - IE the website I was trying to sort out for a client had this duplicate content, issue, so we decided to put unique content on Page A and Page B (not just one page like this but many). Bizarrely after about a week, all the Page A's dropped in rankings - indicating a possibility that the old link consolidation, may have been re-correctly associated with the two pages, so now Page A would only be getting Link Value X. Has anyone got any test/analysis to support or refute this??0 -
Link Architecture - Xenu Link Sleuth Vs Manual Observation Confusion
Hi, I have been asked to complete some SEO contracting work for an e-commerce store. The Navigation looked a bit unclean so I decided to investigate it first. a) Manual Observation Within the catalogue view, I loaded up the page source and hit Ctrl-F and searched "href", turns out there's 750 odd links on this page, and most of the other sub catalogue and product pages also have about 750 links. Ouch! My SEO knowledge is telling me this is non-optimal. b) Link Sleuth I crawled the site with Xenu Link Sleuth and found 10,000+ pages. I exported into Open Calc and ran a pivot table to 'count' the number of pages per 'site level'. The results looked like this - Level Pages 0 1 1 42 2 860 3 3268 Now this looks more like a pyramid. I think is is because Link Sleuth can only read 1 'layer' of the Nav bar at a time - it doesnt 'hover' and read the rest of the nav bar (like what can be found by searching for "href" on the page source). Question: How are search spiders going to read the site? Like in (1) or in (2). Thankyou!
Intermediate & Advanced SEO | | DigitalLeaf0