Why isn't my uneven link flow among index pages causing uneven search traffic?
-
I'm working with a site that has millions of pages. The link flow through index pages is atrocious, such that for the letter A (for example) the index page A/1.html has a page authority of 25 and the next pages drop until A/70.html (the last index page listing pages that start with A) has a page authority of just 1. However, the pages linked to from the low page authority index pages (that is, the pages whose second letter is at the end of the alphabet) get just as much traffic as the pages linked to from A/1.html (the pages whose second letter is A or B). The site gets a lot of traffic and has a lot of pages, so this is not just a statistical biip. The evidence is overwhelming that the pages from the low authority index pages are getting just as much traffic as those getting traffic from the high authority index pages. Why is this? Should I "fix" the bad link flow problem if traffic patterns indicate there's no problem? Is this hurting me in some other way? Thanks
-
Thanks Everett, I appreciate it!
-
Hello Gil,
With regard to user-generated profile pages, I typically recommend to clients that they noindex,follow these until they reach a minimum threshold of completeness (e.g. 75% complete) to avoid filling the index with thin "stub" pages, or those created by spam profiles.
If these are local business type pages, as in the White Pages example, the more "supporting content" you customize those pages with the better. For example, a local business listing page could link to similar businesses in the area, provide star ratings, allow visitors to leave reviews/comments, share demographics data for the area, include links to the business' social profiles, embedded videos (commercials, etc...) for the business and many other things.
I realize these pages might be getting traffic at the moment, but as Google updates the machine learning algo to incorporate feedback from the quality raters, who are now being asked to look at supporting content, your client may find their traffic to those pages (and indeed the site as a whole) slowly declining over the next year or two.
That's about as far as I can take it without seeing the pages. Good luck and I hope we've been of some assistance!
-
Hi Travis,
Thanks for your reply.
As I just wrote to Everett, I can't share too many details for confidentiality reasons. My site is somewhat similar to WhitePages, where http://www.whitepages.com/ind/p-001 has a Moz Page Authority of 45, but http://www.whitepages.com/ind/p-150 has a Moz PA of 1. We have similar PA distribution among our index pages, but our organic search traffic is just as high when linked to from the PA 1 pages and when linked to from the PA 45 pages. So I don't know if my client should spend time fixing the problem.
Thanks
-
Thanks. I can't share too many details for confidentiality reasons. I realize that makes it hard / impossible to diagnose correctly, and I'm sorry about that.
These are person pages. The site's link structure naturally gives more link power to the people with the most connections. We could NoIndex (or mask links to) pages that don't have much information but I think such a system would probably be complex and may backfire.
So there's not the kind of taxonomy / directory / long-tail keyword structure that you would expect from a large product directory (for example).
Let's pretend we're discussing WhitePages.com where http://www.whitepages.com/ind/p-001 has a Moz Page Authority of 45, but http://www.whitepages.com/ind/p-150 has a Moz PA of 1. I can fix the problem and get the back pages to have higher PA, but I can't recommend that my client spend resources to fix this since the pages at the back of the index get just as much organic search traffic as the pages at the top.
Thanks
-
As others have stated, we can't really say much with certainty unless we view the site. However, here are my two pennies anyway...
The farther you go down into the directory structure (assuming you have a logical taxonomy and site architecture) the more long-tail and specific the keywords will be. The more long-tail and specific the topic, the less page authority is needed to rank.
With that said, if I was working on a site with millions of pages I'd look into doing a content audit to determine which ones even SHOULD be in the index. Very few sites can scale quality landing pages into the millions.
-
You shouldn't expect anyone to solve anything that technical, with any sort of certainty, without stating the actual domain.
If it's getting organic traffic, great. Could it get more? Maybe.
No one can speak with any sort of certainty based upon what you have written at this point.
Apologies if I appear a little cranky. I'm getting tired of all of these; "I have a problem with a bajillion possible issues, but I won't tell you what I'm looking at." questions.
So you can always PM me, I'm not coming after your client. The problem is more interesting.
-
Yes, an even distribution of organic search traffic seems to indicate that the pages are indexed and ranking. Gains might be made via external links, but as far as modifying your link flow goes, it doesn't seem like the site needs it based on what you've described.
-
Thanks. Sorry I wasn't clear, when I say the traffic is pretty evenly distributed among the pages, I'm referring specifically to organic traffic. I'm wondering if the relatively even distribution of organic traffic is proof that better balancing the link flow won't increase traffic.
-
If you're speaking in terms of just organic search visits it doesn't seem to be a problem, but "traffic" in your example is a little broad. There could be paid search being targeted to those pages, or some sort of social media mechanism that causes people to visit their specific page, or so on.
A segmented look at your analytics for the site (or site section) will give you a good idea of whether or not the pages have a problem getting organic search traffic. If they don't I wouldn't worry about link flow. Really the main reason to adjust it is if you're lacking indexation or rank, and so far from what you've described you're not.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
App Index Knowledge Graph Link
Is it possible to App Index within a knowledge graph? I would like to app index my app within the knowledge graph instead of it going to the website but is that possible?
Intermediate & Advanced SEO | | mattdinbrooklyn0 -
Why isn't Google indexing this site?
Hello, Moz Community My client's site hasn't been indexed by Google, although it was launched a couple of months ago. I've ran down the check points in this article https://mza.seotoolninja.com/ugc/8-reasons-why-your-site-might-not-get-indexed without finding a reason why. Any sharp SEO-eyes out there who can spot this quickly? The url is: http://www.oldermann.no/ Thank you
Intermediate & Advanced SEO | | Inevo
INEVO, digital agency0 -
Long term strategy to retain link 'goodness', I need some help!
Hi, I have a few questions around the best approach to retain as much link juice / authority from transitioning multiple domains into 1 single domain over the next year or so. I have 2 similar websites (www.brandA.co.uk and www.brandB.co.uk) which I need to transition to a new website (www.brandC.co.uk) over the next 2 years. Both A&B are established and have there own brand value, brand C will be a new website. I need to start introducing the brand from website C onto A&B straight away and then eventually drop the brands from A&B and just be left with C. One idea I am considering is: www.brandA.co.uk becomes brandA.brandC.co.uk (brandA sits as a subdomain on brandC website) Ultimately over time I would drop the subdomain (brandA) and just be left with www.brandC.co.uk The other option is: www.brandA.co.uk becomes brandC.co.uk/brandA...with the same ultimate aim as above. In both above case the same would be done for brandB, either becoming a subdomain of a folder on brandC website What I need to know is what is the best way to first pass any SEO goodness from the websites for brandA and brandB to the intermediate solution of either brandA.brandC.co.uk or brandC.co.uk/brandA (I see this intermediate solution being in place for approx 2 years). And then how to transition the intermediate solution into just having brandC.co.uk Which solution will aid growing the SEO goodness on the final brandC.co.uk website? Does google see subdomains as part of the main domain and thus the main domain will benefit from any links going to the subdomain or is it better to always use /folders as google sees these as more part of one website? ...or is there another option that I haven't considered? I know it's rater confusing so please give me a shout if you want anymore info. Thanks James
Intermediate & Advanced SEO | | cewe0 -
Can links indexed by google "link:" be bad? or this is like a good example by google
Can links indexed by google "link:" be bad? Or this is like a good example shown by google. We are cleaning our links from Penguin and dont know what to do with these ones. Some of them does not look quality.
Intermediate & Advanced SEO | | bele0 -
My homepage doesn't rank anymore. It's been replaced by irrelevant subpages which rank around 100-200 instead of top 5.
Hey guys, I think I got some kind of penalty for my homepage. I was in top5 for my keywords. Then a few days ago, my homepage stopped ranking for anything except searching for my domain name in Google. sitename.com/widget-reviews/ previously ranked #3 for "widget reviews"
Intermediate & Advanced SEO | | wearetribe
but now....
sitename.com/widget-training-for-pet-cats/ is ranking #84 for widget reviews instead. Similarly across all my other keywords, irrelevant, wrong pages are ranking. Did I get some kind of penalty?0 -
Flow of internal link equity
I've recently come across this: A site changes the URL of one internal page to something more search friendly, and 301's the old to the new as you would expect. They don't change the link on the homepage in the navigation. Instead they keep it to the old URL so they go through the 301 to get to the page even though it's internal. They say if they change the URL it will reset the internal flow of link equity to that page. I've not come across this before and so am not sure what to think. I mean I can see what they're saying but I would have though that it being internal would mean it's different and that the flow to internal pages would just kind of resume as-was quite soon afterwards. Any views?
Intermediate & Advanced SEO | | SteveOllington0 -
Static index page or not?
Are there any advantages of dis-advantages to running a static homepage as opposed to a blog style homepage. I have be running a static page on my site with the latest posts displayed as links after the homepage content. I would like to remove the static page and move to a more visually appealing homepage that includes graphics for each post and the posts droppping down the page like normal blogs do. How will this effect my site if I move from a static page to a more dynamic blog style page layout? Could I still hold the spot I currently rank for with the optimized index content if I turn to a more traditional blog format? cheers,
Intermediate & Advanced SEO | | NoCoGuru0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0