Why isn't my uneven link flow among index pages causing uneven search traffic?
-
I'm working with a site that has millions of pages. The link flow through index pages is atrocious, such that for the letter A (for example) the index page A/1.html has a page authority of 25 and the next pages drop until A/70.html (the last index page listing pages that start with A) has a page authority of just 1. However, the pages linked to from the low page authority index pages (that is, the pages whose second letter is at the end of the alphabet) get just as much traffic as the pages linked to from A/1.html (the pages whose second letter is A or B). The site gets a lot of traffic and has a lot of pages, so this is not just a statistical biip. The evidence is overwhelming that the pages from the low authority index pages are getting just as much traffic as those getting traffic from the high authority index pages. Why is this? Should I "fix" the bad link flow problem if traffic patterns indicate there's no problem? Is this hurting me in some other way? Thanks
-
Thanks Everett, I appreciate it!
-
Hello Gil,
With regard to user-generated profile pages, I typically recommend to clients that they noindex,follow these until they reach a minimum threshold of completeness (e.g. 75% complete) to avoid filling the index with thin "stub" pages, or those created by spam profiles.
If these are local business type pages, as in the White Pages example, the more "supporting content" you customize those pages with the better. For example, a local business listing page could link to similar businesses in the area, provide star ratings, allow visitors to leave reviews/comments, share demographics data for the area, include links to the business' social profiles, embedded videos (commercials, etc...) for the business and many other things.
I realize these pages might be getting traffic at the moment, but as Google updates the machine learning algo to incorporate feedback from the quality raters, who are now being asked to look at supporting content, your client may find their traffic to those pages (and indeed the site as a whole) slowly declining over the next year or two.
That's about as far as I can take it without seeing the pages. Good luck and I hope we've been of some assistance!
-
Hi Travis,
Thanks for your reply.
As I just wrote to Everett, I can't share too many details for confidentiality reasons. My site is somewhat similar to WhitePages, where http://www.whitepages.com/ind/p-001 has a Moz Page Authority of 45, but http://www.whitepages.com/ind/p-150 has a Moz PA of 1. We have similar PA distribution among our index pages, but our organic search traffic is just as high when linked to from the PA 1 pages and when linked to from the PA 45 pages. So I don't know if my client should spend time fixing the problem.
Thanks
-
Thanks. I can't share too many details for confidentiality reasons. I realize that makes it hard / impossible to diagnose correctly, and I'm sorry about that.
These are person pages. The site's link structure naturally gives more link power to the people with the most connections. We could NoIndex (or mask links to) pages that don't have much information but I think such a system would probably be complex and may backfire.
So there's not the kind of taxonomy / directory / long-tail keyword structure that you would expect from a large product directory (for example).
Let's pretend we're discussing WhitePages.com where http://www.whitepages.com/ind/p-001 has a Moz Page Authority of 45, but http://www.whitepages.com/ind/p-150 has a Moz PA of 1. I can fix the problem and get the back pages to have higher PA, but I can't recommend that my client spend resources to fix this since the pages at the back of the index get just as much organic search traffic as the pages at the top.
Thanks
-
As others have stated, we can't really say much with certainty unless we view the site. However, here are my two pennies anyway...
The farther you go down into the directory structure (assuming you have a logical taxonomy and site architecture) the more long-tail and specific the keywords will be. The more long-tail and specific the topic, the less page authority is needed to rank.
With that said, if I was working on a site with millions of pages I'd look into doing a content audit to determine which ones even SHOULD be in the index. Very few sites can scale quality landing pages into the millions.
-
You shouldn't expect anyone to solve anything that technical, with any sort of certainty, without stating the actual domain.
If it's getting organic traffic, great. Could it get more? Maybe.
No one can speak with any sort of certainty based upon what you have written at this point.
Apologies if I appear a little cranky. I'm getting tired of all of these; "I have a problem with a bajillion possible issues, but I won't tell you what I'm looking at." questions.
So you can always PM me, I'm not coming after your client. The problem is more interesting.
-
Yes, an even distribution of organic search traffic seems to indicate that the pages are indexed and ranking. Gains might be made via external links, but as far as modifying your link flow goes, it doesn't seem like the site needs it based on what you've described.
-
Thanks. Sorry I wasn't clear, when I say the traffic is pretty evenly distributed among the pages, I'm referring specifically to organic traffic. I'm wondering if the relatively even distribution of organic traffic is proof that better balancing the link flow won't increase traffic.
-
If you're speaking in terms of just organic search visits it doesn't seem to be a problem, but "traffic" in your example is a little broad. There could be paid search being targeted to those pages, or some sort of social media mechanism that causes people to visit their specific page, or so on.
A segmented look at your analytics for the site (or site section) will give you a good idea of whether or not the pages have a problem getting organic search traffic. If they don't I wouldn't worry about link flow. Really the main reason to adjust it is if you're lacking indexation or rank, and so far from what you've described you're not.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Establishing if links are 'nofollow'
Wonder if any of you guys can tell me if there is any other way to tell google links are nofollow other than in the html (ie can you tell google to nofollow every link in a subdomain or something). I'm trying to establish if a couple of links on a very high ranking site are passing me pagerank or not without asking them directly and looking silly! Within the source code for the page they are NOT tagged as nofollow at present. Hope that all makes sense 😉
Intermediate & Advanced SEO | | mat20150 -
Pull meta descriptions from a website that isn't live anymore
Hi all, we moved a website over to Wordpress 2 months ago. It was using .cfm before, so all of the URLs have changed. We implemented 301 redirects for each page, but we weren't able to copy over any of the meta descriptions. We have an export file which has all of the old web pages. Is there a tool that would allow us to upload the old pages and extract the meta descriptions so that we can get them onto the new website? We use the Yoast SEO plugin which has a bulk meta descriptions editor, so I'm assuming that the easiest/most effective way would be to find a tool that generates some sort of .csv or excel file that we can just copy and paste? Any feedback/suggestions would be awesome, thanks!
Intermediate & Advanced SEO | | georgetsn0 -
How to remove my site's pages in search results?
I have tested hundreds of pages to see if Google will properly crawl, index and cached them. Now, I want these pages to be removed in Google search except for homepage. What should be the rule in robots.txt? I use this rule, but I am not sure if Google will remove the hundreds of pages (for my testing). User-agent: *
Intermediate & Advanced SEO | | esiow2013
Disallow: /
Allow: /$0 -
Will Canonical tag on parameter URLs remove those URL's from Index, and preserve link juice?
My website has 43,000 pages indexed by Google. Almost all of these pages are URLs that have parameters in them, creating duplicate content. I have external links pointing to those URLs that have parameters in them. If I add the canonical tag to these parameter URLs, will that remove those pages from the Google index, or do I need to do something more to remove those pages from the index? Ex: www.website.com/boats/show/tuna-fishing/?TID=shkfsvdi_dc%ficol (has link pointing here)
Intermediate & Advanced SEO | | partnerf
www.website.com/boats/show/tuna-fishing/ (canonical URL) Thanks for your help. Rob0 -
To index or de-index internal search results pages?
Hi there. My client uses a CMS/E-Commerce platform that is automatically set up to index every single internal search results page on search engines. This was supposedly built as an "SEO Friendly" feature in the sense that it creates hundreds of new indexed pages to send to search engines that reflect various terminology used by existing visitors of the site. In many cases, these pages have proven to outperform our optimized static pages, but there are multiple issues with them: The CMS does not allow us to add any static content to these pages, including titles, headers, metas, or copy on the page The query typed in by the site visitor always becomes part of the Title tag / Meta description on Google. If the customer's internal search query contains any less than ideal terminology that we wouldn't want other users to see, their phrasing is out there for the whole world to see, causing lots and lots of ugly terminology floating around on Google that we can't affect. I am scared to do a blanket de-indexation of all /search/ results pages because we would lose the majority of our rankings and traffic in the short term, while trying to improve the ranks of our optimized static pages. The ideal is to really move up our static pages in Google's index, and when their performance is strong enough, to de-index all of the internal search results pages - but for some reason Google keeps choosing the internal search results page as the "better" page to rank for our targeted keywords. Can anyone advise? Has anyone been in a similar situation? Thanks!
Intermediate & Advanced SEO | | FPD_NYC0 -
Meta NOINDEX and links into the pages?
If I have internal links pointing to pages that are META NO INDEX, will Google still index them? Or does that only apply to pages that are linked to from an external domain? Thanks!
Intermediate & Advanced SEO | | bjs20100 -
Do links from twitter count in SEOMoz's Toolbar link count?
I am using the Chrome extension and looking at a SERP, when a page is said to have 2000 incoming links, does that include tweets with a link back to this page? What about retweets. Are those counted separately or as one? And what about independent tweets that have exactly the same content (tweet text + link)
Intermediate & Advanced SEO | | davhad0 -
Are duplicate links on same page alright?
If I have a homepage with category links, is it alright for those category links to appear in the footer as well, or should you never have duplicate links on one page? Can you please give a reason why as well? Thanks!
Intermediate & Advanced SEO | | dkamen0