Skip to content
Search engines 5511dd3

The Nature of the "100-link Limit" of Google

Ryan Chooai

This YouMoz entry was submitted by one of our community members. The author’s views are entirely their own (excluding an unlikely case of hypnosis) and may not reflect the views of Moz.

Table of Contents

Ryan Chooai

The Nature of the "100-link Limit" of Google

This YouMoz entry was submitted by one of our community members. The author’s views are entirely their own (excluding an unlikely case of hypnosis) and may not reflect the views of Moz.

SEOmoz is one of my favorite places to have some SEO fun time, because you can always find sparkling ideas about SEO there. Today Dr. Pete wrote a post titled How Many Links Is Too Many? and explained why we shouldn't pay any extra attention to it. The post is brilliant so I just can't help chipping in with my own opinions on the so called "100-link limit" of Google.

Dr. Pete has already explained "Where Did We Get 100?", "Could You Be Penalized?", "Is 100 Still The Limit?", "So, Does It Still Matter?" and "What's The Right Number?", but I will go from the perspective of Googlebot, explaining how everything works.

Does Googlebot identify all the links on a page?

Yes, definitely. When crawling a page, Googlebot identifies all the links on a page. Because it is just how search engine spiders are designed to explore the Internet. The way Googlebot sees your web page is almost the same as a Lynx browser does. Let's check out the screenshot below to see what my own website looks like to Googlebot:

Lynx view of website

We can see, in the eyes of Googlebot, any page can be divided into two parts:

  1. Page content
  2. Links on the page

All links can be seen, regardless of the number of links.

So what does Googlebot do with the links on a page?

When crawling a web page, Googlebot (and other spiders) mainly has two tasks, namely "reading" the content and "remembering" the links. It doesn't directly go to those links but puts them together in storage to check later. It does this because in today's Internet, most websites have many identical parts on different pages of them, such as the sidebar menu, navigation bar, footer links, etc.

Say one day 2 bots (Bot A and Bot B) come to a website that has three pages - Page 1, Page 2 and Page 3. Bot A visits Page 1 and Bot B visits Page 2, which share the same navigation bar. If the two bots just follow the links on their separate pages right away after crawling the pages, they may both visit Page 3. This is a waste of time and resources to Google. So to avoid a situation like this, Bot A and Bot B meets up before proceeding, putting together the links they have seen and removing all the duplicate ones.

What if there are more than 100 links on a page?

Googlebot doesn't really care how many links there are on a page because when Googlebot dealing with links on a page, it is always a "see - remember - check back later" process.

Googlebot cares about only one thing - pleasing Google by costing less but finding more valuable content. So when they come back to the storage where all links are saved, they first choose those important ones to crawl, then the second most important ones, then the third, etc. So as a matter of fact, it is not that Google ignores links after a certain number, but they prioritize the important links and to save resources for Google, check less important ones later. If some links on a page are really of little value to Googlebot, Googlebot's checking back can be so delayed that we feel like it is ignored.

How does Googlebot decides which links to check first?

I believe there are a few factors that can affect Googlebot's evaluation on the importance of  links within a page. For example, we all know that Google dislikes deep subdirectories so deep nesting of sub-directories can make Googlebot put those links at the back of the queue. And URLs that contain suspicious words such as "cig", "bin" and "admin" may also be discriminated against by Googlebot.

"What's the right number?" doesn't matter, it is the order.

Since we know that Googlebot always crawls a page from top to bottom, if there are some certain links are of greater importance to you, you can put them nearer to the top and throw those of less value in the back trunk.

Back to Top
Ryan Chooai
SEO Consultant and Hottest Marketer of the Year at Chinese SEO Shifu

With Moz Pro, you have the tools you need to get SEO right — all in one place.

Read Next

The Helpful Content Update Was Not What You Think

The Helpful Content Update Was Not What You Think

Sep 05, 2024
How to Optimize for Google's Featured Snippets [Updated for 2024]

How to Optimize for Google's Featured Snippets [Updated for 2024]

Aug 20, 2024
How Will Google’s Antitrust Ruling Affect You?

How Will Google’s Antitrust Ruling Affect You?

Aug 08, 2024

Comments

Please keep your comments TAGFEE by following the community etiquette

Comments are closed. Got a burning question? Head to our Q&A section to start a new conversation.