Pagination and links per page issue.
-
Hi all,
I have a listings based website that just doesn't seem to want to pass rank to the inner pages.
See here for an example:
http://www.business4sale.co.uk/Buy/Hotels-For-Sale-in-the-UK
I know that there are far too many links on this page and I am working on reducing the number by altering my grid classes to output fewer links.
The page also displays a number of links to other page numbers for these results. My script adds the string " - Page2" to the end of the title, description and URL when the user clicks on page two of these results.
My question is:
Would an excessive amount(200+) of links on a page result in less PR being passed to this page(looking spammy)?
And would using rel canonical on page numbers greater than 1 result in better trust/ranking?
Thanks in advance.
-
I believe the number 100 is the limit of links in a page. and yes the more links the less pr being passed to each page.
But 100 links on the home page means means you can have 100 child pages with 100 links on each means 10,000 links only 2 clicks from home page.
as for re=canonical, is page 2 unique? then yes just as you would for any other page.
I assume you are aware of the flat link stucture, if not I think this page though old is a must.
http://www.webworkshop.net/pagerank.htmlIts a long read, but very imformative
-
Rel canonical doesn't tell engines not to crawl the page (the mata tag nofollow does that), but rather just tells the engine not to index the page in place of the 1st page of your pager results. This helps reduce duplicate content penalties and consolidates your PA onto a single URL. I could be wrong, but I imagine you would also prefer organic users to go to the first page of results rather than, say, page 6.
The result is that engines will still crawl your pages and find your listings (and index those), but only index the first page of your listing page.
I hope that helps to clarify things!
Andrew
-
I'm sorry, what I meant was you should make the pagination pages canonical themselves, like for Page 2...
-
But would rel canonical make listing on page 2 and above unindexible?
Listings are added chronologically on my site and I still want crawlers to be able to reach adverts created years ago, these listing a could be on page 100, surely rel canonical tells engines not to crawl the page as it is not the canonical version?
-
200 links on a page isn't that bad. Once you get to 250+ I would rethink the architecture.
Yes, you should use rel canonical on your pagination pages.
A good way to pass ranking between deep pages like this is to have a section at the bottom that offers similar listings in the area. This way you are giving the bots multiple ways to find each listing, rather than just from one page/category. Do it like this - http://www.estatesgazette.com/propertylink/advert/kensingtonrooms_hotel-_131_137_cromwell_road_london_sw7_4du-3264453.htm. They have a "More Properties from this Advertiser" section.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Many Links to Disavow at Once When Link Profile is Very Spammy?
We are using link detox (Link Research Tools) to evaluate our domain for bad links. We ran a Domain-wide Link Detox Risk report. The reports showed a "High Domain DETOX RISK" with the following results: -42% (292) of backlinks with a high or above average detox risk
Intermediate & Advanced SEO | | Kingalan1
-8% (52) of backlinks with an average of below above average detox risk
-12% (81) of backlinks with a low or very low detox risk
-38% (264) of backlinks were reported as disavowed. This look like a pretty bad link profile. Additionally, more than 500 of the 689 backlinks are "404 Not Found", "403 Forbidden", "410 Gone", "503 Service Unavailable". Is it safe to disavow these? Could Google be penalizing us for them> I would like to disavow the bad links, however my concern is that there are so few good links that removing bad links will kill link juice and really damage our ranking and traffic. The site still ranks for terms that are not very competitive. We receive about 230 organic visits a week. Assuming we need to disavow about 292 links, would it be safer to disavow 25 per month while we are building new links so we do not radically shift the link profile all at once? Also, many of the bad links are 404 errors or page not found errors. Would it be OK to run a disavow of these all at once? Any risk to that? Would we be better just to build links and leave the bad links ups? Alternatively, would disavowing the bad links potentially help our traffic? It just seems risky because the overwhelming majority of links are bad.0 -
Same page Anchor Links vs Internal Link (Cannibalisation)
Hey Mozzers, I have a very long article page that supports several of my sub-category pages. It has sub-headings that link out to the relevant pages. However the article is very long and to make it easier to find the relevant section I was debating adding inpage anchor links in a bullet list at the top of the page for quick navigation. PAGE TITLE Keyword 1 Keyword 2 etc <a name="'Keyword1"></a> Keyword 1 Content
Intermediate & Advanced SEO | | ATP
<a name="'Keyword2"></a> Keyword 2 Content Because of the way my predecessor wrote this article, its section headings are the same as the sub-categories they link out to and boost (not ideal but an issue I will address later). What I wondered is if having the inpage achor would confuse the SERPS because they would be linking with the same keyword. My worry is that by increasing userbility of the article by doing this I also confuse them SERPS First I tell them that this section on my page talk about keyword 1. Then from in that article i tell them that a different page entirely is about the same keyword. Would linking like this confuse SERPS or are inpage anchor links looked upon and dealt with differently?0 -
Intra-linking to pages with a different Canonical url ?
Hello Moz Community! I'm hoping to get some advice around intra-linking practices and the benefits when a page that is being linked to has a different canonical tag than it's own URL. Confused? Allow me to elaborate. Scenario: Background: Ecommerce Company is trying to increase its organic ranking for key, broad terms in the cycling industry. Ecommerce company is trying to rank its category pages for a main term. To help this, the company focusing on increasing the quality of its intra-linking structure (the links and anchor texts that link to another page within the site). Example goal: to have it's Road Cassettes category page rank for 'Road Cassettes' Company's 'cassettes' main category page is here: /Components/Drivetrain/Cassettes/ And the company uses filtered navigation logic to drill down into 'road cassettes' specifically: /Components/Drivetrain/Cassettes/?page_no=1&fq=ATR_RoadBiking:True SEOs are instructed to include occasional links back to this page, with SEO friendly anchor text, to help strengthen it's authority for the main term. The Issue / Question: Main category URL: /Components/Drivetrain/Cassettes/ Road Cassettes category URL: /Components/Drivetrain/Cassettes/?page_no=1&fq=ATR_RoadBiking:True Road Cassettes Canonical URL: /Components/Drivetrain/Cassettes/ The canonical URL of the filtered Road Cassettes category is its main category URL. Will Company be able to effectively rank its Road Cassettes category URL for 'Road Cassettes' if the canonical URL is the main category? Should the canonical URL not be the main category? OR Will increasing the intra-linking to the Road Cassettes URL help the main category URL rank for 'Road Cassettes' - by passing all it's authority?
Intermediate & Advanced SEO | | Ray-pp0 -
Strange 404s in GWT - "Linked From" pages that never existed
I’m having an issue with Google Webmaster Tools saying there are 404 errors on my site. When I look into my “Not Found” errors I see URLs like this one: Real-Estate-1/Rentals-Wanted-228/Myrtle-Beach-202/subcatsubc/ When I click on that and go to the “Linked From” tab, GWT says the page is being linked from http://www.myrtlebeach.com/Real-Estate-1/Rentals-Wanted-228/Myrtle-Beach-202/subcatsubc/ The problem here is that page has never existed on myrtlebeach.com, making it impossible for anything to be “linked from” that page. Many more strange URLs like this one are also showing as 404 errors. All of these contain “subcatsubc” somewhere in the URL. My Question: If that page has never existed on myrtlebeach.com, how is it possible to be linking to itself and causing a 404?
Intermediate & Advanced SEO | | Fuel0 -
Are links on the page like this detrimental?
Hello, on www.ditalia.com.au are the links at the bottom of the page under: Latest Blog Posts, Most Popular Blogs, Fabric & Lace, Wedding Dresses..., useful or detrimental to SEO?
Intermediate & Advanced SEO | | infinart0 -
Maximum of 100 links on a page vs rel="nofollow"
All, I read within the SEOmoz blog that search engines consider 100 links on a page to be plenty, and we should try (where possible) to keep within the 100 limit. My question is; when a rel="nofollow" attribute is given to a link, does that link still count towards your maximum 100? Many thanks Guy
Intermediate & Advanced SEO | | Horizon0 -
Is it better to link to the root domain or to the language home page?
Hi guys! When dealing with linkbuilding for a multi language page such as www.inmonova.com is it better to get external links pointing at inmonova.com or is it better to point: English links to inmonova.com/en Spanish links to inmonova.com/es German links to inmonova.com/de What is your opinion?
Intermediate & Advanced SEO | | inmonova0 -
Noindex junk pages with inbound links?
I recently came across what is to me a new SEO problem. A site I consult with has some thin pages with a handful of ads at the top, some relevant local content sourced from a third party beneath that... and a bunch of inbound links to said pages. Not just any links, but links from powerful news sites. My impression is that said links are paid (sidebar links, anchor text... nice number of footprints.) Short version: They may be getting juice from these links. A preliminary lookup for one page's keywords in the title finds it top 100 on Google. I don't want to lose that juice, but do think the thin pages they link to can incur Panda's filter. They've got the same blurb for lots of [topic x] in [city y], plus the sourced content (not original...). So I'm thinking about noindexing said pages to avoid Panda filters. Also, as a future pre-emptive measure, I'm considering figuring out what they did to get these links and aiming to have them removed if they were really paid for. If it was a biz dev deal, I'm open to leaving them up, but that possibility seems unlikely. What would you do? One of the options I laid out above or something else? Why? p.s. I'm asking this on my blog (seoroi.com/blog/ ) too, so if you're up for me to quote you (and link to your site, do say so. You aren't guaranteed to be quoted if you answer here, but it's one of the easier ways you'll get a good quality link. p.p.s. Related note: I'm looking for intermediate to advanced guest posts for my blog, which has 2000+ RSS subs. Email me at gab@ my site if you're interested. You can also PM me here on SEOmoz, though I don't login as frequently.
Intermediate & Advanced SEO | | Gab-Goldenberg0