Should I noindex user-created fundraising pages?
-
Hello Moz community!
I work for a nonprofit where users are able to create their own fundraising pages on the website for supporters to directly donate. Some of them are rarely used, others get updated frequently by the host. There are likely a ton of these on our site. Moz crawl says we have ~54K pages, and when I do a "site:[url]" search on Google, 90% of the first 100 results are fundraising pages.
These are not controlled by our staff members, but I'm wondering if meta noindexing these pages could have a big effect on our SEO rankings. Has anyone tried anything similar or know if this strategy could have legs for our site?
My only concern is whether users wouldn't be able to find their fundraising page in our Google CSE implemented on the website.
Any insight you fine folks could provide would be greatly appreciated!
-
I'd tread very carefully here as thing 1 and thing 2 seem to contradict each other at face value. You're right, Google can send traffic to a site in ways other than keywords, but it's not the norm. The next thing I'd look at is, hmm - how are we tracking keyword rankings? Is it an online, cloud based rank tracker that relies on you specifying all of (and all of the right) keywords to track? Most of those trackers track between 50 and 300 KWs (daily, weekly) but it's not uncommon for such sites to have 10,000+ keywords contributing. If they're not all in there, it's a bad sample you are looking at. Connect Google Search Console to Google Analytics. let it run for a few weeks, analyse the 'search query' data from within Google Analytics (which can be done once it's all hooked up). GSC only lets you export 1k keywords (usually, sometimes it can be more) but GA will take 5k and that's much better for your analysis. You might be surprised to find, those pages rank for more keywords than you thought. maybe hundreds of little ones, instead of a few big ones
-
Effectdigital is right in looking at your analytics and backlinks to help make this decision.
In the Moz case study we referenced earlier, they were getting rid of pages that didn't provide value at all to anyone. Those pages probably didn't have any links pointing to them at all. So it made sense to get rid of them.
Since your pages are providing value (it seems) and your getting 1/3 of your traffic coming into those pages, we would tread carefully on meta noindexing them.
You might only consider meta noindexing a group of them that haven't brought in any traffic this whole year and that don't have any links pointing to them. That way, you won't lose any existing traffic that your getting but you can see if the trimming helps your site's overall traffic and rankings.
-
Appreciate the word of caution, I'm relatively new and am looking for well-rounded opinions about the repercussions such a massive move could make for our site. As a response:
Thing #1: We don't have many fundraising pages that rank highly for keywords, as we're still working on juicing up our regular site pages as is to improve in the SERP results. I was mainly wondering whether the glut of fundraising pages could be harming our SERP results. Some certainly have duplicate content but that's beyond our control, and I'm not sure if that could significantly be harming our results. Any thoughts on that?
Thing #2: Great call on checking the data. YTD nearly 1/3 of our user sessions have landed on one of these fundraising pages. I'm guessing that's likely either the hosts using google to find their page and then subsequently log in, or friends searching for it on google and then navigating and donating. We do still have a Google Custom Search Engine on our site. Presumably people could find them that way?
If you have any additional opinions or feedback given what I detailed above, I'd very much appreciate it!
-
Be VERY careful
Thing #1) Just because you stop Google indexing and crawling some pages, that doesn't mean they will give that same traffic (keywords linking to those pages) to other URLs on your site. They may decide that your other URLs, do not satisfy the specific keywords connecting with the fundraiser URLs
Thing#2) CHECK. Go onto Google Analytics and actually check what percentage of your Google traffic (and overall traffic, I guess) comes specifically through these URLs. If it's like 2-3%, no big deal. If most of your traffic comes to and lands on these pages, no-indexing them all could be the single largest mistake you'll ever make
Blog posts and articles are fun but no substitute for checking your own, real, actual, factual data. Always always do that
-
Thanks! I've been wondering about it for awhile and actually stumbled upon this very article today - which prompted the question
-
Britney Muller, with Moz, did just that when she meta noindexed over 70,000 low quality profile pages created by users. As a result, Moz saw an increase in organic users, almost 9% the following month and then they saw a lift of 13.7% year-over-year for organic traffic the following month.
You can read all about it or watch the interview about it here: https://www.getcredo.com/britney-muller/
We think it's worth a try for sure.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Keyword stuffing - on page grader count
I used the on page grader for my homepage, www.cprnj.com, and it said the keyword "physical therapy" was used 26 times in the body of the page. But I can't find more than 4. How exactly is keyword stuffing accounted for, and is there some place that I am not seeing the keywords?
Moz Pro | | CPRNJ-JC0 -
Duplicate Pages
Hello, we have an issue which I'm hoping someone can help with. Our Moz system is saying that this page http://www.indigolittle.com/fees/ Is a duplicate page. We use this page purely for mobiles and we have added code to say This has been on for over a month now however Moz is still picking the page us as a High Priority Issue.
Moz Pro | | popcreativeltd0 -
Tool recommendation for Page Depth?
I'd like to crawl our ecommerce site to see how deep (clicks from home page) pages are. I want to verify that every category, sub-category, and product detail page is within three clicks of the home page for googlebot. Suggestions? Thanks!
Moz Pro | | Garmentory0 -
One page report are empty !
Hi Rodgerbot, Now, i've no seomoz one page report for any campaign 😞 What happen ? I've previously several report. Thanks,
Moz Pro | | Max840 -
How to create a campaign to 3 products
I have 3 products, and structure my site: www.mysite.com - have a page to acess the products, institucional, blog.... www.mysite.com/product-1 - have informations and structure to product 1 (videos, testimonials, features, revisions) - this product have the specifics keywords, and are different of product2, 3.... the same in: www.mysite.com/product-2 - keywords specifics and differents of prod1, prod3 www.mysite.com/product-3 - keywords specifics and differents of prod1, prod2 I need create 4 campaigns in seomoz, URL to URL, or create unique campaign with url "www.mysite.com"?
Moz Pro | | Edudebona0 -
Is it possible to override the 10k pages crawl limit on PRO?
Hi There, Just signed up for PRO and I love it! We have a particularly large website (tons of content) and the 10,000 page limit is holding us back from getting really exhaustive analysis. Is there any way to up the limit for a single crawl? Thanks!
Moz Pro | | Richline_Digital0 -
20000 site errors and 10000 pages crawled.
I have recently built an e-commerce website for the company I work at. Its built on opencart. Say for example we have a chair for sale. The url will be: www.domain.com/best-offers/cool-chair Thats fine, seomoz is crawling them all fine and reporting any errors under them url great. On each product listing we have several options and zoom options (allows the user to zoom in to the image to get a more detailed look). When a different zoom type is selected it adds on to the url, so for example: www.domain.com/best-offers/cool-chair?zoom=1 and there are 3 different zoom types. So effectively its taking for urls as different when in fact they are all one url. and Seomoz has interpreted it this way, and crawled 10000 pages(it thinks exist because of this) and thrown up 20000 errors. Does anyone have any idea how to solve this?
Moz Pro | | CompleteOffice0 -
What do i do when all pages are grade A?
I've used the on page grade and now have all my pages at a grade A for relevant keywords. Most of them are cool, achieveing first page rankings apart from a few massive keywords. So the question is, what's next? What do i do now that I'm at grade A, but perhaps not #1 yet... Cheers -dan
Moz Pro | | spytunes0