Duplicate Page Content, Indexing and Rel Canonical Just DOUBLED! Need Advice to Fix
-
Last Friday (Penguin 5/2.1) my website shot way off the grid and I noticed in my MOZ PRO Campaign dashboard that all of the following just doubled in numbers on my website: duplicate page content, Google indexing, and rel canonicals. I also noticed that some of my pages, images, tags and categories now added a /page/2/ or a -2. I just changed noindex for tags, but indexing for media, pages, posts, and categories. I'm currently using All In One SEO for a plugin. Any advice would be much appreciated as I'm stuck on the issue.
relconical.png Duplicate-Page-Content.png [Duplicate Content II](Duplicate Content II) index1.png
-
What plugins have you added recently? What settings have you changed in Wordpress?
Something is causing a ton more URLs. I would worry less about the rel canonicals and more about the duplicate content. I am with Rohit and his advice, you need to noindex a ton of content including archives, tags and maybe categories. I would worry less about categories though.
-
Hey Lucas,
I'd suggest you to try WordPress SEO by Yoast and no-index especially archives, tags and if possible, categories. Once they're no-indexed, search engines won't care about duplicate content. It's the duplicate content in the search engines' indexes that they care about.
You also have to look for the root of the issue - what caused the sudden rise in the number of indexed pages.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Large site with content silo's - best practice for deep indexing silo content
Thanks in advance for any advice/links/discussion. This honestly might be a scenario where we need to do some A/B testing. We have a massive (5 Million) content silo that is the basis for our long tail search strategy. Organic search traffic hits our individual "product" pages and we've divided our silo with a parent category & then secondarily with a field (so we can cross link to other content silo's using the same parent/field categorizations). We don't anticipate, nor expect to have top level category pages receive organic traffic - most people are searching for the individual/specific product (long tail). We're not trying to rank or get traffic for searches of all products in "category X" and others are competing and spending a lot in that area (head). The intent/purpose of the site structure/taxonomy is to more easily enable bots/crawlers to get deeper into our content silos. We've built the page for humans, but included link structure/taxonomy to assist crawlers. So here's my question on best practices. How to handle categories with 1,000+ pages/pagination. With our most popular product categories, there might be 100,000's products in one category. My top level hub page for a category looks like www.mysite/categoryA and the page build is showing 50 products and then pagination from 1-1000+. Currently we're using rel=next for pagination and for pages like www.mysite/categoryA?page=6 we make it reference itself as canonical (not the first/top page www.mysite/categoryA). Our goal is deep crawl/indexation of our silo. I use ScreamingFrog and SEOMoz campaign crawl to sample (site takes a week+ to fully crawl) and with each of these tools it "looks" like crawlers have gotten a bit "bogged down" with large categories with tons of pagination. For example rather than crawl multiple categories or fields to get to multiple product pages, some bots will hit all 1,000 (rel=next) pages of a single category. I don't want to waste crawl budget going through 1,000 pages of a single category, versus discovering/crawling more categories. I can't seem to find a consensus as to how to approach the issue. I can't have a page that lists "all" - there's just too much, so we're going to need pagination. I'm not worried about category pagination pages cannibalizing traffic as I don't expect any (should I make pages 2-1,000) noindex and canonically reference the main/first page in the category?). Should I worry about crawlers going deep in pagination among 1 category versus getting to more top level categories? Thanks!
Moz Pro | | DrewProZ1 -
Duplicate Page
I just Check Crawl the status error with Duplicate Page Content. As Mentioned Below. Songs.pk | Download free mp3, Hindi Music, Indian Mp3 Songs http://www.getmp3songspk.com Songs.pk | Download free mp3, Hindi Music, Indian Mp3 Songs http://getmp3songspk.com and then i added these lines to my htaccess file RewriteBase /
Moz Pro | | Getmp3songspk
RewriteCond %{HTTP_HOST} !^www.getmp3songspk.com$ [NC]
RewriteRule ^(.*)$ http://www.getmp3songspk.com/$1 [L,R=301] But Still See that error again when i crawl a new test.0 -
How long for authority to transfer form an old page to a new page via a 301 redirect? (& Moz PA score update?)
Hi How long aproximately does G take to pass authority via a 301 from an old page to its new replacement page ? Does Moz Page Authority reflect this in its score once G has passed it ? All Best
Moz Pro | | Dan-Lawrence
Dan3 -
On Link Analysis tab I my best pages are 301 and 404 pages.
I looked on my redirrect file and found that /* redirects to /v/404.asp.
Moz Pro | | sbetzen
However if you look below at the link analysis the 404 page is getting a 404 error.
The homepage ecowindchimes.com/ is getting a 301 (but I don't know where it is going to).
The third one is also redirected. 1. [No Data] ecowindchimes.com/ ||| 301 ||| 2 ||| 36 2. 2. [No Data] ecowindchimes.com/v/404.asp ||| 404 ||| 2 ||| 34 3. [No Data] 3. ecowindchimes.com/index.html?lang=en-us&target=d2.html ||| 301 ||| 1 ||| 33 So I have 2 questions: 1) should this be fixed? and 2) how? This is a volusion site and I believe the "catchall" redirect was done by them0 -
Duplicate Page Titles & Content
We have just launched a new version of a website and after running it through SEOMOZ we have over 6000 duplicate title & content errors. (awesome) 😕 We have products that show up multiple times under different URLs however we "thought" we had implemented the rel=canonical correctly. My question is - do these errors still show up in SEOMOZ despite the canonical tags being there OR if they were "correct" would we be getting "zero" errors?
Moz Pro | | ZaddleMarketing0 -
Inbound Links To Deleted Pages
Hi, I recently deleted some pages from my website and believe that there will be external inbound links pointing to these pages. I would like to find them and put redirects in place - can anybody tell me how to use SEOMOZ to find where external links are poiting to moved/deleted pages Thanks
Moz Pro | | stayin1 -
Seomoz crawling filtered pages
Hi, I just checked an seo campaign we started last week, so I opened seomoz to see the crawl diagnostics. Lot's of duplicate content & duplicate titles showing up, but that's because Rogerbot is crawling all of the filtered pages as well. How do I exclude these pages from being crawled? /product/brand-x/3969?order=brand&sortorder=ASC
Moz Pro | | nvs.nim
/product/brand-x/3969?order=popular&sortorder=ASC
/product/brand-x/3969?order=popular&sortorder=DESC&page=10
/product/brand-x/3969?order=popular&sortorder=DESC&page=110 -
Notice rel canonical
Hi, Why does my sites get the crawler notice for rel canonical when using the PRO account crawlers?? The canonical is there and it works, and to me it looks just like any other canonical link, the canonical is only at some links but not everyone, why is that?
Moz Pro | | careeron0