What's the best way to tackle duplicate pages in a blog?
-
We installed a WP blog on a website and the below result is just an example. All of them lead to the same content. What's the best way to resolve it?
http://www.calmu.edu/blog/
http://www.calmu.edu/blog/calmu-business-spotlight-veev/
http://www.calmu.edu/blog/category/business-buzz/ -
Thanks..
-
No, almost all blogs have this kind of problem, joomla also. But in Joomla i'm more accustomed to solve them. On Joomla i use YOOtheme ZOO that is a great Blog tool, for SEO I use Acesef, which is integrated to ZOO. I can configure this duplicate problem on Acesef Zoo Extension easily.
But using WP you will find a lot of plugins for SEO and will solve this : D
-
Thanks Joao. Are you saying blogs on joomla platform do not have this problem?
-
Do you think it would work. What are the negative aspects of using a redirect?
-
I did see this page but I was unsure of how to guide my developer on putting canonical links. Can you help?
-
Thanks a lot @ParagonDigital. This issue was detected after the SEOMoz crawl of the website and they are listing all these pages as duplicate pages.
-
Hi Sangeeta,
It looks like you are using the Wordpress All-in-One SEO Pack which should take care of most duplicate pages in Category and Archives using canonicalization.
Both of these pages:
http://www.calmu.edu/blog/calmu-business-spotlight-veev/
http://www.calmu.edu/blog/category/business-buzz/have this line added by the SEO Plugin which tells the bots that the page in the blog directory is the real page and to consider the category and archive versions of the page the same page.
rel="canonical" href="http://www.calmu.edu/blog/calmu-business-spotlight-veev/" />
The home page on a blog is always going to have some duplication for the recent posts. The only thing I know of that you could do for that would be to have Wordpress display blurbs for the post with a read more link instead of the whole post on the home page.
-
I'd make sure you have canonical links too. Google has a page dedicated to them here: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139394
-
Usually creating a 301 redirect in your HTAcces file is an effective way of dealing with duplicate content. You could use the free redirect generator if you're not too familiar with writing htaccess files.
-
I think this plugin solves your problem, I don't use wordpress a lot, I prefer joomla but I've heard this for WP:
http://wordpress.org/extend/plugins/platinum-seo-pack/
You can use this script on header.php to make to:
if((is_home() && ($paged < 2 )) || is_single() || is_page() || is_category()){
echo '';
} else {
echo '';
}
I would prefer to use the plugin, because you can make others things god to SEO on-page
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to best handle search landing pages - that don't exist
I have quite a bit of blog information that can be searched, which results in "pages" that don't actually live anywhere. These are scanned by Moz and appear as poor page quality for speed, etc. How do I get the service to either ignore all of these or is there a way to treat them as a real page with content? As there are quite a few generated over time, I'd like to be able to capture them somehow. Thanks.
On-Page Optimization | | amac70 -
Number of internal links and passing 'link juice' down to key pages.
Howdy Moz friends. I've just been checking out this post on Moz from 2011 and wanted to know how relevant it is today? I'm particularly interested in a number of links we have on our HP potentially harming important landing page rankings because not enough 'link juice is getting to them i.e) are they are being diluted by all the many other links on the page? (deeper pages, faqs, etc etc) It seems strange to me that as Google as has got more sophisticated this would still be that relevant (thus the reason for posting). Anyway, I thought I was definitely worth asking. If we can leverage more out of our on-page efforts then great 🙂
On-Page Optimization | | isaac6630 -
Would I be safe canonicalizing comments pages on the first page?
We are building comment pages for an article site that live on a separate URL from the article (I know this is not ideal, but it is necessary). Each comments page will have a summary of the article at the top. Would I be safe using the first page of comments as the canonical URL for all subsequent comment pages? Or could I get away with using the actual article page as the canonical URL for all comment pages?
On-Page Optimization | | BostonWright0 -
Duplicate page content,
Hi, in my campaign crawls diagnostic, I have a lot of Duplicate page content, but we use canonicalization and I used webmastertool to make sure the campaign parameters are not consider by the Google bot. Can you see what could be my problem, or do you have a tip for me or things to look at ? Thank You VB
On-Page Optimization | | Vale70 -
Rethinking company's monthly content production process.
I'm trying to rethink my company's content production process. I believe that we're stuck using a formula that works but can surely be improved. Our Current Process It essentially boils down to posting a certain number of content pieces per month for each client. After the pages are approved and live, there isn't much thought given to them. What We're Thinking After taking a step back, we realize now that a lot of these clients have sites with a tremendous amount of content that is rarely, if ever, revisited. In hopes of creating higher quality content and avoiding having to write that certain number of pieces per month, we're investigating alternative strategies to ensure each client has fresh content. What We're Looking Into Page Edits/Refreshes - I'm beginning to wonder if we can get similar gains by simply refreshing the content that already exists. We can include additional keywords and improve the content in a fraction of the time that it takes to produce a new piece. We're struggling to come up with a process for refreshing the content, however. Ideally we'd be implementing a process where content is revisited 6-12 months, but that still doesn't take care of the problem of creating too much new content. Simplified Version I believe that my company is creating too much content. Editing/refreshing seems like a better use of resources, but I have no idea how to implement a process and develop procedures. Questions What does your content production process look like? Do you produce a certain number a month, a quarter, as needed, etc? How do you go about refreshing your content?
On-Page Optimization | | SeoWebMechanix0 -
Best Way to check for duplicate pages
With Google's updates we know they want to clean out duplicate content. i have been seeing the same crap spit out even word for word on different sites. Anyway how do you experienced SEO people test for dups on your own site as well as other sites. The only thing i can come up with is paying copyscape 5 cts a test. There has to be other ways. Advise/
On-Page Optimization | | joemas990 -
How do you see a list of URLs with duplicate page titles?
When looking at the Duplicate Page Title report, the Other URLs column has various numbers that presumably indicate the number of pages that share the same title. When I click on one of these numbers, say a URL that shows 4 in that column, the next page reports "No sample duplicate URLs to report". Why isn't it showing me the other 3 URLs with the same page title?
On-Page Optimization | | jkenyon0 -
Why isn't SEOMoz using File Extensions (*.html etc) on any of their web page URLs?
...and what is the SEO benefit of this? This video from Matt Cutts suggests using file extentions, except for a directory.
On-Page Optimization | | magicrob0