How can I correct this massive duplicate content problem?
-
I just updated a clients website which resulted in about 6000 duplicate page content errors. The way I set up my clients new website is I created a sub folder calles blog and installed wordpress on that folder. So when you go to suncoastlaw.com your taken to an html website, but if you click on the blog link in the nav, your taken to the to blog subfolder. The problem I'm having is that the url's seem to be repeating them selves. So for example, if you type in
in http://suncoastlaw.com/blog/aboutus.htm/aboutus.htm/aboutus.htm/aboutus.htm/
that somehow is a legitimate url and is being considered duplicate content of of http://suncoastlaw.com/aboutus.htm/. This repeating url only seems to be a problem when the blog/ is in the url. Any ideas as to how I can fix this?
-
anybody got a similar plugin for Joomla?
-
Hi Scott Take a look at this http://wordpress.org/support/topic/duplicate-url-in-posts All the best Richard
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Problem with On-page
I have an issue. I have added 5 keywords but when i go to the "on page" tab. They are not there... So i press on "Add keyword" and it takes me to another page where i can see all my keywords. So i go back to the "on page" and no keyword shows up. I wanna have a summary of the weekly crawl for the on page of these keywords and it's not showing up 😞 Anybody knows why?
Moz Pro | | theseolab0 -
The pages that add robots as noindex will Crawl and marked as duplicate page content on seo moz ?
When we marked a page as noindex with robots like {<meta name="<a class="attribute-value">robots</a>" content="<a class="attribute-value">noindex</a>" />} will crawl and marked as duplicate page content(Its already a duplicate page content within the site. ie, Two links pointing to the same page).So we are mentioning both the links no need to index on SE.But after we made this and crawl reports have no change like it tooks the duplicate with noindex marked pages too. Please help to solve this problem.
Moz Pro | | trixmediainc0 -
How can I set up a campaign to track just directories on a specific subdomain?
I am trying to setup a campaign to track a specific subdomain and all its directories. For example, want to track example.abc.com/11111 and example.abc.com/22222 and so on. No interest in tracking abc.com itself. Is this possible?
Moz Pro | | BalihooSearch1 -
Competitive Analysis Problem
I just discovered that, for my competitive analysis, it is set to judge my naked domain instead of www. My rel="canonical" is set to point to www., thus, www. has a much higher domain authority than my naked domain. However, I don;t want to lose all of the history I have accumulated over the years for this domain. How to I make it so SEOmoz matches against www. instead of my naked domain? Which is wiser to gauge by?
Moz Pro | | bks_seo0 -
Duplicate content pages
Crawl Diagnostics Summary shows around 15,000 duplicate content errors for one of my projects, It shows the list of pages with how many duplicate pages are there for each page. But i dont have a way of seeing what are the duplicate page URLs for a specific page without clicking on each page link and checking them manually which is gonna take forever to sort. When i export the list as CSV, duplicate_page_content column doest show any data. Can anyone please advice on this please. Thanks <colgroup><col width="1096"></colgroup>
Moz Pro | | nam2
| duplicate_page_content |1 -
Crawl Diagnostics returning duplicate content based on session id
I'm just starting to dig into crawl diagnostics and it is returning quite a few errors. Primarily, the crawl is indicating duplicate content (page titles, meta tags, etc), because of a session id in the URL. I have set-up a URL parameter in Google Webmaster Tools to help Google recognize the existence of this session id. Is there any way to tell the SEOMoz spider the same thing? I'd like to get rid of these errors since I've already handled them for the most part.
Moz Pro | | csingsaas0 -
Can you please guide me how to use Blogscape for Social Media Monitoring?
Please provide me with the complete details of How to use Blogscape for Social Media Monitoring?Can you also provide the Video Tutorial if any for the same? Awaiting your Reply at the earliest. Regards, Prashakth Kamath
Moz Pro | | 1prashakth0 -
Broken Links and Duplicate Content Errors?
Hello everybody, I’m new to SEOmoz and I have a few quick questions regarding my error reports: In the past, I have used IIS as a tool to uncover broken links and it has revealed a large amount of varying types of "broken links" on our sites. For example, some of them were links on my site that went to external sites that were no longer available, others were missing images in my CSS and JS files. According to my campaign in SEOmoz, however, my site has zero broken links (4XX). Can anyone tell me why the IIS errors don’t show up in my SEOmoz report, and which of these two reports I should really be concerned about (for SEO purposes)? 2. Also in the "errors" section, I have many duplicate page titles and duplicate page content errors. Many of these "duplicate" content reports are actually showing the same page more than once. For example, the report says that "http://www.cylc.org/" has the same content as "http://www.cylc.org/index.cfm" and that, of course, is because they are the same page. What is the best practice for handling these duplicate errors--can anyone recommend an easy fix for this?
Moz Pro | | EnvisionEMI0