My pages says it has 16 errors, need help
-
My pages says it has 16 errors, and all of them are due to duplicate content. How do I fix this? I believe its only due to my meta tag description.
-
Glad to help
Hope it's an easy fix.
-
and apparently in unison!
-
You are very welcome and good luck!
-
HAHA! What can I say other then great minds think alike!
-
Thank you both! I will look into Dr. Pete guide, pronto!
Cheers!
-
Hey Jake,
SNAP!
Sha
-
Hi Gajendra,
The Pro App identifies two types of duplicate Errors (the red button in your Crawl Diagnostics Summary). These are Duplicate Page Content **(**a significant amount of content on the page has been identified as duplicate) and Duplicate Page Title (page title only has been identified as duplicate).
Duplicate Page Title errors are most often an internal issue, where many pages in the site have been given the same page title.
Duplicate Page Content errors can be either an internal and/or external issue. It may be that identical pages within the site are visible via multiple URL's, AND/OR the content on pages may be a duplicate of content on other websites. This happens a lot with sites that use product descriptions, content feeds etc from other sites.
To identify the actual URLs where duplicated content has been detected, click the Red error button in the Diagnostics Summary and you will see a list of pages where the error has been identified. In the second column, you will see a blue link which tells you how many duplicates there are for that particular item. When you click the link you will see a full list of URLs which are duplicates.
There are a number of things which can cause duplicate content errors on a site - many are due to the way the site is structured or functions. To really understand what is happening and how to deal with it, you should read Dr Pete Meyers' landmark post Duplicate Content in a Post-Panda World.
Hope that helps,
Sha
-
My advice would be to dive deep into the campaign that you have running for your page and check to see what is causing the issue. It could be that your URL isn't refined resulting in there being a copy for the page for each URL variation. This can be solved a few different ways but unfortunately I am not quite sure what the problem is that is causing the duplicate content with the information you have provided. Dr. Pete has put together a fantastic Duplicate Content Guide I would recommend checking out . He goes over each variation and some great ways to deal with duplicate content issues.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Link Building - Blogger Outreach - Need Help!
I am looking forward to create a solid link building campaign for my affiliate blog. But, what I want is completely GENUINE and effective link building. Could anyone please suggest what strategies should I opt for. I am planning to get Guest Blogging done? But, is it effective anymore. I have read so much about companies posting on Private networks.
Technical SEO | | scott_eastman
Are there actually companies out there that actually outreach blogs? it would be helpful if somebody could refer based upon personal experience. Much Appreciated!1 -
Are image pages considered 'thin' content pages?
I am currently doing a site audit. The total number of pages on the website are around 400... 187 of them are image pages and coming up as 'zero' word count in Screaming Frog report. I needed to know if they will be considered 'thin' content by search engines? Should I include them as an issue? An answer would be most appreciated.
Technical SEO | | MTalhaImtiaz0 -
Website SEO Product Pages - Condense Product Pages
We are managing a website that has seen consistently dropping rankings over the last 2 years (http://www.independence-bunting.com/). Our long term strategy has been purely content-based and is of high quality, but isn’t seeing the desired results. It is an ecommerce site that has a lot of pages, most of which are category or product pages. Many of the product pages have duplicate or thin content, which we currently see as one of the primary reasons for the ranking drops.The website has many individual products which have the same fabric and size options, but have different designs. So it is difficult to write valuable content that differs between several products that have similar designs. Right now each of the different designs has its own product page. We have a dilemma, because our options are:A.Combine similar designs of the product into one product page where the customer must choose a design, a fabric, and a size before checking out. This way we can have valuable content and don’t have to duplicate that content on other pages or try to find more to say about something that there really isn’t anything else to say about. However, this process will remove between 50% and 70% of the pages on the website. We know number of indexed pages is important to search engines and if they suddenly see that half of our pages are gone, we may cause more negative effects despite the fact that we are in fact aiming to provide more value to the user, rather than less.B.Leave the product pages alone and try to write more valuable content for each product page, which will be difficult because there really isn’t that much more to say, or more valuable ways to say it. This is the “safe” option as it means that our negative potential impact is reduced but we won’t necessarily see much positive trending either. C.Test solution A on a small percentage of the product categories to see any impact over the next several months before making sitewide updates to the product pages if we see positive impact, or revert to the old way if we see negative impact.Any sound advice would be of incredible value at this point, as the work we are doing isn’t having the desired effects and we are seeing consistent dropping rankings at this point.Any information would be greatly appreciated. Thank you,
Technical SEO | | Ed-iOVA0 -
Title errors for pages behind a login
On our website we have content which is located behind a members login. the SEOMoz crawl report has returned these pages with a "no title" error against them. It appears that these pages are being crawled until the website prompts it to login. I can only presume that it follows the url but doesn't have an opportunity to crawl the meta data. what is the solution for these pages? 401, so that the bots know these pages are behind a login? do we implement anything to ensure "no index", "no follow"? I searched the T'interwebs and couldn't find anything conclusive on this issue.
Technical SEO | | digitalez0 -
Photo Attachment Page
Hi! I am using Wordpress, and when I click on my photos, they go to an attachment page that only shows the photo. However, in my SeoMoz crawl results, these are showing up as duplicate content page. Is it worth going back and changing all of my photos so they aren't clickable and therefore do not create a duplicate page? Thanks in advance! Jodi familytravelmagazine.com
Technical SEO | | JodiFTM0 -
Splitting Page Authority with two URLs for the same page.
Hello guys, My website is currently holding two different URLs for the same page and I am under the impression such set up is dividing my Page Authority and Link Juice. We currently have the following page with both URLs below: www.wbresearch.com/soldiertechnologyusa/home.aspx
Technical SEO | | JoaoPdaCosta-WBR
www.wbresearch.com/soldiertechnologyusa/ Analysing the page authority and backlinks I identified that we are splitting the amount of backlinks (links from sites, social media and therefore authority). "/home.aspx"
PA: 67
Linking Root Domains: 52
Total Links: 272 "/"
PA: 64
Linking Root Domains: 29
Total Links: 128 I am under the impression that if the URLs were the same we would maximise our backlinks and therefore page authority. My Question: How can I fix this? Should I have a 301 redirect from the page "/" to the "/home.aspx" therefore passing the authority and link juice of “/” directly to “/homes.aspx”? Trying to gather thoughts and ideas on this, suggestions are much appreciated? Thanks!0 -
Dynamic page
I have few pages on my site that are with this nature /locator/find?radius=60&zip=&state=FL I read at Google webmaster that they suggest not to change URL's like this "According to Google's Blog (link below) they are able to crawl the simplified dynamic URL just fine, and it is even encouraged to use a simple dynamic URL ( " It's much safer to serve us the original dynamic URL and let us handle the problem of detecting and avoiding problematic parameters. " ) _http://googlewebmastercentral.blogspot.com/2008/09/dynamic-urls-vs-static-urls.html _It can also actually lead to a decrease as per this line: " We might have problems crawling and ranking your dynamic URLs if you try to make your urls look static and in the process hide parameters which offer the Googlebot valuable information. "The URLs are already simplified without any extra parameters, which is the recommended structure from Google:"Does that mean I should avoid rewriting dynamic URLs at all?
Technical SEO | | ciznerguy
That's our recommendation, unless your rewrites are limited to removing unnecessary parameters, or you are very diligent in removing all parameters that could cause problems" I would love to get some opinions on this also please consider that those pages are not cached by Google for some reason.0 -
Help needed with robots.txt regarding wordpress!
Here is my robots.txt from google webmaster tools. These are the pages that are being blocked and I am not sure which of these to get rid of in order to unblock blog posts from being searched. http://ensoplastics.com/theblog/?cat=743 http://ensoplastics.com/theblog/?p=240 These category pages and blog posts are blocked so do I delete the /? ...I am new to SEO and web development so I am not sure why the developer of this robots.txt file would block pages and posts in wordpress. It seems to me like that is the reason why someone has a blog so it can be searched and get more exposure for SEO purposes. IS there a reason I should block any pages contained in wodrpress? Sitemap: http://www.ensobottles.com/blog/sitemap.xml User-agent: Googlebot Disallow: /*/trackback Disallow: /*/feed Disallow: /*/comments Disallow: /? Disallow: /*? Disallow: /page/
Technical SEO | | ENSO
User-agent: * Disallow: /cgi-bin/ Disallow: /wp-admin/ Disallow: /wp-includes/ Disallow: /wp-content/plugins/ Disallow: /wp-content/themes/ Disallow: /trackback Disallow: /commentsDisallow: /feed0