Content Audit Questions
-
Hi Mozzers
Having worked on my companies site for a couple of months now correcting many issues, im now ready to begin looking at a content review, many areas of the site contain duplicate content, the main causes being
1. Category Page Duplications
e.g.
Widget Page Contains ("Blue Widget Extract")
Widget Page Contains ("Red Widget Extract")
Blue Widget Page Contains ("Same Blue Widget Extract")
Red Widget Page Contains ("Same Red Widget Extract")2. Product Descriptions
Item 1 (Identical to item 2 with the exception of a few words and technical specs)
Item 2Causing almost all the content on the site to get devalued. Whilst i've cleared all moz errors and warnings im certain this is causing devaluation of most of the website.
I was hoping you could answer these questions so I know what to expect once i have made the changes.
- Will the pages that had duplicate content recover once they possess unique content or should i expect a hard and slow climb back?
- The website has never receive any warnings from Google, does this mean recovery for penalties like duplicate content will be quicker
- Several pages rank on page 1 for fairly competitive keywords despite having duplicate content and keyword spammy content. What are the chances of shooting myself in the foot by editing this content?
- I know I will have to wait for google to crawl the pages before i see any reflection in the changes, but how long after google has crawled the page should I get a realistic idea of how positive the changes were?
As always, thanks for you time!
-
Hi there
1. Yes, they can come back if you create unique content to that page that takes on-site factors into consideration. The climb really depends on the industry and the queries you are trying to rank for.
2. No, just because you have never had a penalty before doesn't mean the climb out of a penalty would be easier than a site that has multiple. The best course of action is to get valuable and unique content up as quickly (but efficiently) as you can and avoid penalties at all costs.
3. I would honestly run through the content audit that Moz has - it's a step by step process that will help you distinguish what content needs to be removed, updated, or consolidated. Remember - Google provides resources and steps for you to take when you have duplicate content.
4. SEO Theory has a great post about this that covers multiple areas of your SEO - from meta tags, content, and penalty recovery. I suggest you read it as it's more comprehensive than anything anyone could write here. Marie Haynes also wrote a great post for traffic performance after a penalty if you want to read into that as well (not saying you have one! It's just useful information).
Hope this helps! Good luck!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
HTTPS - implementation question
Hello, I am looking at a site on which they haven't 301'd http to https, so each URL is there whether you have http or https at the beginning. Why would a site owner not 301 to https? Is there any logical reason not to use 301? This particular website is simply using a canonical tag to point to the https version of each URL.
Intermediate & Advanced SEO | | McTaggart0 -
Duplicate Content Question With New Domain
Hey Everyone, I hope your day is going well. I have a question regarding duplicate content. Let's say that we have Website A and Website B. Website A is a directory for multiple stores & brands. Website B is a new domain that will satisfy the delivery niche for these multiple stores & brands (where they can click on a "Delivery" anchor on Website A and it'll redirect them to Website B). We want Website B to rank organically when someone types in " <brand>delivery" in Google. Website B has NOT been created yet. The Issue Website B has to be a separate domain than Website A (no getting around this). Website B will also pull all of the content from Website A (menus, reviews, about, etc). Will we face any duplicate content issues on either Website A or Website B in the future? Should we rel=canonical to the main website even though we want Website B to rank organically?</brand>
Intermediate & Advanced SEO | | imjonny0 -
Possible duplicate content issue
Hi, Here is a rather detailed overview of our problem, any feedback / suggestions is most welcome. We currently have 6 sites targeting the various markets (countries) we operate in all websites are on one wordpress install but are separate sites in a multisite network, content and structure is pretty much the same barring a few regional differences. The UK site has held a pretty strong position in search engines the past few years. Here is where we have the problem. Our strongest page (from an organic point of view) has dropped off the search results completely for Google.co.uk, we've picked this up through a drop in search visibility in SEMRush, and confirmed this by looking at our organic landing page traffic in Google Analytics and Search Analytics in Search Console. Here are a few of the assumptions we've made and things we've checked: Checked for any Crawl or technical issues, nothing serious found Bad backlinks, no new spammy backlinks Geotarggetting, this was fine for the UK site, however the US site a .com (not a cctld) was not set to the US (we suspect this to be the issue, but more below) On-site issues, nothing wrong here - the page was edited recently which coincided with the drop in traffic (more below), but these changes did not impact things such as title, h1, url or body content - we replaced some call to action blocks from a custom one to one that was built into the framework (Div) Manual or algorithmic penalties: Nothing reported by search console HTTPs change: We did transition over to http at the start of june. The sites are not too big (around 6K pages) and all redirects were put in place. Here is what we suspect has happened, the https change triggered google to re-crawl and reindex the whole site (we anticipated this), during this process, an edit was made to the key page, and through some technical fault the page title was changed to match the US version of the page, and because geotargetting was not turned on for the US site, Google filtered out the duplicate content page on the UK site, there by dropping it off the index. What further contributes to this theory is that a search of Google.co.uk returns the US version of the page. With country targeting on (ie only return pages from the UK) that UK version of the page is not returned. Also a site: query from google.co.uk DOES return the Uk version of that page, but with the old US title. All these factors leads me to believe that its a duplicate content filter issue due to incorrect geo-targetting - what does surprise me is that the co.uk site has much more search equity than the US site, so it was odd that it choose to filter out the UK version of the page. What we have done to counter this is as follows: Turned on Geo targeting for US site Ensured that the title of the UK page says UK and not US Edited both pages to trigger a last modified date and so the 2 pages share less similarities Recreated a site map and resubmitted to Google Re-crawled and requested a re-index of the whole site Fixed a few of the smaller issues If our theory is right and our actions do help, I believe its now a waiting game for Google to re-crawl and reindex. Unfortunately, Search Console is still only showing data from a few days ago, so its hard to tell if there has been any changes in the index. I am happy to wait it out, but you can appreciate that some of snr management are very nervous given the impact of loosing this page and are keen to get a second opinion on the matter. Does the Moz Community have any further ideas or insights on how we can speed up the indexing of the site? Kind regards, Jason
Intermediate & Advanced SEO | | Clickmetrics0 -
Question about Indexing of /?limit=all
Hi, i've got your SEO Suite Ultimate installed on my site (www.customlogocases.com). I've got a relatively new magento site (around 1 year). We have recently been doing some pr/seo for the category pages, for example /custom-ipad-cases/ But when I search on google, it seems that google has indexed the /custom-ipad-cases/?limit=all This /?limit=all page is one without any links, and only has a PA of 1. Whereas the standard /custom-ipad-cases/ without the /? query has a much higher pa of 20, and a couple of links pointing towards it. So therefore I would want this particular page to be the one that google indexes. And along the same logic, this page really should be able to achieve higher rankings than the /?limit=all page. Is my thinking here correct? Should I disallow all the /? now, even though these are the ones that are indexed, and the others currently are not. I'd be happy to take the hit while it figures it out, because the higher PA pages are what I ultimately am getting links to... Thoughts?
Intermediate & Advanced SEO | | RobAus0 -
I have 2 Questions
what if we do the interlinking on the exact keywords? Is this comes under spam technique? For example - http://blog.payscout.com/automotive-merchant-services/ I interlink the exact keyword in the above URL. Can we use same image 2-3 times on the same website with different anchor tags? For example - http://packforcity.com/what-to-wear-in-new-orleans-in-january/ http://packforcity.com/what-to-wear-in-san-francisco-in-october/ Same image used on the website with different alt tag.
Intermediate & Advanced SEO | | AlexanderWhite0 -
Client Question
Client Question - How much time this keyword takes to rank? Is there any tool or any calculation to find out the estimate time for a particular keyword?
Intermediate & Advanced SEO | | marknorman1 -
Content question about 3 sites targeted at 3 different countries
I am new here, and this is my first question. I was hoping to get help with the following scenario: I am looking to launch 3 sites in 3 different countries, using 3 different domains. For example the.com for USA, the .co.uk for UK , and a slightly different .com for Australia, as I could not purchase .com.au as I am not a registered business in Australia. I am looking to set the Geographic Target on Google Webmaster. So for example, I have set the .com for USA only, with .co.uk I won't need to set anything, and I will set the other Australian .com to Australia. Now, initially the 3 site will be "brochure" websites explaining the service that we offer. I fear that at the beginning they will most likely have almost identical content. However, on the long term I am looking to publish unique content for each site, almost on a weekly basis. So over time they would have different content from each other. These are small sites to begin with. So each site in the "brochure" form will have around 10 pages. Over time it will have 100's of pages. My question or my worry is, will Google look at the fact that I have same content across 3 sites negatively even though they are specifically targeted to different countries? Will it penalise my sites negatively?
Intermediate & Advanced SEO | | ryanetc0 -
Diagnosing duplicate content issues
We recently made some updates to our site, one of which involved launching a bunch of new pages. Shortly afterwards we saw a significant drop in organic traffic. Some of the new pages list similar content as previously existed on our site, but in different orders. So our question is, what's the best way to diagnose whether this was the cause of our ranking drop? My current thought is to block the new directories via robots.txt for a couple days and see if traffic improves. Is this a good approach? Any other suggestions?
Intermediate & Advanced SEO | | jamesti0