Is Google able to determine duplicate content every day/ month?
-
A while ago I talked to somebody who used to work for MSN a couple of years ago within their engineering department. We talked about a recent dip we had with one of our sites.We argued this could be caused by the large amount of duplicate content we have on this particular website (+80% of our site).
Then he said, quoted: "Google seems only to be able to determine every couple of months instead of every day if the content is actually duplicate content". I clearly don't doubt that duplicate content is a ranking factor. But I would like to know you guys opinions about Google being only able to determine this every couple of X months instead of everyday.
Have you seen or heard something similar?
-
Sorting out Google's timelines is tricky these days, because they aren't the same for every process and every site. In the early days, the "Google dance" happened about once a month, and that was the whole mess (index, algo updates, etc.). Over time, index updates have gotten a lot faster, and ranking and indexation are more real-time (especially since the "Caffeine" update), but that varies wildly across sites and pages.
I think you also have to separate a couple of impacts of duplicate content. When it comes to filtering - Google excluding a piece of duplicate content from rankings (but not necessarily penalizing the site), I don't see any evidence that this takes a couple of months. It can Google days or weeks to re-cache any given page, and to detect a duplicate they would have to re-cache both copies, so that may take a month in some cases, realistically. I strongly suspect, though, that the filter itself happens in real-time. There's no good way to store a filter for every scenario, and some filters are query-specific. Computationally, some filters almost have to happen on the fly.
On the other hand, you have updates like Panda, where duplicate content can cause something close to a penalty. Panda data was originally updated outside of the main algorithm, to the best of our knowledge, and probably about once/month. Over the more than a year since Panda 1.0 rolled out, though, it seems that this timeline accelerated. I don't think it's real-time, but it may be closer to 2 weeks (that's speculation, I admit).
So, the short answer is "It's complicated" I don't have any evidence to suggest that filtering duplicates takes Google months (and, actually, have anecdotal evidence that it can happen much faster). It is possible that it could take weeks or months to see the impact of duplicates on some sites and in some situations, though.
-
Hi Donnie,
Thanks for your reply, but I was already aware of the fact that Google had/ has a sandbox. I had to mention this within my question. I'm looking more for an answer around the fact if Google is able to determine on what basis if pages are duplicate.
Because I saw dozens of cases where our content was indexed and we linked/ linked not back to the 'original' source.
Also want to make clear that in all of these cases the duplicate content was in agreement with the original sources just to be sure.
-
In the past google had a sandbox period before any page (content) would rank. However, now everything is instant. (just learned this today @seomoz)
If you release something, Google will index it as fast as possible. If that info gets duplicated Google will only count the first one indexed. Everyone else loses brownie points unless they trackback/link back to the main article (first indexed).
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Enable Ecommerce Tracking with Google Tag Manager
Hello all, I am having an issue with tracking the sales on a webshop, and I would like to know how I can enable the ecommerce tracking with Google Tag Manager? Right now I am tracking the pageviews fine, firing a Universal Analytics tag. How can I achieve this? Thank you.
Reporting & Analytics | | renehansen0 -
What determines the page order of site:domain?
Whenever I use site:domain.com to check what's index, it's pretty much always in the same order. I gather from this, the order is not random. I'm also reasonably certainly it isn't related to any page strength signals or ranking results. So, does anyone know why the pages are displayed in the order they are? What information does the order of the pages tell me? Thanks, Ruben
Reporting & Analytics | | KempRugeLawGroup1 -
Google Analytics: Different stats for date range vs single month?
I've been scratching my head, chin, and you name it over this one. I have an advanced segment to remove bot traffic from my data. When I look at the Audience Overview data for a single month (let's say Aug). I am shown a session count. No problems here, however If I set the date range to (January - August). The august monthly stats is incorrect, much lower. What this means is that, if I export a CSV report from Jan-Aug, the data is wrong compared to individually recording a month. Anyone faced this? I've asked the question over at the Google Analytics technical section as well, but no answer P.S I even used the 'control the number of sessions used to calculate this report' tool but no luck.
Reporting & Analytics | | Bio-RadAbs0 -
How do you adjust the Google Analytics date range?
Hello! Every time I log into Google Analytics (via desktop), the date range is set to October 1, 2013 to October 28, 2013. I've looked throughout the GA settings to find a place where I change this default setting to anything else ("Last week?") but have not be able to find where that setting is. Can you help? Losing my mind,
Reporting & Analytics | | SmileMoreSEO
Erik0 -
Google Direct Traffic Reporting for Mobile on Analytics
We noticed a significant drop in our direct traffic in Google Analytics for mobile on July 30th but all our other traffic remains the same. What is the possible cause of this?
Reporting & Analytics | | COEDMediaGroup0 -
2 days in the past week Google has crawled 10x the average pages crawled per day. What does this mean?
For the past 3 months my site www.dlawlesshardware.com has had an average of about 400 pages crawled per day by google. We have just over 6,000 indexed pages. However, twice in the last week, Google crawled an enormous percentage of my site. After averaging 400 pages crawled for the last 3 months, the last 4 days of crawl stats say the following. 2/1 - 4,373 pages crawled 2/2 - 367 pages crawled 2/3 - 4,777 pages crawled 2/4 - 437 pages crawled What is the deal with these enormous spike in pages crawled per day? Of course, there are also corresponding spikes in kilobytes downloaded per day. Essentially, Google averages crawling about 6% of my site a day. But twice in the last week, Google decided to crawl just under 80% of my site. Has this happened to anyone else? Any ideas? I have literally no idea what this means and I haven't found anyone else with the same problem. Only people complaining about massive DROPS in pages crawled per day. Here is a screenshot from Webmaster Tools: http://imgur.com/kpnQ8EP The drop in time spent downloading a page corresponded exactly to an improvement in our CSS. So that probably doesn't need to be considered, although I'm up for any theories from anyone about anything.
Reporting & Analytics | | dellcos0 -
Google Analytics and DNS change
Our new alumni application is going be tested at domain uva.imodules.com . We are going to collect traffic data with a Google analytics account number UA-884652-XX. So going to uva.imodules.com/myPage.html would send its data to Google Analytics with that account number. Then when it is ready for production we are going to just change the domain name of the application and switch the DNS over to dardencommunity.darden.virginia.edu . So going to dardencommunity.darden.virginia.edu /myPage.html would send its data to Google Analtics with that SAME account number. Aside from having the testing domain data in the same profile are there any other issues/problems we may run into?
Reporting & Analytics | | Darden0 -
Duplicate Content
I am looking to check the duplicate content of two websites against each other, www.housesalesbulgaria.com and www.housesalesturkey.com. What is the best way to check this?
Reporting & Analytics | | Feily0