Which Algorithm Change Hurt the Site? A causation/correlation issue
-
The attached graph is from google analytics, a correlation of about 14 months of Organic Google visits with algo changes, data from moz naturally
Is there any way to tell from this which will have affected the site? for example #1 or #2 seems to be responsible for the first dip, but #4 seems to fix it and it broke around 6, or is the rise between 4 and 7 an anomaly and actually 1 or 2 caused a slip from when it was released all the way to when 7 was released.
Sorry if the graph is a little cloak and dagger, that is partly because we don't have permissions to reveal much about the identity, and partly because we were trying to do a kind of double blind, separating the data from our biases
We can say though the different between the level at the start and end of the graph is at least 10,000 visits per day
-
It's really tough (and even inadvisable) to try to pin a traffic change to an algorithm update based solely on spikes in a graph. On rare occasion, it's pretty clear (Penguin is a good example, I've found), but in most cases there's just a lot of gray areas and the graph leaves out a mountain of data.
The big issue I see here is potentially seasonality and knowing what happened to the site and business. For example, you can look at #6 and #7 and call these dips, but that sort of ignores the spike. Is the dip the anomaly, or is the spike the anomaly? What drove up traffic between #4 and #6? Maybe that simply stopped, was a one-time event, or was seasonal.
Why was there volatility between #7 and #14 and then relative stability after #14? You could call #14 a "drop", but not knowing the timeline, it's hard to see how the curve might smooth in different windows. What it looks like is a period of highly volatile events followed by an evening out.
Without knowing the industry, the business, the history, and without segmenting this data, trying to make claims just based on dips and spikes in the graph is pretty dangerous, IMO. This could have virtually nothing to do with the algorithm, in theory.
-
I don't understand how dates would help? Was it not clear that the red lines are the dates of algo updates?
By abstracting the data the hope was to gain insight into how to read the graphs in relation to updates, and not just get help related to specific updates which wouldn't help much the next time we have to deal with a traffic drop problem. More a question of who to think rather than what to think.
Trying to read between the lines are you saying different algo changes take different amounts of time to kick in and that's why a more detailed graph is more useful? For example if #1 was the first penguin change, would your response be different if it was the first panda change?
-
You can use the Google Penalty Checker tool from Fruition: http://fruition.net/google-penalty-checker-tool/
I would not believe 100% on the tool results, but you can at least have an initial Analise, you'll need to go deeper to double check if this initial Analise is 100% relevant or not.
- Felipe
-
This doesn't tell me anything. If you at least had dates in there you could compare traffic dips to Google Algo Updates/Refreshes.
I understand you can't reveal the domain but I will be shocked if somebody here can tell you anything without further information. This place is full of brilliant minds, but that would take some sort of a mind-reader to tackle...
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need advice on setting up primary domain and shopify site analytics to work best together
Hello, I have a client that I have been working on their primary site for the last year or so. In the last month they decided to have one of their internal employees setup a small shopify store. Now they are asking for the analytics tracking codes for it. My question for you is what would be the best way for me to set that up? variables: primary domain and shopify domain, google and bing analytics Have been looking at how cross domain tracking works (https://support.google.com/tagmanager/answer/6106951), and the instructions for setting up ecommerce in analytics for shopify (https://help.shopify.com/manual/reports-and-analytics/google-analytics/google-analytics-setup). But am still not 100% which route would be the best, any input would be greatly appreciated! thank you, Dustin
Reporting & Analytics | | pastedtoast1 -
I have a WP site which uses categories to display the same content in several locations. Which items should get a canonical tag to avoid a ding for duplicate content?
So...I have a Knowledge Center and press room that pretty much use the same posts. So...technically the content looks like its on several pages because the post shows up on the Category listing page. Do I add a Canonical tag to each individual post...so that it is the only one that is counted? Also...I have a LONG disclaimer that goes at the bottom of most of the posts. would this count as duplicate content? Is there a way to markup a single paragraph to tell the spiders not to crawl it?
Reporting & Analytics | | LindsayiHart0 -
404 errors more than 1.8 lacs, Duplicate Content, Duplicate title, missing meta description increasing as site is based on regular ticket selling (CRM), kindly help
Sites error increasing i.e. 404 errors more than 1.8 lacs, Duplicate Content, Duplicate title, missing meta description increasing day by day as site is based on regular ticket selling (CRM), We have checked with webmasters for 404's, but it is not easy to delete 1.8 lac entries. How to resolve this issue for future. kindly help and suggest the solution.
Reporting & Analytics | | 1akal0 -
When to note Keyword improvement/decline?
Every report we have fluctuations in our rankings, but the majority of them are only a few places up or a few places down. Other than first page and top 3, when do you all start noting fluctuations (good or bad)? Would you start at 5, 10, 20 etc? Thanks, Ruben
Reporting & Analytics | | KempRugeLawGroup0 -
Any harm and why the differences - multiple versions of same site in WMT
In Google Webmaster Tools we have set up: ourdomain.co.nz
Reporting & Analytics | | zingseo
ourdomain.co.uk
ourdomain.com
ourdomain.com.au
www.ourdomain.co.nz
www.ourdomain.co.uk
www.ourdomain.com
www.ourdomain.com.au
https://www.ourdomain.co.nz
https://www.ourdomain.co.uk
https://www.ourdomain.com
https://www.ourdomain.com.au As you can imagine, this gets confusing and hard to manage. We are wondering whether having all these domains set up in WMT could be doing any damage? Here http://support.google.com/webmasters/bin/answer.py?hl=en&answer=44231 it says: "If you see a message that your site is not indexed, it may be because it is indexed under a different domain. For example, if you receive a message that http://example.com is not indexed, make sure that you've also added http://www.example.com to your account (or vice versa), and check the data for that site." The above quote suggests that there is no harm in having several versions of a site set up in WMT, however the article then goes on to say: "Once you tell us your preferred domain name, we use that information for all future crawls of your site and indexing refreshes. For instance, if you specify your preferred domain as http://www.example.com and we find a link to your site that is formatted as http://example.com, we follow that link as http://www.example.com instead." This suggests that having multiple versions of the site loaded in WMT may cause Google to continue crawling multiple versions instead of only crawling the desired versions (https://www.ourdomain.com + .co.nz, .co.uk, .com.au). However, even if Google does crawl any URLs on the non https versions of the site (ie ourdomain.com or www.ourdomain.com), these 301 to https://www.ourdomain.com anyway... so shouldn't that mean that google effectively can not crawl any non https://www versions (if it tries to they redirect)? If that was the case, you'd expect that the ourdomain.com and www.ourdomain.com versions would show no pages indexed in WMT, however the oposite is true. The ourdomain.com and www.ourdomain.com versions have plenty of pages indexed but the https versions have no data under Index Status section of WMT, but rather have this message instead: Data for https://www.ourdomain.com/ is not available. Please try a site with http:// protocol: http://www.ourdomain.com/. This is a problem as it means that we can't delete these profiles from our WMT account. Any thoughts on the above would be welcome. As an aside, it seems like WMT is picking up on the 301 redirects from all ourdomain.com or www.ourdomain.com domains at least with links - No ourdomain.com or www.ourdomain.com URLs are registering any links in WMT, suggesting that Google is seeing all links pointing to URLs on these domains as 301ing to https://www.ourdomain.com ... which is good, but again means we now can't delete https://www.ourdomain.com either, so we are stuck with 12 profiles in WMT... what a pain.... Thanks for taking the time to read the above, quite complicated, sorry!! Would love any thoughts...0 -
Filter out IP address of Site Search analytics
Hi Mozzers, I have a filter that excludes all internal traffic from my sites. But this does not seem to work on site search > Search Terms See here:- http://productforums.google.com/forum/#!searchin/analytics/filter$20site$20search/analytics/pO18L31hEO4/tJ3lKVNT3YYJ Any ideas? Or is it a bug, etc Thanks S
Reporting & Analytics | | Metropolis0 -
When one of my sites returns a ranking that consistently reads "No Data", what does that say about the site?
I am getting "No Data" reads for some of my sites - I personally think it has to do with the site's construction - especially the landing page... I inherited this site to do SEO - it was not created with on site SEO in mind - please help if you can sites are: www.storagesanangelo.com www.storagemidland.com Should I get webmaster to remove the big map graphic and add text and pics instead... Sure appreciate brilliant thoughts - even about yetis and beer
Reporting & Analytics | | creativeguy0 -
Conversion rates by browser & OS - any feedback/experts/experience?
Hi, Ive been evaluating conversion rates by operating system and by browser for a client. Ive picked up significant and somewhat disturbing trends. As you'd expect the bulk of traffic is coming from a Windows/Internet Explorer combination. This is unfortunately one of the worst combinations (Windows/Firefox & Windows/Safari did worse. Chrome/Windows was significantly the best combination with Windows). Windows also performs much worse than Mac. E.g. Windows/Firefox performs worse than Mac/Firefox. Overall conversion rate for Mac is 7.07% compared to 5.69% Windows. This is based on hundreds of thousands of visits and equates to tens of thousands of dollars difference in revenue. Generally later versions of browsers perform better on both main operating systems e.g IE 9.0 converts at 6.33% compared to 8.0 at 5.80% on Windows and Firefox 4.01 on the Mac converts at 7.57% compared to 3.6.16 at 6.54% (although this dataset is smaller than Windows/IE). Page load speeds (recorded in the clients analytics) are significantly faster on Mac than Windows (as expected really). Being Windows/IE and specifically Windows IE8 represents the bulk of traffic should we be addressing this? Will any optimisation negatively affect better performing Mac/Browser combinations? Understanding that Mac users equate to 'better' converting visitors - what else could be done there? Anyone have thoughts or experience on optimising pages for improved conversion rates via IE and Windows? Thanks in advance, Andy
Reporting & Analytics | | AndyMacLean0