Which Algorithm Change Hurt the Site? A causation/correlation issue
-
The attached graph is from google analytics, a correlation of about 14 months of Organic Google visits with algo changes, data from moz naturally
Is there any way to tell from this which will have affected the site? for example #1 or #2 seems to be responsible for the first dip, but #4 seems to fix it and it broke around 6, or is the rise between 4 and 7 an anomaly and actually 1 or 2 caused a slip from when it was released all the way to when 7 was released.
Sorry if the graph is a little cloak and dagger, that is partly because we don't have permissions to reveal much about the identity, and partly because we were trying to do a kind of double blind, separating the data from our biases
We can say though the different between the level at the start and end of the graph is at least 10,000 visits per day
-
It's really tough (and even inadvisable) to try to pin a traffic change to an algorithm update based solely on spikes in a graph. On rare occasion, it's pretty clear (Penguin is a good example, I've found), but in most cases there's just a lot of gray areas and the graph leaves out a mountain of data.
The big issue I see here is potentially seasonality and knowing what happened to the site and business. For example, you can look at #6 and #7 and call these dips, but that sort of ignores the spike. Is the dip the anomaly, or is the spike the anomaly? What drove up traffic between #4 and #6? Maybe that simply stopped, was a one-time event, or was seasonal.
Why was there volatility between #7 and #14 and then relative stability after #14? You could call #14 a "drop", but not knowing the timeline, it's hard to see how the curve might smooth in different windows. What it looks like is a period of highly volatile events followed by an evening out.
Without knowing the industry, the business, the history, and without segmenting this data, trying to make claims just based on dips and spikes in the graph is pretty dangerous, IMO. This could have virtually nothing to do with the algorithm, in theory.
-
I don't understand how dates would help? Was it not clear that the red lines are the dates of algo updates?
By abstracting the data the hope was to gain insight into how to read the graphs in relation to updates, and not just get help related to specific updates which wouldn't help much the next time we have to deal with a traffic drop problem. More a question of who to think rather than what to think.
Trying to read between the lines are you saying different algo changes take different amounts of time to kick in and that's why a more detailed graph is more useful? For example if #1 was the first penguin change, would your response be different if it was the first panda change?
-
You can use the Google Penalty Checker tool from Fruition: http://fruition.net/google-penalty-checker-tool/
I would not believe 100% on the tool results, but you can at least have an initial Analise, you'll need to go deeper to double check if this initial Analise is 100% relevant or not.
- Felipe
-
This doesn't tell me anything. If you at least had dates in there you could compare traffic dips to Google Algo Updates/Refreshes.
I understand you can't reveal the domain but I will be shocked if somebody here can tell you anything without further information. This place is full of brilliant minds, but that would take some sort of a mind-reader to tackle...
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Speed or Site Speed which one does Google considered a ranking signal
I've read many threads online which proves that website speed is a ranking factor. There's a friend whose website scores 44 (slow metric score) on Google Pagespeed Insights. Despite that his website is slow, he outranks me on Google search results. It confuses me that I optimized my website for speed, but my competitor's slow site outperforms me. On Six9ja.com, I did amazing work by getting my target score which is 100 (fast metric score) on Google Pagespeed Insights. Coming to my Google search console tool, they have shown that some of my pages have average scores, while some have slow scores. Google search console tool proves me wrong that none of my pages are fast. Then where did the fast metrics went? Could it be because I added three Adsense Javascript code to all my blog posts? If so, that means that Adsense code is slowing website speed performance despite having an async tag. I tested my blog post speed and I understand that my page speed reduced by 48 due to the 3 Adsense javascript codes added to it. I got 62 (Average metric score). Now, my site speed is=100, then my page speed=62 Does this mean that Google considers page speed rather than site speed as a ranking factor? Screenshots: https://imgur.com/a/YSxSwOG **Regarding: **https://six9ja.com/
Reporting & Analytics | | Kingsmart1 -
How to change domains in Google Analytics without losing the data
Hi there, We recently changed our domain from .COM to .NET so that all our subdomains from external pages matched. Right now in Google Console we have our new .NET website being tracked, but in GA we are still tracking .COM. It is also causing issues with MOZ crawling our site because of the .COM/.NET discrepancy. My question is what is the best way to change our Google Analytics from .COM to .NET without losing historical data and what considerations do we need to change before implementing this? Our team was concerned that just downloading the old data would be too vast and it we wouldn't be able to continue manipulating it dynamically in GA. Thanks!!
Reporting & Analytics | | cPanel-LLC.0 -
Bounce Rates: Leaving my domain.com/blog to shop on mydomain.com counts as bounce rate?
Hello! I have a kind of difficult question. On my main domain i have: Store: mydomain.com and wordpress blog on mydomain.com/blog If I have a link to a specific product on my blog and user goes to the product on the store, will bounce rate increase or as it's the same domain will be like a new page view? Different CMS's and blog is on a different analytics account than the store. I hope i could explain myself! Thank you
Reporting & Analytics | | prozis0 -
SEO dealing with a CDN on a site.
This one is stumping me and I need some help. I have a client who's site is www.site.com and we have set them up a CDN through Max CDN at cdn.site.com which is basically a cname to the www.site.com site. The images in the GWT for www.site.com are de-indexing rapidly and the images on cdn.site.com are not indexing. In the Max CDN account I have the images from cdn.site.com sending a canonical header from www.site.com but that does not seem to help, they are all still de-indexing.
Reporting & Analytics | | LesleyPaone0 -
Analytics/Google Keyword comparison
Hi I'm trying to establish a methodology to best show the gap between potential and realised organic keyword traffic. To obtain potential keyword traffic I'm using the Google Adwords keyword tool to derive local monthly search volumes for exact keyword matches. However, I'm confused as to which is the best way of getting a comparable metric from Google Analytics (GA). I was using custom reports and the 'organic searches' metric. However, this provides different values to a standard report selecting non-paid search in the default advanced segments. What is the best report/metric in GA to use for both organic and paid search volumes that would be comparable to the Google Adwords keyword tool. Also, I'm having problems getting my kids to eat their greens, any advice! 😉 Thanks Neil
Reporting & Analytics | | mccormackmorrison0 -
Conversion rates by browser & OS - any feedback/experts/experience?
Hi, Ive been evaluating conversion rates by operating system and by browser for a client. Ive picked up significant and somewhat disturbing trends. As you'd expect the bulk of traffic is coming from a Windows/Internet Explorer combination. This is unfortunately one of the worst combinations (Windows/Firefox & Windows/Safari did worse. Chrome/Windows was significantly the best combination with Windows). Windows also performs much worse than Mac. E.g. Windows/Firefox performs worse than Mac/Firefox. Overall conversion rate for Mac is 7.07% compared to 5.69% Windows. This is based on hundreds of thousands of visits and equates to tens of thousands of dollars difference in revenue. Generally later versions of browsers perform better on both main operating systems e.g IE 9.0 converts at 6.33% compared to 8.0 at 5.80% on Windows and Firefox 4.01 on the Mac converts at 7.57% compared to 3.6.16 at 6.54% (although this dataset is smaller than Windows/IE). Page load speeds (recorded in the clients analytics) are significantly faster on Mac than Windows (as expected really). Being Windows/IE and specifically Windows IE8 represents the bulk of traffic should we be addressing this? Will any optimisation negatively affect better performing Mac/Browser combinations? Understanding that Mac users equate to 'better' converting visitors - what else could be done there? Anyone have thoughts or experience on optimising pages for improved conversion rates via IE and Windows? Thanks in advance, Andy
Reporting & Analytics | | AndyMacLean0 -
Something strange going on with new client's site...
Please forgive my stupidity if there is something obvious here which I have missed (I keep assuming that must be the case), but any advice on this would be much appreciated. We've just acquired a new client. Despite having a site for plenty of time now they did not previously have analytics with their last company (I know, a crime!). They've been with us for about a month now and we've managed to get them some great rankings already. To be fair, the rankings weren't bad before us either. Anyway. They have multiple position one rankings for well searched terms both locally and nationally. One would assume therefore that a lot of their traffic would come from Google right? Not according to their analytics. In fact, very little of it does... instead, 70% of their average 3,000 visits per month comes from just one referring site. A framed version of their site which is through reachlocal, which itself doesn't rank for any of their terms. I don't get it... The URL of the site is: www.namgrass.co.uk (ignore there being a .com too, that's a portal as they cover other countries). The referring site causing me all this confusion is: http://namgrass.rtrk.co.uk/ (see source code at the bottom for the reachlocal thing). Now I know reach local certainly isn't sending them all that traffic, so why does GA say it is... and what is this reachlocal thing anyway?? I mean, I know what reachlocal is, but what gives here with regards to it? Any ideas, please??
Reporting & Analytics | | SteveOllington0