Which Algorithm Change Hurt the Site? A causation/correlation issue
-
The attached graph is from google analytics, a correlation of about 14 months of Organic Google visits with algo changes, data from moz naturally
Is there any way to tell from this which will have affected the site? for example #1 or #2 seems to be responsible for the first dip, but #4 seems to fix it and it broke around 6, or is the rise between 4 and 7 an anomaly and actually 1 or 2 caused a slip from when it was released all the way to when 7 was released.
Sorry if the graph is a little cloak and dagger, that is partly because we don't have permissions to reveal much about the identity, and partly because we were trying to do a kind of double blind, separating the data from our biases
We can say though the different between the level at the start and end of the graph is at least 10,000 visits per day
-
It's really tough (and even inadvisable) to try to pin a traffic change to an algorithm update based solely on spikes in a graph. On rare occasion, it's pretty clear (Penguin is a good example, I've found), but in most cases there's just a lot of gray areas and the graph leaves out a mountain of data.
The big issue I see here is potentially seasonality and knowing what happened to the site and business. For example, you can look at #6 and #7 and call these dips, but that sort of ignores the spike. Is the dip the anomaly, or is the spike the anomaly? What drove up traffic between #4 and #6? Maybe that simply stopped, was a one-time event, or was seasonal.
Why was there volatility between #7 and #14 and then relative stability after #14? You could call #14 a "drop", but not knowing the timeline, it's hard to see how the curve might smooth in different windows. What it looks like is a period of highly volatile events followed by an evening out.
Without knowing the industry, the business, the history, and without segmenting this data, trying to make claims just based on dips and spikes in the graph is pretty dangerous, IMO. This could have virtually nothing to do with the algorithm, in theory.
-
I don't understand how dates would help? Was it not clear that the red lines are the dates of algo updates?
By abstracting the data the hope was to gain insight into how to read the graphs in relation to updates, and not just get help related to specific updates which wouldn't help much the next time we have to deal with a traffic drop problem. More a question of who to think rather than what to think.
Trying to read between the lines are you saying different algo changes take different amounts of time to kick in and that's why a more detailed graph is more useful? For example if #1 was the first penguin change, would your response be different if it was the first panda change?
-
You can use the Google Penalty Checker tool from Fruition: http://fruition.net/google-penalty-checker-tool/
I would not believe 100% on the tool results, but you can at least have an initial Analise, you'll need to go deeper to double check if this initial Analise is 100% relevant or not.
- Felipe
-
This doesn't tell me anything. If you at least had dates in there you could compare traffic dips to Google Algo Updates/Refreshes.
I understand you can't reveal the domain but I will be shocked if somebody here can tell you anything without further information. This place is full of brilliant minds, but that would take some sort of a mind-reader to tackle...
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will changing the property from http to https in Google Analytics affect main unfiltered view?
I set my client up with an unfiltered view in Google Analytics. This is the one with historical data going back for years, so I don't want to do anything that will affect this view. Recently, the website moved from HTTP to HTTPS. There's a setting for the property that will allow me to change the property name to https://EXAMPLE.com and change the default URL to https://EXAMPLE.com. Questions: 1. If I change the property name and the default URL, will this somehow affect my unfiltered view in a way that I'll lose historical data or data moving forward? 2. I have heard that changing the default URL to HTTPS will help me avoid a common problem others have experienced (where they lose the referrer in Google Analytics and a bunch of their sessions go to direct / other). Is this true?
Reporting & Analytics | | Kevin_P3 -
Filter Tracking works fine at staging site but not on LIVE site why?
Hello Expert, For my ecommerce site I want to track filter url's like price range, size, width, color etc and fully filter url should display in google analytic. I have implemented filter tracking at staging server and it works perfectly but on LIVE site it not show me full filter url. Do you guys think any parameter which i have configured in search console affect this? Note - I have configured in this way - http://webmasters.stackexchange.com/questions/93008/how-to-track-a-product-filter-in-the-product-list-view-with-google-analytics My filter url's are given below. And in search console I have configure two parameters. 1) effect - Sort, Crawl - No urls 2) FT - effect- ( - ) , crawl - Let google bot decide. But as per me this parameter is for crawling should not affect tracking right? mysite.com?FP=0&filtSeq=Price&Sort=BS
Reporting & Analytics | | adamjack
mysite.com?FT=7581&filtSeq=Type&Sort=BS
mysite.com?FT=1042&filtSeq=Colour&Sort=BS In robot file nothing is block. In analytic it showing me url till mysite.com only where as in staging it shows me full filter url. Thanks!0 -
Backlinks Tracking Websites/Tools/Software
I have multiple websites that I need to keep track of their backlinks. How do you guys keep track of your backlinks? What are some cool tools that you use ?
Reporting & Analytics | | AngelosS0 -
Does subdomain (or sub sub domain) affect analytics data of root site?
We self-host our public website, but over time have also added subdomains onto it that are not public and are for internal or even client portals. I am seeking advice as to whether those subdomains affect the analytics data (self referrals, visits, bounces) of the public site that I am tasked with analyzing. I feel that it does skew the data but need to build a solid case to move the public website to a new domain, so as to leave the existing one in tact with all of its subs.
Reporting & Analytics | | MarketingGroup0 -
Is there a way to map your on-page SEO changes with the organic growth?
Hi Mozzers, I was just wondering if there's a way we can map our on-page SEO changes with the increase/decrease in organic traffic. For instance, I introduced brand pages' link the product page breadcrumbs and suddenly organic traffic for my brand pages increase from X to 2X in 1 couple of weeks. Now, this can be because of this breadcrumb change purely or because of some algorithm update or may be, bots started finding the content interesting and hence, started ranking them up (in case the brand pages were launched recently). So, you can't say which change should be mapped to what increase/decrease in organic traffic. Or, is there a way to map this?
Reporting & Analytics | | _nitman0 -
How do I track a primary domain and a subdomain as single site in Google Analytics?
Our website consists of a primary domain (marketing focused) and subdomain (ecommerce platform). The two sites look and function as one site even though they are using different technology. I would like to track the primary domain (example.com) and the subdomain (shop.example.com) as a single site in Google Analytics. The subdomain will be set up with GA ecommerce tracking as well. Can someone provide an example of the GA snippet that each would need?
Reporting & Analytics | | Evan340 -
Index.php and /
Hello, We have a php system and in the MOZ error report our index.php shows up as a duplicate for / (home page). I instituted a rel canonical on the index.php because the / gets better rank than the other. This said, the error report through MOZ still shows them as duplicates. Should I be using a 301 instead? Please help! Also, I would love a good technical SEO book (for bridging the gap between SEO and programmer) if someone can recommend one? Thanks in advance!
Reporting & Analytics | | lfrazer0 -
How long until changes are reflected in Google
I updated my site to remove duplicate titles and also include the rel=
Reporting & Analytics | | MartinSpence46
"next" etc tags. When can I expect to see these reflected in Google andSEOMoz? I also found out I should update my sitemap every month, when will Google pick up the new sitemap? Sorry if these are basic questions, new to the whole world of SEO. Thanks M0