Is there a way to map your on-page SEO changes with the organic growth?
-
Hi Mozzers,
I was just wondering if there's a way we can map our on-page SEO changes with the increase/decrease in organic traffic. For instance, I introduced brand pages' link the product page breadcrumbs and suddenly organic traffic for my brand pages increase from X to 2X in 1 couple of weeks. Now, this can be because of this breadcrumb change purely or because of some algorithm update or may be, bots started finding the content interesting and hence, started ranking them up (in case the brand pages were launched recently).
So, you can't say which change should be mapped to what increase/decrease in organic traffic. Or, is there a way to map this?
-
Thank you so much Sir Alan. Really appreciate your reply
-
I'm only going to add to all of these great responses by saying this:
1. Even if you make a change today, it does NOT mean you will be able to know EXACTLY when that change is acknowledged by Google. This is especially true on larger sites. It can take days, weeks, even months for Google to properly recrawl the entire site (even when they crawl every day, some of those URIs were just crawled the day before or three days ago, while only a portion of today's crawl will be other, not as recently crawled URIs). And then it can take weeks for all of Google's algorithms to catch up. Along the way, those algorithms may even evaluate only a PARTIAL understanding of the change (while waiting for Googlebot to get to all the other pages).
2. One additional suggestion is to look at in-page analytics within Google Analytics, or a 3rd party click tracking tool to get a better idea of whether people are even clicking on a given link on-page. Just be careful in setting up 3rd party click tracking - do it poorly, and you can cause massive duplicate URL problems. And in-page analytics in GA often aggregates all clicks on all of the individual links on a single page where several point to one common destination URI.
-
Oh wow! Will connect with you on Twitter to understand about the same which can help me plan it better. Hope you won't mind sharing the way your architected your internal tracking tool
-
Yeah I am an Analytic Junkie and I have incorporated my own analytics I built that helps me compare with GA at the same time that helps me dive deeper into the numbers and gives me a more detailed overview of behavior on my pages as well as users.
It's cool
-
Hi Linda,
Yeah! High time to start exploring GA annotations. Needless to say, will definitely post here once I'll be able to find/build a good solution for the same
-
Hi Cesar Bielich,
Thank you so much for the well descriptive explanation, will start exploring GA annotations right away.
Yes, I can code and planning to work on internal analytics system to track these granular pieces but it'll take time to implement such powerful system when you've 10 million + pages and hence, its not a P0 right now. We have integrated GTM as well, and tracking some of these values to some extent. But, as you correctly mentioned that none of these things can be directly mapped to any increase/decrease in organic traffic, I should definitely think about prioritizing my project to understand the correlation between my changes and the organic traffic which can be an awesome asset to understand these things.
-
Hi Nitin,
We found that using Fruition (actually a penalty checker) was pretty useful as well. It overlays all Google updates and SERP changes on your Analytics data. And of course: use annotations in Analytics.
If you figure out a great way to do this, please let me know!
Kind regards,
Linda Hogenes
-
Hello Nitin,
Honestly I think there is no one solution fit for all in this situation. Let’s say you tweak your home page title and rankings get up by 2 positions this never means that this is a standard solution and things might not work the same way for other website.
Even if you test one thing at a time to see how your changes are impacting results, you cannot control the environment completely. Let’s say you fix all 404s on your website and panda roll out on similar dates so you cannot exactly say that if this change in result is because of fixing 404 pages or because of the panda update.
I think it’s very difficult to say exactly what is impacting how much on results but you can do some test and come to a point that few factors has a less weight as compare to others within the industry.
Just a thought!
-
Well there are a few factors you have to consider with this and unfortunately there is no definitive way to determine this with Google, but with patience, over time you can see the benefits from your changes and track them.
When it comes to algorithmic changes there is practically no way to monitor that. Google has told us time and time again that they make many changes constantly (almost daily and up to 500 to 600 changes a year) to their algorithm to make it smarter so you have to consider that. Tracking changes to on-page SEO with specific algorithmic changes will pretty much be impossible, BUT it's not impossible if you track it correctly. Remember that your users will give you all the information you need to determine if your changes are working, and the more your users are happy the more they will share and spread the news so that will eventually evolve into shares and backlinks.
Tracking on-page SEO changes to organic traffic
This one is simpler than you think as long as you know how to do it correctly. One of the best tools for this is Google Analytics. Here are a few things you can do.
- Google Analytics provides annotations for you to create markers when you make changes on your site. You can then track the changes you made with the annotation and see the difference in traffic.
- Track changes with "compare to" option when selecting dates that help you see the differences in traffic from the previous period. For instance if you made a change on November 1st. Use the compare tool and track the previous week of traffic to that date range and see if you can see an increase in organic traffic.
- You can "compare to" in the same way with more specific setting and see which pages on your site (or ones you made changes to) increased or decreased after you made your changes. Just run the "compare to" scenario the same in Google Analytics, but do it in Behavior > Site Content > All Pages and see which pages increased in traffic from your on-page changes.
When making these changes remember to use Google Analytics and track specific organic changes in traffic by going to Acquisition > All Traffic > Source/Medium and then click on your search engine of choice (I'm assuming Google of course). Or when tracking other changes use different dimensions and metrics to track the organic traffic.
Can you code?
Depending if you are using wordpress or built your site from scratch knowing how to include some code on your site to track your changes helps tremendously.
For instance you can add some code to help you determine how many users are clicking on your breadcrumbs links and see if that help creates more organic traffic. PHP is great for this. Instead of having the links on your breadcrumbs sending the user to the exact page, have it go to a script that logs that click in a database so that you can see how many users are clicking on your breadcrumbs links and which ones, then send them to the desired page. Over a few weeks you will see which clicks are the most effective.
If you need some help you can private message me here at Moz and I can show you what I mean. I have been a web developer for over 15 years and I am a Analytic junkie so I can show you some things
Hope that helps
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
PDF best practices: to get them indexed or not? Do they pass SEO value to the site?
All PDFs have landing pages, and the pages are already indexed. If we allow the PDFs to get indexed, then they'd be downloadable directly from google's results page and we would not get GA events. The PDFs info would somewhat overlap with the landing pages info. Also, if we ever need to move content, we'd now have to redirects the links to the PDFs. What are best practices in this area? To index or not? What do you / your clients do and why? Would a PDF indexed by google and downloaded directly via a link in the SER page pass SEO juice to the domain? What if it's on a subdomain, like when hosted by Pardot? (www1.example.com)
Reporting & Analytics | | hlwebdev1 -
Do modal pop-ups impact the Google Analytics of the host page?
We have an instance of a page where visitors can click a button to start an interactive quiz. The quiz pops up in a modal window that references another domain (the interactive content provider). Will the person completing the quiz in the modal pop-up still be counted as an active visitor on the original host page during the time they are completing the quiz?
Reporting & Analytics | | MuhammadInc0 -
Is there an efficient way to block/filter referral spam in Google Analytics for a large network of websites?
Hello, everyone - I'm looking for guidance on how to block or filter referral spam in Google Analytics. But I'm needing to block for an entire network of Wordpress websites. We have two networks which total over 2,500 websites. We are currently blocking sites we find out about via htaccess. This works, but only after we see we are getting hit with the spam. Updating 2,500+ Google Analytics accounts with filtering is not an ideal option due to the time factor and the fact that new bots coming out almost daily. We can continue the htaccess method, but does anyone have any other ideas for blocking referral spam for a large network of sites? These are the other ideas we have. 1. Blocking all traffic from Russia and China based up subnets. We know many will still get through, but it should block 50% of it, we hope.
Reporting & Analytics | | copyjack
2. Moving sites to Google Tag manager. This is a huge tasks but we have seen that sites using Tag Manager are not effected, at least for now. Other ideas are appreciated!0 -
What are all the 5's in SEO Queries in Analytics?
Every small business client has the same thing. 5 impressions for keywords, row after row, every single month. Why exactly 5 and why month after month the same thing? I see this in every local business I work in - and for very important phrases! It's gotten to the point that I think those are fake and I just look at the impressions that have numbers great than 5. Obviously I have to get their impressions up, but what am I to believe about these?
Reporting & Analytics | | katandmouse0 -
High Temporary Redirects: Login required pages
Noticed something interesting, a high temporary redirect report from Moz. Reviewing the pages they are caused by the user having to login and getting redirected. I can see the returnto query in the URL too. My thoughts: Since a login is required and the user is being redirected, these should remain 302 and not 301. I tested my Google Analytics account to **Exclude URL Query Parameter **returnto, just to see if it affected traffic. It didn't, I mean I don't see urls duplicated with the parameter anymore, just grouped together, so traffic is still being counted. I'm going to wait 1 more day and see what impact the GA traffic is before applying the exclusion to my true Google Analytics profile. This got me thinking, I should probably exclude this parameter from Google and Bing Webmaster Tools, that way Google/bing won't read those urls. Now does Moz's crawler follow that? Do you think that would change my moz crawl diagnostic report because I told Google/Bing crawlers to exclude that parameter. What do you think of my approach to reduce these high temporary redirects reported by Moz? Will it work? Has it plagued you?
Reporting & Analytics | | Bio-RadAbs0 -
Organic Traffic From...Mountain View, CA?
I've noticed something a little odd in my organic search traffic lately. Looking at several websites that target the Minneapolis area, I'm seeing some organic searches come in (typically using head keywords - no geo-modifier) from Mountain View, CA. There's no way we are truly ranking well on these terms in California, so it certainly feels like Google sniffing around. I was worried that perhaps they were checking into penalizing us or something, but we've actually seen upticks in search traffic lately. This traffic is not showing up in Google Analytics, just Adobe SiteCatalyst. In the past, spikes from random locations were probably some sort of crawler, like the preview bot, but these are coming in as searches with (for now) keyword data. Has anyone else seen anything like this?
Reporting & Analytics | | SarahLK0 -
Excluding referral traffic from a specific page Google analytics
Hi, I am trying to exclude from referrals from a particular page i.e. www.domain.com/nothispage within Google analytics, I have tried a couple variations within the advanced filter (Regex etc) section without much luck, could anyone assist ? Updated-trying to do this using a filter for the entire profile. Thanks Marc
Reporting & Analytics | | NRMA0 -
Difference between page/domain authority
could anyone explain the difference between Page Authority and Domain Authoity to me or give me a link to a site where it is explained? Sorry if It's really obvious and I'm just too stupid to find out, but I've searched and haven't found anything.
Reporting & Analytics | | mtueckcr0