Adjustable Bounce Rate
-
Hi
I've been looking at analysing bounce rate in more depth, I wondered what people's views on adjustable bounce rate were? I've been reading this article http://searchenginewatch.com/sew/how-to/2322974/how-to-implement-adjusted-bounce-rate-abr-via-google-tag-manager-tutorial
Is it worth adding this? Or is it just as useful to look at time on page over bounce rate?
-
I've only just seen this
Thank you! I'll try and get to grips with User Flow, I need to dedicate some time to analysing the data
Becky
-
Hi
Thank you for the reply. I have looked at User Flow but I tend to get a bit lost in the amount of data and finding exactly what I need.
Can you segment and filter this by landing page?
I can see the drop offs, but not the drop off for new users - or is this report based on new users only?
Thank you!
-
Hi Becky,
You are correct - normally if a tag is fired it won't be counted as bounce (unless you set "noninteraction=true" - check https://support.google.com/analytics/answer/1033068#NonInteractionEvents)
Dirk
-
Amazing thanks!
-
Picking up on Dirk saying:
I prefer to know if people scroll to the end of the page (so I assume they have read the article) rather than just put an arbitrary time to fire an event.
This was shared the other day - it's a way of pulling in scroll-depth data into your Google Analytics reports. Incredibly useful:
-
Thanks, for me I think I want to know what pages people find useful and what ones they don't but with ecommerce it's a bit more difficult.
My overall goal is to provide content the user wants to see on product pages.
On that last point, I thought that when you add code to fire an event when someone has been on a page for X amount of time, if they only access this page, but you've set this event - it won't be counted as a bounce?
I'll read up on the ecommerce tracking too thanks!
-
It can be useful - it depends on what you want to know. If you do not implement either of them - the time on site will not be correct as there will be no time on site calculated for bounced visits.
Personally - I prefer to know if people scroll to the end of the page (so I assume they have read the article) rather than just put an arbitrary time to fire an event. It will in both cases make the time measurement on your site more accurate. Both ways of measurement will reduce the bounce rate.
I think it's certainly useful for e-commerce - but then I would rather use enhanced e-commerce tracking.
I don't really understand what you mean with "I thought that if you took into account the time spent on page, and set these parameters in analytics, that it wouldn't in fact be counted as a bounce?" - could you explain?
Dirk
-
Hi Dirk,
Thanks for your response. So are you saying Adjustable Bounce rate is also not beneficial?
I thought that if you took into account the time spent on page, and set these parameters in analytics, that it wouldn't in fact be counted as a bounce?
I'll also look into the content tracking you mentioned - is this also useful for ecommerce? I'm not always expected people to scroll right to the end of pages.
Thanks
-
Time on page has the same issue - suppose somebody visits your site - spends 10 minutes reading an article & then goes to another site. It will be counted as a bounced visit - but even worse - the 10 minutes spend on your site will not be measured in Analytics (check http://cutroni.com/blog/2012/02/29/understanding-google-analytics-time-calculations/)
This is one of the advantages of the Advanced Content tracking - it measures better what people are doing on your site. The fact that the bounce rate decreases for me isn't the big win - the fact that you get better time measurement on site & that you can check the interaction (do they scroll to the end) are the things that bring benefit.
If you don't want to use the tag manager - you can also do this with the normal tracking code: http://cutroni.com/blog/2014/02/12/advanced-content-tracking-with-universal-analytics/ (Cutroni is the Analytics Advocate @Google)
Dirk
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranking #1 but Bounce Rate is 90%?!
Hi Mozers, We have a page that's ranking #1 for several very high volume queries but the bounce rate is 90%. It's puzzling that the page is ranking so well even though the bounce rate is exceedingly high. The algorithm takes user engagement metrics into account so you would think that it those metrics would push the page down. Having said that, the page does have lots of backlinks. So maybe it's ranking despite the fact that people are clicking out? Does anyone have an idea? Thanks, Yael
Intermediate & Advanced SEO | | yaelslater0 -
I am lost at where to go. My optimization rating is 95% + and rankings are on pages 4+. I would like to know what I should do to increase my rankings.
My site is Glare-Guard.com. My Domain Authority has not moved from 17 in a long time. i have done everything to optimize the different pages. I have 90%+ ratings for the various pages, yet I am still not even close to the first page for many of the keywords I am looking to rank for. Do you have any tips or ideas? Should I try to rewrite my content and add more information? I am just at a loss for where I should go to get the right traffic to my site. Any help would be greatly appreciated.
Intermediate & Advanced SEO | | bigskyinc0 -
We 410'ed URLs to decrease URLs submitted and increase crawl rate, but dynamically generated sub URLs from pagination are showing as 404s. Should we 410 these sub URLs?
Hi everyone! We recently 410'ed some URLs to decrease the URLs submitted and hopefully increase our crawl rate. We had some dynamically generated sub-URLs for pagination that are shown as 404s in google. These sub-URLs were canonical to the main URLs and not included in our sitemap. Ex: We assumed that if we 410'ed example.com/url, then the dynamically generated example.com/url/page1 would also 410, but instead it 404’ed. Does it make sense to go through and 410 these dynamically generated sub-URLs or is it not worth it? Thanks in advice for your help! Jeff
Intermediate & Advanced SEO | | jeffchen0 -
Adjusting Display of Sitelinks
I have a client that has some site links for their store but on of the links does not appear how it should. It should be a category that is all Capital letters but instead is appearing as initial Caps. I am guessing that this is due to anchor text somewhere on the site but for the life of me, I cannot find it appearing like this anywhere. Anything you think I might be missing here or should try? This is probably just something stupid I have overlooked.
Intermediate & Advanced SEO | | DRSearchEngOpt
THANKS!0 -
Would you rate-control Googlebot? How much crawling is too much crawling?
One of our sites is very large - over 500M pages. Google has indexed 1/8th of the site - and they tend to crawl between 800k and 1M pages per day. A few times a year, Google will significantly increase their crawl rate - overnight hitting 2M pages per day or more. This creates big problems for us, because at 1M pages per day Google is consuming 70% of our API capacity, and the API overall is at 90% capacity. At 2M pages per day, 20% of our page requests are 500 errors. I've lobbied for an investment / overhaul of the API configuration to allow for more Google bandwidth without compromising user experience. My tech team counters that it's a wasted investment - as Google will crawl to our capacity whatever that capacity is. Questions to Enterprise SEOs: *Is there any validity to the tech team's claim? I thought Google's crawl rate was based on a combination of PageRank and the frequency of page updates. This indicates there is some upper limit - which we perhaps haven't reached - but which would stabilize once reached. *We've asked Google to rate-limit our crawl rate in the past. Is that harmful? I've always looked at a robust crawl rate as a good problem to have. Is 1.5M Googlebot API calls a day desirable, or something any reasonable Enterprise SEO would seek to throttle back? *What about setting a longer refresh rate in the sitemaps? Would that reduce the daily crawl demand? We could set increase it to a month, but at 500M pages Google could still have a ball at the 2M pages/day rate. Thanks
Intermediate & Advanced SEO | | lzhao0 -
What is an acceptable bounce rate?
0% is of course the best case and 100% would be the worst case but what would is considered average. How do you address this subject with your clients?
Intermediate & Advanced SEO | | jjgonza0 -
Product Feed Contributing To Bounce Rate
We subscribe to a product feed and have been very pleased with the results. However, one of the unanticipated results is a trending increase in our site bounce rate. Should we be concerned about this 3-10% increase in bounce rate trend. It may go higher. Of all the factors that can contribute to bounce rate, one of the factors is that we have a lot of products on the site that cannot be shipped out of state or shipped at all. These products can only be delivered in-state or picked up at our store. The Analytics data suggests that feed products typically have a higher bounce rate, lower ctr, lower time on page, lower time on site etc. than products found by other means. However, the product feed generates sales. Should I take these products off the feed that have a high bounce rate and are not "shipable"? Although they may land on feed product, they may click through to a shipable product. Our feed provider says of the bounce rate is typically not something a lot of other merchants worry about. I'm not certain, but I'm inclined to disagree. What are your thoughts and experiences with this? Thanks for the help.
Intermediate & Advanced SEO | | AWCthreads0 -
How to Set Custom Crawl Rate in Google Webmaster Tools?
This is really silly question to set custom crawl rate in Google webmaster tools. Any one can find out that section under setting tab. But, I have confusion to decide number for request per second and second between requests text field. I want to set custom crawl rate for my eCommerce website. I checked my Google webmaster tools and find out as attachment. So, Can I use this facility to improve my crawling? 6233755578_33ce83bb71_b.jpg
Intermediate & Advanced SEO | | CommercePundit0