Seaches & Clicks Research
-
Is there a way to check the percentage of clicks on specific websites based on searches that people do? For example, say I searched "sneakers", what percentage of viewers clicked on a particular site.
-
Thanks!
-
There is a company is the UK that offer a tool that does this. Not sure if this is the right link but the tool is part of Experian.
http://www.experian.co.uk/integrated-marketing/web-analytics.html
They call me a month or so ago to demo it. It had amazing data but was extremely expensive (circa £10-50k per year if I remember correctly).
-
I do not know of such a tool - maybe try SEMRush? They have a lot by way of competitive analysis.
-
I mean for all sites. ie: competitors
-
You mean for your own site? yo can see this in both bing and goole wmt
-
Thank you - this is general info. I was wondering if there's an actual tool to see the click-through rate for certain keywords.
-
You could use the percentages from any of the click through rate reports out there for a rough guess;
Coconut Headphones (there's a 2nd part to this article too)
Bear in mind, everyone's reports are always a bit different. There are so many variables to estimating click through rate, its nearly impossible to come up with exact percentages across the board, as they can vary by industry, amount of PPC ads, local search vs general search, if there's videos or images in the result etc.
But hope those links help!
-Dan
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to do effective keyword research with categories and subcategories?
Hi all, I'm trying to breakdown some SEO 101 tips and start from scratch. Starting with Keywords! I would like to audit our site for main keywords, grouping them in categories and subcategories. My questions are: 1. Is it possible to see where we rank on google AND search trends of visits to our site?
Algorithm Updates | | Eric_S
2. What is a good method or structure to document (excel?)
3. What analysis can be made from finding the results of these keywords and how can I make use of this? As a beginner your help is much appreciated!!2 -
Googles Search Intent – Plural & Singular KW’s
This is more of a ‘gripe’ than a question, but I would love to hear people’s views. Typically, when you search for a product using the singular and plural versions of the keyword Google delivers different SERPs. As an example, ‘leather handbag’ and ‘leather handbags’ return different results, but surely the search intent is exactly the same? You’d have thought Google was now clever enough to work this out. We tend to optimise our webpages for both the plural and singular variations of the KW’s, but see a mixed bag of results when analysing rankings. Is Google trying to force us to create a unique webpage for the singular version, and another unique webpage for the plural version? This would confuse the visitor, and make no sense.. the search intent is the same! How do you combat this problem? Many thanks in advance. Lee.
Algorithm Updates | | Webpresence0 -
Domain location is a ranking factor? Back links & website?
If a website trying to rank in US and it has received many back-links from domains hosting from other countries; how it will impact website ranking? Can a website hosted in country will rank well in other country? How much the hosted location matters? Like....domain hosted in Germany but trying to rank in US?
Algorithm Updates | | vtmoz0 -
SEO - Google Local Listing & Same Day Delivery
Hi We are looking to offer same day delivery if you're in a 20 mile radius to us. I'm trying to do some research on how to optimise this for Google organic listings. Would this be the same as optimising for a local business listing? I'm not sure where to start. Thanks! Becky
Algorithm Updates | | BeckyKey0 -
New Website Old Domain - Still Poor Rankings after 1 Year - Tagging & Content the culprit?
I've run a live wedding band in Boston for almost 30 years, that used to rank very well in organic search. I was hit by the Panda Updates August of 2014, and rankings literally vanished. I hired an SEO company to rectify the situation and create a new WordPress website -which launched January 15, 2015. Kept my old domain: www.shineband.com Rankings remained pretty much non-existent. I was then told that 10% of my links were bad. After lots of grunt work, I sent in a disavow request in early June via Google Wemaster Tools. It's now mid October, rankings have remained pretty much non-existent. Without much experience, I got Moz Pro to help take control of my own SEO and help identify some problems (over 60 pages of medium priority issues: title tag character length and meta description). Also some helpful reports by www.siteliner.com and www.feinternational.com both mentioned a Duplicate Content issue. I had old blog posts from a different domain (now 301 redirecting to the main site) migrated to my new website's internal blog, http://www.shineband.com/best-boston-wedding-band-blog/ as suggested by the SEO company I hired. It appears that by doing that -the the older blog posts show as pages in the back end of WordPress with the poor meta and tile issues AS WELL AS probably creating a primary reason for duplicate content issues (with links back to the site). Could this most likely be viewed as spamming or (unofficial) SEO penalty? As SEO companies far and wide daily try to persuade me to hire them to fix my ranking -can't say I trust much. My plan: put most of the old blog posts into the Trash, via WordPress -rather than try and optimize each page (over 60) adjusting tagging, titles and duplicate content. Nobody really reads a quick post from 2009... I believe this could be beneficial and that those pages are more hurtful than helpful. Is that a bad idea, not knowing if those pages carry much juice? Realize my domain authority not great. No grand expectations, but is this a good move? What would be my next step afterwards, some kind of resubmitting of the site, then? This has been painful, business has fallen, can't through more dough at this. THANK YOU!
Algorithm Updates | | Shineband1 -
Crosslinking & Managing Multiple Domains in Same Webmaster Tool's Account
I am wondering if there are any consequences if you manage multiple websites in the same Webmaster Tool's account and cross link between them? My guess is that this would be a very easy thing for Google to detect and build into their algorithms. Hence affect the link juice from those domains that are owned by the same person. I am looking for verification on this. Thanks, Joe
Algorithm Updates | | csamsojo0 -
Post penguin & panda update. what would be a good seo strategies for brand new sites
Hi there. I have the luxury of launching a few sites after the penguin and panda updates, so I can start from scratch and hopefully do it right. I will get SEO companies to help me with this so i just want to ask for advices on what would be a good strategies for a brand new site. my understand of the new updates is this content and user experience is important, like how long they spend, how many pages etc social media is important. we intent to engage FB and twitter alot. in New Zealand, not too many people use google+ so we will probbaly just concentrate on the first two hopefully we will try to get people to share our website via social media, apparent that is important should only concentrate on high quality backlinks with a good diverse set of alt tags, but concentrate on branding rather than keywords. Am i correct to say that so far? if that is the principle, what would be the strategy to implement these goals? Links to any articles would also be great please. Love learning. i just want to do this right and hopefully try to future proof the sites against updates as possible. i guess quality content and links will most likely to be safe. Thank you for your help.
Algorithm Updates | | btrinh0 -
301-Redirects, PageRank, Matt Cutts, Eric Enge & Barry Schwartz - Fact or Myth?
I've been trying to wrap my head around this for the last hour or so and thought it might make a good discussion. There's been a ton about this in the Q & A here, Eric Enge's interview with Matt Cutts from 2010 (http://www.stonetemple.com/articles/interview-matt-cutts-012510.shtml) said one thing and Barry Schwartz seemed to say another: http://searchengineland.com/google-pagerank-dilution-through-a-301-redirect-is-a-myth-149656 Is this all just semantics? Are all of these people really saying the same thing and have they been saying the same thing ever since 2010? Cyrus Shepherd shed a little light on things in this post when he said that it seemed people were confusing links and 301-redirects and viewing them as being the same things, when they really aren't. He wrote "here's a huge difference between redirecting a page and linking to a page." I think he is the only writer who is getting down to the heart of the matter. But I'm still in a fog. In this video from April, 2011, Matt Cutts states very clearly that "There is a little bit of pagerank that doesn't pass through a 301-redirect." continuing on to say that if this wasn't the case, then there would be a temptation to 301-redirect from one page to another instead of just linking. VIDEO - http://youtu.be/zW5UL3lzBOA So it seems to me, it is not a myth that 301-redirects result in loss of pagerank. In this video from February 2013, Matt Cutts states that "The amount of pagerank that dissipates through a 301 is currently identical to the amount of pagerank that dissipates through a link." VIDEO - http://youtu.be/Filv4pP-1nw Again, Matt Cutts is clearly stating that yes, a 301-redirect dissipates pagerank. Now for the "myth" part. Apparently the "myth" was about how much pagerank dissipates via a 301-redirect versus a link. Here's where my head starts to hurt: Does this mean that when Page A links to Page B it looks like this: A -----> ( reduces pagerank by about 15%)-------> B (inherits about 85% of Page A's pagerank if no other links are on the page But say the "link" that exists on Page A is no longer good, but it's still the original URL, which, when clicked, now redirects to Page B via a URL rewrite (301 redirect)....based on what Matt Cutts said, does the pagerank scenario now look like this: A (with an old URL to Page B) ----- ( reduces pagerank by about 15%) -------> URL rewrite (301 redirect) - Reduces pagerank by another 15% --------> B (inherits about 72% of Page A's pagerank if no other links are on the page) Forgive me, I'm not a mathematician, so not sure if that 72% is right? It seems to me, from what Matt is saying, the only way to avoid this scenario would be to make sure that Page A was updated with the new URL, thereby avoiding the 301 rewrite? I recently had to re-write 18 product page URLs on a site and do 301 redirects. This was brought about by our hosting company initiating rules in the back end that broke all of our custom URLs. The redirects were to exactly the same product pages (so, highly relevant). PageRank tanked on all 18 of them, hard. Perhaps this is why I am diving into this question more deeply. I am really interested to hear your point of view
Algorithm Updates | | danatanseo0