Measure impact from new meta descriptions
-
Hi guys,
I'm looking to implement new meta descriptions across a site and i want to measure the impact.
So far I'm thinking of extracting the CTR data from GWT for the last 90 days to get the most accurate CTR averages for each URL. Then once the new meta descriptions have been implemented, compare the CTR with the old CTR averages accross URLs.
Do you think this would be the most accurate way of measuring the impact?
Cheers,
Chris
-
Hey Chris,
I recently did the same thing, I looked at traffic from organic to the specific pages I changed the meta descriptions for. I did 20 for my top sellers and 20 for my slow sellers and watched for incremental increases YoY and WoW through GA. I did see increases although small, will impact my overall traffic in bulk - I have educated the rest of the business and now we write better meta descriptions.
Although your method seems sound and as Matt said ultimately you want to make sure you are converting those people too.
All the best,
K
-
If you have goal tracking set up, it may make sense to compare conversions as well. While I like to use meta desc to improve CTR, I'd much rather improve conversions. If I can get someone to click through AND take action, that's the best meta desc.
Step 1 - CTR
Step 2 - Conversion/Goal Conversion
If your CTRs are super-low, improving them may not help conversion a ton. If they're already decently high, you may only see a small increase but you could double conversions. I think using the two together makes the most sense.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Impact of Removing 60,000 Page from Sites
We currently have a database of content across about 100 sites. All of this content is exactly the same on all of them, and it is also found all over the internet in other places. So it's not unique at all and it brings in almost no organic traffic. I want to remove this bloat from our sites. Problem is that this database accounts for almost 60,000 pages on each site and it is all currently indexed. I'm a little bit worried that flat out dumping all of this data at once is going to cause Google to wonder what in the world we are doing and we are going to see some issues from it (at least in the short run). My thought now is to remove this content in stages so it doesn't all get dropped at once. But would deindexing all of this content first be better? That way Google would still be able to crawl it and understand that it is not relevant user content and therefore minimize impact when we do terminate it completely? Any other ideas for minimizing SEO issues?
Intermediate & Advanced SEO | | MJTrevens1 -
301 Redirect from query string to new static page
If i want to create a redirect from a page where the slug ends like this "/?i=4839&mid=1000&id=41537" to a static, more SEO friendly slug like "/contact-us/", will a standard 301 redirect suffice? Thanks, Nails
Intermediate & Advanced SEO | | matt.nails0 -
Does Google ignore duplicate meta descriptions?
Hi there SEO mozzers, I am dealing with a website that has duplicate meta descriptions (we know is bad).As a punishment, Google totally ignores the meta descriptions and picks content from the website and displays it in SERP. I already read the https://mza.seotoolninja.com/blog/why-wont-google-use-my-meta-description but I was wondering if there is more information/knowledge out there. Any tips are appreciated!
Intermediate & Advanced SEO | | Europarl_SEO_Team0 -
Migrating to new Windows Server
Hello, We are migrating an existing website to a new Windows 2016 Server. Please advise or direct us to any good resources for advice on important configurations for the server primarily with respect to SEO. For example, is it important to ensure Pinging is enabled on server? Or are there good IIS add ons / features we should ensure we use, like URL rewrite? Thank you in advance for your response!
Intermediate & Advanced SEO | | srbello0 -
International website. Di I need a new website
i am looking to expand from the UK and open a location in the US. i curretly have a .co.uk domain. what would you recommend I do with th website, create a new one wth a .com domain?
Intermediate & Advanced SEO | | Caffeine_Marketing0 -
New domain or subdirectory?
I noticed my domain authority has dropped slightly in the recent update, and it has me re-thinking a strategy for a website I just recently launched. I purchased the domain name kansasisbeautiful.com about a year ago and have been working on building it for most of that time. Earlier in August, I went ahead and launched it. However, towards the end of the development of the website, I decided to just put it in a subdirectory of my parent company (my photography business) at mickeyshannon.com/kansas and redirected the kansasisbeautiful.com domain to the subdirectory. mickeyshannon.com is my photography business website. The Kansas website has it's own distinct design, but is powered completely by my photography. I created it for a few purposes, including promoting tourism to the state of Kansas and to publish a book on Kansas travel next year, but one of it's main goals is also to help sell my photography prints. I decided to put it in a subdirectory (mickeyshannon.com/kansas) as I had hoped it might drive more traffic into buying photo prints if it lived on my main website. However, I've been re-thinking my strategy and have been wondering if I'm competing against myself too much. Many of my photography prints have the name of a location in them and have their own URL per photo (for example: "Flint Hills Spring Sunrise" is at http://www.mickeyshannon.com/photo/flint-hills-spring-sunset/). It makes me wonder if the new Kansas travel website page for the Flint Hills (http://www.mickeyshannon.com/kansas/flint-hills/) is competing for that keyword. Would I be better moving mickeyshannon.com/kansas to kansasisbeautiful.com? I was worried having so many backlinks back to my photography site would send up red flags with Google as if the kansasisbeautiful.com website was just a spammy website created to push traffic to mickeyshannon.com when it really has it's own purpose. Any thoughts on whether using the domain name or keeping it at the subdomain level is better? Hopefully that made sense. Thanks, Mickey
Intermediate & Advanced SEO | | VSphoto0 -
Toxic Links; Their Existence and Their Impact..
We are constantly being asked about the existence of “toxic Links” and that they are damaging the sites of our clients. Apparently, this definition is being pushed down the throats of clients by other “Seo experts” trying to hijack our business. At this point in time, clients can easily be swayed as a reflex reaction to a drop in rankings. These so called “Seo experts” are clearly scaremongering for their own gain but I would be grateful for your opinion about whether automated, spun content from Seolinkvine and the like, where the English may not be perfect (I assume this is what is meant by “toxic Links”) can actually damage a client’s site. Is it not more constructive to concentrate resources on dilution of keywords from the anchor text rather than waste time on links that may no longer be as powerful, or do they actually have a negative effect?
Intermediate & Advanced SEO | | Dexter-2455780 -
Canonical Meta Tag
Can someone explain how this works and how necessary is it? For example, I have a new client, who is ranking WITHOUT the www in their domain, but they have a good deal of backlinks already that have www in it. When I set up google webmaster tools I made 2, one for WWW and one for WITHOUT and there are diffenet numbers of backlinks for each. I have no idea what do about this or if I should even do anything. Thanks
Intermediate & Advanced SEO | | TheGrid0