Change in Meta Description - 320 to 160
-
Why google showing only 160 Char, instead of 320? Is there any official announcement from @Google?
I have noticed form last week in Google SERP, Description is showing 160 again.
Please help me with valuable information.
-
-
Do you have a link to that article?
... I can see that Shopify suggests having desciprtion-tags there are 320 characters long, - but that gives an issue in the Moz-tool...
-
I've got a blog post going up tomorrow morning, but the short answer is that it looks like Google has reverted to the previous limit (roughly 155 characters). There are a small % of display snippets with >300 characters, but those are exceptions to the rule and seem to be connected somewhat to Featured Snippets. In most cases, those >300 descriptions are pulled from page content.
-
Thank you for reply,
Please update us if you have some important information on this.
-
Danny confirmed a change today, but didn't provide much detail (just that some display snippets are shorter). Anecdotally, we're definitely seeing a change. We're collecting data now and should have more to say later in the week. Trying to get some measurements across a larger data set so that we can give people solid information.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Domain prefix changed, will this impact SEO?
Our web development team have changed our domain prefix from www to non www due to a server change. Our SSL certificate would not be recognised under www and would produce a substantial error message when visiting the secure parts of our website. To prevent issues with old links they have added a permanent 301 redirect from www. to non www. urls until our sitemap catches up. Would this impact our SEO efforts or would it have no impact as a redirect has been placed? Thanks
Technical SEO | | Jseddon920 -
Meta tags
Hello, Does anyone know how long it takes for the meta descriptions to show up in Google? This because I just updated my meta descriptions for the whole website, but while moz toolbar is showing it correctly, google is still showing the old ones, even if i used the see as googlebot tool from webmaster tools. Thanks for a reply
Technical SEO | | socialengaged
Eugenio | Social Engagement0 -
Web address change - Search impact?
Hi, I have whosjack.org and wjlondon.com - on there is a London relevant news and events website. whosjack.org has been the main site for some time and has decent search pick up. Currently wjlondon.com just redirects to whosjack.org. However - having london in our actual address would be far more beneficial for us. So ideally I want to swap the two web addresses around. Have the main site at wjlondon and have whosjack redirecting to it. However - I don't want to loose traffic from search. An idea I had was to create a sept site at wjLondon that was a feed of social content and links from whosjack so that it starts to get a decent search and then swap them over but not sure whether that would actually be detrimental what with all the dupe content issues with google etc. Any thoughts?
Technical SEO | | luwhosjack0 -
Timely use of robots.txt and meta noindex
Hi, I have been checking every possible resources for content removal, but I am still unsure on how to remove already indexed contents. When I use robots.txt alone, the urls will remain in the index, however no crawling budget is wasted on them, But still, e.g having 100,000+ completely identical login pages within the omitted results, might not mean anything good. When I use meta noindex alone, I keep my index clean, but also keep Googlebot busy with indexing these no-value pages. When I use robots.txt and meta noindex together for existing content, then I suggest Google, that please ignore my content, but at the same time, I restrict him from crawling the noindex tag. Robots.txt and url removal together still not a good solution, as I have failed to remove directories this way. It seems, that only exact urls could be removed like this. I need a clear solution, which solves both issues (index and crawling). What I try to do now, is the following: I remove these directories (one at a time to test the theory) from the robots.txt file, and at the same time, I add the meta noindex tag to all these pages within the directory. The indexed pages should start decreasing (while useless page crawling increasing), and once the number of these indexed pages are low or none, then I would put the directory back to robots.txt and keep the noindex on all of the pages within this directory. Can this work the way I imagine, or do you have a better way of doing so? Thank you in advance for all your help.
Technical SEO | | Dilbak0 -
Should I change these "Overly dynamic URLs" ?
Hello, My client have pages that look like this: www.domain.com/blog/index.aspx?blogmonth=1&blogday=10&blogyear=2012&blogid=256 Question 1: SEOMoz say they are overly dynamic. Is it really in this case as the numbers indicate the year, month and day and do not change? Question 2: Should we change the URLs to proper SEO friendly URLs such as www.domain.com/keywords1-keyword2? The pages are already ranking well and we worry that changing the URL may damage the ranking? Do we risk the page to go down in ranking by creating SEO friendly URLs? (and using a 301 to redirect from the old URL)
Technical SEO | | DavidSpivac0 -
How can I tell Google, that a page has not changed?
Hello, we have a website with many thousands of pages. Some of them change frequently, some never. Our problem is, that googlebot is generating way too much traffic. Half of our page views are generated by googlebot. We would like to tell googlebot, to stop crawling pages that never change. This one for instance: http://www.prinz.de/party/partybilder/bilder-party-pics,412598,9545978-1,VnPartypics.html As you can see, there is almost no content on the page and the picture will never change.So I am wondering, if it makes sense to tell google that there is no need to come back. The following header fields might be relevant. Currently our webserver answers with the following headers: Cache-Control: no-cache, must-revalidate, post-check=0, pre-check=0, public
Technical SEO | | bimp
Pragma: no-cache
Expires: Thu, 19 Nov 1981 08:52:00 GMT Does Google honor these fields? Should we remove no-cache, must-revalidate, pragma: no-cache and set expires e.g. to 30 days in the future? I also read, that a webpage that has not changed, should answer with 304 instead of 200. Does it make sense to implement that? Unfortunatly that would be quite hard for us. Maybe Google would also spend more time then on pages that actually changed, instead of wasting it on unchanged pages. Do you have any other suggestions, how we can reduce the traffic of google bot on unrelevant pages? Thanks for your help Cord0 -
Frequent server changes
Hey guys. I have a server related question. One of our websites is hosted with a nasty slow company, and we want to make a change. The problem we have is that the site is 6 months old so it started on one server, the client then moved it to this slow host about 2 months ago, we now want to move it again. Will this negatively affect search engine rankings? As ever, thanks in advance 🙂
Technical SEO | | Nextman0 -
Wordpress plugin for facilitating adding a post descriptions
Could anyone recommend a plugin for creating wordpress post descriptions. There is a confusingly large selection of choices in the wordpress plugins directory.
Technical SEO | | catherine-2793880