Should I remove all meta descriptions to avoid duplicates as a short term fix?
-
I’m currently trying to implement Matt Cutt’s advice from a recent YouTube video, in which he said that it was better to have no meta descriptions at all than duplicates.
I know that there are better alternatives, but, if forced to make a choice, would it be better to remove all duplicate meta descriptions from a site than to have duplicates (leaving a lone meta tag description on the home page perhaps?). This would be a short term fix prior to making changes to our CMS to allow us to add unique meta descriptions to the most important pages.
I’ve seen various blogs across the internet which recommend removing all the tags in these circumstances, but I’m interested in what people on Moz think of this.
The site currently has a meta description which is duplicated across every page on the site.
-
Yes, if you can quickly write unique meta descriptions you can start doing that immediately and if the site is not big and you manage to finish it in 1- 2 weeks there is no use to delete meta descriptions. If you see that this will occupy more then 2 weeks it is better to delete duplicated metadescriptions. But the best solution will be to write unique meta description immediately for home page and other important pages which rank right now.
-
Thanks Marc for answering what is in many ways an unfair question.
I definitely agree that the long term objective should be different and relevant meta descriptions as you say. It's also good to know that each of the approaches I suggested were ultimately bad practice, even if one of them is less bad than the other.
-
this is a tough choice between 2 bad "mistakes" - if you leave it blank or empty you don`t use the potential you have... if you have duplicates this is also the same PLUS being afraid what could happen because Matt Cutts mentioned this topic...
You can
t expect a really serious recommendation because both is a "no-go" to me at the moment. If you leave it blank, Google will decide what they display within their SERPs respective Snippets. If your sites have a good and relevant text this won
t end up in a desaster but if there is nothing which Google can pick from???! You no what I mean? (many shops have this problem then)On the other hand this announcement might be a hint to what already happend with the meta keywords... they have no relevance anymore... the only recommendation I would give you now: try it out!
Take one page or a few more and delete the meta description. Leave it blank and wait a little bit... sooner or later you will be able to see the effect within the SERPs then. Upon this result you can make your decision. Don`t do that with your start/home/entry page... use sites (URLs I mean) which are linked more deeper...
BUT your long-term goal should be: create different and relevant meta descriptions
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do Yelp and Justia get all the extra Meta Description Real estate?
I was doing some KW research for a client and noticed something interesting with regard to Yelp and Justia. For a search on DWI Attorneys, they each had over 300 character meta descriptions showing on the SERP without truncating. Everyone else was either truncated or within limit of roughly 160 characters. Obviously if there is a way to get something other than a list to show that way you can own some real estate. Would love to hear from some of you Mozzers on this. Here are two images that should assist. Best Edit: I found one that was not a directory site and it appears it is Google doing it. The site has no meta description for the home page and this is what is being pulled by Google. There are 327 characters here! The truncation marks are showing it being pulled from different parts of the page. Image is Killeen DWI Attorney. NOTE None of these are clients, etc. I also changed the cities so this is a general search. zAQpA qZ9KI 06p7U
Intermediate & Advanced SEO | | RobertFisher1 -
Client wants to remove mobile URLs from their sitemap to avoid indexing issues. However this will require SEVERAL billing hours. Is having both mobile/desktop URLs in a sitemap really that detrimental to search indexing?
We had an enterprise client ask to remove mobile URLs from their sitemaps. For their website both desktop & mobile URLs are combined into one sitemap. Their website has a mobile template (not a responsive website) and is configured properly via Google's "separate URL" guidelines. Our client is referencing a statement made from John Mueller that having both mobile & desktop sitemaps can be problematic for indexing. Here is the article https://www.seroundtable.com/google-mobile-sitemaps-20137.html
Intermediate & Advanced SEO | | RosemaryB
We would be happy to remove the mobile URLs from their sitemap. However this will unfortunately take several billing hours for our development team to implement and QA. This will end up costing our client a great deal of money when the task is completed. Is it worth it to remove the mobile URLs from their main website to be in adherence to John Mueller's advice? We don't believe these extra mobile URLs are harming their search indexing. However we can't find any sources to explain otherwise. Any advice would be appreciated. Thx.0 -
Error Meta Description
(adult website) https://www.google.com.br/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=robertinha Why Google is not reading my description of Yoast plugin? Vídeos de sexo - Vídeos porno
Intermediate & Advanced SEO | | stroke
www.robertinha.com.br/
Robertinha.com.br. lupa. facebook twitter plus. Página Inicial; Última Atualização: terça, 14 abril 2015. Página Inicial. Categorias. Amadoras (227) · Coroas (6) ... If I site: meusite.com.br work, he read correctly, but the site search not.
I do not understand https://www.google.com.br/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=site:robertinha.com.br Vídeos de sexo - Vídeos porno
www.robertinha.com.br/
Vídeos de sexo grátis: assista agora mesmo vídeos porno com gatas, gostosas, safadas fazendo muito sexo.0 -
Site been plagiarised - duplicate content
Hi, I look after two websites, one sells commercial mortgages the other sells residential mortgages. We recently redesigned both sites, and one was moved to a new domain name as we rebranded it from being a trading style of the other brand to being a brand in its own right. I have recently discovered that one of my most important pages on the residential mortgages site is not in Google's index. I did a bit of poking around with Copyscape and found another broker has copied our page almost word-for-word. I then used copyscape to find all the other instances of plagiarism on the other broker's site and there are a few! It now looks like they have copied pages from our commercial mortgages site as well. I think the reason our page has been removed from the index is that we relaunced both these sites with new navigation and consequently new urls. Can anyone back me up on this theory? I am 100% sure that our page is the original version because we write everything in-house and I check it with copyscape before it gets published, Also the fact that this other broker has copied from several different sites corroborates this view. Our legal team has written two letters (not sent yet) - one to the broker and the other to the broker's web designer. These letters ask the recipient to remove the copied content within 14 days. If they do remove our content from our site, how do I get Google to reindex our pages, given that Google thinks OUR pages are the copied ones and not the other way around? Does anyone have any experience with this? Or, will it just happen automatically? I have no experience of this scenario! In the past, where I've found duplicate content like this, I've just rewritten the page, and chalked it up to experience but I don't really want to in this case because, frankly, the copy on these pages is really good! And, I don't think it's fair that someone else could potentially be getting customers that were persuaded by OUR copy. Any advice would be greatly appreciated. Thanks, Amelia
Intermediate & Advanced SEO | | CommT0 -
Duplicate content within sections of a page but not full page duplicate content
Hi, I am working on a website redesign and the client offers several services and within those services some elements of the services crossover with one another. For example, they offer a service called Modelling and when you click onto that page several elements that build up that service are featured, so in this case 'mentoring'. Now mentoring is common to other services therefore will feature on other service pages. The page will feature a mixture of unique content to that service and small sections of duplicate content and I'm not sure how to treat this. One thing we have come up with is take the user through to a unique page to host all the content however some features do not warrant a page being created for this. Another idea is to have the feature pop up with inline content. Any thoughts/experience on this would be much appreciated.
Intermediate & Advanced SEO | | J_Sinclair0 -
Moving some content to a new domain - best practices to avoid duplicate content?
Hi We are setting up a new domain to focus on a specific product and want to use some of the content from the original domain on the new site and remove it from the original. The content is appropriate for the new domain and will be irrelevant for the original domain and we want to avoid creating completely new content. There will be a link between the two domains. What is the best practice for this to avoid duplicate content and a potential Panda penalty?
Intermediate & Advanced SEO | | Citybase0 -
Removing Canonical Links
We implemented rel=canonical as we decided to paginate our pages. We then ran some testing and on the whole pagination did not work out so we removed all on-page pagination. Now, internally when I click for example a link for Widgets I get the /widgets.php but searching through Google I get to /widgets.php?page=all . There are not redirects in place at the moment. The '?page=all' page has been rated 'A' by the SEOmoz tool under On Page Optimization reports and performs much better than the exact same page without the '?page=all' (the score dips to a 'D' grade) so need to tread carefully so we don't lose the link value. Can anyone advise us on the best way forward? Thanks in advance.
Intermediate & Advanced SEO | | jannkuzel0 -
Should I remove footer links?
I added footer links to my site some months ago as I figured that any authority my home page had would be distributed to several of my other most important pages on my site helping them to rank. Would I be better to remove them and would that improve the authority of my home page as less 'link juice' is being distributed. I did originally set up a page per keyword on my site and start building links to each one but as my home page has a good authority I am going to target several keywords on my home page instead as I have some way to go to improve the authority of my other important pages and think this would be a better solution. It would reduce the number of links I have per page however I did see Matt Cutts say that the no more than 100 links per page rule doesn't apply any more. Do footer links add any SEo value?
Intermediate & Advanced SEO | | SamCUK0