Is this cloaking or some dangerous blackhat SEO tactic?
-
Hey wonderful SEO guys, I need your advice. Would the following be considered cloaking, or a black hat SEO tactic.
I performed the following search for Guess tops on Google: "Guess women's tops." Please see the attached image (Guess 1) of the description tag that comes up with this search. This not the primary page description tag, but when you visit the women's tops tag, that description is not visible on the page. In fact it is placed in the meta name section (see Guess meta-name description image). The information appears as a description on a SERPS depending on the keyword search performed, but the text is just not visible on the tops page.
Can this be considered a form of cloaking? If not, is this a dangerous blackhat SEO tactic, or actually nothing to be worried about? We are thinking of doing something similar with some of lengthy homepage introductions-making them invisible, but still appearing on SERPS, as long as it relates to content that is clearly on the website, or what the website is about.
Please advise. Thanks.
-
R.E. Original question - Nothing funny here looks like a standard Meta description.
Note: It used to be standard that Google pulled only ever the Meta description but now if they consider it more relevant they'll (Google) will pull any page content that's relevant to the search query, no matter if it appears at the top, middle or bottom of a a page - you really can't dictate this since you can't possibly predict all possible search queries which will result in visitors to your site.
I'd stick to writing strong Meta & page titles & descriptions providing high quality content and letting Google handle what it pulls! You may want to look into the Schema.org markup to see what you can dictate.
Sam
-
I have never seen that happen in terms of Google showing that in the SERPs. You can hide data in the back using JSON and stuff but I have never seen Google put text as the description if it's not visible in the body...at least from my experience
-
Thanks for your response Hutch42. I know what meta descriptions are, their purpose and how they can best be used to enhance click-through-rates. Something changed on the Guess site since yesterday when we were analyzing it, so my initial post is now proving irrelevant.
Here is another question: Apart from a page's official description tag, Google will usually pull text from a particular page and use it as part of a description on a SERPS depending on the keywords used in search. The text it pulls from that page may not necessarily be that page's official description tag. Is there way (or would you advise), to add let's say a 200-word article about a business on a page in an invisible manner, while still having Google display any part of that article in a SERPS if any search includes keywords that have been used in that article? So basically, the text is in the back end (not visible on the page), maybe in the html code, but Google can still pick it up and add it's content as an unofficial description in a SERPs if a search contains one of more keywords included in the article?
I hope you understand what I'm asking. Thanks
-
The function of the meta description is to provide a snippet for the search engine to display, it is not a form of masking, that site is using the meta description for its intended purpose.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does having an embedded Google Map still count as a positive SEO signal?
I know this was true a few years ago, however is there still an advantage to having an embedded map vs. a pop up map in 2017?
Local Website Optimization | | BigChad21 -
SEO and Redirecting Site to a Different Firm's Domain while Maintaining Current Domain's Rankings
I am a plaintiffs' attorney with a website that ranks well for my major practice areas. I am considering taking a position with a new firm. As part of the discussion, the new firm would allow me to keep my current site so long as it redirects to my bio page on their firm's site. My goal is to keep my current site ranking well and continuously work on SEO efforts, in case I leave the new firm and want to rely on my current site in the future. My questions are: Is there a way to redirect my site every time it shows up in the listings (I have 1000+ indexed pages) without sacrificing its current rankings b/c of bounce rate issues, etc and 2) If I continue to add pages and work on SEO for my site while it redirects to another, will those efforts be worthwhile due to the redirect? I want to keep trying to build my site even though it redirects to a page on a different domain.
Local Website Optimization | | crpoll0 -
Call Tracking, DNI Script & Local SEO
Hi Moz! I've been reading about this a lot more lately - and it doesn't seem like there's exactly a method that Google (or other search engines) would consider to be "best practices". The closest I've come to getting some clarity are these Blumenthals articles - http://blumenthals.com/blog/2013/05/14/a-guide-to-call-tracking-and-local/ & the follow-up piece from CallRail - http://blumenthals.com/blog/2014/11/25/guide-to-using-call-tracking-for-local-search/. Assuming a similar goal of using an existing phone number with a solid foundation in the local search ecosystem, and to create the ability to track how many calls are coming organically (not PPC or other paid platform) to the business directly from the website for an average SMB. For now, let's also assume we're also not interested in screening the calls, or evaluating customer interaction with the staff - I would love to hear from anyone who has implemented the DNI call tracking info for a website. Were there negative effects on Local SEO? Did the value of the information (# of calls/month) outweigh any local search conflicts? If I was deploying this today, it seems like the blueprint for including DNI script, while mitigating risk for losing local search visibility might go something like this: Hire reputable call-tracking service, ensure DNI will match geographic area-code & be "clean" numbers Insert DNI script on key pages on site Maintain original phone number (non-DNI) on footer, within Schema & on Contact page of the site ?? Profit Ok, those last 2 bullet points aren't as important, but I would be curious where other marketers land on this issue, as I think there's not a general consensus at this point. Thanks everyone!
Local Website Optimization | | Etna1 -
Call Tracking numbers effect on Local SEO
Hello Mozzers! With the importance of homogeneous NAP information on Local SEO, could using Call Tracking numbers have a negative effect? Is it better to use Javascript to place the number, or to hard code it? Thanks in advance!
Local Website Optimization | | FrankSweeney0 -
Can to many 301 redirects damage my Ecommerce Site - SEO Issue
Hello All, I have an eCommerce website doing online hire. We operate from a large number of locations (100 approx) and my 100 or so categories have individual locations pages against them example - Carpet Cleaners (category) www.mysite/hire-carpetcleaners
Local Website Optimization | | PeteC12
carpet cleaner hire Manchester www.mysite/hire-carpetcleaners/Manchester
carpet cleaner hire london
carpet cleaner hire Liverpool patio heater (category)
patio heater hire Manchester
patio heater hire London
patio heater hire Liverpool And so on..... I have unique content for some of these pages but given that my site had 40,000 odd urls, I do have a large amount of thin/duplicate content and it's financially not possible to get unique
content written for every single page for all my locations and categories. Historically, I used to rank very well for these location pages although this year, things have dropped off and recently , I was hit with the Panda 4.0 update which i understand targets thin content. Therefore what I am int he process of doing is reducing the number of locations I want to rank for and have pages for thus allowing me to achieve both a higher percentage of unique content over duplicate/thin content on the whole site and only concerntrate on a handful of locations which I can realistically get unique content written for. My questions are as follows. By reducing the number of locations, my website will currently 301 redirect these location page i have been dropping back to it's parent category.
e.g carpet cleaner hire Liverpool page - Will redirect back to the parent Carpet cleaner hire Page. Given that I have nearly 100 categories to do , this will mean site will generate thousands of 301 redirects when I reduce down to a handful of locations per category. The alternative Is that I can 404 those pages ?... What do yout think I should do ?.. Will it harm me by having so many 301's . It's essentially the same page with a location name in it redirecting back to the parent. Some of these do have unqiue content but most dont ?. My other question is - On a some of these categories with location pages, I currently rank very well for locally although there is no real traffic for these location based keywords (using keyword planner). Shall I bin them or keep them? Lastly , Once I have reduced the number of location pages , I will still have thin content until , I can get the unique content written for them. Should I remove these pages until that point of leave them as it is? It will take a few months
to get all the site with unique content. Once complete, I should be able to reduce my site down from 40,000 odd pages to say 5,000 pages Any advice would be greatly appreciated thanks
Pete0 -
Will subdomains with duplicate content hurt my SEO? (solutions to ranking in different areas)
My client has offices in various areas of the US, and we are working to have each location/area rank well in their specific geographical location. For example, the client has offices in Chicago, Atlanta, Dallas & St Louis. Would it be best to: Set up the site structure to have an individual page devoted to each location/area so there's unique content relevant to that particular office? This keeps everything under the same, universal domain & would allow us to tailor the content & all SEO components towards Chicago (or other location). ( example.com/chicago-office/ ; example.com/atlanta-office/ ; example.com/dallas-office/ ; etc. ) Set up subdomains for each location/area...using the basically the same content (due to same service, just different location)? But not sure if search engines consider this duplicate content from the same user...thus penalizing us. Furthermore, even if the subdomains are considered different users...what do search engines think of the duplicate content? ( chicago.example.com ; atlanta.example.com ; dallas.example.com ; etc. ) 3) Set up subdomains for each location/area...and draft unique content on each subdomain so search engines don't penalize the subdomains' pages for duplicate content? Does separating the site into subdomains dilute the overall site's quality score? Can anyone provide any thoughts on this subject? Are there any other solutions anyone would suggest?
Local Website Optimization | | SearchParty0 -
Had SEO Firm tell me to Start Over - pros and cons help please
Hi So I have quotes of 1250 to 2500 a month to run my website, seo wise. What I am told is they will do all facebook postings, 4 blog posts each month, some citations, and site optimization. Those amounts due seem like a lot. Yet I was last to start all over. Basically I was told that because of some bad backlinks, which only a few remain, that you can never recover from an algorithm penalty. And with a Disavow, its like telling Google - penalize me please So the plan was this: $3000 for a new site, and new domain, and then it has no penalties, and I will be ranking in no time. The problem is I am branded. My domain and business name is Bernese Of The Rockies. People know us and we are very respected. So if we create a new site like example.com, I do not want to mislead people. Or if there is a penalty for say a landing page or site, where I am sending people to my main site for more info type of thing. Just looking for your input if this is a common issue, where if you have a non manual, but algo penalty that you must restart? Thank you so much for your thoughts and suggestions.
Local Website Optimization | | Berner0 -
Website Mods and SEO for Multi-Location Practice?
We're in the process of taking over a WordPress website within the next week for a 3 location medical practice. These are in 3 different cities. 1 location is in a pretty competitive market, while the other 2 are not. The current site isn't bad for design and navigation and they don't have the budget for a full-redesign. Structurally, it is sound. It lacks a lot of content though and a blog. It is not responsive, should we convert to make it responsive? At first glance you can't tell they have 3 locations and their content for each location and services offered is pretty weak. What other suggestions do any of you have for getting the main site to rank for all 3 locations? I know it'll take some time since they are no where to be found now, but just looking for any other tips you may all have. Thanks!! - Patrick
Local Website Optimization | | WhiteboardCreations0