Why is either Rogerbot or (if it is the case) Googlebots not recognizing keyword usage in my body text?
-
I have a client that does liposuction as one of their main services, they have been ranked in the top 1-5 for their keywords "sarasota liposuction" with different variations of the words for a long time, and suddenly have dropped about 10-12 places down to #15 in the engine. I went to investigate this and actually came to the "on-page analysis" tool for SEOmoz pro, where oddly enough it says that there is no mention of the target keyword in the body content (on-page analysis tool screenshot attached). I didn't quite understand why it would not recognize the obvious keywords in the body text so I went back to the page and inspected further. The keywords have an odd featured link that links up to an internally hosted keyword glossary for definitions of terms that people might not know directly. These definitions pop up in a lightbox upon clicking the keyword (liposuction lightbox screenshots attached). I have no idea why google would not recognize these words as they have the text in between the link, yet if there is something wrong with the code syntax etc. it might possibly hender the engine from seeing the body text of the link?
any help would be greatly appreciated! Thank you so much!
-
Hey there,
Sorry that you're getting some confusing On-Page reports.
Since this is more of a help question, rather than an SEO questions, I am going to migrate this over to ZenDesk (our help ticketing system) and answer you there. You should receive an email once the ticket has been created with a link to the ticket itself.
In the future, we recommend that you direct any questions about trouble shooting our software to [email protected] so we can look into the problem for you. That's why the Help Team is here! :]
Thanks,
Chiaryn
-
When you did the on page analysis, did you ask it to check for "sarasota liposuction" or just "liposuction" - I've noticed that the checker tool is very specific like that.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why are two different pages showing for the same keyword in every alternative day?
Hello Everyone, I was really confused with one of my clients search results for a single keyword. One day the home page shows up for the keyword and the other day pricing page shows in search results. I made sure that there is no keyword cannibalization and also more backlinks are pointed towards home page. But still the pricing page shows up for every alternate day. I even checked Google analytics and the most visited page is home page and not the pricing page. Also, when the pricing page shows up it will be there on 2nd page of Google search results and when the home page shows up it is going to the 4th page of search results. Please help me in figuring out this issue. Thanks
Technical SEO | | sandeep.clickdesk0 -
Duplicate title tags being caused by upper case and lower case version of urls'
Hi GWT is reporting lots of duplicate titles for a clients new site. Mainly these are due to 2x different versions of the url, one with words starting with an upper case character and the other all lower case. Clients dev saying this has something to do with windows server and is ok! Is this correct or should i be telling them to delete and then 301 redirect all upper case versions to the lower case (since lower case better practice) and that will deal with the reported dupe titles ? All Best Dan
Technical SEO | | Dan-Lawrence0 -
Help with onpage keyword optimization, site architecture, and how those aspects affect the SERPs.
Hey guys, I've made a post or two before, but my story is that I've been learning SEO for a while now and have only recently (in the last four months) had the opportunity to actually apply what I've been reading about. What I've learned while trying to put these things into practice is that it can be pretty tough sledding, even when it comes to basic elements like keywords and search results. Anyway, to the good stuff. I've been helping my brother's startup company in my spare time because I want them to do well. They're on the last legs of their series A funding and have no money to put towards SEO, content marketing or social, so I'm helping when and where I can for free. The company is Maluuba, a siri-like personal assistant app for Android with a ton of different domains. They launched at TechCrunch Disrupt and actually have a lot of traction and a fair amount of publicity, so I'm not exactly working with scraps, but I don't work with them in their offices and only really communicate with my brother, who is having a really hard time getting buy-in for some of the stuff I want them to do. Their initial website was pretty terrible, so my brother got the okay to redesign the site and together, we worked with a designer to implement the site I linked to. Because they have so many domains (search, social, organization) I thought creating specific pages along with a one homepage would be a good way to optimize for different things and funnel a wider audience to convert to the one macro goal of the site: getting people to download the app. The results haven't been exactly what I expected and I fear I didn't really implement what I still think is a good plan correctly. I've only tried to optimize the pages for a few keywords to start. The main keyword for the homepage and indeed the brand is 'personal assistant app' which is a fairly competitive keyword that I know have them ranking second for on Google CA. I used 'siri-alternative' as a secondary keyword, since that's how they label themselves in the Play Store. For the three other main (pages search, social, organization) I used 'personal assistant app' as a secondary keyword and tried to optimize each page for 'search app', 'social app' and 'organizer app', respectively. While I'm really quite proud that I managed to get a page ranking in the top three for our main keyword, I'm just as disappointed that it's the search page and not the homepage, mainly because I have no idea why it's happening. So, all of that to ask a few questions: Did I make a mistake by trying to add funnels to the site? Or did I just go about optimizing the pages incorrectly? Why does the search page rank really, really well for 'personal assistant app' while the other pages - including the one I intended to rank the highest for that term - lag behind? I'd guess that Google is indexing this page alone as the main representative of 'personal assistant app', but that wasn't my intention. I'm also not using any rel=canonical tags, if that matters. Also, this page has been flipping around in the 1-3 range in the SERPs for about a month, but I still haven't noticed any traffic from 'personal assistant app'. Alright, this is getting way to long. I'd very much appreciate any and all insights as to what I'm doing wrong or what I'm missing. It could be really obvious and thus make this post silly, but I really have read and tried to learn a lot. I just can't see what's going on here because I don't have any experience to compare it to. Thanks in advance for any help. Cheers, JD
Technical SEO | | JDMcNamara1 -
Is it okay to use anchor text almost exclusively for inbound links?
We are not spammy - each link is earned through a long process of relationship building and targeted guest post writing. Because of this, we like each link to have anchor text and they don't point to the same page or have the same anchor text. Is this still something to be worried about? Do we still need to include plain URLs (wwww.example.com) for some of those links?
Technical SEO | | BlueLinkERP0 -
Keywords based domains redirecting to a site.. is it SPAM?
Keywords based domains redirecting to a site is considered spam isn't it ? And if yes, then is it considered spam in all cases whether those domain based sites are related or non related to main site?
Technical SEO | | Personnel_Concept0 -
Location Based Content / Googlebot
Our website has local content specialized to specific cities and states. The url structure of this content is as follows: www.root.com/seattle www.root.com/washington When a user comes to a page, we are auto-detecting their IP and sending them directly to the relevant location based page - much the way that Yelp does. Unfortunately, what appears to be occurring is that Google comes in to our site from one of its data centers such as San Jose and is being routed to the San Jose page. When a user does a search for relevant keywords, in the SERPS they are being sent to the location pages that it appears that bots are coming in from. If we turn off the auto geo, we think that Google might crawl our site better, but users would then be show less relevant content on landing. What's the win/win situation here? Also - we also appear to have some odd location/destination pages ranking high in the SERPS. In other words, locations that don't appear to be from one of Google's data center. No idea why this might be happening. Suggestions?
Technical SEO | | Allstar0 -
Website isn't Ranking for Any Keyword
Hi, I launched a playhouses website in april this year and have been steadily link building to it over the past few months. I have gotten all of the internal optimisation correct (that I can see) however it is still not ranking for any keyword and suprinsgly all of our traffic is comming either direct or through bing. The website is showing as being in googles index however it is still not ranking for even the smallest of niche keywords. The only penalty I can see is that we have some spammy blog links that my colleague has gotten which I have been trying to counteract with high quality guest blogging. Any input is welcome the url is http://www.playhouses.co.uk/ Simon
Technical SEO | | GardenGamer0 -
Google Web Master Tools - Keyword Variants & misspelling
We have millions of urls and the technical expertise to write code to fix the spelling of keyword variants Google has discovered and shows us in Web Master tools. Since Google has recognized these as variants, is it worth our time to write code that will fix the spelling of obvious misses?
Technical SEO | | snoopcat0