Javascript hidden divs, links to anchor content
-
Hello,
I am working on a web project that breaks up its sections by utilizing hidden divs shown via javascript activated through anchor links.
- First question: Is this SEO suicide? I have confirmed that the content is being indexed by searching for specific text but have been led to believe that hidden div content will be afforded a lower 'importance'. One suggestion has having the text as display:block and then hiding it on page load. Will this make a difference?
- Second: Is there any way to have Google index the anchored content by the specific anchor text?
An example for the second question: If you search google right now for:
buyers like to look at floorplans Tom & Jan
You will get a link to:
but I would rather it be:
[http://www.janandtom.com/#Interactive Floorplans](http://www.janandtom.com/#Interactive Floorplans)
Sorry if this is redundant or addressed before. I tried searching the questions but wasn't getting and definitive direction to go and this project is a little unique for me. Also, I'm just getting my feet we into this 'high-end' seo (new member of SEOMoz) so please bear with me. Any help would be greatly appreciated.
Thanks!
-
The guidline you want to look out for is cloaking, showing one thing to the user and one to the serch engine.
If you are hiding the text and not showing it to the user then you have a problem. if you have some way that the user can click and then see the hidden text you should be ok.
how does google know the difference? i dont know if they can algorthmicaly, but if you rank well a compeditor will look into your site and report you if he can.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can we ignore "broken links" without redirecting to "new pages"?
Let's say we have reaplced www.website.com/page1 with www.website.com/page2. Do we need to redirect page1 to page2 even page1 doesn't have any back-links? If it's not a replacement, can we ignore a "lost page"? Many websites loose hundreds of pages periodically. What's Google's stand on this. If a website has replaced or lost hundreds of links without reclaiming old links by redirection, will that hurts?
Algorithm Updates | | vtmoz0 -
404s in Google Search Console and javascript
The end of April, we made the switch from http to https and I was prepared for a surge in crawl errors while Google sorted out our site. However, I wasn't prepared for the surge in impossibly incorrect URLs and partial URLs that I've seen since then. I have learned that as Googlebot grows up, he'she's now attempting to read more javascript and will occasionally try to parse out and "read" a URL in a string of javascript code where no URL is actually present. So, I've "marked as fixed" hundreds of bits like /TRo39,
Algorithm Updates | | LizMicik
category/cig
etc., etc.... But they are also returning hundreds of otherwise correct URLs with a .html extension when our CMS system generates URLs with a .uts extension like this: https://www.thompsoncigar.com/thumbnail/CIGARS/90-RATED-CIGARS/FULL-CIGARS/9012/c/9007/pc/8335.html
when it should be:
https://www.thompsoncigar.com/thumbnail/CIGARS/90-RATED-CIGARS/FULL-CIGARS/9012/c/9007/pc/8335.uts Worst of all, when I look at them in GSC and check the "linked from" tab it shows they are linked from themselves, so I can't backtrack and find a common source of the error. Is anyone else experiencing this? Got any suggestions on how to stop it from happening in the future? Last month it was 50 URLs, this month 150, so I can't keep creating redirects and hoping it goes away. Thanks for any and all suggestions!
Liz Micik0 -
Does adding lots of new content on a site at one time actually hurt you?
When speaking with a client today, he made the comment that he didn't want all of the new content we'd been working to be added to the site all at once for fear that he would get penalized for flooding the site with new content. I don't have any strong data to confirm or refute the claim, is there any truth to it?
Algorithm Updates | | JordanRussell0 -
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0 -
How does Google treat anchor tags on badges after penguin update?
We have a website builder that creates sites in sub-domains (i.e. yoursite.breezi.com) on every site we have included a badge that has anchor text and an image. My question is given the fact that we will include this on many if not most of the sites created inside our builder how will google treat backlinks with the same anchor tag/text from non relevant sites after the penguin update? I am concerned about the backlinks from non-theme related sites and it's SEO implications. Any help is greatly appreciated.
Algorithm Updates | | breezi0 -
Need some Real Insight into our SEO Issue and Content Generation
We have our site www.practo.com We have our blog as blog.practo.com We plan to have our main site in a months time from now as www.ray.practo.com The Issues - I will then need to direct all my existing traffic from www.practo.com to www.ray.practo.com Keeping in mind SEO and also since I will be generating new content via our Wordpress instance what are the best ways to do this so that google does not have difficulty in find out content 1. Would it be good if I put the Wordpress instance as ray.practo.com/ blog(wordpress instance comes in here in the directory) / article-url 2.Would it be better with www.practo.com / ray / blog/article-url I am using wordpress to roll out all our new SEO based content on various keywords and topics for which we want traffice - primary reasons are since we needed a content generation cms platform so that we dont have to deal with html pages and every time publish those content pages via a developer. Is the above - what soever I am planning to do in the correct manner keeping SEO in mind. Any suggestions are welcome. I seriously need to know writing seo based content on wordpress instance and have them in the urls is that a good idea? Or is only html a good idea. But we need some cms to be there so that content writers can write content independently. Please guide accordingly. Thanks
Algorithm Updates | | shanky10 -
Why is a link considered active, but is no longer on the page?
How come links sometimes show up in OSE or Yahoo Site Explorer and then when you go to the page, they're not there anymore? Why is a link indexed or considered active but is no longer on the page?
Algorithm Updates | | MichaelWeisbaum0 -
High bounce rates from content articles influencing our rankings for rest of site
We have a large content article section on our e-commerce site that receives a lot of visits but also have very high bounce rates. We are wondering if this is hurting the rest of our site's rankings. **When I say bounce rates I mean what ever metrics Google is using to determine quality content (specifically after the Panda update). ** We are trying to determine if having the content articles on our domain hurts us. We only have the content articles for link building.
Algorithm Updates | | seozachz0