Mask links with JS that point to noindex'ed paged
-
Hi,
in an effort to prepare our page for the Panda we dramatically reduced the number of pages that can be indexed (from 100k down to 4k). All the remaining pages are being equipped with unique and valuable content.
We still have the other pages around, since they represent searches with filter combination which we deem are less interesting to the majority of users (hence they are not indexed). So I am wondering if we should mask links to these non-indexed pages with JS, such that Link-Juice doesn't get lost to those. Currently the targeted pages are non-index via "noindex, follow" - we might de-index them with robots.txt though, if the "site:" query doesn't show improvements.
Thanks,
Sebastian
-
Well, we just want to show less links to Google than to the user (but the links for Google are still a subset of the links shown to users). The links we'd do as JS links are those to less often applied search filters, which we don't index in order not to spam the search index.
Fortunately, if Google is smart enough in decrypting the links it wouldn't do any harm.
Thanks for our ideas tough! Especially the site: thing I considered myself, it really takes ages until something is de-indexed (for us, using robots.txt did speed it up by a magnitude).
-
Not to mention Google's ability to decipher JS to one degree or another, and they're working on improving that all the time. I've seen content they found that was supposed to be hidden in JS.
-
First be aware that the "site:" query won't show improvements for a long time. I had a 15 page website I built for someone get indexed in the dev server on accident. I 301'd every page to the new site's real URL. If I site search the dev url's they are still there, in spite of the fact that they 301 and have been for nearly two months. One I did 6 months ago only recently was removed from the site search.
if you link to your own pages that are not indexed for whatever reason, you could try to mask them in javascript but just be aware of the fine line you walk. Google does not like anything that misleads them or users. Hiding a link that is visible to users and not them is not a good idea in my opinion. If you have content that isn't worth indexing, it shouldn't be worth linking to anyway.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does rel= canonical combine link juice for 2 pages?
If two pages are very similar, and one should rel= canonical to the other, will the page authority pass from the page with rel= canonical to the target page? Also, what happens when you a page rel=canonical's to itself?
Technical SEO | | SkinLaboratory0 -
What is Too Many On-Page Links?
in campaigns i see " Too Many On-Page Links " what is this ? can anyone please tell me ?
Technical SEO | | constructionhelpline0 -
Why doesn't SEOmoz see internal/external links on my site?
My SEOmoz analysis that my site contains neither external or internal lnks. I have used other tools and they have all seen the internal and external links on the pages. There aren't many but they are there. Why isn't SEOmoz seeing them?
Technical SEO | | iain0 -
Similar pages: noindex or rel:canonical or disregard parameters?!
Hey all! We have a hotel booking website that has search results pages per destinations (e.g. hotels in NYC is dayguest.com/nyc). Pages are also generated for destinations depending on various parameters, that can be star rating, amenities, style of the properties, etc. (e.g. dayguest.com/nyc/4stars, dayguest.com/nyc/luggagestorage, dayguest.com/nyc/luxury, etc.). In general, all of these pages are very similar, as for example, there might be 10 hotels in NYC and all of them will offer luggage storage. Pages can be nearly identical. Come the problems of duplicate content and loss of juice by dilution. I was wondering what was the best practice in such a situation: should I just put all pages except the most important ones (e.g. dayguest.com/nyc) as noindex? Or set it as canonical page for all variations? Or in google webmaster tool ask google to disregard the URLs for various parameters? Or do something else altogether?! Thanks for the help!
Technical SEO | | Philoups0 -
Search result pages - noindex but auto follow?
Hi guys, I don't index my search pages, and currently my pages are tagged name="robots" content="noindex"> Do I need to specify follow or will it automatically be done? Thanks Cyto
Technical SEO | | Bio-RadAbs0 -
Inbound links with anchor text about pills pointing to our URLs
Hi, I have just noticed one of our websites has lots of inbound links with the words: "order cialis", "**cialis professional online", "**order viagra soft" and so on. I have checked the target URLs source for any kind of suspicious code and found nothing. **What should I look for and what should I do in this case? ** Thanks
Technical SEO | | ceci27100 -
Link to overall brand pages
On our website we have two ways to get in a brand environment. We have general brand pages and brand pages divided by category. At this moment the category brand pages get the most SEO value, because we have a link on our homepage to these pages (via the mega dropdown). The problem is that we would like to assign the SEO value to the general brand pages (with all the articles) instead of the category brand pages (with only articles within a category). We prefer to optimize the general brand page without a link to this page on the homepage for now. for example; Those two pages have the most SEO value
Technical SEO | | eCommerceSEO
www.debijenkorf.nl/herenmode/diesel
www.debijenkorf.nl/damesmode/diesel but we would like to assign value to;
www.debijenkorf.nl/diesel Do you have a solution for this problem? Thank you in advance! Kind regards,0 -
How Best to Handle 'Site Jacking' (Unauthorized Use of Someone else's Dedicated IP Address)
Anyone can point their domain to any IP address they want. I've found at least two domains (same owner) with two totally unrelated domains (to each other and to us) that are currently pointing their domains to our IP address. The IP address is on our dedicated server (we control the entire physical server) and is exclusive to only that one domain (so it isn't a virtual hosting misconfiguration issue) This has caused Google to index their two domains with duplicate content from our site (found by searching for site:www.theirdomain.com) Their site does not come up in the first 50 results though for any of the keywords we come up for so Google obviously knows THEY are the dupe content, not us (our site has been around for 12 years - much longer than them.) Their registration is private and we have not been able to contact these people. I'm not sure if this is just a mistake on the DNS for the two domains or it is someone doing this intentionally to try to harm our ranking. It has been going on for a while, so it is most likely not a mistake for two live sites as they would have noticed long ago they were pointing to the wrong IP. I can think of a variety of actions to take but I can find no information anywhere regarding what Google officially recommends doing in this situation, assuming you can't get a response. Here's my ideas. a) Approach it as a Digital Copyright Violation and go through the lengthy process of having their site taken down. Pro: Eliminates the issue. Con: Sort of a pain and we could be leaving possibly some link juice on the table? b) Modify .htaccess to do a 301 redirect from any URL not using our domain, to our domain. This means Google is going to see several domains all pointing to the same IP and all except our domain, 301 redirecting to our domain. Not sure if THAT will harm (or help) us? Would we not receive link juice then from any site out there that was linking to these other domains? Con: Google will see the context of the backlinks and their link text will not be related at all to our site. In addition, if any of these other domains pointing to our IP have backlinks from 'bad neighborhoods' I assume it could hurt us? c) Modify .htaccess to do a 404 File Not Found or 403 forbidden error? I posted in other forums and have gotten suggestions that are all over the map. In many cases the posters don't even understand what I'm talking about - thinking they are just normal backlinks. Argh! So I'm taking this to "The Experts" on SEOMoz.
Technical SEO | | jcrist1