Is use of javascript to simplify information architecture considered cloaking?
-
We are considering using javascript to format URLs to simplify the navigation of the googlebot through our site, whilst presenting a larger number of links for the user to ensure content is accessible and easy to navigate from all parts of the site. In other words, the user will see all internal links, but the search engine will see only those links that form our information hierarchy.
We are therefore showing the search engine different content to the user only in so far as the search engine will have a more hierarchical information architecture by virture of the fact that there will be fewer links visible to the search engine to ensure that our content is well structured and discoverable.
Would this be considered cloaking by google and would we be penalised?
-
Pagination is just links. Google can follow the links.
How you set up and offer your pages is important, especially for areas with a lot of pages.
If you have 40 pages of content then I would recommend a structure that offers pages something like "1,2,3,...20...40". If you don't offer a middle selection then that content will probably never be seen.
-
Does the googlebot follow pagination of search results? All our product pages are on the third tier, but their discovery would rely on google following pagination if we cannot use our original approach to infroamtion architecture (ie use javascript to channel the google bot to discover our tier 3 pages)
Thanks for your help!
-
Search engines will determine how deep to crawl a site based on it's importance. You can use the Domain Authority and Page Authority metrics to measure this factor.
In general, you want your content to be a maximum of 3 clicks from your landing page. If you have buried your content deeper, consider either flattening out your architecture or adding links to the buried content. It is very helpful to build external links to the deeper content which will help search engines discover those pages.
-
Ryan is right... you shouldn't do this. If you want to help the crawlers find their way through your site, you could submit a sitemap?
-
Hi Ryan
We use a navigation bar in the header which means that there are a large number of on page links and there is no clear way to determine our information architecture from our internal link structure. i.e. many pages at different levels in our information architecture can be accessed from every page on the site.
Is this an issue? Or will the URL structure be sufficient for the search engines to categorise our content? How can we help the search engine discover content at level 3 in our hierarchy if we insist on using a navigation bar in the header which we believe gives a good user experience?
Thanks!!
-
I have to agree with Ryan. Yes it's cloaking. ... And if you get caught, you could and most likely would be penalized.
-
The actions you describing define cloaking and would be penalized.
If that process were allowed then it would be severely abused. Sites would remove links that were less desirable such as to their privacy page. Sites might also add links.
Search engines insist upon seeing the same content that a user would see.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to inform Google to remove 404 Pages of my website?
Hi, I want to remove more than 6,000 pages of my website because of bad keywords, I am going to drop all these pages and making them ‘404’ I want to know how can I inform google that these pages does not exists so please don’t send me traffic from those bad keywords? Also want to know can I use disavow tool of google website to exclude these 6,000 pages of my own website?
Technical SEO | | renukishor4 -
Dynamically changing a title with javascript
Hi, I asked our IT team to be able to write custom page titles in our CMS and they came up with a solution that writes the title dynamically with javascript. When I look on the page, I see the title in the browser, but when I look in the source code, I see the original page title. I am thinking that Google won't see the new javascript title, so it will not be indexed and have no impact on SEO. Am I right ?
Technical SEO | | jfmonfette0 -
What is the advantage of using sub domains instead of pages on the root domain?
Have a look at this example http://bannerad.designcrowd.com/ For each category of design, they have a landing page on the sub domain. Wouldn't it be better to have them as part of the same domain? What is the strategy behind using sub domains?
Technical SEO | | designquotes0 -
Using a non-visible H1
I have a developer that wants to use style="text-indent:-9999px" to make the H1 non-visible to the user. Being the conservative person I am, I've never tried this before and worry that Search Engines may think this is a form of cloaking. Am I worrying about nothing? And apologies if it's already been covered here. I couldn't find it. Thanks in advance!!!!
Technical SEO | | elytical0 -
What SEO factors do you consider when selecting a CMS?
I'm assisting evaluating 5 different CMS solutions for a client. As a part of this evaluation, I want to ensure we fully explore the SEO capabilities and short comings of the different platforms. Which factors would you recommend we consider? Currently, I'm thinking: custom URLs custom page titles, meta data, etc automatic sitemap updates customize robots/indexing settings site load times rel canonical support code & css quality 301 redirect functionality What else should be on this list? Is there anything on my list that you would de-prioritize? At risk of making this question too large: any opinions out there on what the most SEO friendly CMS systems are?
Technical SEO | | amastix0 -
Would duplicate listings effect a client's ranking if they used same address?
Lots of duplication on directory listings using similar or same address, just different company names... like so-and-so carpet cleaning; and another listing with so-and-so janitorial services. Now my client went from a rank around 3 - 4 to not even in the top 50 within a week. -- -- -- Would duplication cause this sudden drop? Not a lot of competition for a client using keyword (janitorial services nh); -- -- -- would a competitor that recently optimized a site cause this sudden drop? Client does need to optimize for this keyword, and they do need to clean up this duplication. (Unfortunately this drop happened first of March -- I provided the audit, recommendations/implementation and still awaiting the thumbs up to continue with implementation). --- --- --- Did Google make a change and possibly find these discrepancies within listings and suddenly drop this client's ranking? And they there's Google Places:
Technical SEO | | CeCeBar
Client usually ranks #1 for Google Places with up to 12 excellent reviews, so they are still getting a good spot on the first page. The very odd thing though is that Google is still saying that need to re-verify their Google places. I really would like to know for my how this knowledge how a Google Places account could still need verification and yet still rank so well within Google places on page results? because of great reviews? --- Any ideas here, too? _Cindy0 -
Different version of site for "users" who don't accept cookies considered cloaking?
Hi I've got a client with lots of content that is hidden behind a registration form - if you don't fill it out you can not proceed to the content. As a result it is not being indexed. No surprises there. They are only doing this because they feel it is the best way of capturing email addresses, rather than the fact that they need to "protect" the content. Currently users arriving on the site will be redirected to the form if they have not had a "this user is registered" cookie set previously. If the cookie is set then they aren't redirected and get to see the content. I am considering changing this logic to only redirecting users to the form if they accept cookies but haven't got the "this user is registered cookie". The idea being that search engines would then not be redirected and would index the full site, not the dead end form. From the clients perspective this would mean only very free non-registered visitors would "avoid" the form, yet search engines are arguably not being treated as a special case. So my question is: would this be considered cloaking/put the site at risk in any way? (They would prefer to not go down the First Click Free route as this will lower their email sign-ups.) Thank you!
Technical SEO | | TimBarlow0 -
Use of + in url good or bad?
Hi, I am working on a SEO project for a client.
Technical SEO | | MaartenvandenBos
Some of the urls have a + between the keyword.
like www.example.com/make+me+happy/ Is this good or bad for seo?
Or is it maybe better to use - ? Thanks!0