Pros vs Cons - Navigation/content embedded within javascript
-
My programmer showed me this demo website where all the navigation and content is embedded within javascript: http://sailsjs.org/#!
Google site search returned 51 in results, all pages pretty much unique Title Tags and Meta Descriptions
Bing site search returned 24 results with pretty much identical Title Tags and Meta Descriptions
Matt Cutts said it's fine but to test first: http://www.youtube.com/watch?v=Mibrj2bOFCU
Has anyone seen any reason to avoid this web convention?
My gut is to avoid this approach with the main drawback I see is that websites like this won't do well on search engines other than Google that have less sophisticated algorithms.
thoughts?
-
Your gut instinct is correct. Yes, it's entirely possible to crawl javascript. Is it still difficult to crawl? As far as I know, yes. So why put up that barrier if you don't have to?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate/ <title>element too long issues</title>
I have a "duplicate <title>"/"<title> element too long" issue with thousands of pages. In the future I would like to automate these in a way that keeps them from being duplicated AND too long. The solution I came up with was to standardize these monthly posts with a similar, shorter, <title>, but then differentiate by adding the month and the year of the post at the end of each <title>. Hundreds of these come out every week, so it is hard to sit there and come up with a unique <title> every time. With this solution the <title> tags would undoubtedly be short enough, however my primary concern is, would simply adding the month and year at the end of each <title> be enough for Google/Moz to decide it is not a duplicate? How much variation is enough for it not to be deemed a duplicate <title>? </p></title>
Intermediate & Advanced SEO | | Brian_Dowd0 -
Block lightbox content
I'm working on a new website with aggregator of content.
Intermediate & Advanced SEO | | JohnPalmer
i'll show to my users content from another website in my website in LIGHTBOX windows when they'll click on the title of the items. ** I don't have specific url for these items.
What is the best way to say for SE "Don't index these pages"?0 -
Client wants a seperate .tv domain for their media/videos instead of a subdomain/subfolder. What is the best way to pass of link equity to a new domain?
We have a client that wants to place their video content on a .tv tld instead of a subfolder/subdomain in their .com website. They believe that the .tv domain will better represent the media experience of their business. We can understand this client's position however we are concerned about their .tv domain will lose out on the link equity if it were no longer placed in the .com's subdomain/subfolder. Here are our questions: 1. What would be the best way to pass of link equity from .com website to a new .tv domain? Should we just have a video link on the .com website that 301 directs to the new .tv domain? 2. Is there any SEO benefit of having a .tv domain for Google Video queries or even Youtube? 3. Is there any long term value of having two different websites? For link equity purposes we understand that it would be better if everything was in a .com. However is a .tv domain ideal for a better representation of their media content? We appreciate any feedback.
Intermediate & Advanced SEO | | RosemaryB0 -
2 URLS pointing to the same content
Hi, We currently have 2 URL's pointing to the same website (long story why we have it) - A & B. A is our main website but we set up B as a rewrite URL to use for our Pay Per Click campaign. Now because its the same site, but B is just a URL rewrite, Google Webmaster Tools is seeing that we have thousands of links coming in from site B to site A. I want to tell Google to ignore site B url but worried it might affect site A. I can't add a no follow link on site B as its the same content so will also be applicable on Site A. I'm also worried about using Google Disavow as it might impact on site A! Can anyone make any suggestions on what to do, as I would like to hear from anyone with experience with this or can recommend a safe option. Thanks for your time!
Intermediate & Advanced SEO | | Party_Experts0 -
Duplicate Page Content / Titles Help
Hi guys, My SEOmoz crawl diagnostics throw up thousands of Dup Page Content / Title errors which are mostly from the forum attached to my website. In-particular it's the forum user's profiles that are causing the issue, below is a sample of the URLs that are being penalised: http://www.mywebsite.com/subfolder/myforum/pop_profile.asp?mode=display&id=1308 I thought that by adding - http://www.mywebsite.com/subfolder/myforum/pop_profile.asp to my robots.txt file under 'Ignore' would cause the bots to overlook the thousands of profile pages but the latest SEOmoz crawl still picks them up. My question is, how can I get the bots to ignore these profile pages (they don't contain any useful content) and how much will this be affecting my rankings (bearing in mind I have thousands of errors for dup content and dup page titles). Thanks guys Gareth
Intermediate & Advanced SEO | | gaz33420 -
Which duplicate content should I remove?
I have duplicate content and am trying to figure out which URL to remove. What should I take into consideration? Authority? How close to the root the page is? How clear the path is? Would appreciate your help! Thanks!
Intermediate & Advanced SEO | | Ocularis0 -
Duplicate Content on Press Release?
Hi, We recently held a charity night in store. And had a few local celebs turn up etc... We created a press release to send out to various media outlets, within the press release were hyperlinks to our site and links on certain keywords to specific brands on our site. My question is, should we be sending a different press release to each outlet to stop the duplicate content thing, or is sending the same release out to everyone ok? We will be sending approx 20 of these out, some going online and some not. So far had one local paper website, a massive football website and a local magazine site. All pretty much same content and a few pics. Any help, hints or tips on how to go about this if I am going to be sending out to a load of other sites/blogs? Cheers
Intermediate & Advanced SEO | | YNWA0 -
How are they avoiding duplicate content?
One of the largest stores in USA for soccer runs a number of whitelabel sites for major partners such as Fox and ESPN. However, the effect of this is that they are creating duplicate content for their products (and even the overall site structure is very similar). Take a look at: http://www.worldsoccershop.com/23147.html http://www.foxsoccershop.com/23147.html http://www.soccernetstore.com/23147.html You can see that practically everything is the same including: product URL product title product description My question is, why is Google not classing this as duplicate content? Have they coded for it in a certain way or is there something I'm missing which is helping them achieve rankings for all sites?
Intermediate & Advanced SEO | | ukss19840