Angular JS - Page Load
-
Website build in process in Angular JS. We are looking at prerendering the pages so its all good. However, because there are going to be few server requests, how would the page load be like for search engines?
Also, on the client side (browser) would there be any impact if we prerender the pages?
Cheers!
-
Did you read prerender documentation?
https://prerender.io/documentation/install-middleware#apacheBecause there you can find two examples (Apache + nginx):
https://gist.github.com/thoop/8072354
https://gist.github.com/thoop/8165802How they works? Simple - bot's are received proxified version from this url:
http://service.prerender.io/http://example.com/url
this works as your server is switch to specific proxy mode called reverse proxy. This works similar as proxy. Proxy caches results from few computers/network to the internet. Computers are clients, they sent requests, proxy go in internet and execute it, then return result to clients. This is normal way. In reverse way - internet is client and proxy serve requests to internal infrastructure. This allow hiding internal infrastructure, easy scaling or even make complex site with few internal servers (one will process /blog, other /shop, third /support, etc).But - this "prerender" version is served only to bots. Normal clients (not in list) received AngularJS version of HTML. Since everything is served from your own server you shouldn't hesitated.
Second - do not (!!!) sent prerendered version to clients because prerender can't load pages from your server to make it prerendered. You can make easy overload your server in redirect loop. Also prerender server's too.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unlimited Product Pages
While browsing through my Moz campaign, I noticed that my site is pulling up unlimited numbers of product pages even though no products appear on them. i.e. http://www.interstellarstore.com/star-trek-memorabilia?page=16 http://www.interstellarstore.com/star-trek-memorabilia?page=100 http://www.interstellarstore.com/star-trek-memorabilia?page=200 I have no ideal how to resolve this issue. I can't possible 301 an unlimited number of pages, and I can see this being a big SEO problem. Any thoughts?
Intermediate & Advanced SEO | | moon-boots0 -
Splitting down pages
Hello everyone, I have a page on my directory for example:
Intermediate & Advanced SEO | | SamBayPublishing
https://ose.directory/topics/breathing-apparatus The title on this page is small yet a bit unspecific:
Breathing Apparatus Companies, Suppliers and Manufacturers On webmaster tools these terms hold different values for each category so "topic name companies" sometimes has a lot more searches than "topic name suppliers". I was thinking if I could split the page into the following into three separate pages would that be better: https://ose.directory/topics/breathing-apparatus (main - Title: Breathing Apparatus)
https://ose.directory/topics/breathing-apparatus/companies (Title: Breathing Apparatus Companies)
https://ose.directory/topics/breathing-apparatus/manufacturers (Title: Breathing Apparatus Manufacturers)
https://ose.directory/topics/breathing-apparatus/suppliers (Title: Breathing Apparatus Suppliers) Two Questions: Would this be more beneficial from an SEO perspective? Would google penalise me for doing this, if so is there a way to do it properly. PS. The list of companies may be the same but the page content ever so slightly different. I know this would not effect my users much because the terms I am using all mean pretty much the same thing. The companies do all three.0 -
Home Page Authority
My site has several different homepage versions. I am running on the Volusion eCommerce. www.mydomain.com - Page Authority 44
Intermediate & Advanced SEO | | PartyStore
www.mydomain.com/Default.asp - Page Authority 33
www.mydomain.com/default.asp - Page Authority = 33 So here is the question, is it normal to have different page Authorities for each version? Is this diluting my SEO for the homepage? Any input on this would be appreciated.0 -
I have 2 keywords I want to target, should I make one page for both keywords or two separate pages?
My team sells sailboats and pontoon boats all over the country. So while they are both boats, the target market is two different types of people... I want to make a landing page for each state so if someone types in "Pontoon Boats for sale in Michigan" or "Pontoon boats for sale in Tennessee," my website will come up. But I also want to come up if someone is searching for sailboats for sale in Michigan or Tennessee (or any other state for that matter). So my question is, should I make 1 page for each state that targets both pontoon boats and sailboats (total of 50 landing pages), or should I make two pages for each state, one targeting pontoon boats and the other sailboats (total of 100 landing pages). My team has seen success targeting each state individually for a single keyword, but have not had a situation like this come up yet.
Intermediate & Advanced SEO | | VanMaster0 -
Google+ Page Question
Just started some work for a new client, I created a Google+ page and a connected YouTube page, then proceeded to claim a listing for them on google places for business which automatically created another Google+ page for the business listing. What do I do in this situation? Do I delete the YouTube page and Google+ page that I originally made and then recreate them using the Google+ page that was automatically created or do I just keep both pages going? If the latter is the case, do I use the same information to populate both pages and post the same content to both pages? That doesn't seem like it would be efficient or the right way to go about handling this but I could be wrong.
Intermediate & Advanced SEO | | goldbergweismancairo0 -
Keywords under product listing pages
Hi guys, One of my main concerns when we start redesigning the site Trespass.co.uk, is the current pages like this one http://www.trespass.co.uk/snow-sports/clothing/ski-jackets/womens-ski-jackets are bordering over optimisation. Is this the case as each product listed in the url above has "womens ski jacket" under each product. If we have 50 products on each product listing page with the product name + type of product, ie. flora womens ski jacket, xyz mens waterproof jacket. Are we over optimising the page for the main keywords by having them under each product? Would that page be over optimised for womens ski jackets? Thanks guys
Intermediate & Advanced SEO | | Trespass0 -
How to improve ranking of deep pages?
While this may sound like an obvious or stupid question at first...let me explain... We are an e-commerce website which sells one type of item nationally; for sake of an example which is similar to us, you can think of an e-commerce site that sells movie theater tickets in cities and towns across the country. Our home page ranks very well for the appropriate keywords as well as some of our state and city pages rank very well for local searches. However, while some state and city pages rank well for their respective local searches, others have a low page rank with some not even in the top 50 for their respective keywords. My question is that we aren't clear why some pages will rank well while others wont when the competition looks similar for those local searches. And in today's Panda/Penguin era we are unsure of how to get more of these state/city pages ranking better? For the record, we are quite strict about on-page SEO, 99% of our 5600 pages are crawled & we have minimum SEO errors from the SEOMoz crawls. Can anyone provide some feedback & thoughts?
Intermediate & Advanced SEO | | CTSupp0 -
Deferred javascript loading
Hi! This follows on from my last question. I'm trying to improve the page load speed for http://www.gear-zone.co.uk/. Currently, Google rate the page speed of the GZ site at 91/100 – with the javascript being the only place where points are being deducated. The only problem is, the JS relates to the trustpilot widget, and social links at the bottom of the page – neither of which work when they are deferred. Normally, we would add the defer attribute to the script tags, but by doing so it waits until the page is fully loaded before executing the scripts. As both the js I mentioned (reviews and buttons) use the document.Write command, adding this would write the code off the page and out of placement from where they should be. Anyone have any ideas?
Intermediate & Advanced SEO | | neooptic0