如何优化服务和加载JavaScript文件? [英] How to optimally serve and load JavaScript files?

查看:220
本文介绍了如何优化服务和加载JavaScript文件?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我希望拥有全球规模网络应用程序经验的人可以澄清一些问题,假设和可能的误会。



让我们假设一个网站(大量的客户端/动态组件),其中全球有数十万用户,并且从一个位置(也就是中欧)提供来源。


  1. 如果应用程序依赖于流行的JavaScript库,最好从Google CDN中获取它,并将其编译成一个单独的最小化的JS文件(以及所有应用程序特定的JavaScript),或者从Google CDN?

  2. Assetic VS headjs :加载一个JS文件是否更有意义或并行加载所有脚本(按依赖关系执行)?

我的假设(请更正我):



将所有应用程序特定/本地JS代码编译成一个文件,使用像Google这样的CDN,用于流行的库等,但是通过headjs并行加载所有这些似乎最佳,但我不确定。第三方JS和应用程序特定的JS的服务器端编译似乎似乎使用CDN的目的,因为库可能被缓存到用户的一行。除了缓存之外,从Google的CDN下载第三方库可能比托管应用程序的中央服务器更快。



如果新版本的流行JS库以大的性能提升发布,则会随应用程序进行测试,然后实现:




  • 如果所有的JS都被编译成一个文件,那么即使应用程序代码没有改变,每个用户都必须重新下载这个文件。

  • 如果从CDN加载第三方脚本那么用户只能从CDN(或从某个地方的缓存)下载新版本。



以下任何以下的正当担忧像这样描述的情况?




  • 一些用户(或浏览器)只能一次连接一个主机名,来自第三方CDN的一些脚本将导致整体加载时间的加快。

  • 有些用户ma您在受限制的环境中使用该应用程序,因此该应用程序的域可能是白名单而不是CDN的域。 (如果可能这是现实的问题,是否可以尝试从CDN加载并从中央服务器加载失败?)


解决方案


将所有应用程序特定/本地JS代码编译成一个文件


由于我们的一些主要目标是减少HTTP请求数量< a>和最小化请求开销,这是一个非常广泛采用的最佳实践



我们可能认为不这样做的主要情况是在频繁高速缓存无效的情况下,即当我们对代码进行更改时。在这里总是存在折衷:服务一个文件很可能会提高缓存无效率,而服务多个单独的文件可能会导致用户空闲缓存的启动速度较慢。



正因为如此,将某些特定于页面的JavaScript的偶然内容整合起来并不像一些人所说的那么邪恶。一般来说,将JS连接到一个文件中是一个很好的第一步。


使用像Google这样的热门图书馆等CDN。


如果我们在谈论我们使用的代码是不可改变的库,那就是不太可能会被缓存失效,我可能会稍微多一些赞成将HTTP请求包装到单片本地JS文件中。这对于基于例如特定jQuery版本的大型代码库尤其如此。在这种情况下,图书馆版本几乎肯定会涉及对您的客户端应用程序代码的重大更改,否定了将其分开的优势。



仍然,混合请求域是一个重要的胜利,因为我们不想成为通过每个域名上限的最大连接过多地节流。当然,一个子域也可以这样服务,但是谷歌的域名具有无味的优势,可能已经在客户端的DNS缓存中。


,但是通过headjs并行加载所有这些都似乎是最优的


虽然新兴的JavaScript主机有优势,我们应该记住,使用它们会对页面启动产生负面影响,因为浏览器需要在加载程序请求其余的资源之前,才能获取我们的加载程序。换句话说,对于具有空高速缓存的用户,在开始任何实际加载之前,需要完整的往返服务器。再次,编译步骤可以进行拯救 - 请参阅 require.js 伟大的混合实现。



确保您的脚本不阻止UI绘画的最佳方法仍然是将它们放置在HTML的末尾。如果您希望将它们放在其他地方,则现在可以为您提供 async defer 属性。所有现代浏览器并行请求资产,因此除非您需要支持特定的旧版客户端,否则不应成为主要考虑因素。 Browserscope网络表是对这种事情的很好的参考。 IE8是可预测的主要罪犯,仍然阻止图像和iFrame请求,直到脚本加载。即使是 3.6 Firefox完全并行化除了iFrames之外的所有内容。


有些用户可能在有限的环境中使用该应用程序,因此应用程序的域可能是白名单不是CDN的域名。 (如果可能,这是现实的关切,是否可以尝试从CDN加载并从中央服务器加载失败?)


如果客户端机器可以访问远程主机,总是会导致严重的性能损失,因为我们必须等待它无法连接才能加载我们的保留副本。我更倾向于在本地托管这些资产。


I'm hoping someone with more experience with global-scale web applications could clarify some questions, assumptions and possible misunderstandings I have.

Let's take a hypothetical site (heavy amount of client-side / dynamic components) which has hundreds of thousands of users globally and the sources are being served from one location (let's say central Europe).

  1. If the application depends on popular JavaScript libraries, would it be better to take it from the Google CDN and compile it into one single minified JS file (along with all application-specific JavaScript) or load it separately from the Google CDN?
  2. Assetic VS headjs: Does it make more sense to load one single JS file or load all the scripts in parallel (executing in order of dependencies)?

My assumptions (please correct me):

Compiling all application-specific/local JS code into one file, using CDNs like Google's for popular libraries, etc. but loading all of these via headjs in parallel seems optimal, but I'm not sure. Server-side compiling of third party JS and application-specific JS into one file seems to almost defeat the purpose of using the CDN since the library is probably cached somewhere along the line for the user anyway.

Besides caching, it's probably faster to download a third party library from Google's CDN than the central server hosting the application anyway.

If a new version of a popular JS library is released with a big performance boost, is tested with the application and then implemented:

  • If all JS is compiled into one file then every user will have to re-download this file even though the application code hasn't changed.
  • If third party scripts are loaded from CDNs then the user only has download the new version from the CDN (or from cache somewhere).

Are any of the following legitimate worries in a situation like the one described?

  • Some users (or browsers) can only have a certain number of connections to one hostname at once so retrieving some scripts from a third party CDN would be result in overall faster loading times.
  • Some users may be using the application in a restricted environment, therefore the domain of the application may be white-listed but not the CDNs's domains. (If it's possible this is realistic concern, is it at all possible to try to load from the CDN and load from the central server on failure?)

解决方案

Compiling all application-specific/local JS code into one file

Since some of our key goals are to reduce the number of HTTP requests and minimize request overhead, this is a very widely adopted best practice.

The main case where we might consider not doing this is in situations where there is a high chance of frequent cache invalidation, i.e. when we make changes to our code. There will always be tradeoffs here: serving a single file is very likely to increase the rate of cache invalidation, while serving many separate files will probably cause a slower start for users with an empty cache.

For this reason, inlining the occasional bit of page-specific JavaScript isn't as evil as some say. In general though, concatenating and minifying your JS into one file is a great first step.

using CDNs like Google's for popular libraries, etc.

If we're talking about libraries where the code we're using is fairly immutable, i.e. unlikely to be subject to cache invalidation, I might be slightly more in favour of saving HTTP requests by wrapping them into your monolithic local JS file. This would be particularly true for a large code base heavily based on, for example, a particular jQuery version. In cases like this bumping the library version is almost certain to involve significant changes to your client app code too, negating the advantage of keeping them separate.

Still, mixing request domains is an important win, since we don't want to be throttled excessively by the maximum connections per domain cap. Of course, a subdomain can serve just as well for this, but Google's domain has the advantage of being cookieless, and is probably already in the client's DNS cache.

but loading all of these via headjs in parallel seems optimal

While there are advantages to the emerging host of JavaScript "loaders", we should keep in mind that using them does negatively impact page start, since the browser needs to go and fetch our loader before the loader can request the rest of our assets. Put another way, for a user with an empty cache a full round-trip to the server is required before any real loading can begin. Again, a "compile" step can come to the rescue - see require.js for a great hybrid implementation.

The best way of ensuring that your scripts do not block UI painting remains to place them at the end of your HTML. If you'd rather place them elsewhere, the async or defer attributes now offer you that flexibility. All modern browsers request assets in parallel, so unless you need to support particular flavours of legacy client this shouldn't be a major consideration. The Browserscope network table is a great reference for this kind of thing. IE8 is predictably the main offender, still blocking image and iFrame requests until scripts are loaded. Even back at 3.6 Firefox was fully parallelising everything but iFrames.

Some users may be using the application in a restricted environment, therefore the domain of the application may be white-listed but not the CDNs's domains. (If it's possible this is realistic concern, is it at all possible to try to load from the CDN and load from the central server on failure?)

Working out if the client machine can access a remote host is always going to incur serious performance penalties, since we have to wait for it to fail to connect before we can load our reserve copy. I would be much more inclined to host these assets locally.

这篇关于如何优化服务和加载JavaScript文件?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆