Web优化:为什么合并文件更快? [英] Web Optimization: Why are Combined Files Faster?

查看:55
本文介绍了Web优化:为什么合并文件更快?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已阅读到将您的所有css文件合并为一个大文件,或将您的所有脚本文件合并为一个脚本文件可以减少HTTP请求的数量,从而加快下载速度.

但是我不明白这一点.我认为如果您有多个文件(在现代浏览器中最多可以限制为10个文件),浏览器将并行下载它们,从而减少了总下载时间(除以允许的连接数)./p>

我显然在这里缺少关键信息.有人可以开灯吗?

解决方案

每个请求/响应中都有开销.从本质上讲,这就是结果.

这是Google请求标头的示例...

获取 http://www.google.com/ HTTP/1.1接受:application/x-ms-application,图片/jpeg,应用程序/xaml + xml,图片/gif,图片/pjpeg,application/x-ms-xbap,application/vnd.ms-excel,application/vnd.ms-powerpoint,应用程序/msword,application/x-shockwave-flash,/接受语言:美国用户代理:Mozilla/4.0(兼容; MSIE 7.0;Windows NT 6.1;WOW64;三叉戟/4.0;GTB0.0;SLCC2;.NET CLR 2.0.50727;.NET CLR 3.5.30729;.NET CLR3.0.30729;Media Center PC 6.0;OfficeLiveConnector.1.4;OfficeLivePatch.1.3)接受编码:gzip,放气连接:保持活动状态主持人:www.google.com Cookie:xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

去年我写了一篇关于这件事的文章... http://swortham.blogspot.com/2010/03/latency-requests-css-sprites-and-you.html

您是对的,可以并行下载多个文件(一个主机名可以下载2个或更多文件,具体取决于浏览器).进而会导致页面逐渐加载,这很好.但这并不意味着您的首页应该由20多个CSS,JS和图像文件组成.理想情况下,您需要大量整合以优化网站.

I have read that combining all of your css files into one big one, or all of your script files into a single script file reduces the number of HTTP requests and therefore speeds up download speed.

But I don't understand this. I thought that if you had multiple files (up to a limit, which is 10 I believe on modern browsers), the browser would download them in parallel, thus REDUCING the total time to download (divided by the number of connections allowed).

I am obviously missing a key piece of info here. Can someone turn on the lights?

解决方案

There's overhead in every request/response. That's essentially what it comes down to.

Here's an example of a request header to Google ...

GET http://www.google.com/ HTTP/1.1 Accept: application/x-ms-application, image/jpeg, application/xaml+xml, image/gif, image/pjpeg, application/x-ms-xbap, application/vnd.ms-excel, application/vnd.ms-powerpoint, application/msword, application/x-shockwave-flash, / Accept-Language: en-US User-Agent: Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.1; WOW64; Trident/4.0; GTB0.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; OfficeLiveConnector.1.4; OfficeLivePatch.1.3) Accept-Encoding: gzip, deflate Connection: Keep-Alive Host: www.google.com Cookie: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

I wrote an article about this last year... http://swortham.blogspot.com/2010/03/latency-requests-css-sprites-and-you.html

You are right that multiple files can be downloaded in parallel (2 or more from a single hostname, depending on the browser). And that in turn will cause the page to load progressively, which is good. But that doesn't mean that your homepage should be composed of 20+ css, js, and image files. Ideally you'd want to combine quite a bit to optimize the site.

这篇关于Web优化:为什么合并文件更快?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆