通过获取浏览器的并发请求数 [英] Get number of concurrent requests by browser

查看:182
本文介绍了通过获取浏览器的并发请求数的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我试图找出是否会是值得的跨多个子域小号$ P $肚图像请求。例如,[本文](链接断开)说:


  

浏览器只能使在同一时间两个请求,所以浏览器会要求两个文件,​​下载它们,然后转移到接下来的两个。越多的HTTP请求,或单独组件的页面需要正确显示,时间越长,用户将不得不等待。


当他们说的,哪些浏览器特别?涉及到并行XMLHtt prequests的数数,每<一个href=\"http://stackoverflow.com/questions/561046/how-many-concurrent-ajax-xmlhtt$p$pquest-requests-are-allowed-in-popular-browser\">this问题?


解决方案

有很多东西在这里考虑的问题。在大多数情况下,我只能选择一个cookie的域/子域来承载您的图像,如static.mywebsite.com。理想情况下静态文件应该由CDN主办,但那是另一回事了。

首先,IE7允许每个主机只有两个并发连接。但今天大多数浏览器都允许不止于此。 IE8允许6个并发连接,Chrome可让6和Firefox允许8。

所以,如果你的网页只有6个图像,例如,那么它真的会是毫无意义的为s $ P $垫跨多个子域图像。

因此​​,让我们假设你有一个页面上的24幅图像。那么,在生活中一些东西是免费的,有通过并行这样的事情死亡。如果在4个不同的子域名托管图片,那么这意味着,每一个形象理论上并行下载。然而,这也意味着有涉及3个额外的DNS查询。和DNS查找可能是100毫秒,150毫秒,或有时更长。这增加了延迟很容易抵消并行下载任何好处。您可以通过 http://www.webpagetest.org/ <测试站点看到这个真实世界的例子/ p>

当然,最好的解决办法是使用CSS精灵时可以减少请求的数目。我谈论和的这篇文章这个

更新

有从史蒂夫Souders的上分片域的主题...

一篇有趣的文章

  

大部分的美国十大网站做域分片。 YouTube使用
  i1.ytimg.com,i2.ytimg.com,i3.ytimg.com和i4.ytimg.com。生活
  搜索使用ts1.images.live.com,ts2.images.live.com,
  ts3.images.live.com和ts4.images.live.com。这两个站点都
  在四个领域分片。什么是最佳数字? 雅虎!
  发布了经过至少两个建议分片的研究,但没有
  四个多,领域。以上四,表现实际上下降。


http://www.stevesouders.com/blog / 2009/05/12 /分片主导域/

不过请注意,这是写于2009年,2011年他张贴评论...


  

由于新的浏览器打开每个域更多的连接,它可能
  更好地调低数。我认为2是一个很好的compromize,
  但是这只是一种预感。这将会是巨大的,如果某些生产性能跑
  一个测试,以确定最佳数量


您也应该记住,很大的原因是为大网站,如雅虎和亚马逊做域分片甚至是必要的是,他们的网站是如此充满活力。的图像被附加到其被动态显示的产品或故事。因此,这并不可行为他们使用CSS精灵作为积极的将是最佳的。

像计算器一个网站,但是,是对这类图像的光,他们已经削减要求这么多,他们并不需要做拆分的数量。向使这种情况发生了一大步是他们这个sprites.png形象的使用...

http://cdn.sstatic.net/Sites/计算器/ IMG / sprites.png?v = 5

更新#2
史蒂夫Souders的发布域分片的另一个更新。他重复很多东西,我已经提到过。但是,站出来的东西是SPDY以及如何应该影响你的决定。


  

也许对分片领域的强有力的论据是,它是
  在不必要的SPDY世界(以及HTTP 2.0)。事实上,
  域分片可能会伤害SPDY下的性能。 SPDY支持
  并发请求(发送所有早期的请求头),以及
  请求优先级。跨多个域拆分减少
  这些好处。 SPDY由Chrome浏览器,火狐,Opera和IE浏览器的支持
  11.如果您的流量是由这些浏览器的天下,你可能想跳过域拆分。


I'm trying to figure out whether it would be worthwhile to spread image requests across multiple sub-domains. [This article](link broken) for example says:

Most browsers can only make two requests at a time, so the browser will request two files, download them and then move on to the next two. The more HTTP requests, or separate components a page requires to display properly, the longer the user will have to wait.

When they say most, which browsers in particular? Is that number related to the number of concurrent XMLHttpRequests, per this question?

解决方案

There are a lot of things to consider here. In most situations, I would only choose one cookieless domain/subdomain to host your images such as static.mywebsite.com. And ideally static files should be hosted by a CDN, but that's another story.

First of all, IE7 allowed only two concurrent connections per host. But most browsers today allow more than that. IE8 allows 6 concurrent connections, Chrome allows 6, and Firefox allows 8.

So if your web page only has 6 images, for example, then it'd really be pointless to spread your images across multiple subdomains.

So let's say you have 24 images on a page. Well, few things in life are free and there's such a thing as death by parallelization. If you host your images in 4 different subdomains, then that means that every single image could theoretically be downloaded in parallel. However, it also means that there are 3 additional DNS lookups involved. And a DNS lookup could be 100 ms, 150 ms, or sometimes longer. This added delay could easily offset any benefit of parallel downloads. You can see real-world examples of this by testing sites with http://www.webpagetest.org/

Of course the best solution is to use CSS sprites when possible to cut down on the number of requests. I talk about that and the inherent overhead of every request in this article and this one.

UPDATE

There's an interesting article from Steve Souders on the subject of sharding domains...

Most of the U.S. top ten web sites do domain sharding. YouTube uses i1.ytimg.com, i2.ytimg.com, i3.ytimg.com, and i4.ytimg.com. Live Search uses ts1.images.live.com, ts2.images.live.com, ts3.images.live.com, and ts4.images.live.com. Both of these sites are sharding across four domains. What’s the optimal number? Yahoo! released a study that recommends sharding across at least two, but no more than four, domains. Above four, performance actually degrades.

http://www.stevesouders.com/blog/2009/05/12/sharding-dominant-domains/

Note however that this was written in 2009. And in 2011 he posted a comment...

Since newer browsers open more connections per domain, it’s probably better to revise the number downwards. I think 2 is a good compromize, but that’s just a hunch. It’d be great if some production property ran a test to determine the optimal number.

You should also keep in mind that the big reason it's even necessary for the big sites like Yahoo and Amazon to do domain sharding is that their sites are so dynamic. The images are attached to products or stories which are displayed dynamically. So it's not feasible for them to use CSS sprites as aggressively as would be optimal.

A site like StackOverflow, however, is light on these sorts of images and they have cut down on the number of requests so much that they don't need to do sharding. A big step towards making that happen is their usage of this sprites.png image...

http://cdn.sstatic.net/Sites/stackoverflow/img/sprites.png?v=5

UPDATE #2 Steve Souders posted another update on domain sharding. He repeats much of what I've already mentioned. But the thing that stood out was SPDY and how that should affect your decision.

Perhaps the strongest argument against domain sharding is that it’s unnecessary in the world of SPDY (as well as HTTP 2.0). In fact, domain sharding probably hurts performance under SPDY. SPDY supports concurrent requests (send all the request headers early) as well as request prioritization. Sharding across multiple domains diminishes these benefits. SPDY is supported by Chrome, Firefox, Opera, and IE 11. If your traffic is dominated by those browsers, you might want to skip domain sharding.

这篇关于通过获取浏览器的并发请求数的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆