为什么网络工作者的表现在30秒后急剧下降? [英] Why does web worker performance sharply decline after 30 seconds?

查看:119
本文介绍了为什么网络工作者的表现在30秒后急剧下降?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我试图在web worker中执行时提高脚本的性能。它旨在解析浏览器中的大文本文件而不会崩溃。一切运作良好,但我注意到在使用web worker时大文件的性能存在严重差异。



所以我进行了一个简单的实验。我在同一个输入上运行了两次脚本。第一次运行在页面的主线程中执行脚本(没有web worker)。自然,这会导致页面冻结并变得无法响应。对于第二次运行,我在web worker中执行脚本。




  • 中,这完全是同步的。由于我无法解释的原因,该循环速度显着减慢,30秒后随着时间变慢。

    在主线程中运行相同的代码,就像你上面看到的那样,运行起来非常顺利和快速。



    为什么会发生这种情况?我能做些什么来提高性能? (我不希望任何人完全理解该文件中的所有1,200行代码,如果你这样做的话,那太棒了,但我感觉这与web工作者比我的代码更相关,因为它在主要版本中运行良好系统:我在Mac OS 10.9.4上运行Chrome 35,内存为16 GB;四核2.7 GHz英特尔酷睿i7,具有256 KB二级高速缓存(每个核心)和三级高速缓存6 MB。文件块大小约为10 MB。

    更新:刚刚在Firefox 30上试用过,而不是在工作线程中经历相同的减速(但在主线程中运行时比Chrome慢)。然而,在一个更大的文件(大约1 GB)下尝试同样的实验,在大约35到40秒之后产生显着的减速(看起来)。

    解决方案
    div>

    Tyler Ault在Google+上建议了一种可能性,结果非常有帮助。



    他推测在工作线程中使用 FileReaderSync (而不是普通的ol'async FileReader )没有提供垃圾收集发生的机会。



    将工作线程异步更改为使用 FileReader (这直观上看起来像是一个性能步骤 em>)将这一过程加速回到了37秒,这正是我期望的结果。



    我还没有从泰勒那里听到过,我是不完全确定我明白为什么垃圾收集会是罪魁祸首,但有关 FileReaderSync 大大减缓了代码的速度。


    I'm trying to improve the performance of a script when executed in a web worker. It's designed to parse large text files in the browser without crashing. Everything works pretty well, but I notice a severe difference in performance for large files when using a web worker.

    So I conducted a simple experiment. I ran the script on the same input twice. The first run executed the script in the main thread of the page (no web workers). Naturally, this causes the page to freeze and become unresponsive. For the second run, I executed the script in a web worker.

    For small files in this experiment (< ~100 MB), the performance difference is negligible. However, on large files, parsing takes about 20x longer in the worker thread:

    The blue line is expected. It should only take about 11 seconds to parse the file, and the performance is fairly steady:

    The red line is the performance inside the web worker. It is much more surprising:

    The jagged line for the first 30 seconds is normal (the jag is caused by the slight delay in sending the results to the main thread after every chunk of the file is parsed). However, parsing slows down rather abruptly at 30 seconds. (Note that I'm only ever using a single web worker for the job; never more than one worker thread at a time.)

    I've confirmed that the delay is not in sending the results to the main thread with postMessage(). The slowdown is in the tight loop of the parser, which is entirely synchronous. For reasons I can't explain, that loop is drastically slowed down and it gets slower with time after 30 seconds.

    But this only happens in a web worker. Running the same code in the main thread, as you've seen above, runs very smoothly and quickly.

    Why is this happening? What can I do to improve performance? (I don't expect anyone to fully understand all 1,200+ lines of code in that file. If you do, that's awesome, but I get the feeling this is more related to web workers than my code, since it runs fine in the main thread.)

    System: I'm running Chrome 35 on Mac OS 10.9.4 with 16 GB memory; quad-core 2.7 GHz Intel Core i7 with 256 KB L2 cache (per core) and L3 Cache of 6 MB. The file chunks are about 10 MB in size.

    Update: Just tried it on Firefox 30 and it did not experience the same slowdown in a worker thread (but it was slower than Chrome when run in the main thread). However, trying the same experiment with an even larger file (about 1 GB) yielded significant slowdown after about 35-40 seconds (it seems).

    解决方案

    Tyler Ault suggested one possibility on Google+ that turned out to be very helpful.

    He speculated that using FileReaderSync in the worker thread (instead of the plain ol' async FileReader) was not providing an opportunity for garbage collection to happen.

    Changing the worker thread to use FileReader asynchronously (which intuitively seems like a performance step backwards) accelerated the process back up to just 37 seconds, right where I would expect it to be.

    I haven't heard back from Tyler yet and I'm not entirely sure I understand why garbage collection would be the culprit, but something about FileReaderSync was drastically slowing down the code.

    这篇关于为什么网络工作者的表现在30秒后急剧下降?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆