如何保持的Node.js从分裂套接字消息成小块 [英] How to keep Node.js from splitting socket messages into smaller chunks
问题描述
我有哪些来自Apache / PHP到Node.js的推JSON数据一个聊天程序,通过TCP套接字:
I've got a chat program which pushes JSON data from Apache/PHP to Node.js, via a TCP socket:
// Node.js (Javascript)
phpListener = net.createServer(function(stream)
{
stream.setEncoding("utf8");
stream.on("data", function(txt)
{
var json = JSON.parse(txt);
// do stuff with json
}
}
phpListener.listen("8887", 'localhost');
// Apache (PHP)
$sock = stream_socket_client("tcp://localhost:8887");
$written = fwrite($sock, $json_string);
fclose($sock);
的问题是,如果JSON字符串足够大(超过周围8K),输出消息被分成多个块,并且JSON分析器失败。 PHP返回$写入值作为字符串的长度是否正确,但数据的事件处理程序触发两次或两次以上。
The problem is, if the JSON string is large enough (over around 8k), the output message gets split into multiple chunks, and the JSON parser fails. PHP returns the $written value as the correct length of the string, but the data event handler fires twice or more.
我应该功能连接到不同的事件,或者是有缓存整个事件触发文本的方式,在不会屈从于比赛重负载条件下的一种方式?或者一些其他的解决方案,我都没有想到的?
Should I be attaching the function to a different event, or is there a way to cache text across event fires, in a way that won't succumb to race conditions under heavy load? Or some other solution I haven't thought of?
谢谢!
推荐答案
您应该尝试使用一个缓存,缓存数据,Node.js的趋向,以提高性能数据分割。
You should try using a buffer, to cache the data, as Node.js tends to split data in order to improve performance.
http://nodejs.org/api.html#buffers-2
您可以缓冲所有的请求,然后调用存储在其中的数据的功能。
you can buffer all your request, and then call the function with the data stored at it.
这篇关于如何保持的Node.js从分裂套接字消息成小块的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!