异步批量(批次),通过插入的Node.js到MySQL(或MongoDB的?) [英] Async Bulk(batch) insert to MySQL(or MongoDB?) via Node.js

查看:1549
本文介绍了异步批量(批次),通过插入的Node.js到MySQL(或MongoDB的?)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

直奔Qeustion - >

Straight to the Qeustion ->.

问题:要使用Node.js的异步做批量插入(没有必要的大头,如果MySQL可以处理它)(即将形式的.NET和PHP背景)

The problem : To do async bulk inserts (not necessary bulk, if MySql can Handle it) using Node.js (coming form a .NET and PHP background)

例如:
 假设我有做了一些工作(异步)40(可调)功能和其单一迭代之后每加在表中的记录,现在是非常可能的,在同一时间多个功能,使插入电话。 MySQL能够处理它的方式直接?考虑有将是一个自动更新域。

Example : Assume i have 40(adjustable) functions doing some work(async) and each adding a record in the Table after its single iteration, now it is very probable that at the same time more than one function makes an insertion call. Can MySql handle it that ways directly?, considering there is going to be an Auto-update field.

在C#(.NET)我会使用一个DataTable包含所有从每个功能到底行批量插入数据表到数据库表。并启动多个线程每个功能。

In C#(.NET) i would have used a dataTable to contain all the rows from each function and in the end bulk-insert the dataTable into the database Table. and launch many threads for each function.

你会在这种情况下,建议有什么办法,

What approach will you suggest in this case,

应情况下,我需要处理每桌10,000或400万行的办法改变吗?
另外,DB架构不会改变,将MongoDB的是这个更好的选择?

Shall the approach change in case i need to handle 10,000 or 4 million rows per table? ALso The DB schema is not going to change, will MongoDB be a better choice for this?

我是新来的节点的NoSQL而在小白此刻学习阶段。所以,如果你能提供一些解释你的答案,这将是真棒。

I am new to Node, NoSql and in the noob learning phase at the moment. So if you can provide some explanation to your answer, it would be awesome.

感谢。

修改
答:无论是MySQL或MongoDB中支持任何形式的批量插入的,引擎盖下,这只是一个foreach循环。
他们两人都是能够处理大量连接simultanously,表现将在很大程度上取决于你的要求和生产环境。

EDIT : Answer : Neither MySql or MongoDB support any sort of Bulk insert, under the hood it is just a foreach loop. Both of them are capable of handling a large number of connections simultanously, the performance will largely depend on you requirement and production environment.

推荐答案

1)每个连接顺序执行。如果您正在使用一个连接,你的40〜功能将导致排队40查询(在MySQL的库通过明确的队列,根据syncronisation元的code或系统队列),不一定以相同的顺序你开始40功能。 MySQL不会在这种情况下,任何种族的条件问题,自动更新域

1) in MySql queries are executed sequentially per connection. If you are using one connection, your 40~ functions will result in 40 queries enqueued (via explicit queue in mysql library, your code or system queue based on syncronisation primitives), not necessarily in the same order you started 40 functions. MySQL won't have any race conditions problems with auto-update fields in that case

2),如果你真的想并行执行40查询您需要打开40个连接到MySQL(这是不是从性能的角度来看是个好主意,但同样,MySQL是旨在正确处理自动递增多个客户端)

2) if you really want to execute 40 queries in parallel you need to open 40 connections to MySQL (which is not a good idea from performance point of view, but again, Mysql is designed to handle auto-increments correctly for multiple clients)

3)在上线级MySQL协议没有特殊的批量插入命令,任何图书馆其实暴露批量插入API只是在做长插入...价值观查询。

3) There is no special bulk insert command in the Mysql protocol on the wire level, any library exposing bulk insert api in fact just doing long 'insert ... values' query.

这篇关于异步批量(批次),通过插入的Node.js到MySQL(或MongoDB的?)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆