使用 node.js 在 mysql 中插入 LARGE 卷数据时出错(错误代码:'ECONNRESET') [英] error while inserting LARGE volume data in mysql by using node.js (error code: 'ECONNRESET')

查看:79
本文介绍了使用 node.js 在 mysql 中插入 LARGE 卷数据时出错(错误代码:'ECONNRESET')的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在使用 node.js 在 mysql 中插入大量数据时遇到错误这是数据

I met an error while inserting large volume data in mysql by using node.js Here is the data

instData = [ [ '73caf3d0-f6a4-11e8-8160-eb5f91ce3830',
               '20181017'],
             [ '73caf3d1-f6a4-11e8-8160-eb5f91ce3830',
               '20181019'],
            ... 49316 more items ]

这是连接代码的一部分:

Here is part of connection code:

 let pool = mysql.createPool(db);

 module.exports = {
   connPool (sql, val, cb) {  //function name
       pool.getConnection((err, conn) => {
         if(err){
           console.log('Connection Error:' + err);
         }else{
           console.log('allConnections:' + pool._allConnections.length);
           let q = conn.query(sql, val, (err, rows,fields) => {
              if (err) {
                console.log('Query:' + sql + ' error:' + err);
              }
              cb(err, rows, fields);
              conn.release();
            });
         } // end if
      }); // end pool.getConnection
   }, 
  ......

当我运行插入代码时,出现错误.

When I run the insert the code, I got the error.

 sql_inst ='insert into demo (id,upload_time) values ?';
 func.connPool(sql_inst, [instData], (err,rows,fields) => {
    if(err==null){
        res.json({code: 200, msg: 'success', data: req.body });
    } else {
        res.json({code: 400, msg: 'failed:'+err});
    }
 });

错误信息:

{ Error: read ECONNRESET
at TCP.onStreamRead (internal/stream_base_commons.js:111:27)
--------------------
at Protocol._enqueue (D:\Projects\test\node_modules\mysql\lib\protocol\Protocol.js:144:48)
at PoolConnection.query (D:\Projects\test\node_modules\mysql\lib\Connection.js:200:25)
at pool.getConnection (D:\Projects\test\sql\func.js:29:26)
at Ping.onOperationComplete (D:\Projects\test\node_modules\mysql\lib\Pool.js:110:5)
at Ping.<anonymous> (D:\Projects\test\node_modules\mysql\lib\Connection.js:502:10)
at Ping._callback (D:\Projects\test\node_modules\mysql\lib\Connection.js:468:16)
at Ping.Sequence.end (D:\Projects\test\node_modules\mysql\lib\protocol\sequences\Sequence.js:83:24)
at Ping.Sequence.OkPacket (D:\Projects\test\node_modules\mysql\lib\protocol\sequences\Sequence.js:92:8)
at Protocol._parsePacket (D:\Projects\vutest\node_modules\mysql\lib\protocol\Protocol.js:278:23)
at Parser.write (D:\Projects\test\node_modules\mysql\lib\protocol\Parser.js:76:12)
errno: 'ECONNRESET',
code: 'ECONNRESET',
syscall: 'read',
fatal: true }

似乎连接已关闭,但问题是我使用了连接池,它不应该每次都连接.

It seems the connection is closed, but the question is i use connection pool, it should not connect everytime.

当我减小数据大小时,例如数据包含100条记录,代码运行成功.

When I reduce the size of data, f.g. the data contains 100 records, the code runs successfully.

运行环境:节点:v10.11.0mysql:v5.7

Running env: node:v10.11.0 mysql:v5.7

我如何解决这个问题?太棒了!

How cold I address this issue? Great thx!

推荐答案

我已经解决了这个问题.它是由默认定义 max_allowed_pa​​cket 引起的.在 my.ini (C:\ProgramData\MySQL\MySQL Server 5.7) 中找到 max_allowed_pa​​cket.更新为max_allowed_pa​​cket=64M".重启mysql.完成.

I have fixed this issue. It is caused by the default definition max_allowed_packet. Find max_allowed_packet in my.ini (C:\ProgramData\MySQL\MySQL Server 5.7). Update to 'max_allowed_packet=64M'. Restart mysql. Done.

这篇关于使用 node.js 在 mysql 中插入 LARGE 卷数据时出错(错误代码:'ECONNRESET')的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆