Sql批量复制内存问题 [英] Sql bulk copy memory issue

查看:21
本文介绍了Sql批量复制内存问题的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我们在 C# 中使用 SqlBulk Copy 类.在sql中插入批量数据.我们有一个包含 1000 万条记录的表.

We are using SqlBulk Copy class in C#. To insert Bulk data in sql. We have a table with 10 million records in it.

我们在循环中以 10,000 个批次插入数据

We are inserting data in a batch of 10,000 in a loop

我们正面临物理内存问题.内存增加而不是减少.

We are facing physical memory issue.The memory gets increased and do not get reduced.

下面是我们的代码.使用sql批量复制时如何释放内存或者有没有其他方法可以批量插入.

Below is our code . How we can release memory when sql bulk copy is used or is there any another way to do bulk insert.

using (System.Data.SqlClient.SqlBulkCopy bulkCopy = new System.Data.SqlClient.SqlBulkCopy(SQlConn,SqlBulkCopyOptions.TableLock,null))
{
    //bulkCopy = new System.Data.SqlClient.SqlBulkCopy(SQlConn);
    bulkCopy.DestinationTableName = DestinationTable;
    bulkCopy.BulkCopyTimeout = 0;
    bulkCopy.BatchSize = dt1.Rows.Count;
    Logger.Log("DATATABLE FINAL :" + dt1.Rows.Count.ToString(), Logger.LogType.Info);
    if (SQlConn.State == ConnectionState.Closed || SQlConn.State == ConnectionState.Broken)
        SQlConn.Open();
    bulkCopy.WriteToServer(dt1); //DataTable
    SQlConn.Close();
    SQlConn.Dispose();
    bulkCopy.Close();
    if (bulkCopy != null)
    {
        ((IDisposable)bulkCopy).Dispose();
    }                        
}

此处更新完整代码.

try
        {

            using (SqlConnection SQlConn = new SqlConnection(Common.SQLConnectionString))
            {


                DataTable dt1 = FillEmptyDateFields(dtDestination);

                //SqlTableCreator ObjTbl = new SqlTableCreator(SQlConn);

                //ObjTbl.DestinationTableName = DestinationTable;
                using (System.Data.SqlClient.SqlBulkCopy bulkCopy = new System.Data.SqlClient.SqlBulkCopy(SQlConn,SqlBulkCopyOptions.TableLock,null))
                {

                    //bulkCopy = new System.Data.SqlClient.SqlBulkCopy(SQlConn);
                    bulkCopy.DestinationTableName = DestinationTable;
                    bulkCopy.BulkCopyTimeout = 0;
                    bulkCopy.BatchSize = dt1.Rows.Count;
                    Logger.Log("DATATABLE FINAL :" + dt1.Rows.Count.ToString(), Logger.LogType.Info);
                    if (SQlConn.State == ConnectionState.Closed || SQlConn.State == ConnectionState.Broken)
                        SQlConn.Open();
                    bulkCopy.WriteToServer(dt1);
                    SQlConn.Close();
                    SQlConn.Dispose();
                    bulkCopy.Close();
                    if (bulkCopy != null)
                    {
                        ((IDisposable)bulkCopy).Dispose();
                    }                        
                }

            }

            dtDestination.Dispose();
            System.GC.Collect();
            dtDestination = null;
        }
        catch (Exception ex)
        {
            Logger.Log(ex, Logger.LogType.Error);
            throw ex;

        }

推荐答案

这里的关键问题是:什么是 dt1,它来自哪里,你是如何发布的?DataTable 实际上很难清除,坦率地说,我通常不会在这里推荐 DataTable 源.但是,如果您必须使用DataTable,那么请确保每次迭代使用一个完全独立的DataSet/DataTable,并释放旧的以便回收.

The key question here would be: what is dt1, where did it come from, and how have you released it? DataTable is actually quite tricky to clean out, and frankly I wouldn't normally recommend a DataTable source here. However, if you must use DataTable, then make sure and use a completely separate DataSet / DataTable per iteration, and release the old one so it can recycled.

然而,更有效的是使用 WriteToServer(IDataReader) - 这允许您以流方式处理行.如果您在两个 SQL 系统之间进行复制,您甚至可以在单独的命令/连接上使用 ExecuteReader(),但是 IDataReader 非常简单,您可以编写一个基本的IDataReader 对于大多数来源(或找到这样做的库,例如 CsvReader 用于处理分隔文件,例如 csv/tsv).

More efficient, however, is to use WriteToServer(IDataReader) - this allows you to handle rows in a streaming fashion. If you are copying between two SQL systems, you could even just use ExecuteReader() on a separate command / connection, but IDataReader is pretty simple, and you can write a basic IDataReader for most sources (or find libraries that do so, for example CsvReader for handling delimited files such as csv/tsv).

这篇关于Sql批量复制内存问题的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆