SqlBulkCopy在实体框架和经典Ado.net之间的单个事务下进行多个表插入或批量插入操作 [英] SqlBulkCopy Multiple Tables Insert under single Transaction OR Bulk Insert Operation between Entity Framework and Classic Ado.net

查看:245
本文介绍了SqlBulkCopy在实体框架和经典Ado.net之间的单个事务下进行多个表插入或批量插入操作的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

运行应用程序时,我有两个表需要插入。

假设我有如下表




  • tbl_FirstTable和tbl_SecondTable



我的问题是数据量。

我需要插入10,000行以上



因此,我首先使用实体​​框架,如下所示。

 公共布尔值Save_tbl_FirstTable_Vs_tbl_SecondTable(List< tbl_FirstTable> List_tbl_FirstTable,List< tbl_SecondTable> List_tbl_SecondTable)
{
bool IsSuccessSave = false;
尝试
{
使用(DummyDBClass_ObjectContext _DummyDBClass_ObjectContext = new DummyDBClass_ObjectContext())
{
foreach(List_tbl_FirstTable中的tbl_FirstTable _tbl_FirstTable)
{
_DummyDBClass_ObjectContext .tbl_FirstTable.InsertOnSubmit(_tbl_FirstTable);
}

foreach(List_tbl_SecondTable中的tbl_SecondTable _tbl_SecondTable)
{
_DummyDBClass_ObjectContext.tbl_SecondTable.InsertOnSubmit(_tbl_SecondTable);
}

_DummyDBClass_ObjectContext.SubmitChanges();
IsSuccessSave = true;
}
}
catch(异常例外)
{
Log4NetWrapper.WriteError(string.Format( {0}:{1}:Exception = {2} ,
this.GetType()。FullName,
(新StackTrace(新StackFrame(0))。GetFrame(0).GetMethod()。Name.ToString(),
前.Message.ToString()));

if(例如:InnerException!= null)
{
Log4NetWrapper.WriteError(string.Format( {0}:{1}:InnerException Exception = {2} ,
this.GetType()。FullName,
(new StackTrace(new StackFrame(0))。GetFrame(0).GetMethod()。Name.ToString(),
例如。 InnerException.Message.ToString()));
}
}

返回IsSuccessSave;
}

那是我遇到错误的地方超时异常

如果使用以下代码,我认为该异常将得到解决。

  DummyDBClass_ObjectContext.CommandTimeout = 1800; // 30分钟

所以我用了它。它解决了,但是我遇到另一个错误 OutOfMemory异常

因此,我搜索了解决方法,很幸运,我发现了以下文章。


  1. 使用实体框架进行批量插入的问题

  2. 在SqlBulkCopy中使用事务

  3. 在交易中执行批量复制操作

根据那篇文章,我将代码从Entity Framework更改为经典ADO.net代码。

  public bool Save_tbl_FirstTable_Vs_tbl_SecondTable(DataTable DT_tbl_FirstTable,数据表DT_tbl_SecondTable)
{
bool IsSuccessSave = false;
SqlTransaction transaction = null;
尝试
{
使用(DummyDBClass_ObjectContext _DummyDBClass_ObjectContext = new DummyDBClass_ObjectContext())
{
var connectionString =((EntityConnection)_DummyDBClass_ObjectContext.Connection).StoreConnection.ConnectionString;
使用(SqlConnection连接=新的SqlConnection(connectionString))
{
connection.Open();
使用(transaction = connection.BeginTransaction())
{
使用(SqlBulkCopy bulkCopy_tbl_FirstTable = new SqlBulkCopy(connection,SqlBulkCopyOptions.KeepIdentity,transaction))
{
bulkCopy_tbl_FirstTable .BatchSize = 5000;
bulkCopy_tbl_FirstTable.DestinationTableName = dbo.tbl_FirstTable;
bulkCopy_tbl_FirstTable.ColumnMappings.Add( ID, ID);
bulkCopy_tbl_FirstTable.ColumnMappings.Add( UploadFileID, UploadFileID);
bulkCopy_tbl_FirstTable.ColumnMappings.Add( Active, Active);
bulkCopy_tbl_FirstTable.ColumnMappings.Add( CreatedUserID, CreatedUserID);
bulkCopy_tbl_FirstTable.ColumnMappings.Add( CreatedDate, CreatedDate);
bulkCopy_tbl_FirstTable.ColumnMappings.Add( UpdatedUserID, UpdatedUserID);
bulkCopy_tbl_FirstTable.ColumnMappings.Add( UpdatedDate, UpdatedDate);
bulkCopy_tbl_FirstTable.WriteToServer(DT_tbl_FirstTable);
}

使用(SqlBulkCopy bulkCopy_tbl_SecondTable = new SqlBulkCopy(connection,SqlBulkCopyOptions.KeepIdentity,transaction))
{

bulkCopy_tbl_SecondTable.BatchSize = 5000;
bulkCopy_tbl_SecondTable.DestinationTableName = dbo.tbl_SecondTable;
bulkCopy_tbl_SecondTable.ColumnMappings.Add( ID, ID);
bulkCopy_tbl_SecondTable.ColumnMappings.Add( UploadFileDetailID, UploadFileDetailID);
bulkCopy_tbl_SecondTable.ColumnMappings.Add( CompaignFieldMasterID, CompaignFieldMasterID);
bulkCopy_tbl_SecondTable.ColumnMappings.Add( Value, Value);
bulkCopy_tbl_SecondTable.ColumnMappings.Add( Active, Active);
bulkCopy_tbl_SecondTable.ColumnMappings.Add( CreatedUserID, CreatedUserID);
bulkCopy_tbl_SecondTable.ColumnMappings.Add( CreatedDate, CreatedDate);
bulkCopy_tbl_SecondTable.ColumnMappings.Add( UpdatedUserID, UpdatedUserID);
bulkCopy_tbl_SecondTable.ColumnMappings.Add( UpdatedDate, UpdatedDate);
bulkCopy_tbl_SecondTable.WriteToServer(DT_tbl_SecondTable);
}


transaction.Commit();
IsSuccessSave = true;
}
connection.Close();
}
}
}
catch(异常例外)
{
if(transaction!= null)
transaction.Rollback();

Log4NetWrapper.WriteError(string.Format( {0}:{1}:Exception = {2},
this.GetType()。FullName,
(new StackTrace(new StackFrame(0))。GetFrame(0).GetMethod()。Name.ToString(),
ex.Message.ToString()));

if(例如:InnerException!= null)
{
Log4NetWrapper.WriteError(string.Format( {0}:{1}:InnerException Exception = {2} ,
this.GetType()。FullName,
(new StackTrace(new StackFrame(0))。GetFrame(0).GetMethod()。Name.ToString(),
例如。 InnerException.Message.ToString()));
}
}

返回IsSuccessSave;
}

最后,它在不到15秒的时间内执行了超过500,000行的插入过程。



发布此方案有两个原因。


  1. 我想要分享我的发现。

  2. 由于我并不完美,我仍然需要您的更多建议。

因此,每一个更好的解决方案将不胜感激。

解决方案

1)使用EF6.x ,其性能要比EF5.x

好得多。



这里有更多建议(来自使用EF批量插入



2)通过为每个工作单元使用新的上下文,保持活动上下文图较小。 / p>

3)关闭AutoDetechChangesEnabled-context.Configuration.AutoDetectChangesEnabled = false;



4)在循环中进行批处理,请定期致电SaveChanges


I have two tables which need to be inserted when my application run.
Let's say that I have tables as followed

  • tbl_FirstTable and tbl_SecondTable

My problem is data volume.
I need to insert over 10,000 rows to tbl_FirstTable and over 500,000 rows to tbl_SecondTable.

So fristly, I use entity framework as follow.

public bool Save_tbl_FirstTable_Vs_tbl_SecondTable(List<tbl_FirstTable> List_tbl_FirstTable, List<tbl_SecondTable> List_tbl_SecondTable)
{
    bool IsSuccessSave = false;
    try
    {
        using (DummyDBClass_ObjectContext _DummyDBClass_ObjectContext = new DummyDBClass_ObjectContext())
        {           
            foreach (tbl_FirstTable _tbl_FirstTable in List_tbl_FirstTable)
            {
                _DummyDBClass_ObjectContext.tbl_FirstTable.InsertOnSubmit(_tbl_FirstTable);
            }

            foreach (tbl_SecondTable _tbl_SecondTable in List_tbl_SecondTable)
            {
                _DummyDBClass_ObjectContext.tbl_SecondTable.InsertOnSubmit(_tbl_SecondTable);
            }

            _DummyDBClass_ObjectContext.SubmitChanges();
            IsSuccessSave = true;
        }
    }
    catch (Exception ex)
    {
        Log4NetWrapper.WriteError(string.Format("{0} : {1} : Exception={2}",
                                    this.GetType().FullName,
                                    (new StackTrace(new StackFrame(0))).GetFrame(0).GetMethod().Name.ToString(),
                                    ex.Message.ToString()));

        if (ex.InnerException != null)
        {
            Log4NetWrapper.WriteError(string.Format("{0} : {1} : InnerException Exception={2}",
                                    this.GetType().FullName,
                                    (new StackTrace(new StackFrame(0))).GetFrame(0).GetMethod().Name.ToString(),
                                    ex.InnerException.Message.ToString()));
        }
    }

    return IsSuccessSave;
}

That is the place I face error Time out exception.
I think that exception will be solved If I use below code.

DummyDBClass_ObjectContext.CommandTimeout = 1800; // 30 minutes

So I used it. It solved but I face another error OutOfMemory Exception.
So I searched the solutions, fortunately, I found below articles.

  1. Problem with Bulk insert using Entity Framework
  2. Using Transactions with SqlBulkCopy
  3. Performing a Bulk Copy Operation in a Transaction

According to that articles, I change my code from Entity Framework to Classic ADO.net code.

public bool Save_tbl_FirstTable_Vs_tbl_SecondTable(DataTable DT_tbl_FirstTable, DataTable DT_tbl_SecondTable)
{
    bool IsSuccessSave = false;
    SqlTransaction transaction = null;
    try
    {
        using (DummyDBClass_ObjectContext _DummyDBClass_ObjectContext = new DummyDBClass_ObjectContext())
        {
            var connectionString = ((EntityConnection)_DummyDBClass_ObjectContext.Connection).StoreConnection.ConnectionString;
            using (SqlConnection connection = new SqlConnection(connectionString))
            {
                connection.Open();
                using (transaction = connection.BeginTransaction())
                {
                    using (SqlBulkCopy bulkCopy_tbl_FirstTable = new SqlBulkCopy(connection, SqlBulkCopyOptions.KeepIdentity, transaction))                            
                    {
                        bulkCopy_tbl_FirstTable.BatchSize = 5000;
                        bulkCopy_tbl_FirstTable.DestinationTableName = "dbo.tbl_FirstTable";
                        bulkCopy_tbl_FirstTable.ColumnMappings.Add("ID", "ID");
                        bulkCopy_tbl_FirstTable.ColumnMappings.Add("UploadFileID", "UploadFileID");
                        bulkCopy_tbl_FirstTable.ColumnMappings.Add("Active", "Active");
                        bulkCopy_tbl_FirstTable.ColumnMappings.Add("CreatedUserID", "CreatedUserID");
                        bulkCopy_tbl_FirstTable.ColumnMappings.Add("CreatedDate", "CreatedDate");
                        bulkCopy_tbl_FirstTable.ColumnMappings.Add("UpdatedUserID", "UpdatedUserID");
                        bulkCopy_tbl_FirstTable.ColumnMappings.Add("UpdatedDate", "UpdatedDate");
                        bulkCopy_tbl_FirstTable.WriteToServer(DT_tbl_FirstTable);
                    }

                    using (SqlBulkCopy bulkCopy_tbl_SecondTable = new SqlBulkCopy(connection, SqlBulkCopyOptions.KeepIdentity, transaction))                            
                    {

                        bulkCopy_tbl_SecondTable.BatchSize = 5000;
                        bulkCopy_tbl_SecondTable.DestinationTableName = "dbo.tbl_SecondTable";
                        bulkCopy_tbl_SecondTable.ColumnMappings.Add("ID", "ID");
                        bulkCopy_tbl_SecondTable.ColumnMappings.Add("UploadFileDetailID", "UploadFileDetailID");
                        bulkCopy_tbl_SecondTable.ColumnMappings.Add("CompaignFieldMasterID", "CompaignFieldMasterID");
                        bulkCopy_tbl_SecondTable.ColumnMappings.Add("Value", "Value");
                        bulkCopy_tbl_SecondTable.ColumnMappings.Add("Active", "Active");
                        bulkCopy_tbl_SecondTable.ColumnMappings.Add("CreatedUserID", "CreatedUserID");
                        bulkCopy_tbl_SecondTable.ColumnMappings.Add("CreatedDate", "CreatedDate");
                        bulkCopy_tbl_SecondTable.ColumnMappings.Add("UpdatedUserID", "UpdatedUserID");
                        bulkCopy_tbl_SecondTable.ColumnMappings.Add("UpdatedDate", "UpdatedDate");
                        bulkCopy_tbl_SecondTable.WriteToServer(DT_tbl_SecondTable);
                    }


                    transaction.Commit();
                    IsSuccessSave = true;
                }
                connection.Close();
            }
        }
    }
    catch (Exception ex)
    {
        if (transaction != null)
            transaction.Rollback();

        Log4NetWrapper.WriteError(string.Format("{0} : {1} : Exception={2}",
                                    this.GetType().FullName,
                                    (new StackTrace(new StackFrame(0))).GetFrame(0).GetMethod().Name.ToString(),
                                    ex.Message.ToString()));

        if (ex.InnerException != null)
        {
            Log4NetWrapper.WriteError(string.Format("{0} : {1} : InnerException Exception={2}",
                                    this.GetType().FullName,
                                    (new StackTrace(new StackFrame(0))).GetFrame(0).GetMethod().Name.ToString(),
                                    ex.InnerException.Message.ToString()));
        }
    }

    return IsSuccessSave;
}

Finally, It perform insert process in less than 15 seconds for over 500,000 rows.

There is two reasons why I post this scenario.

  1. I would like to share what I found out.
  2. As I am not perfect, I still need to get more suggestion from you.

So, every better solution will be appreciated.

解决方案

1) Use EF6.x, which has much better performance than EF5.x

Here are more suggestions (from Bulk insert with EF)

2) Keep the active Context Graph small by using a new context for each Unit of Work

3) Turn off AutoDetechChangesEnabled - context.Configuration.AutoDetectChangesEnabled = false;

4) Batching, in your loop, Call SaveChanges periodically

这篇关于SqlBulkCopy在实体框架和经典Ado.net之间的单个事务下进行多个表插入或批量插入操作的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆