批量插入在 Azure SQL Server 中无法正常工作 [英] Bulk insert is not working properly in Azure SQL Server

查看:41
本文介绍了批量插入在 Azure SQL Server 中无法正常工作的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我无法使用 C# webapi 将大量数据插入 Azure SQL 服务器数据库

I'm not able to insert the bulk amount of data into Azure SQL server DB using C# webapi

考虑

我想在 SQL 中插入 60K> 条数据.在我的本地 sql 服务器中没有问题,但在 Azure SQL 中它的连接超时

I want to insert 60K> data in SQL. In my local sql server there is no problem but in Azure SQL its getting connection timed-out

我的方法:(所有都在本地 sql server 中工作,但不在 Azure sql server 中)

My approach:(All are working in local sql server but not in Azure sql server)

1) 尝试使用 EF 一一插入记录(10000 大约 10 分钟,主要是超时)

1) Tried using EF its inserting record one by one (For 10000 approx. 10 min,mostly timeout)

2) 尝试使用 批量插入扩展 和 EF3)在SqlBulkCopy中试过

2) Tried using Bulk insert Extension along with EF 3) Tried in SqlBulkCopy

4) 尝试增加连接字符串中的连接超时

4) Tried increasing connection time out in connection string

5) 尝试在 Dbcontext 中增加命令超时.

5) Tried increasing command time out in Dbcontext.

异常堆栈跟踪

Execution Timeout Expired.  The timeout period elapsed prior to completion of the operation or the server is not responding.
System.Data.SqlClient.SqlException (0x80131904): Execution Timeout Expired.  The timeout period elapsed prior to completion of the operation or the server is not responding. ---> System.ComponentModel.Win32Exception (0x80004005): The wait operation timed out
   at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)
   at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)
   at System.Data.SqlClient.TdsParserStateObject.ReadSniError(TdsParserStateObject stateObj, UInt32 error)
   at System.Data.SqlClient.TdsParserStateObject.ReadSniSyncOverAsync()
   at System.Data.SqlClient.TdsParserStateObject.TryReadNetworkPacket()
   at System.Data.SqlClient.TdsParserStateObject.TryPrepareBuffer()
   at System.Data.SqlClient.TdsParserStateObject.TryReadByte(Byte& value)
   at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)
   at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj)
   at System.Data.SqlClient.SqlBulkCopy.RunParser(BulkCopySimpleResultSet bulkCopyHandler)
   at System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsyncContinuedOnSuccess(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource`1 source)
   at System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsyncContinued(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource`1 source)
   at System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsync(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource`1 source)
   at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalRestContinuedAsync(BulkCopySimpleResultSet internalResults, CancellationToken cts, TaskCompletionSource`1 source)
   at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalRestAsync(CancellationToken cts, TaskCompletionSource`1 source)
   at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalAsync(CancellationToken ctoken)
   at System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServerAsync(Int32 columnCount, CancellationToken ctoken)
   at System.Data.SqlClient.SqlBulkCopy.WriteToServer(DataTable table, DataRowState rowState)

Azure 中是否有任何解决方案或任何配置可以更改?

Is there any solution for it or any configuration to be changed in Azure?

更新

用于批量插入的代码

  using (var dbConnection = new DBModel().Database.Connection as SqlConnection)
                {
                    dbConnection?.Open();
                    using (var sqlBulkCopy = new SqlBulkCopy(dbConnection))
                    {
                        try
                        {
                            /* ColumnMapping
                             * Column is mapped to DB Column to DataTable Column
                             *
                             */
                            sqlBulkCopy.EnableStreaming = true;
                            sqlBulkCopy.BulkCopyTimeout = 500;
                            sqlBulkCopy.DestinationTableName = "LogTable";
                            //dt is object of the Datatable
                            sqlBulkCopy.WriteToServer(dt);
                        }
                        catch (Exception ex)
                        {

                        }
                    }


                }

推荐答案

我建议您将 sqlBulkCopy.BatchSize 设置为合理的数量,而不是一次性插入所有内容.根据您要插入的数据,尝试从 10.000 开始,然后逐渐增加或减少,直到您对性能感到满意为止.

I'd suggest you set sqlBulkCopy.BatchSize to a reasonable amount, instead of inserting everything in one batch. Depending on the data you're inserting, try starting with 10.000 and work your way up or down until you're satisfied with the performance.

编辑一些额外的说明:当您考虑批量大小时,您需要考虑到 SqlBulkCopy 不仅需要插入数据,还需要读取和发送数据 - 最后一部分可能是它在本地 SQL 服务器上运行的原因,而不是在Azure - 这也意味着,如果您正在处理大型数据集,您将需要使用较小的批次大小或相当高的 BulkCopyTimeout 设置,以允许每个批次有机会在达到超时限制之前完成.

Edit for some extra clarification: When you consider your batch size, you need to take into consideration that SqlBulkCopy will need to not only insert the data, but also read AND send it - the last part is probably the reason why it works on your local SQL server, but not on Azure - it also means that, if you're working with a large dataset, that you will need to work with lower batch size, or a considerably higher BulkCopyTimeout setting, to allow each batch the chance to finish before reaching the timeout limit.

您可以在这篇文章中阅读有关批量大小的更多信息.推荐的 SqlBulkCopy 批量大小是多少?

You can read more on batch sizes in this post. What is the recommended batch size for SqlBulkCopy?

其他选项:
我正在阅读此内容,这可能仅仅是因为您的插入达到了关键的 DTU(数据库事务单元,基本上是服务器组合资源的度量)使用点.

Other option:
I was reading up on this, and it could simply be because your insert reaches a critical DTU (Database Transaction Unit, basically a measure of the servers combined resources) usage point.

性能级别经过校准和管理,以提供所需的资源来运行您的数据库工作负载,直至达到所选服务层/性能级别所允许的最大限制.如果您的工作负载达到 CPU/数据 IO/日志 IO 限制之一的限制,您将继续以允许的最大级别接收资源,但您的查询可能会延迟增加.这些限制不会导致任何错误,只会导致您的工作负载变慢,除非变慢变得如此严重以至于查询开始超时.

Performance levels are calibrated and governed to provide the needed resources to run your database workload up to the max limits allowed for your selected service tier/performance level. If your workload is hitting the limits in one of CPU/Data IO/Log IO limits, you will continue to receive the resources at the maximum allowed level, but you are likely to see increased latencies for your queries. These limits will not result in any errors, but just a slowdown in your workload, unless the slowdown becomes so severe that queries start timing out.

取自此链接:https://azure.microsoft.com/da-dk/blog/azure-sql-database-introduces-new-near-real-time-performance-metrics/
尝试在监视 DTU 使用情况的同时再次启动复制,看看它是否在很长一段时间(更)期间处于 100% 状态.如果是这种情况,您可能希望提高数据库的定价层级.

Taken from this link: https://azure.microsoft.com/da-dk/blog/azure-sql-database-introduces-new-near-real-time-performance-metrics/
Try starting the copy again while monitoring the DTU usage and see if it's on 100% for long(er) periods. If that is the case, you might want to up your pricing tier scale for the database.

这篇关于批量插入在 Azure SQL Server 中无法正常工作的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆